WorldWideScience

Sample records for model performance model

  1. Well performance model

    International Nuclear Information System (INIS)

    Thomas, L.K.; Evans, C.E.; Pierson, R.G.; Scott, S.L.

    1992-01-01

    This paper describes the development and application of a comprehensive oil or gas well performance model. The model contains six distinct sections: stimulation design, tubing and/or casing flow, reservoir and near-wellbore calculations, production forecasting, wellbore heat transmission, and economics. These calculations may be performed separately or in an integrated fashion with data and results shared among the different sections. The model analysis allows evaluation of all aspects of well completion design, including the effects on future production and overall well economics

  2. NIF capsule performance modeling

    Directory of Open Access Journals (Sweden)

    Weber S.

    2013-11-01

    Full Text Available Post-shot modeling of NIF capsule implosions was performed in order to validate our physical and numerical models. Cryogenic layered target implosions and experiments with surrogate targets produce an abundance of capsule performance data including implosion velocity, remaining ablator mass, times of peak x-ray and neutron emission, core image size, core symmetry, neutron yield, and x-ray spectra. We have attempted to match the integrated data set with capsule-only simulations by adjusting the drive and other physics parameters within expected uncertainties. The simulations include interface roughness, time-dependent symmetry, and a model of mix. We were able to match many of the measured performance parameters for a selection of shots.

  3. Principles of Sonar Performance Modeling

    NARCIS (Netherlands)

    Ainslie, M.A.

    2010-01-01

    Sonar performance modelling (SPM) is concerned with the prediction of quantitative measures of sonar performance, such as probability of detection. It is a multidisciplinary subject, requiring knowledge and expertise in the disparate fields of underwater acoustics, acoustical oceanography, sonar

  4. Characterising performance of environmental models

    NARCIS (Netherlands)

    Bennett, N.D.; Croke, B.F.W.; Guariso, G.; Guillaume, J.H.A.; Hamilton, S.H.; Jakeman, A.J.; Marsili-Libelli, S.; Newham, L.T.H.; Norton, J.; Perrin, C.; Pierce, S.; Robson, B.; Seppelt, R.; Voinov, A.; Fath, B.D.; Andreassian, V.

    2013-01-01

    In order to use environmental models effectively for management and decision-making, it is vital to establish an appropriate level of confidence in their performance. This paper reviews techniques available across various fields for characterising the performance of environmental models with focus

  5. Multiprocessor performance modeling with ADAS

    Science.gov (United States)

    Hayes, Paul J.; Andrews, Asa M.

    1989-01-01

    A graph managing strategy referred to as the Algorithm to Architecture Mapping Model (ATAMM) appears useful for the time-optimized execution of application algorithm graphs in embedded multiprocessors and for the performance prediction of graph designs. This paper reports the modeling of ATAMM in the Architecture Design and Assessment System (ADAS) to make an independent verification of ATAMM's performance prediction capability and to provide a user framework for the evaluation of arbitrary algorithm graphs. Following an overview of ATAMM and its major functional rules are descriptions of the ADAS model of ATAMM, methods to enter an arbitrary graph into the model, and techniques to analyze the simulation results. The performance of a 7-node graph example is evaluated using the ADAS model and verifies the ATAMM concept by substantiating previously published performance results.

  6. Firm Sustainability Performance Index Modeling

    Directory of Open Access Journals (Sweden)

    Che Wan Jasimah Bt Wan Mohamed Radzi

    2015-12-01

    Full Text Available The main objective of this paper is to bring a model for firm sustainability performance index by applying both classical and Bayesian structural equation modeling (parametric and semi-parametric modeling. Both techniques are considered to the research data collected based on a survey directed to the China, Taiwan, and Malaysia food manufacturing industry. For estimating firm sustainability performance index we consider three main indicators include knowledge management, organizational learning, and business strategy. Based on the both Bayesian and classical methodology, we confirmed that knowledge management and business strategy have significant impact on firm sustainability performance index.

  7. Air Conditioner Compressor Performance Model

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Ning; Xie, YuLong; Huang, Zhenyu

    2008-09-05

    During the past three years, the Western Electricity Coordinating Council (WECC) Load Modeling Task Force (LMTF) has led the effort to develop the new modeling approach. As part of this effort, the Bonneville Power Administration (BPA), Southern California Edison (SCE), and Electric Power Research Institute (EPRI) Solutions tested 27 residential air-conditioning units to assess their response to delayed voltage recovery transients. After completing these tests, different modeling approaches were proposed, among them a performance modeling approach that proved to be one of the three favored for its simplicity and ability to recreate different SVR events satisfactorily. Funded by the California Energy Commission (CEC) under its load modeling project, researchers at Pacific Northwest National Laboratory (PNNL) led the follow-on task to analyze the motor testing data to derive the parameters needed to develop a performance models for the single-phase air-conditioning (SPAC) unit. To derive the performance model, PNNL researchers first used the motor voltage and frequency ramping test data to obtain the real (P) and reactive (Q) power versus voltage (V) and frequency (f) curves. Then, curve fitting was used to develop the P-V, Q-V, P-f, and Q-f relationships for motor running and stalling states. The resulting performance model ignores the dynamic response of the air-conditioning motor. Because the inertia of the air-conditioning motor is very small (H<0.05), the motor reaches from one steady state to another in a few cycles. So, the performance model is a fair representation of the motor behaviors in both running and stalling states.

  8. MODELING SUPPLY CHAIN PERFORMANCE VARIABLES

    Directory of Open Access Journals (Sweden)

    Ashish Agarwal

    2005-01-01

    Full Text Available In order to understand the dynamic behavior of the variables that can play a major role in the performance improvement in a supply chain, a System Dynamics-based model is proposed. The model provides an effective framework for analyzing different variables affecting supply chain performance. Among different variables, a causal relationship among different variables has been identified. Variables emanating from performance measures such as gaps in customer satisfaction, cost minimization, lead-time reduction, service level improvement and quality improvement have been identified as goal-seeking loops. The proposed System Dynamics-based model analyzes the affect of dynamic behavior of variables for a period of 10 years on performance of case supply chain in auto business.

  9. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  10. Behavior model for performance assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Borwn-VanHoozer, S. A.

    1999-07-23

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result.

  11. Behavior model for performance assessment

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S. A.

    1999-01-01

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result

  12. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  13. Model description and evaluation of model performance: DOSDIM model

    International Nuclear Information System (INIS)

    Lewyckyj, N.; Zeevaert, T.

    1996-01-01

    DOSDIM was developed to assess the impact to man from routine and accidental atmospheric releases. It is a compartmental, deterministic, radiological model. For an accidental release, dynamic transfer are used in opposition to a routine release for which equilibrium transfer factors are used. Parameters values were chosen to be conservative. Transfer between compartments are described by first-order differential equations. 2 figs

  14. Calibration of PMIS pavement performance prediction models.

    Science.gov (United States)

    2012-02-01

    Improve the accuracy of TxDOTs existing pavement performance prediction models through calibrating these models using actual field data obtained from the Pavement Management Information System (PMIS). : Ensure logical performance superiority patte...

  15. Human Performance Models of Pilot Behavior

    Science.gov (United States)

    Foyle, David C.; Hooey, Becky L.; Byrne, Michael D.; Deutsch, Stephen; Lebiere, Christian; Leiden, Ken; Wickens, Christopher D.; Corker, Kevin M.

    2005-01-01

    Five modeling teams from industry and academia were chosen by the NASA Aviation Safety and Security Program to develop human performance models (HPM) of pilots performing taxi operations and runway instrument approaches with and without advanced displays. One representative from each team will serve as a panelist to discuss their team s model architecture, augmentations and advancements to HPMs, and aviation-safety related lessons learned. Panelists will discuss how modeling results are influenced by a model s architecture and structure, the role of the external environment, specific modeling advances and future directions and challenges for human performance modeling in aviation.

  16. Estuarine modeling: Does a higher grid resolution improve model performance?

    Science.gov (United States)

    Ecological models are useful tools to explore cause effect relationships, test hypothesis and perform management scenarios. A mathematical model, the Gulf of Mexico Dissolved Oxygen Model (GoMDOM), has been developed and applied to the Louisiana continental shelf of the northern ...

  17. Performance Measurement Model A TarBase model with ...

    Indian Academy of Sciences (India)

    rohit

    Model A 8.0 2.0 94.52% 88.46% 76 108 12 12 0.86 0.91 0.78 0.94. Model B 2.0 2.0 93.18% 89.33% 64 95 10 9 0.88 0.90 0.75 0.98. The above results for TEST – 1 show details for our two models (Model A and Model B).Performance of Model A after adding of 32 negative dataset of MiRTif on our testing set(MiRecords) ...

  18. Analysing the temporal dynamics of model performance for hydrological models

    NARCIS (Netherlands)

    Reusser, D.E.; Blume, T.; Schaefli, B.; Zehe, E.

    2009-01-01

    The temporal dynamics of hydrological model performance gives insights into errors that cannot be obtained from global performance measures assigning a single number to the fit of a simulated time series to an observed reference series. These errors can include errors in data, model parameters, or

  19. Generalization performance of regularized neural network models

    DEFF Research Database (Denmark)

    Larsen, Jan; Hansen, Lars Kai

    1994-01-01

    Architecture optimization is a fundamental problem of neural network modeling. The optimal architecture is defined as the one which minimizes the generalization error. This paper addresses estimation of the generalization performance of regularized, complete neural network models. Regularization...

  20. Assessing Ecosystem Model Performance in Semiarid Systems

    Science.gov (United States)

    Thomas, A.; Dietze, M.; Scott, R. L.; Biederman, J. A.

    2017-12-01

    In ecosystem process modelling, comparing outputs to benchmark datasets observed in the field is an important way to validate models, allowing the modelling community to track model performance over time and compare models at specific sites. Multi-model comparison projects as well as models themselves have largely been focused on temperate forests and similar biomes. Semiarid regions, on the other hand, are underrepresented in land surface and ecosystem modelling efforts, and yet will be disproportionately impacted by disturbances such as climate change due to their sensitivity to changes in the water balance. Benchmarking models at semiarid sites is an important step in assessing and improving models' suitability for predicting the impact of disturbance on semiarid ecosystems. In this study, several ecosystem models were compared at a semiarid grassland in southwestern Arizona using PEcAn, or the Predictive Ecosystem Analyzer, an open-source eco-informatics toolbox ideal for creating the repeatable model workflows necessary for benchmarking. Models included SIPNET, DALEC, JULES, ED2, GDAY, LPJ-GUESS, MAESPA, CLM, CABLE, and FATES. Comparison between model output and benchmarks such as net ecosystem exchange (NEE) tended to produce high root mean square error and low correlation coefficients, reflecting poor simulation of seasonality and the tendency for models to create much higher carbon sources than observed. These results indicate that ecosystem models do not currently adequately represent semiarid ecosystem processes.

  1. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Whitmore, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kaffine, Leah [National Renewable Energy Lab. (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron P. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  2. Biofilm carrier migration model describes reactor performance.

    Science.gov (United States)

    Boltz, Joshua P; Johnson, Bruce R; Takács, Imre; Daigger, Glen T; Morgenroth, Eberhard; Brockmann, Doris; Kovács, Róbert; Calhoun, Jason M; Choubert, Jean-Marc; Derlon, Nicolas

    2017-06-01

    The accuracy of a biofilm reactor model depends on the extent to which physical system conditions (particularly bulk-liquid hydrodynamics and their influence on biofilm dynamics) deviate from the ideal conditions upon which the model is based. It follows that an improved capacity to model a biofilm reactor does not necessarily rely on an improved biofilm model, but does rely on an improved mathematical description of the biofilm reactor and its components. Existing biofilm reactor models typically include a one-dimensional biofilm model, a process (biokinetic and stoichiometric) model, and a continuous flow stirred tank reactor (CFSTR) mass balance that [when organizing CFSTRs in series] creates a pseudo two-dimensional (2-D) model of bulk-liquid hydrodynamics approaching plug flow. In such a biofilm reactor model, the user-defined biofilm area is specified for each CFSTR; thereby, X carrier does not exit the boundaries of the CFSTR to which they are assigned or exchange boundaries with other CFSTRs in the series. The error introduced by this pseudo 2-D biofilm reactor modeling approach may adversely affect model results and limit model-user capacity to accurately calibrate a model. This paper presents a new sub-model that describes the migration of X carrier and associated biofilms, and evaluates the impact that X carrier migration and axial dispersion has on simulated system performance. Relevance of the new biofilm reactor model to engineering situations is discussed by applying it to known biofilm reactor types and operational conditions.

  3. Modelling and evaluation of surgical performance using hidden Markov models.

    Science.gov (United States)

    Megali, Giuseppe; Sinigaglia, Stefano; Tonet, Oliver; Dario, Paolo

    2006-10-01

    Minimally invasive surgery has become very widespread in the last ten years. Since surgeons experience difficulties in learning and mastering minimally invasive techniques, the development of training methods is of great importance. While the introduction of virtual reality-based simulators has introduced a new paradigm in surgical training, skill evaluation methods are far from being objective. This paper proposes a method for defining a model of surgical expertise and an objective metric to evaluate performance in laparoscopic surgery. Our approach is based on the processing of kinematic data describing movements of surgical instruments. We use hidden Markov model theory to define an expert model that describes expert surgical gesture. The model is trained on kinematic data related to exercises performed on a surgical simulator by experienced surgeons. Subsequently, we use this expert model as a reference model in the definition of an objective metric to evaluate performance of surgeons with different abilities. Preliminary results show that, using different topologies for the expert model, the method can be efficiently used both for the discrimination between experienced and novice surgeons, and for the quantitative assessment of surgical ability.

  4. Modelling and Motivating Academic Performance.

    Science.gov (United States)

    Brennan, Geoffrey; Pettit, Philip

    1991-01-01

    Three possible motivators for college teachers (individual economic interest, academic virtue, and academic honor) suggest mechanisms that can be used to improve performance. Policies need to address all three motivators; economic levers alone may undermine alternative ways of supporting good work. (MSE)

  5. Assembly line performance and modeling

    Science.gov (United States)

    Rane, Arun B.; Sunnapwar, Vivek K.

    2017-09-01

    Automobile sector forms the backbone of manufacturing sector. Vehicle assembly line is important section in automobile plant where repetitive tasks are performed one after another at different workstations. In this thesis, a methodology is proposed to reduce cycle time and time loss due to important factors like equipment failure, shortage of inventory, absenteeism, set-up, material handling, rejection and fatigue to improve output within given cost constraints. Various relationships between these factors, corresponding cost and output are established by scientific approach. This methodology is validated in three different vehicle assembly plants. Proposed methodology may help practitioners to optimize the assembly line using lean techniques.

  6. Photovoltaic performance models - A report card

    Science.gov (United States)

    Smith, J. H.; Reiter, L. R.

    1985-01-01

    Models for the analysis of photovoltaic (PV) systems' designs, implementation policies, and economic performance, have proliferated while keeping pace with rapid changes in basic PV technology and extensive empirical data compiled for such systems' performance. Attention is presently given to the results of a comparative assessment of ten well documented and widely used models, which range in complexity from first-order approximations of PV system performance to in-depth, circuit-level characterizations. The comparisons were made on the basis of the performance of their subsystem, as well as system, elements. The models fall into three categories in light of their degree of aggregation into subsystems: (1) simplified models for first-order calculation of system performance, with easily met input requirements but limited capability to address more than a small variety of design considerations; (2) models simulating PV systems in greater detail, encompassing types primarily intended for either concentrator-incorporating or flat plate collector PV systems; and (3) models not specifically designed for PV system performance modeling, but applicable to aspects of electrical system design. Models ignoring subsystem failure or degradation are noted to exclude operating and maintenance characteristics as well.

  7. Constrained bayesian inference of project performance models

    OpenAIRE

    Sunmola, Funlade

    2013-01-01

    Project performance models play an important role in the management of project success. When used for monitoring projects, they can offer predictive ability such as indications of possible delivery problems. Approaches for monitoring project performance relies on available project information including restrictions imposed on the project, particularly the constraints of cost, quality, scope and time. We study in this paper a Bayesian inference methodology for project performance modelling in ...

  8. ORGANIZATIONAL LEARNING AND PERFORMANCE. A CONCEPTUAL MODEL

    OpenAIRE

    Alexandra Luciana GUÞÃ

    2013-01-01

    Throught this paper, our main objective is to propose a conceptual model that links the notions of organizational learning (as capability and as a process) and organizational performance. Our contribution consists in analyzing the literature on organizational learning and organizational performance and in proposing an integrated model, that comprises: organizational learning capability, the process of organizational learning, organizational performance, human capital (the value and uniqueness...

  9. Critical review of glass performance modeling

    International Nuclear Information System (INIS)

    Bourcier, W.L.

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process

  10. Critical review of glass performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Bourcier, W.L. [Lawrence Livermore National Lab., CA (United States)

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process.

  11. Driver Performance Model: 1. Conceptual Framework

    National Research Council Canada - National Science Library

    Heimerl, Joseph

    2001-01-01

    ...'. At the present time, no such comprehensive model exists. This report discusses a conceptual framework designed to encompass the relationships, conditions, and constraints related to direct, indirect, and remote modes of driving and thus provides a guide or 'road map' for the construction and creation of a comprehensive driver performance model.

  12. Performance of hedging strategies in interval models

    NARCIS (Netherlands)

    Roorda, Berend; Engwerda, Jacob; Schumacher, J.M.

    2005-01-01

    For a proper assessment of risks associated with the trading of derivatives, the performance of hedging strategies should be evaluated not only in the context of the idealized model that has served as the basis of strategy development, but also in the context of other models. In this paper we

  13. Iowa calibration of MEPDG performance prediction models.

    Science.gov (United States)

    2013-06-01

    This study aims to improve the accuracy of AASHTO Mechanistic-Empirical Pavement Design Guide (MEPDG) pavement : performance predictions for Iowa pavement systems through local calibration of MEPDG prediction models. A total of 130 : representative p...

  14. Performance modeling, stochastic networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi R

    2013-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of introducing an appropriate mathematical framework for modeling and analysis as well as understanding the phenomenon of statistical multiplexing. The models, techniques, and results presented form the core of traffic engineering methods used to design, control and allocate resources in communication networks.The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the importan

  15. Evaluation of models in performance assessment

    International Nuclear Information System (INIS)

    Dormuth, K.W.

    1993-01-01

    The reliability of models used for performance assessment for high-level waste repositories is a key factor in making decisions regarding the management of high-level waste. Model reliability may be viewed as a measure of the confidence that regulators and others have in the use of these models to provide information for decision making. The degree of reliability required for the models will increase as implementation of disposal proceeds and decisions become increasingly important to safety. Evaluation of the models by using observations of real systems provides information that assists the assessment analysts and reviewers in establishing confidence in the conclusions reached in the assessment. A continuing process of model calibration, evaluation, and refinement should lead to increasing reliability of models as implementation proceeds. However, uncertainty in the model predictions cannot be eliminated, so decisions will always be made under some uncertainty. Examples from the Canadian program illustrate the process of model evaluation using observations of real systems and its relationship to performance assessment. 21 refs., 2 figs

  16. Performance modeling, loss networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi

    2009-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of understanding the phenomenon of statistical multiplexing. The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the important ideas of Palm distributions associated with traffic models and their role in performance measures. Also presented are recent ideas of large buffer, and many sources asymptotics that play an important role in understanding statistical multiplexing. I

  17. Generating Performance Models for Irregular Applications

    Energy Technology Data Exchange (ETDEWEB)

    Friese, Ryan D.; Tallent, Nathan R.; Vishnu, Abhinav; Kerbyson, Darren J.; Hoisie, Adolfy

    2017-05-30

    Many applications have irregular behavior --- non-uniform input data, input-dependent solvers, irregular memory accesses, unbiased branches --- that cannot be captured using today's automated performance modeling techniques. We describe new hierarchical critical path analyses for the \\Palm model generation tool. To create a model's structure, we capture tasks along representative MPI critical paths. We create a histogram of critical tasks with parameterized task arguments and instance counts. To model each task, we identify hot instruction-level sub-paths and model each sub-path based on data flow, instruction scheduling, and data locality. We describe application models that generate accurate predictions for strong scaling when varying CPU speed, cache speed, memory speed, and architecture. We present results for the Sweep3D neutron transport benchmark; Page Rank on multiple graphs; Support Vector Machine with pruning; and PFLOTRAN's reactive flow/transport solver with domain-induced load imbalance.

  18. Tailored model abstraction in performance assessments

    International Nuclear Information System (INIS)

    Kessler, J.H.

    1995-01-01

    Total System Performance Assessments (TSPAs) are likely to be one of the most significant parts of making safety cases for the continued development and licensing of geologic repositories for the disposal of spent fuel and HLW. Thus, it is critical that the TSPA model capture the 'essence' of the physical processes relevant to demonstrating the appropriate regulation is met. But how much detail about the physical processes must be modeled and understood before there is enough confidence that the appropriate essence has been captured? In this summary the level of model abstraction that is required is discussed. Approaches for subsystem and total system performance analyses are outlined, and the role of best estimate models is examined. It is concluded that a conservative approach for repository performance, based on limited amount of field and laboratory data, can provide sufficient confidence for a regulatory decision

  19. Individualized Biomathematical Modeling of Fatigue and Performance

    Science.gov (United States)

    2008-05-29

    waking period are omitted in order to avoid confounds from sleep inertia. Gray bars indicate scheduled sleep periods . (b) Performance predictions...i.e., total sleep deprivation; black). Light gray areas indicate nocturnal sleep periods . In this illustration, the bifurcation point is set to...confounds from sleep inertia. Gray bars indicate scheduled sleep periods . (b) Corresponding performance predictions according to the new model

  20. Advances in HTGR fuel performance models

    International Nuclear Information System (INIS)

    Stansfield, O.M.; Goodin, D.T.; Hanson, D.L.; Turner, R.F.

    1985-01-01

    Advances in HTGR fuel performance models have improved the agreement between observed and predicted performance and contributed to an enhanced position of the HTGR with regard to investment risk and passive safety. Heavy metal contamination is the source of about 55% of the circulating activity in the HTGR during normal operation, and the remainder comes primarily from particles which failed because of defective or missing buffer coatings. These failed particles make up about 5 x 10 -4 fraction of the total core inventory. In addition to prediction of fuel performance during normal operation, the models are used to determine fuel failure and fission product release during core heat-up accident conditions. The mechanistic nature of the models, which incorporate all important failure modes, permits the prediction of performance from the relatively modest accident temperatures of a passively safe HTGR to the much more severe accident conditions of the larger 2240-MW/t HTGR. (author)

  1. Data Model Performance in Data Warehousing

    Science.gov (United States)

    Rorimpandey, G. C.; Sangkop, F. I.; Rantung, V. P.; Zwart, J. P.; Liando, O. E. S.; Mewengkang, A.

    2018-02-01

    Data Warehouses have increasingly become important in organizations that have large amount of data. It is not a product but a part of a solution for the decision support system in those organizations. Data model is the starting point for designing and developing of data warehouses architectures. Thus, the data model needs stable interfaces and consistent for a longer period of time. The aim of this research is to know which data model in data warehousing has the best performance. The research method is descriptive analysis, which has 3 main tasks, such as data collection and organization, analysis of data and interpretation of data. The result of this research is discussed in a statistic analysis method, represents that there is no statistical difference among data models used in data warehousing. The organization can utilize four data model proposed when designing and developing data warehouse.

  2. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  3. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  4. Utilities for high performance dispersion model PHYSIC

    International Nuclear Information System (INIS)

    Yamazawa, Hiromi

    1992-09-01

    The description and usage of the utilities for the dispersion calculation model PHYSIC were summarized. The model was developed in the study of developing high performance SPEEDI with the purpose of introducing meteorological forecast function into the environmental emergency response system. The procedure of PHYSIC calculation consists of three steps; preparation of relevant files, creation and submission of JCL, and graphic output of results. A user can carry out the above procedure with the help of the Geographical Data Processing Utility, the Model Control Utility, and the Graphic Output Utility. (author)

  5. Assessing The Performance of Hydrological Models

    Science.gov (United States)

    van der Knijff, Johan

    The performance of hydrological models is often characterized using the coefficient of efficiency, E. The sensitivity of E to extreme streamflow values, and the difficulty of deciding what value of E should be used as a threshold to identify 'good' models or model parameterizations, have proven to be serious shortcomings of this index. This paper reviews some alternative performance indices that have appeared in the litera- ture. Legates and McCabe (1999) suggested a more generalized form of E, E'(j,B). Here, j is a parameter that controls how much emphasis is put on extreme streamflow values, and B defines a benchmark or 'null hypothesis' against which the results of the model are tested. E'(j,B) was used to evaluate a large number of parameterizations of a conceptual rainfall-runoff model, using 6 different combinations of j and B. First, the effect of j and B is explained. Second, it is demonstrated how the index can be used to explicitly test hypotheses about the model and the data. This approach appears to be particularly attractive if the index is used as a likelihood measure within a GLUE-type analysis.

  6. A statistical model for predicting muscle performance

    Science.gov (United States)

    Byerly, Diane Leslie De Caix

    The objective of these studies was to develop a capability for predicting muscle performance and fatigue to be utilized for both space- and ground-based applications. To develop this predictive model, healthy test subjects performed a defined, repetitive dynamic exercise to failure using a Lordex spinal machine. Throughout the exercise, surface electromyography (SEMG) data were collected from the erector spinae using a Mega Electronics ME3000 muscle tester and surface electrodes placed on both sides of the back muscle. These data were analyzed using a 5th order Autoregressive (AR) model and statistical regression analysis. It was determined that an AR derived parameter, the mean average magnitude of AR poles, significantly correlated with the maximum number of repetitions (designated Rmax) that a test subject was able to perform. Using the mean average magnitude of AR poles, a test subject's performance to failure could be predicted as early as the sixth repetition of the exercise. This predictive model has the potential to provide a basis for improving post-space flight recovery, monitoring muscle atrophy in astronauts and assessing the effectiveness of countermeasures, monitoring astronaut performance and fatigue during Extravehicular Activity (EVA) operations, providing pre-flight assessment of the ability of an EVA crewmember to perform a given task, improving the design of training protocols and simulations for strenuous International Space Station assembly EVA, and enabling EVA work task sequences to be planned enhancing astronaut performance and safety. Potential ground-based, medical applications of the predictive model include monitoring muscle deterioration and performance resulting from illness, establishing safety guidelines in the industry for repetitive tasks, monitoring the stages of rehabilitation for muscle-related injuries sustained in sports and accidents, and enhancing athletic performance through improved training protocols while reducing

  7. New Diagnostics to Assess Model Performance

    Science.gov (United States)

    Koh, Tieh-Yong

    2013-04-01

    The comparison of model performance between the tropics and the mid-latitudes is particularly problematic for observables like temperature and humidity: in the tropics, these observables have little variation and so may give an apparent impression that model predictions are often close to observations; on the contrary, they vary widely in mid-latitudes and so the discrepancy between model predictions and observations might be unnecessarily over-emphasized. We have developed a suite of mathematically rigorous diagnostics that measures normalized errors accounting for the observed and modeled variability of the observables themselves. Another issue in evaluating model performance is the relative importance of getting the variance of an observable right versus getting the modeled variation to be in phase with the observed. The correlation-similarity diagram was designed to analyse the pattern error of a model by breaking it down into contributions from amplitude and phase errors. A final and important question pertains to the generalization of scalar diagnostics to analyse vector observables like wind. In particular, measures of variance and correlation must be properly derived to avoid the mistake of ignoring the covariance between north-south and east-west winds (hence wrongly assuming that the north-south and east-west directions form a privileged vector basis for error analysis). There is also a need to quantify systematic preferences in the direction of vector wind errors, which we make possible by means of an error anisotropy diagram. Although the suite of diagnostics is mentioned with reference to model verification here, it is generally applicable to quantify differences between two datasets (e.g. from two observation platforms). Reference publication: Koh, T. Y. et al. (2012), J. Geophys. Res., 117, D13109, doi:10.1029/2011JD017103. also available at http://www.ntu.edu.sg/home/kohty

  8. A practical model for sustainable operational performance

    International Nuclear Information System (INIS)

    Vlek, C.A.J.; Steg, E.M.; Feenstra, D.; Gerbens-Leenis, W.; Lindenberg, S.; Moll, H.; Schoot Uiterkamp, A.; Sijtsma, F.; Van Witteloostuijn, A.

    2002-01-01

    By means of a concrete model for sustainable operational performance enterprises can report uniformly on the sustainability of their contributions to the economy, welfare and the environment. The development and design of a three-dimensional monitoring system is presented and discussed [nl

  9. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  10. DETRA: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Suolanen, V.

    1996-01-01

    The computer code DETRA is a generic tool for environmental transfer analyses of radioactive or stable substances. The code has been applied for various purposes, mainly problems related to the biospheric transfer of radionuclides both in safety analyses of disposal of nuclear wastes and in consideration of foodchain exposure pathways in the analyses of off-site consequences of reactor accidents. For each specific application an individually tailored conceptual model can be developed. The biospheric transfer analyses performed by the code are typically carried out for terrestrial, aquatic and food chain applications. 21 refs, 35 figs, 15 tabs

  11. Performance model for a CCTV-MTI

    International Nuclear Information System (INIS)

    Dunn, D.R.; Dunbar, D.L.

    1978-01-01

    CCTV-MTI (closed circuit television--moving target indicator) monitors represent typical components of access control systems, as for example in a material control and accounting (MC and A) safeguards system. This report describes a performance model for a CCTV-MTI monitor. The performance of a human in an MTI role is a separate problem and is not addressed here. This work was done in conjunction with the NRC sponsored LLL assessment procedure for MC and A systems which is presently under development. We develop a noise model for a generic camera system and a model for the detection mechanism for a postulated MTI design. These models are then translated into an overall performance model. Measures of performance are probabilities of detection and false alarm as a function of intruder-induced grey level changes in the protected area. Sensor responsivity, lens F-number, source illumination and spectral response were treated as design parameters. Some specific results are illustrated for a postulated design employing a camera with a Si-target vidicon. Reflectance or light level changes in excess of 10% due to an intruder will be detected with a very high probability for the portion of the visible spectrum with wavelengths above 500 nm. The resulting false alarm rate was less than one per year. We did not address sources of nuisance alarms due to adverse environments, reliability, resistance to tampering, nor did we examine the effects of the spatial frequency response of the optics. All of these are important and will influence overall system detection performance

  12. Performance assessment modeling of pyrometallurgical process wasteforms

    International Nuclear Information System (INIS)

    Nutt, W.M.; Hill, R.N.; Bullen, D.B.

    1995-01-01

    Performance assessment analyses have been completed to estimate the behavior of high-level nuclear wasteforms generated from the pyrometallurgical processing of liquid metal reactor (LMR) and light water reactor (LWR) spent nuclear fuel. Waste emplaced in the proposed repository at Yucca Mountain is investigated as the basis for the study. The resulting cumulative actinide and fission product releases to the accessible environment within a 100,000 year period from the various pyrometallurgical process wasteforms are compared to those of directly disposed LWR spent fuel using the same total repository system model. The impact of differing radionuclide transport models on the overall release characteristics is investigated

  13. Performance Evaluation and Modelling of Container Terminals

    Science.gov (United States)

    Venkatasubbaiah, K.; Rao, K. Narayana; Rao, M. Malleswara; Challa, Suresh

    2018-02-01

    The present paper evaluates and analyzes the performance of 28 container terminals of south East Asia through data envelopment analysis (DEA), principal component analysis (PCA) and hybrid method of DEA-PCA. DEA technique is utilized to identify efficient decision making unit (DMU)s and to rank DMUs in a peer appraisal mode. PCA is a multivariate statistical method to evaluate the performance of container terminals. In hybrid method, DEA is integrated with PCA to arrive the ranking of container terminals. Based on the composite ranking, performance modelling and optimization of container terminals is carried out through response surface methodology (RSM).

  14. Cognitive performance modeling based on general systems performance theory.

    Science.gov (United States)

    Kondraske, George V

    2010-01-01

    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  15. Hybrid Modeling Improves Health and Performance Monitoring

    Science.gov (United States)

    2007-01-01

    Scientific Monitoring Inc. was awarded a Phase I Small Business Innovation Research (SBIR) project by NASA's Dryden Flight Research Center to create a new, simplified health-monitoring approach for flight vehicles and flight equipment. The project developed a hybrid physical model concept that provided a structured approach to simplifying complex design models for use in health monitoring, allowing the output or performance of the equipment to be compared to what the design models predicted, so that deterioration or impending failure could be detected before there would be an impact on the equipment's operational capability. Based on the original modeling technology, Scientific Monitoring released I-Trend, a commercial health- and performance-monitoring software product named for its intelligent trending, diagnostics, and prognostics capabilities, as part of the company's complete ICEMS (Intelligent Condition-based Equipment Management System) suite of monitoring and advanced alerting software. I-Trend uses the hybrid physical model to better characterize the nature of health or performance alarms that result in "no fault found" false alarms. Additionally, the use of physical principles helps I-Trend identify problems sooner. I-Trend technology is currently in use in several commercial aviation programs, and the U.S. Air Force recently tapped Scientific Monitoring to develop next-generation engine health-management software for monitoring its fleet of jet engines. Scientific Monitoring has continued the original NASA work, this time under a Phase III SBIR contract with a joint NASA-Pratt & Whitney aviation security program on propulsion-controlled aircraft under missile-damaged aircraft conditions.

  16. Performance modeling of network data services

    Energy Technology Data Exchange (ETDEWEB)

    Haynes, R.A.; Pierson, L.G.

    1997-01-01

    Networks at major computational organizations are becoming increasingly complex. The introduction of large massively parallel computers and supercomputers with gigabyte memories are requiring greater and greater bandwidth for network data transfers to widely dispersed clients. For networks to provide adequate data transfer services to high performance computers and remote users connected to them, the networking components must be optimized from a combination of internal and external performance criteria. This paper describes research done at Sandia National Laboratories to model network data services and to visualize the flow of data from source to sink when using the data services.

  17. Performance Measurement Model A TarBase model with ...

    Indian Academy of Sciences (India)

    rohit

    Cost. G=Gamma. CV=Cross Validation. MCC=Matthew Correlation Coefficient. Test 1: C G CV Accuracy TP TN FP FN ... Conclusion: Without considering the MirTif negative dataset for training Model A and B classifiers, our Model A and B ...

  18. Modelling the predictive performance of credit scoring

    Directory of Open Access Journals (Sweden)

    Shi-Wei Shen

    2013-07-01

    Research purpose: The purpose of this empirical paper was to examine the predictive performance of credit scoring systems in Taiwan. Motivation for the study: Corporate lending remains a major business line for financial institutions. However, in light of the recent global financial crises, it has become extremely important for financial institutions to implement rigorous means of assessing clients seeking access to credit facilities. Research design, approach and method: Using a data sample of 10 349 observations drawn between 1992 and 2010, logistic regression models were utilised to examine the predictive performance of credit scoring systems. Main findings: A test of Goodness of fit demonstrated that credit scoring models that incorporated the Taiwan Corporate Credit Risk Index (TCRI, micro- and also macroeconomic variables possessed greater predictive power. This suggests that macroeconomic variables do have explanatory power for default credit risk. Practical/managerial implications: The originality in the study was that three models were developed to predict corporate firms’ defaults based on different microeconomic and macroeconomic factors such as the TCRI, asset growth rates, stock index and gross domestic product. Contribution/value-add: The study utilises different goodness of fits and receiver operator characteristics during the examination of the robustness of the predictive power of these factors.

  19. Modelling fuel cell performance using artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Ogaji, S.O.T.; Singh, R.; Pilidis, P.; Diacakis, M. [Power Propulsion and Aerospace Engineering Department, Centre for Diagnostics and Life Cycle Costs, Cranfield University (United Kingdom)

    2006-03-09

    Over the last few years, fuel cell technology has been increasing promisingly its share in the generation of stationary power. Numerous pilot projects are operating worldwide, continuously increasing the amount of operating hours either as stand-alone devices or as part of gas turbine combined cycles. An essential tool for the adequate and dynamic analysis of such systems is a software model that enables the user to assess a large number of alternative options in the least possible time. On the other hand, the sphere of application of artificial neural networks has widened covering such endeavours of life such as medicine, finance and unsurprisingly engineering (diagnostics of faults in machines). Artificial neural networks have been described as diagrammatic representation of a mathematical equation that receives values (inputs) and gives out results (outputs). Artificial neural networks systems have the capacity to recognise and associate patterns and because of their inherent design features, they can be applied to linear and non-linear problem domains. In this paper, the performance of the fuel cell is modelled using artificial neural networks. The inputs to the network are variables that are critical to the performance of the fuel cell while the outputs are the result of changes in any one or all of the fuel cell design variables, on its performance. Critical parameters for the cell include the geometrical configuration as well as the operating conditions. For the neural network, various network design parameters such as the network size, training algorithm, activation functions and their causes on the effectiveness of the performance modelling are discussed. Results from the analysis as well as the limitations of the approach are presented and discussed. (author)

  20. Modelling fuel cell performance using artificial intelligence

    Science.gov (United States)

    Ogaji, S. O. T.; Singh, R.; Pilidis, P.; Diacakis, M.

    Over the last few years, fuel cell technology has been increasing promisingly its share in the generation of stationary power. Numerous pilot projects are operating worldwide, continuously increasing the amount of operating hours either as stand-alone devices or as part of gas turbine combined cycles. An essential tool for the adequate and dynamic analysis of such systems is a software model that enables the user to assess a large number of alternative options in the least possible time. On the other hand, the sphere of application of artificial neural networks has widened covering such endeavours of life such as medicine, finance and unsurprisingly engineering (diagnostics of faults in machines). Artificial neural networks have been described as diagrammatic representation of a mathematical equation that receives values (inputs) and gives out results (outputs). Artificial neural networks systems have the capacity to recognise and associate patterns and because of their inherent design features, they can be applied to linear and non-linear problem domains. In this paper, the performance of the fuel cell is modelled using artificial neural networks. The inputs to the network are variables that are critical to the performance of the fuel cell while the outputs are the result of changes in any one or all of the fuel cell design variables, on its performance. Critical parameters for the cell include the geometrical configuration as well as the operating conditions. For the neural network, various network design parameters such as the network size, training algorithm, activation functions and their causes on the effectiveness of the performance modelling are discussed. Results from the analysis as well as the limitations of the approach are presented and discussed.

  1. Model for measuring complex performance in an aviation environment

    International Nuclear Information System (INIS)

    Hahn, H.A.

    1988-01-01

    An experiment was conducted to identify models of pilot performance through the attainment and analysis of concurrent verbal protocols. Sixteen models were identified. Novice and expert pilots differed with respect to the models they used. Models were correlated to performance, particularly in the case of expert subjects. Models were not correlated to performance shaping factors (i.e. workload). 3 refs., 1 tab

  2. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    A model for a flue gas boiler covering the flue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been defined for the furnace, the convection zone (split in 2......: a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat......Lab/Simulink has been applied for carrying out the simulations. To be able to verify the simulated results experiments has been carried out on a full scale boiler plant....

  3. High Performance Modeling of Novel Diagnostics Configuration

    Science.gov (United States)

    Smith, Dalton; Gibson, John; Lodes, Rylie; Malcolm, Hayden; Nakamoto, Teagan; Parrack, Kristina; Trujillo, Christopher; Wilde, Zak; Los Alamos Laboratories Q-6 Students Team

    2017-06-01

    A novel diagnostics method to measure the Hayes Electric Effect was tested and verified against computerized models. Where standard PVDF diagnostics utilize piezoelectric materials to measure detonation pressure through strain-induced electrical signals, the PVDF was used in a novel technique by also detecting the detonation's induced electric field. The ALE-3D Hydro Codes predicted the performance by calculating detonation velocities, pressures, and arrival times. These theoretical results then validated the experimental use of the PVDF repurposed to specifically track the Hayes Electric Effect. Los Alamos National Laboratories Q-6.

  4. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    : a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat...

  5. Models and criteria for waste repository performance

    International Nuclear Information System (INIS)

    Smith, C.F.; Cohen, J.J.

    1981-03-01

    A primary objective of the Waste Management Program is to assure that public health is protected. Predictive modeling, to some extent, will play a role in assuring that this objective is met. This paper considers the requirements and limitations of predictive modeling in providing useful inputs to waste management decision making. Criteria development needs and the relation between criteria and models are also discussed

  6. ECOPATH: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Bergstroem, U.; Nordlinder, S.

    1996-01-01

    The model is based upon compartment theory and it is run in combination with a statistical error propagation method (PRISM, Gardner et al. 1983). It is intended to be generic for application on other sites with simple changing of parameter values. It was constructed especially for this scenario. However, it is based upon an earlier designed model for calculating relations between released amount of radioactivity and doses to critical groups (used for Swedish regulations concerning annual reports of released radioactivity from routine operation of Swedish nuclear power plants (Bergstroem och Nordlinder, 1991)). The model handles exposure from deposition on terrestrial areas as well as deposition on lakes, starting with deposition values. 14 refs, 16 figs, 7 tabs

  7. FARMLAND: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Attwood, C.; Fayers, C.; Mayall, A.; Brown, J.; Simmonds, J.R.

    1996-01-01

    The FARMLAND model was originally developed for use in connection with continuous, routine releases of radionuclides, but because it has many time-dependent features it has been developed further for a single accidental release. The most recent version of FARMLAND is flexible and can be used to predict activity concentrations in food as a function of time after both accidental and routine releases of radionuclides. The effect of deposition at different times of the year can be taken into account. FARMLAND contains a suite of models which simulate radionuclide transfer through different parts of the foodchain. The models can be used in different combinations and offer the flexibility to assess a variety of radiological situations. The main foods considered are green vegetables, grain products, root vegetables, milk, meat and offal from cattle, and meat and offal from sheep. A large variety of elements can be considered although the degree of complexity with which some are modelled is greater than others; isotopes of caesium, strontium and iodine are treated in greatest detail. 22 refs, 12 figs, 10 tabs

  8. Models and criteria for LLW disposal performance

    International Nuclear Information System (INIS)

    Smith, C.F.; Cohen, J.J.

    1980-12-01

    A primary objective of the Low Level Waste (LLW) Management Program is to assure that public health is protected. Predictive modeling, to some extent, will play a role in meeting this objective. This paper considers the requirements and limitations of predictive modeling in providing useful inputs to waste mangement decision making. In addition, criteria development needs and the relation between criteria and models are discussed

  9. Models and criteria for LLW disposal performance

    International Nuclear Information System (INIS)

    Smith, C.F.; Cohen, J.J.

    1980-01-01

    A primary objective of the Low Level Waste (LLW) Management Program is to assure that public health is protected. Predictive modeling, to some extent, will play a role in meeting this objective. This paper considers the requirements and limitations of predictive modeling in providing useful inputs to waste management decision making. In addition, criteria development needs and the relation between criteria and models are discussed

  10. High-performance phase-field modeling

    KAUST Repository

    Vignal, Philippe

    2015-04-27

    Many processes in engineering and sciences involve the evolution of interfaces. Among the mathematical frameworks developed to model these types of problems, the phase-field method has emerged as a possible solution. Phase-fields nonetheless lead to complex nonlinear, high-order partial differential equations, whose solution poses mathematical and computational challenges. Guaranteeing some of the physical properties of the equations has lead to the development of efficient algorithms and discretizations capable of recovering said properties by construction [2, 5]. This work builds-up on these ideas, and proposes novel discretization strategies that guarantee numerical energy dissipation for both conserved and non-conserved phase-field models. The temporal discretization is based on a novel method which relies on Taylor series and ensures strong energy stability. It is second-order accurate, and can also be rendered linear to speed-up the solution process [4]. The spatial discretization relies on Isogeometric Analysis, a finite element method that possesses the k-refinement technology and enables the generation of high-order, high-continuity basis functions. These basis functions are well suited to handle the high-order operators present in phase-field models. Two-dimensional and three dimensional results of the Allen-Cahn, Cahn-Hilliard, Swift-Hohenberg and phase-field crystal equation will be presented, which corroborate the theoretical findings, and illustrate the robustness of the method. Results related to more challenging examples, namely the Navier-Stokes Cahn-Hilliard and a diusion-reaction Cahn-Hilliard system, will also be presented. The implementation was done in PetIGA and PetIGA-MF, high-performance Isogeometric Analysis frameworks [1, 3], designed to handle non-linear, time-dependent problems.

  11. Modelling

    CERN Document Server

    Spädtke, P

    2013-01-01

    Modeling of technical machines became a standard technique since computer became powerful enough to handle the amount of data relevant to the specific system. Simulation of an existing physical device requires the knowledge of all relevant quantities. Electric fields given by the surrounding boundary as well as magnetic fields caused by coils or permanent magnets have to be known. Internal sources for both fields are sometimes taken into account, such as space charge forces or the internal magnetic field of a moving bunch of charged particles. Used solver routines are briefly described and some bench-marking is shown to estimate necessary computing times for different problems. Different types of charged particle sources will be shown together with a suitable model to describe the physical model. Electron guns are covered as well as different ion sources (volume ion sources, laser ion sources, Penning ion sources, electron resonance ion sources, and H$^-$-sources) together with some remarks on beam transport.

  12. Dengue human infection model performance parameters.

    Science.gov (United States)

    Endy, Timothy P

    2014-06-15

    Dengue is a global health problem and of concern to travelers and deploying military personnel with development and licensure of an effective tetravalent dengue vaccine a public health priority. The dengue viruses (DENVs) are mosquito-borne flaviviruses transmitted by infected Aedes mosquitoes. Illness manifests across a clinical spectrum with severe disease characterized by intravascular volume depletion and hemorrhage. DENV illness results from a complex interaction of viral properties and host immune responses. Dengue vaccine development efforts are challenged by immunologic complexity, lack of an adequate animal model of disease, absence of an immune correlate of protection, and only partially informative immunogenicity assays. A dengue human infection model (DHIM) will be an essential tool in developing potential dengue vaccines or antivirals. The potential performance parameters needed for a DHIM to support vaccine or antiviral candidates are discussed. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. The COD Model: Simulating Workgroup Performance

    Science.gov (United States)

    Biggiero, Lucio; Sevi, Enrico

    Though the question of the determinants of workgroup performance is one of the most central in organization science, precise theoretical frameworks and formal demonstrations are still missing. In order to fill in this gap the COD agent-based simulation model is here presented and used to study the effects of task interdependence and bounded rationality on workgroup performance. The first relevant finding is an algorithmic demonstration of the ordering of interdependencies in terms of complexity, showing that the parallel mode is the most simplex, followed by the sequential and then by the reciprocal. This result is far from being new in organization science, but what is remarkable is that now it has the strength of an algorithmic demonstration instead of being based on the authoritativeness of some scholar or on some episodic empirical finding. The second important result is that the progressive introduction of realistic limits to agents' rationality dramatically reduces workgroup performance and addresses to a rather interesting result: when agents' rationality is severely bounded simple norms work better than complex norms. The third main finding is that when the complexity of interdependence is high, then the appropriate coordination mechanism is agents' direct and active collaboration, which means teamwork.

  14. A Combat Mission Team Performance Model: Development and initial Application

    National Research Council Canada - National Science Library

    Silverman, Denise

    1997-01-01

    ... realistic combat scenarios. We present a conceptual model of team performance measurement in which aircrew coordination, team performance, mission performance and their interrelationships are operationally defined...

  15. Numerical modeling capabilities to predict repository performance

    International Nuclear Information System (INIS)

    1979-09-01

    This report presents a summary of current numerical modeling capabilities that are applicable to the design and performance evaluation of underground repositories for the storage of nuclear waste. The report includes codes that are available in-house, within Golder Associates and Lawrence Livermore Laboratories; as well as those that are generally available within the industry and universities. The first listing of programs are in-house codes in the subject areas of hydrology, solute transport, thermal and mechanical stress analysis, and structural geology. The second listing of programs are divided by subject into the following categories: site selection, structural geology, mine structural design, mine ventilation, hydrology, and mine design/construction/operation. These programs are not specifically designed for use in the design and evaluation of an underground repository for nuclear waste; but several or most of them may be so used

  16. Modelling Flexible Pavement Response and Performance

    DEFF Research Database (Denmark)

    Ullidtz, Per

    This textbook is primarily concerned with models for predicting the future condition of flexible pavements, as a function of traffic loading, climate, materials, etc., using analytical-empirical methods.......This textbook is primarily concerned with models for predicting the future condition of flexible pavements, as a function of traffic loading, climate, materials, etc., using analytical-empirical methods....

  17. HANDOVER MANAGEABILITY AND PERFORMANCE MODELING IN

    African Journals Online (AJOL)

    SOFTLINKS DIGITAL

    This work develops a model for interpreting implementation progress. The proposed progress monitoring model uses existing implementation artefact metrics, tries .... determine implementation velocity. As noted by McConnell [28] this velocity increases at the beginning and decreases near the end. A formal implementation.

  18. Modeling, simulation and performance evaluation of parabolic ...

    African Journals Online (AJOL)

    Model of a parabolic trough power plant, taking into consideration the different losses associated with collection of the solar irradiance and thermal losses is presented. MATLAB software is employed to model the power plant at reference state points. The code is then used to find the different reference values which are ...

  19. Detailed Performance Model for Photovoltaic Systems: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Tian, H.; Mancilla-David, F.; Ellis, K.; Muljadi, E.; Jenkins, P.

    2012-07-01

    This paper presents a modified current-voltage relationship for the single diode model. The single-diode model has been derived from the well-known equivalent circuit for a single photovoltaic cell. The modification presented in this paper accounts for both parallel and series connections in an array.

  20. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    Science.gov (United States)

    The model performance evaluation consists of metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors.

  1. Modeling and optimization of LCD optical performance

    CERN Document Server

    Yakovlev, Dmitry A; Kwok, Hoi-Sing

    2015-01-01

    The aim of this book is to present the theoretical foundations of modeling the optical characteristics of liquid crystal displays, critically reviewing modern modeling methods and examining areas of applicability. The modern matrix formalisms of optics of anisotropic stratified media, most convenient for solving problems of numerical modeling and optimization of LCD, will be considered in detail. The benefits of combined use of the matrix methods will be shown, which generally provides the best compromise between physical adequacy and accuracy with computational efficiency and optimization fac

  2. Comparison of performance of simulation models for floor heating

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Svendsen, Svend

    2005-01-01

    This paper describes the comparison of performance of simulation models for floor heating with different level of detail in the modelling process. The models are compared in an otherwise identical simulation model containing room model, walls, windows, ceiling and ventilation system. By exchanging...

  3. Integrated thermodynamic model for ignition target performance

    Directory of Open Access Journals (Sweden)

    Springer P.T.

    2013-11-01

    Full Text Available We have derived a 3-dimensional synthetic model for NIF implosion conditions, by predicting and optimizing fits to a broad set of x-ray and nuclear diagnostics obtained on each shot. By matching x-ray images, burn width, neutron time-of-flight ion temperature, yield, and fuel ρr, we obtain nearly unique constraints on conditions in the hotspot and fuel in a model that is entirely consistent with the observables. This model allows us to determine hotspot density, pressure, areal density (ρr, total energy, and other ignition-relevant parameters not available from any single diagnostic. This article describes the model and its application to National Ignition Facility (NIF tritium–hydrogen–deuterium (THD and DT implosion data, and provides an explanation for the large yield and ρr degradation compared to numerical code predictions.

  4. Mathematical Modeling of Circadian/Performance Countermeasures

    Data.gov (United States)

    National Aeronautics and Space Administration — We developed and refined our current mathematical model of circadian rhythms to incorporate melatonin as a marker rhythm. We used an existing physiologically based...

  5. Hydrologic Evaluation of Landfill Performance (HELP) Model

    Science.gov (United States)

    The program models rainfall, runoff, infiltration, and other water pathways to estimate how much water builds up above each landfill liner. It can incorporate data on vegetation, soil types, geosynthetic materials, initial moisture conditions, slopes, etc.

  6. The Five Key Questions of Human Performance Modeling.

    Science.gov (United States)

    Wu, Changxu

    2018-01-01

    Via building computational (typically mathematical and computer simulation) models, human performance modeling (HPM) quantifies, predicts, and maximizes human performance, human-machine system productivity and safety. This paper describes and summarizes the five key questions of human performance modeling: 1) Why we build models of human performance; 2) What the expectations of a good human performance model are; 3) What the procedures and requirements in building and verifying a human performance model are; 4) How we integrate a human performance model with system design; and 5) What the possible future directions of human performance modeling research are. Recent and classic HPM findings are addressed in the five questions to provide new thinking in HPM's motivations, expectations, procedures, system integration and future directions.

  7. A unified tool for performance modelling and prediction

    International Nuclear Information System (INIS)

    Gilmore, Stephen; Kloul, Leila

    2005-01-01

    We describe a novel performability modelling approach, which facilitates the efficient solution of performance models extracted from high-level descriptions of systems. The notation which we use for our high-level designs is the Unified Modelling Language (UML) graphical modelling language. The technology which provides the efficient representation capability for the underlying performance model is the multi-terminal binary decision diagram (MTBDD)-based PRISM probabilistic model checker. The UML models are compiled through an intermediate language, the stochastic process algebra PEPA, before translation into MTBDDs for solution. We illustrate our approach on a real-world analysis problem from the domain of mobile telephony

  8. PV Performance Modeling Methods and Practices: Results from the 4th PV Performance Modeling Collaborative Workshop.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    In 2014, the IEA PVPS Task 13 added the PVPMC as a formal activity to its technical work plan for 2014-2017. The goal of this activity is to expand the reach of the PVPMC to a broader international audience and help to reduce PV performance modeling uncertainties worldwide. One of the main deliverables of this activity is to host one or more PVPMC workshops outside the US to foster more international participation within this collaborative group. This report reviews the results of the first in a series of these joint IEA PVPS Task 13/PVPMC workshops. The 4th PV Performance Modeling Collaborative Workshop was held in Cologne, Germany at the headquarters of TÜV Rheinland on October 22-23, 2015.

  9. Persistence Modeling for Assessing Marketing Strategy Performance

    NARCIS (Netherlands)

    M.G. Dekimpe (Marnik); D.M. Hanssens (Dominique)

    2003-01-01

    textabstractThe question of long-run market response lies at the heart of any marketing strategy that tries to create a sustainable competitive advantage for the firm or brand. A key challenge, however, is that only short-run results of marketing actions are readily observable. Persistence modeling

  10. Some useful characteristics of performance models

    International Nuclear Information System (INIS)

    Worledge, D.H.

    1985-01-01

    This paper examines the demands placed upon models of human cognitive decision processes in application to Probabilistic Risk Assessment. Successful models, for this purpose, should, 1) be based on proven or plausible psychological knowledge, e.g., Rasmussen's mental schematic, 2) incorporate opportunities for slips, 3) take account of the recursive nature, in time, of corrections to mistaken actions, and 4) depend on the crew's predominant mental states that accompany such recursions. The latter is equivalent to an explicit coupling between input and output of Rasmussen's mental schematic. A family of such models is proposed with observable rate processes mediating the (conscious) mental states involved. It is expected that the cumulative probability distributions corresponding to the individual rate processes can be identified with probability-time correlations of the HCR Human Cognitive Reliability type discussed elsewhere in this session. The functional forms of the conditional rates are intuitively shown to have simple characteristics that lead to a strongly recursive stochastic process with significant predictive capability. Models of the type proposed have few parts and form a representation that is intentionally far short of a fully transparent exposition of the mental process in order to avoid making impossible demands on data

  11. Evaluating Performances of Traffic Noise Models | Oyedepo ...

    African Journals Online (AJOL)

    Traffic noise in decibel dB(A) were measured at six locations using 407780A Integrating Sound Level Meter, while spot speed and traffic volume were collected with cine-camera. The predicted sound exposure level (SEL) was evaluated using Burgess, British and FWHA model. The average noise level obtained are 77.64 ...

  12. HANDOVER MANAGEABILITY AND PERFORMANCE MODELING IN

    African Journals Online (AJOL)

    SOFTLINKS DIGITAL

    situations. Library Management Design Using Use. Case. To model software using object oriented design, a case study of Bingham University. Library Management System is used. Software is developed to automate the Bingham. University manual Library. The system will be stand alone and will be designed with the.

  13. Metrics for evaluating performance and uncertainty of Bayesian network models

    Science.gov (United States)

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  14. Modeling and Performance Analysis of Manufacturing Systems in ...

    African Journals Online (AJOL)

    This study deals with modeling and performance analysis of footwear manufacturing using arena simulation modeling software. It was investigated that modeling and simulation is a potential tool for modeling and analysis of manufacturing assembly lines like footwear manufacturing because it allows the researcher to ...

  15. Neuro-fuzzy model for evaluating the performance of processes ...

    Indian Academy of Sciences (India)

    In this work an Adaptive Neuro-Fuzzy Inference System (ANFIS) was used to model the periodic performance of some multi-input single-output (MISO) processes, namely: brewery operations (case study 1) and soap production (case study 2) processes. Two ANFIS models were developed to model the performance of the ...

  16. Performance and reliability model checking and model construction

    NARCIS (Netherlands)

    Hermanns, H.; Gnesi, Stefania; Schieferdecker, Ina; Rennoch, Axel

    2000-01-01

    Continuous-time Markov chains (CTMCs) are widely used to describe stochastic phenomena in many diverse areas. They are used to estimate performance and reliability characteristics of various nature, for instance to quantify throughputs of manufacturing systems, to locate bottlenecks in communication

  17. Modeling and analysis to quantify MSE wall behavior and performance.

    Science.gov (United States)

    2009-08-01

    To better understand potential sources of adverse performance of mechanically stabilized earth (MSE) walls, a suite of analytical models was studied using the computer program FLAC, a numerical modeling computer program widely used in geotechnical en...

  18. Characterization uncertainty and its effects on models and performance

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Treadway, A.H.

    1991-01-01

    Geostatistical simulation is being used to develop multiple geologic models of rock properties at the proposed Yucca Mountain repository site. Because each replicate model contains the same known information, and is thus essentially indistinguishable statistically from others, the differences between models may be thought of as representing the uncertainty in the site description. The variability among performance measures, such as ground water travel time, calculated using these replicate models therefore quantifies the uncertainty in performance that arises from uncertainty in site characterization.

  19. Evaluating Models of Human Performance: Safety-Critical Systems Applications

    Science.gov (United States)

    Feary, Michael S.

    2012-01-01

    This presentation is part of panel discussion on Evaluating Models of Human Performance. The purpose of this panel is to discuss the increasing use of models in the world today and specifically focus on how to describe and evaluate models of human performance. My presentation will focus on discussions of generating distributions of performance, and the evaluation of different strategies for humans performing tasks with mixed initiative (Human-Automation) systems. I will also discuss issues with how to provide Human Performance modeling data to support decisions on acceptability and tradeoffs in the design of safety critical systems. I will conclude with challenges for the future.

  20. Conceptual adsorption models and open issues pertaining to performance assessment

    International Nuclear Information System (INIS)

    Serne, R.J.

    1992-01-01

    Recently several articles have been published that question the appropriateness of the distribution coefficient, Rd, concept to quantify radionuclide migration. Several distinct issues surrounding the modeling of nuclide retardation. The first section defines adsorption terminology and discusses various adsorption processes. The next section describes five commonly used adsorption conceptual models, specifically emphasizing what attributes that affect adsorption are explicitly accommodated in each model. I also review efforts to incorporate each adsorption model into performance assessment transport computer codes. The five adsorption conceptual models are (1) the constant Rd model, (2) the parametric Rd model, (3) isotherm adsorption models, (4) mass action adsorption models, and (5) surface-complexation with electrostatics models. The final section discusses the adequacy of the distribution ratio concept, the adequacy of transport calculations that rely on constant retardation factors and the status of incorporating sophisticated adsorption models into transport codes. 86 refs., 1 fig., 1 tab

  1. The landscape of GPGPU performance modeling tools

    NARCIS (Netherlands)

    Madougou, S.; Varbanescu, A.; de Laat, C.; van Nieuwpoort, R.

    GPUs are gaining fast adoption as high-performance computing architectures, mainly because of their impressive peak performance. Yet most applications only achieve small fractions of this performance. While both programmers and architects have clear opinions about the causes of this performance gap,

  2. Atomic scale simulations for improved CRUD and fuel performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Anders David Ragnar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cooper, Michael William Donald [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-06

    A more mechanistic description of fuel performance codes can be achieved by deriving models and parameters from atomistic scale simulations rather than fitting models empirically to experimental data. The same argument applies to modeling deposition of corrosion products on fuel rods (CRUD). Here are some results from publications in 2016 carried out using the CASL allocation at LANL.

  3. modeling the effect of bandwidth allocation on network performance

    African Journals Online (AJOL)

    Modeling The Effect Of Bandwidth Allocation On Network Performance. MODELING THE EFFECT OF BANDWIDTH ... ABSTRACT. In this paper, a new channel capacity model for interference- limited systems was obtained .... congestion admission control, with the intent of minimizing energy consumption at each terminal.

  4. Modelling of Box Type Solar Cooker Performance in a Tropical ...

    African Journals Online (AJOL)

    Thermal performance model of box type solar cooker with loaded water is presented. The model was developed using the method of Funk to estimate cooking power in terms of climatic and design parameters for box type solar cooker in a tropical environment. Coefficients for each term used in the model were determined ...

  5. Comparisons of Faulting-Based Pavement Performance Prediction Models

    Directory of Open Access Journals (Sweden)

    Weina Wang

    2017-01-01

    Full Text Available Faulting prediction is the core of concrete pavement maintenance and design. Highway agencies are always faced with the problem of lower accuracy for the prediction which causes costly maintenance. Although many researchers have developed some performance prediction models, the accuracy of prediction has remained a challenge. This paper reviews performance prediction models and JPCP faulting models that have been used in past research. Then three models including multivariate nonlinear regression (MNLR model, artificial neural network (ANN model, and Markov Chain (MC model are tested and compared using a set of actual pavement survey data taken on interstate highway with varying design features, traffic, and climate data. It is found that MNLR model needs further recalibration, while the ANN model needs more data for training the network. MC model seems a good tool for pavement performance prediction when the data is limited, but it is based on visual inspections and not explicitly related to quantitative physical parameters. This paper then suggests that the further direction for developing the performance prediction model is incorporating the advantages and disadvantages of different models to obtain better accuracy.

  6. Identifying the connective strength between model parameters and performance criteria

    Directory of Open Access Journals (Sweden)

    B. Guse

    2017-11-01

    Full Text Available In hydrological models, parameters are used to represent the time-invariant characteristics of catchments and to capture different aspects of hydrological response. Hence, model parameters need to be identified based on their role in controlling the hydrological behaviour. For the identification of meaningful parameter values, multiple and complementary performance criteria are used that compare modelled and measured discharge time series. The reliability of the identification of hydrologically meaningful model parameter values depends on how distinctly a model parameter can be assigned to one of the performance criteria. To investigate this, we introduce the new concept of connective strength between model parameters and performance criteria. The connective strength assesses the intensity in the interrelationship between model parameters and performance criteria in a bijective way. In our analysis of connective strength, model simulations are carried out based on a latin hypercube sampling. Ten performance criteria including Nash–Sutcliffe efficiency (NSE, Kling–Gupta efficiency (KGE and its three components (alpha, beta and r as well as RSR (the ratio of the root mean square error to the standard deviation for different segments of the flow duration curve (FDC are calculated. With a joint analysis of two regression tree (RT approaches, we derive how a model parameter is connected to different performance criteria. At first, RTs are constructed using each performance criterion as the target variable to detect the most relevant model parameters for each performance criterion. Secondly, RTs are constructed using each parameter as the target variable to detect which performance criteria are impacted by changes in the values of one distinct model parameter. Based on this, appropriate performance criteria are identified for each model parameter. In this study, a high bijective connective strength between model parameters and performance criteria

  7. A performance comparison of atmospheric dispersion models over complex topography

    International Nuclear Information System (INIS)

    Kido, Hiroko; Oishi, Ryoko; Hayashi, Keisuke; Kanno, Mitsuhiro; Kurosawa, Naohiro

    2007-01-01

    A code system using mass-consistent and Gaussian puff model was improved for a new option of atmospheric dispersion research. There are several atmospheric dispersion models for radionuclides. Because different models have both merits and disadvantages, it is necessary to choose the model that is most suitable for the surface conditions of the estimated region while regarding the calculation time, accuracy, and purpose of the calculations being performed. Some models are less accurate when the topography is complex. It is important to understand the differences between the models for smooth and complex surfaces. In this study, the performances of the following four models were compared: (1) Gaussian plume model (2) Gaussian puff model (3) Mass-consistent wind fields and Gaussian puff model that was improved in this study from one presented in Aomori Energy Society of Japan, 2005 Fall Meeting, D21. (4) Meso-scale meteorological model (RAMS: The Regional Atmospheric Modeling System) and particle-type model (HYPACT: The RAMS Hybrid Particle and Concentration Transport Model) (Reference: ATMET). (author)

  8. Reference Manual for the System Advisor Model's Wind Power Performance Model

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, J.; Jorgenson, J.; Gilman, P.; Ferguson, T.

    2014-08-01

    This manual describes the National Renewable Energy Laboratory's System Advisor Model (SAM) wind power performance model. The model calculates the hourly electrical output of a single wind turbine or of a wind farm. The wind power performance model requires information about the wind resource, wind turbine specifications, wind farm layout (if applicable), and costs. In SAM, the performance model can be coupled to one of the financial models to calculate economic metrics for residential, commercial, or utility-scale wind projects. This manual describes the algorithms used by the wind power performance model, which is available in the SAM user interface and as part of the SAM Simulation Core (SSC) library, and is intended to supplement the user documentation that comes with the software.

  9. Emerging Carbon Nanotube Electronic Circuits, Modeling, and Performance

    OpenAIRE

    Xu, Yao; Srivastava, Ashok; Sharma, Ashwani K.

    2010-01-01

    Current transport and dynamic models of carbon nanotube field-effect transistors are presented. A model of single-walled carbon nanotube as interconnect is also presented and extended in modeling of single-walled carbon nanotube bundles. These models are applied in studying the performances of circuits such as the complementary carbon nanotube inverter pair and carbon nanotube as interconnect. Cadence/Spectre simulations show that carbon nanotube field-effect transistor circuits can operate a...

  10. Team performance modeling for HRA in dynamic situations

    International Nuclear Information System (INIS)

    Shu Yufei; Furuta, Kazuo; Kondo, Shunsuke

    2002-01-01

    This paper proposes a team behavior network model that can simulate and analyze response of an operator team to an incident in a dynamic and context-sensitive situation. The model is composed of four sub-models, which describe the context of team performance. They are task model, event model, team model and human-machine interface model. Each operator demonstrates aspects of his/her specific cognitive behavior and interacts with other operators and the environment in order to deal with an incident. Individual human factors, which determine the basis of communication and interaction between individuals, and cognitive process of an operator, such as information acquisition, state-recognition, decision-making and action execution during development of an event scenario are modeled. A case of feed and bleed operation in pressurized water reactor under an emergency situation was studied and the result was compared with an experiment to check the validity of the proposed model

  11. Cost and Performance Assumptions for Modeling Electricity Generation Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Tidball, R.; Bluestein, J.; Rodriguez, N.; Knoke, S.

    2010-11-01

    The goal of this project was to compare and contrast utility scale power plant characteristics used in data sets that support energy market models. Characteristics include both technology cost and technology performance projections to the year 2050. Cost parameters include installed capital costs and operation and maintenance (O&M) costs. Performance parameters include plant size, heat rate, capacity factor or availability factor, and plant lifetime. Conventional, renewable, and emerging electricity generating technologies were considered. Six data sets, each associated with a different model, were selected. Two of the data sets represent modeled results, not direct model inputs. These two data sets include cost and performance improvements that result from increased deployment as well as resulting capacity factors estimated from particular model runs; other data sets represent model input data. For the technologies contained in each data set, the levelized cost of energy (LCOE) was also evaluated, according to published cost, performance, and fuel assumptions.

  12. Multitasking TORT under UNICOS: Parallel performance models and measurements

    International Nuclear Information System (INIS)

    Barnett, A.; Azmy, Y.Y.

    1999-01-01

    The existing parallel algorithms in the TORT discrete ordinates code were updated to function in a UNICOS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead

  13. Multitasking TORT Under UNICOS: Parallel Performance Models and Measurements

    International Nuclear Information System (INIS)

    Azmy, Y.Y.; Barnett, D.A.

    1999-01-01

    The existing parallel algorithms in the TORT discrete ordinates were updated to function in a UNI-COS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead

  14. Modeling the Mechanical Performance of Die Casting Dies

    Energy Technology Data Exchange (ETDEWEB)

    R. Allen Miller

    2004-02-27

    The following report covers work performed at Ohio State on modeling the mechanical performance of dies. The focus of the project was development and particularly verification of finite element techniques used to model and predict displacements and stresses in die casting dies. The work entails a major case study performed with and industrial partner on a production die and laboratory experiments performed at Ohio State.

  15. Automatic Performance Model Generation for Java Enterprise Edition (EE) Applications

    OpenAIRE

    Brunnert, Andreas;Vögele, Christian;Krcmar, Helmut

    2015-01-01

    The effort required to create performance models for enterprise applications is often out of proportion compared to their benefits. This work aims to reduce this effort by introducing an approach to automatically generate component-based performance models for running Java EE applications. The approach is applicable for all Java EE server products as it relies on standardized component types and interfaces to gather the required data for modeling an application. The feasibility of the approac...

  16. A Systemic Cause Analysis Model for Human Performance Technicians

    Science.gov (United States)

    Sostrin, Jesse

    2011-01-01

    This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…

  17. Model for Agile Software Development Performance Monitoring

    OpenAIRE

    Žabkar, Nataša

    2013-01-01

    Agile methodologies have been in use for more than ten years and during this time they proved to be efficient, even though number of empirical research is scarce, especially regarding agile software development performance monitoring. The most popular agile framework Scrum is using only one measure of performance: the amount of work remaining for implementation of User Story from the Product Backlog or for implementation of Task from the Sprint Backlog. In time the need for additional me...

  18. Models used to assess the performance of photovoltaic systems.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua S.; Klise, Geoffrey T.

    2009-12-01

    This report documents the various photovoltaic (PV) performance models and software developed and utilized by researchers at Sandia National Laboratories (SNL) in support of the Photovoltaics and Grid Integration Department. In addition to PV performance models, hybrid system and battery storage models are discussed. A hybrid system using other distributed sources and energy storage can help reduce the variability inherent in PV generation, and due to the complexity of combining multiple generation sources and system loads, these models are invaluable for system design and optimization. Energy storage plays an important role in reducing PV intermittency and battery storage models are used to understand the best configurations and technologies to store PV generated electricity. Other researcher's models used by SNL are discussed including some widely known models that incorporate algorithms developed at SNL. There are other models included in the discussion that are not used by or were not adopted from SNL research but may provide some benefit to researchers working on PV array performance, hybrid system models and energy storage. The paper is organized into three sections to describe the different software models as applied to photovoltaic performance, hybrid systems, and battery storage. For each model, there is a description which includes where to find the model, whether it is currently maintained and any references that may be available. Modeling improvements underway at SNL include quantifying the uncertainty of individual system components, the overall uncertainty in modeled vs. measured results and modeling large PV systems. SNL is also conducting research into the overall reliability of PV systems.

  19. Performance comparison of hydrological model structures during low flows

    Science.gov (United States)

    Staudinger, Maria; Stahl, Kerstin; Tallaksen, Lena M.; Clark, Martyn P.; Seibert, Jan

    2010-05-01

    Low flows are still poorly reproduced by common hydrological models since they are traditionally designed to meet peak flow situations best possible. As low flow becomes increasingly important to several target areas there is a need to improve available models. We present a study that assesses the impact of model structure on low flow simulations. This is done using the Framework for Understanding Structural Errors (FUSE), which identifies the set of (subjective) decisions made when building a hydrological model, and provides multiple options for each modeling decision. 79 models were built using the FUSE framework, and applied to simulate stream flows in the Narsjø catchment in Norway (119 km²). To allow comparison all new models were calibrated using an automatic optimization method. Low flow and recession analysis of the new models enables us to evaluate model performance focusing on different aspects by using various objective functions. Additionally, model structures responsible for poor performance, and hence unsuitable, can be detected. We focused on elucidating model performance during summer (August - October) and winter low flows which evolve from entirely different hydrological processes in the Narsjø catchment. Summer low flows develop out of a lack of precipitation while winter low flows are due to water storage in ice and snow. The results showed that simulations of summer low flows were throughout poorer than simulations of winter low flows when evaluating with an objective function focusing on low flows; here, the model structure influencing winter low flow simulations is the lower layer architecture. Different model structures were found to influence model performance during the summer season. The choice of other objective functions has the potential to affect such an evaluation. These findings call for the use of different model structures tailored to particular needs.

  20. A Spectral Evaluation of Models Performances in Mediterranean Oak Woodlands

    Science.gov (United States)

    Vargas, R.; Baldocchi, D. D.; Abramowitz, G.; Carrara, A.; Correia, A.; Kobayashi, H.; Papale, D.; Pearson, D.; Pereira, J.; Piao, S.; Rambal, S.; Sonnentag, O.

    2009-12-01

    Ecosystem processes are influenced by climatic trends at multiple temporal scales including diel patterns and other mid-term climatic modes, such as interannual and seasonal variability. Because interactions between biophysical components of ecosystem processes are complex, it is important to test how models perform in frequency (e.g. hours, days, weeks, months, years) and time (i.e. day of the year) domains in addition to traditional tests of annual or monthly sums. Here we present a spectral evaluation using wavelet time series analysis of model performance in seven Mediterranean Oak Woodlands that encompass three deciduous and four evergreen sites. We tested the performance of five models (CABLE, ORCHIDEE, BEPS, Biome-BGC, and JULES) on measured variables of gross primary production (GPP) and evapotranspiration (ET). In general, model performance fails at intermediate periods (e.g. weeks to months) likely because these models do not represent the water pulse dynamics that influence GPP and ET at these Mediterranean systems. To improve the performance of a model it is critical to identify first where and when the model fails. Only by identifying where a model fails we can improve the model performance and use them as prognostic tools and to generate further hypotheses that can be tested by new experiments and measurements.

  1. ASYMMETRIC PRICE TRANSMISSION MODELING: THE IMPORTANCE OF MODEL COMPLEXITY AND THE PERFORMANCE OF THE SELECTION CRITERIA

    Directory of Open Access Journals (Sweden)

    Henry de-Graft Acquah

    2013-01-01

    Full Text Available Information Criteria provides an attractive basis for selecting the best model from a set of competing asymmetric price transmission models or theories. However, little is understood about the sensitivity of the model selection methods to model complexity. This study therefore fits competing asymmetric price transmission models that differ in complexity to simulated data and evaluates the ability of the model selection methods to recover the true model. The results of Monte Carlo experimentation suggest that in general BIC, CAIC and DIC were superior to AIC when the true data generating process was the standard error correction model, whereas AIC was more successful when the true model was the complex error correction model. It is also shown that the model selection methods performed better in large samples for a complex asymmetric data generating process than with a standard asymmetric data generating process. Except for complex models, AIC's performance did not make substantial gains in recovery rates as sample size increased. The research findings demonstrate the influence of model complexity in asymmetric price transmission model comparison and selection.

  2. A performance model of the OSI communication architecture

    Science.gov (United States)

    Kritzinger, P. S.

    1986-06-01

    An analytical model aiming at predicting the performance of software implementations which would be built according to the OSI basic reference model is proposed. The model uses the peer protocol standard of a layer as the reference description of an implementation of that layer. The model is basically a closed multiclass multichain queueing network with a processor-sharing center, modeling process contention at the processor, and a delay center, modeling times spent waiting for responses from the corresponding peer processes. Each individual transition of the protocol constitutes a different class and each layer of the architecture forms a closed chain. Performance statistics include queue lengths and response times at the processor as a function of processor speed and the number of open connections. It is shown how to reduce the model should the protocol state space become very large. Numerical results based upon the derived formulas are given.

  3. Does model performance improve with complexity? A case study with three hydrological models

    Science.gov (United States)

    Orth, Rene; Staudinger, Maria; Seneviratne, Sonia I.; Seibert, Jan; Zappa, Massimiliano

    2015-04-01

    In recent decades considerable progress has been made in climate model development. Following the massive increase in computational power, models became more sophisticated. At the same time also simple conceptual models have advanced. In this study we validate and compare three hydrological models of different complexity to investigate whether their performance varies accordingly. For this purpose we use runoff and also soil moisture measurements, which allow a truly independent validation, from several sites across Switzerland. The models are calibrated in similar ways with the same runoff data. Our results show that the more complex models HBV and PREVAH outperform the simple water balance model (SWBM) in case of runoff but not for soil moisture. Furthermore the most sophisticated PREVAH model shows an added value compared to the HBV model only in case of soil moisture. Focusing on extreme events we find generally improved performance of the SWBM during drought conditions and degraded agreement with observations during wet extremes. For the more complex models we find the opposite behavior, probably because they were primarily developed for prediction of runoff extremes. As expected given their complexity, HBV and PREVAH have more problems with over-fitting. All models show a tendency towards better performance in lower altitudes as opposed to (pre-) alpine sites. The results vary considerably across the investigated sites. In contrast, the different metrics we consider to estimate the agreement between models and observations lead to similar conclusions, indicating that the performance of the considered models is similar at different time scales as well as for anomalies and long-term means. We conclude that added complexity does not necessarily lead to improved performance of hydrological models, and that performance can vary greatly depending on the considered hydrological variable (e.g. runoff vs. soil moisture) or hydrological conditions (floods vs. droughts).

  4. Dynamic vehicle model for handling performance using experimental data

    Directory of Open Access Journals (Sweden)

    SangDo Na

    2015-11-01

    Full Text Available An analytical vehicle model is essential for the development of vehicle design and performance. Various vehicle models have different complexities, assumptions and limitations depending on the type of vehicle analysis. An accurate full vehicle model is essential in representing the behaviour of the vehicle in order to estimate vehicle dynamic system performance such as ride comfort and handling. An experimental vehicle model is developed in this article, which employs experimental kinematic and compliance data measured between the wheel and chassis. From these data, a vehicle model, which includes dynamic effects due to vehicle geometry changes, has been developed. The experimental vehicle model was validated using an instrumented experimental vehicle and data such as a step change steering input. This article shows a process to develop and validate an experimental vehicle model to enhance the accuracy of handling performance, which comes from precise suspension model measured by experimental data of a vehicle. The experimental force data obtained from a suspension parameter measuring device are employed for a precise modelling of the steering and handling response. The steering system is modelled by a lumped model, with stiffness coefficients defined and identified by comparing steering stiffness obtained by the measured data. The outputs, specifically the yaw rate and lateral acceleration of the vehicle, are verified by experimental results.

  5. Developing an Energy Performance Modeling Startup Kit

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2012-10-01

    In 2011, the NAHB Research Center began assessing the needs and motivations of residential remodelers regarding energy performance remodeling. This report outlines: the current remodeling industry and the role of energy efficiency; gaps and barriers to adding energy efficiency into remodeling; and support needs of professional remodelers to increase sales and projects involving improving home energy efficiency.

  6. Modeling vibrato and portamento in music performance

    NARCIS (Netherlands)

    Desain, P.W.M.; Honing, H.J.

    1999-01-01

    Research in the psychology of music dealing with expression is often concerned with the discrete aspects of music performance, and mainly concentrates on the study of piano music (partly because of the ease with which piano music can be reduced to discrete note events). However, on other

  7. WirelessHART modeling and performance evaluation

    NARCIS (Netherlands)

    Remke, Anne Katharina Ingrid; Wu, Xian

    2013-01-01

    In process industries wired supervisory and control networks are more and more replaced by wireless systems. Wireless communication inevitably introduces time delays and message losses, which may degrade the system reliability and performance. WirelessHART, as the first international standard for

  8. Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?

    Science.gov (United States)

    Lum, Karen; Hihn, Jairus; Menzies, Tim

    2006-01-01

    While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.

  9. Sustaining Team Performance: A Systems Model\\

    Science.gov (United States)

    1979-07-31

    a " forgetting curve ." Three cases were tested and retested under four different research schedules. A description c;’ the test cases follows. III-11...available to show fluctuation in Ph due to unit yearly training cycle. Another real-world military example of the classical forgetting curve is found in the...no practice between the acquisition and subsequent test of performance. The classical forgetting curve is believed to apply. The shape of curve depends

  10. Modelling swimming hydrodynamics to enhance performance

    OpenAIRE

    Marinho, D.A.; Rouboa, A.; Barbosa, Tiago M.; Silva, A.J.

    2010-01-01

    Swimming assessment is one of the most complex but outstanding and fascinating topics in biomechanics. Computational fluid dynamics (CFD) methodology is one of the different methods that have been applied in swimming research to observe and understand water movements around the human body and its application to improve swimming performance. CFD has been applied attempting to understand deeply the biomechanical basis of swimming. Several studies have been conducted willing to analy...

  11. Measuring broadband in Europe: : development of a market model and performance index using structural equations modelling

    NARCIS (Netherlands)

    Lemstra, W.; Voogt, B.; Gorp, van N.

    2015-01-01

    This contribution reports on the development of a performance index and underlying market model with application to broadband developments in the European Union. The Structure–Conduct–Performance paradigm provides the theoretical grounding. Structural equations modelling was applied to determine the

  12. An Empirical Study of a Solo Performance Assessment Model

    Science.gov (United States)

    Russell, Brian E.

    2015-01-01

    The purpose of this study was to test a hypothesized model of solo music performance assessment. Specifically, this study investigates the influence of technique and musical expression on perceptions of overall performance quality. The Aural Musical Performance Quality (AMPQ) measure was created to measure overall performance quality, technique,…

  13. Comparison of the performance of net radiation calculation models

    DEFF Research Database (Denmark)

    Kjærsgaard, Jeppe Hvelplund; Cuenca, R.H.; Martinez-Cob, A.

    2009-01-01

    Daily values of net radiation are used in many applications of crop-growth modeling and agricultural water management. Measurements of net radiation are not part of the routine measurement program at many weather stations and are commonly estimated based on other meteorological parameters. Daily....... The performance of the empirical models was nearly identical at all sites. Since the empirical models were easier to use and simpler to calibrate than the physically based models, the results indicate that the empirical models can be used as a good substitute for the physically based ones when available...

  14. Developing an Energy Performance Modeling Startup Kit

    Energy Technology Data Exchange (ETDEWEB)

    Wood, A.

    2012-10-01

    In 2011, the NAHB Research Center began the first part of the multi-year effort by assessing the needs and motivations of residential remodelers regarding energy performance remodeling. The scope is multifaceted - all perspectives will be sought related to remodeling firms ranging in size from small-scale, sole proprietor to national. This will allow the Research Center to gain a deeper understanding of the remodeling and energy retrofit business and the needs of contractors when offering energy upgrade services. To determine the gaps and the motivation for energy performance remodeling, the NAHB Research Center conducted (1) an initial series of focus groups with remodelers at the 2011 International Builders' Show, (2) a second series of focus groups with remodelers at the NAHB Research Center in conjunction with the NAHB Spring Board meeting in DC, and (3) quantitative market research with remodelers based on the findings from the focus groups. The goal was threefold, to: Understand the current remodeling industry and the role of energy efficiency; Identify the gaps and barriers to adding energy efficiency into remodeling; and Quantify and prioritize the support needs of professional remodelers to increase sales and projects involving improving home energy efficiency. This report outlines all three of these tasks with remodelers.

  15. Evaluation Model of Organizational Performance for Small and Medium Enterprises

    Directory of Open Access Journals (Sweden)

    Carlos Augusto Passos

    2014-12-01

    Full Text Available In the 1980’s, many tools for evaluating organizational performance were created. However, most of them are useful only to large companies and do not foster results in small and medium-sized enterprises (SMEs. In light of this fact, this article aims at proposing an Organizational Performance Assessment model (OPA which is flexible and adaptable to the reality of SMEs, based on the theoretical framework of various models, and comparisons on the basis of three major authors’ criteria to evaluate OPA models. The research has descriptive and exploratory character, with qualitative nature. The MADE-O model, according to the criteria described in the bibliography, is the one that best fits the needs of SMEs, used as a baseline for the model proposed in this study with adaptations pertaining to the BSC model. The model called the Overall Performance Indicator – Environment (IDG-E has as its main differential, in addition to the base of the models mentioned above, the assessment of the external and internal environment weighted in modules of OPA. As the SME is characterized by having few processes and people, the small amount of performance indicators is another positive aspect. Submitted to the evaluation of the criteria subscribed by the authors, the model proved to be quite feasible for use in SMEs.

  16. Impact of reactive settler models on simulated WWTP performance.

    Science.gov (United States)

    Gernaey, K V; Jeppsson, U; Batstone, D J; Ingildsen, P

    2006-01-01

    Including a reactive settler model in a wastewater treatment plant model allows representation of the biological reactions taking place in the sludge blanket in the settler, something that is neglected in many simulation studies. The idea of including a reactive settler model is investigated for an ASM1 case study. Simulations with a whole plant model including the non-reactive Takács settler model are used as a reference, and are compared to simulation results considering two reactive settler models. The first is a return sludge model block removing oxygen and a user-defined fraction of nitrate, combined with a non-reactive Takács settler. The second is a fully reactive ASM1 Takács settler model. Simulations with the ASM1 reactive settler model predicted a 15.3% and 7.4% improvement of the simulated N removal performance, for constant (steady-state) and dynamic influent conditions respectively. The oxygen/nitrate return sludge model block predicts a 10% improvement of N removal performance under dynamic conditions, and might be the better modelling option for ASM1 plants: it is computationally more efficient and it will not overrate the importance of decay processes in the settler.

  17. Qualitative and quantitative examination of the performance of regional air quality models representing different modeling approaches

    International Nuclear Information System (INIS)

    Bhumralkar, C.M.; Ludwig, F.L.; Shannon, J.D.; McNaughton, D.

    1985-04-01

    The calculations of three different air quality models were compared with the best available observations. The comparisons were made without calibrating the models to improve agreement with the observations. Model performance was poor for short averaging times (less than 24 hours). Some of the poor performance can be traced to errors in the input meteorological fields, but error exist on all levels. It should be noted that these models were not originally designed for treating short-term episodes. For short-term episodes, much of the variance in the data can arise from small spatial scale features that tend to be averaged out over longer periods. These small spatial scale features cannot be resolved with the coarse grids that are used for the meteorological and emissions inputs. Thus, it is not surprising that the models performed for the longer averaging times. The models compared were RTM-II, ENAMAP-2 and ACID. (17 refs., 5 figs., 4 tabs

  18. Confirming the Value of Swimming-Performance Models for Adolescents.

    Science.gov (United States)

    Dormehl, Shilo J; Robertson, Samuel J; Barker, Alan R; Williams, Craig A

    2017-10-01

    To evaluate the efficacy of existing performance models to assess the progression of male and female adolescent swimmers through a quantitative and qualitative mixed-methods approach. Fourteen published models were tested using retrospective data from an independent sample of Dutch junior national-level swimmers from when they were 12-18 y of age (n = 13). The degree of association by Pearson correlations was compared between the calculated differences from the models and quadratic functions derived from the Dutch junior national qualifying times. Swimmers were grouped based on their differences from the models and compared with their swimming histories that were extracted from questionnaires and follow-up interviews. Correlations of the deviations from both the models and quadratic functions derived from the Dutch qualifying times were all significant except for the 100-m breaststroke and butterfly and the 200-m freestyle for females (P backstroke for males and 200-m freestyle for males and females were almost directly proportional. In general, deviations from the models were accounted for by the swimmers' training histories. Higher levels of retrospective motivation appeared to be synonymous with higher-level career performance. This mixed-methods approach helped confirm the validity of the models that were found to be applicable to adolescent swimmers at all levels, allowing coaches to track performance and set goals. The value of the models in being able to account for the expected performance gains during adolescence enables quantification of peripheral factors that could affect performance.

  19. Disaggregation of Rainy Hours: Compared Performance of Various Models.

    Science.gov (United States)

    Ben Haha, M.; Hingray, B.; Musy, A.

    In the urban environment, the response times of catchments are usually short. To de- sign or to diagnose waterworks in that context, it is necessary to describe rainfall events with a good time resolution: a 10mn time step is often necessary. Such in- formation is not always available. Rainfall disaggregation models have thus to be applied to produce from rough rainfall data that short time resolution information. The communication will present the performance obtained with several rainfall dis- aggregation models that allow for the disaggregation of rainy hours into six 10mn rainfall amounts. The ability of the models to reproduce some statistical character- istics of rainfall (mean, variance, overall distribution of 10mn-rainfall amounts; ex- treme values of maximal rainfall amounts over different durations) is evaluated thanks to different graphical and numerical criteria. The performance of simple models pre- sented in some scientific papers or developed in the Hydram laboratory as well as the performance of more sophisticated ones is compared with the performance of the basic constant disaggregation model. The compared models are either deterministic or stochastic; for some of them the disaggregation is based on scaling properties of rainfall. The compared models are in increasing complexity order: constant model, linear model (Ben Haha, 2001), Ormsbee Deterministic model (Ormsbee, 1989), Ar- tificial Neuronal Network based model (Burian et al. 2000), Hydram Stochastic 1 and Hydram Stochastic 2 (Ben Haha, 2001), Multiplicative Cascade based model (Olsson and Berndtsson, 1998), Ormsbee Stochastic model (Ormsbee, 1989). The 625 rainy hours used for that evaluation (with a hourly rainfall amount greater than 5mm) were extracted from the 21 years chronological rainfall series (10mn time step) observed at the Pully meteorological station, Switzerland. The models were also evaluated when applied to different rainfall classes depending on the season first and on the

  20. A Mathematical Model to Improve the Performance of Logistics Network

    Directory of Open Access Journals (Sweden)

    Muhammad Izman Herdiansyah

    2012-01-01

    Full Text Available The role of logistics nowadays is expanding from just providing transportation and warehousing to offering total integrated logistics. To remain competitive in the global market environment, business enterprises need to improve their logistics operations performance. The improvement will be achieved when we can provide a comprehensive analysis and optimize its network performances. In this paper, a mixed integer linier model for optimizing logistics network performance is developed. It provides a single-product multi-period multi-facilities model, as well as the multi-product concept. The problem is modeled in form of a network flow problem with the main objective to minimize total logistics cost. The problem can be solved using commercial linear programming package like CPLEX or LINDO. Even in small case, the solver in Excel may also be used to solve such model.Keywords: logistics network, integrated model, mathematical programming, network optimization

  1. Faculty Performance Evaluation: The CIPP-SAPS Model.

    Science.gov (United States)

    Mitcham, Maralynne

    1981-01-01

    The issues of faculty performance evaluation for allied health professionals are addressed. Daniel Stufflebeam's CIPP (content-imput-process-product) model is introduced and its development into a CIPP-SAPS (self-administrative-peer- student) model is pursued. (Author/CT)

  2. Technical performance of percutaneous and laminectomy leads analyzed by modeling

    NARCIS (Netherlands)

    Manola, L.; Holsheimer, J.

    2004-01-01

    The objective of this study was to compare the technical performance of laminectomy and percutaneous spinal cord stimulation leads with similar contact spacing by computer modeling. Monopolar and tripolar (guarded cathode) stimulation with both lead types in a low-thoracic spine model was simulated

  3. Neuro-fuzzy model for evaluating the performance of processes ...

    Indian Academy of Sciences (India)

    CHIDOZIE CHUKWUEMEKA NWOBI-OKOYE

    2017-11-16

    Nov 16, 2017 ... In this work an Adaptive Neuro-Fuzzy Inference System (ANFIS) was used to model the periodic performance of some ..... Every node i in this layer is an adaptive node with a node function. Neuro-fuzzy model for .... spectral analysis and parameter optimization using genetic algorithm, the values of v10. and ...

  4. UNCONSTRAINED HANDWRITING RECOGNITION : LANGUAGE MODELS, PERPLEXITY, AND SYSTEM PERFORMANCE

    NARCIS (Netherlands)

    Marti, U-V.; Bunke, H.

    2004-01-01

    In this paper we present a number of language models and their behavior in the recognition of unconstrained handwritten English sentences. We use the perplexity to compare the different models and their prediction power, and relate it to the performance of a recognition system under different

  5. Mathematical Models of Elementary Mathematics Learning and Performance. Final Report.

    Science.gov (United States)

    Suppes, Patrick

    This project was concerned with the development of mathematical models of elementary mathematics learning and performance. Probabilistic finite automata and register machines with a finite number of registers were developed as models and extensively tested with data arising from the elementary-mathematics strand curriculum developed by the…

  6. Activity-Based Costing Model for Assessing Economic Performance.

    Science.gov (United States)

    DeHayes, Daniel W.; Lovrinic, Joseph G.

    1994-01-01

    An economic model for evaluating the cost performance of academic and administrative programs in higher education is described. Examples from its application at Indiana University-Purdue University Indianapolis are used to illustrate how the model has been used to control costs and reengineer processes. (Author/MSE)

  7. A Probabilistic Approach to Symbolic Performance Modeling of Parallel Systems

    NARCIS (Netherlands)

    Gautama, H.

    2004-01-01

    Performance modeling plays a significant role in predicting the effects of a particular design choice or in diagnosing the cause for some observed performance behavior. Especially for complex systems such as parallel computer, typically, an intended performance cannot be achieved without recourse to

  8. Asymptotic performance modelling of DCF protocol with prioritized channel access

    Science.gov (United States)

    Choi, Woo-Yong

    2017-11-01

    Recently, the modification of the DCF (Distributed Coordination Function) protocol by the prioritized channel access was proposed to resolve the problem that the DCF performance worsens exponentially as more nodes exist in IEEE 802.11 wireless LANs. In this paper, an asymptotic analytical performance model is presented to analyze the MAC performance of the DCF protocol with the prioritized channel access.

  9. Performance Implications of Business Model Change: A Case Study

    Directory of Open Access Journals (Sweden)

    Jana Poláková

    2015-01-01

    Full Text Available The paper deals with changes in performance level introduced by the change of business model. The selected case is a small family business undergoing through substantial changes in reflection of structural changes of its markets. The authors used the concept of business model to describe value creation processes within the selected family business and by contrasting the differences between value creation processes before and after the change introduced they prove the role of business model as the performance differentiator. This is illustrated with the use of business model canvas constructed on the basis interviews, observations and document analysis. The two business model canvases allow for explanation of cause-and-effect relationships within the business leading to change in performance. The change in the performance is assessed by financial analysis of the business conducted over the period of 2006–2012 demonstrates changes in performance (comparing development of ROA, ROE and ROS having their lowest levels before the change of business model was introduced, growing after the introduction of the change, as well as the activity indicators with similar developments of the family business. The described case study contributes to the concept of business modeling with the arguments supporting its value as strategic tool facilitating decisions related to value creation within the business.

  10. Performance evaluation:= (process algebra + model checking) x Markov chains

    NARCIS (Netherlands)

    Hermanns, H.; Larsen, K.G.; Nielsen, Mogens; Katoen, Joost P.

    2001-01-01

    Markov chains are widely used in practice to determine system performance and reliability characteristics. The vast majority of applications considers continuous-time Markov chains (CTMCs). This tutorial paper shows how successful model specification and analysis techniques from concurrency theory

  11. Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance

    Science.gov (United States)

    Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.

    2014-01-01

    This presentation describes the capabilities of three-dimensional thermal power model of advanced stirling radioisotope generator (ASRG). The performance of the ASRG is presented for different scenario, such as Venus flyby with or without the auxiliary cooling system.

  12. Practical Techniques for Modeling Gas Turbine Engine Performance

    Science.gov (United States)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.

    2016-01-01

    The cost and risk associated with the design and operation of gas turbine engine systems has led to an increasing dependence on mathematical models. In this paper, the fundamentals of engine simulation will be reviewed, an example performance analysis will be performed, and relationships useful for engine control system development will be highlighted. The focus will be on thermodynamic modeling utilizing techniques common in industry, such as: the Brayton cycle, component performance maps, map scaling, and design point criteria generation. In general, these topics will be viewed from the standpoint of an example turbojet engine model; however, demonstrated concepts may be adapted to other gas turbine systems, such as gas generators, marine engines, or high bypass aircraft engines. The purpose of this paper is to provide an example of gas turbine model generation and system performance analysis for educational uses, such as curriculum creation or student reference.

  13. Testing algorithms for a passenger train braking performance model.

    Science.gov (United States)

    2011-09-01

    "The Federal Railroad Administrations Office of Research and Development funded a project to establish performance model to develop, analyze, and test positive train control (PTC) braking algorithms for passenger train operations. With a good brak...

  14. LINDOZ model for Finland environment: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Galeriu, D.; Apostoaie, A.I.; Mocanu, N.; Paunescu, N.

    1996-01-01

    LINDOZ model was developed as a realistic assessment tool for radioactive contamination of the environment. It was designed to produce estimates for the concentration of the pollutant in different compartments of the terrestrial ecosystem (soil, vegetation, animal tissue, and animal products), and to evaluate human exposure to the contaminant (concentration in whole human body, and dose to humans) from inhalation, ingestion and external irradiation. The user can apply LINDOZ for both routine and accidental type of releases. 2 figs, 2 tabs

  15. Real-time individualization of the unified model of performance.

    Science.gov (United States)

    Liu, Jianbo; Ramakrishnan, Sridhar; Laxminarayan, Srinivas; Balkin, Thomas J; Reifman, Jaques

    2017-12-01

    Existing mathematical models for predicting neurobehavioural performance are not suited for mobile computing platforms because they cannot adapt model parameters automatically in real time to reflect individual differences in the effects of sleep loss. We used an extended Kalman filter to develop a computationally efficient algorithm that continually adapts the parameters of the recently developed Unified Model of Performance (UMP) to an individual. The algorithm accomplishes this in real time as new performance data for the individual become available. We assessed the algorithm's performance by simulating real-time model individualization for 18 subjects subjected to 64 h of total sleep deprivation (TSD) and 7 days of chronic sleep restriction (CSR) with 3 h of time in bed per night, using psychomotor vigilance task (PVT) data collected every 2 h during wakefulness. This UMP individualization process produced parameter estimates that progressively approached the solution produced by a post-hoc fitting of model parameters using all data. The minimum number of PVT measurements needed to individualize the model parameters depended upon the type of sleep-loss challenge, with ~30 required for TSD and ~70 for CSR. However, model individualization depended upon the overall duration of data collection, yielding increasingly accurate model parameters with greater number of days. Interestingly, reducing the PVT sampling frequency by a factor of two did not notably hamper model individualization. The proposed algorithm facilitates real-time learning of an individual's trait-like responses to sleep loss and enables the development of individualized performance prediction models for use in a mobile computing platform. © 2017 European Sleep Research Society.

  16. Model complexity and performance: How far can we simplify?

    Science.gov (United States)

    Raick, C.; Soetaert, K.; Grégoire, M.

    2006-07-01

    Handling model complexity and reliability is a key area of research today. While complex models containing sufficient detail have become possible due to increased computing power, they often lead to too much uncertainty. On the other hand, very simple models often crudely oversimplify the real ecosystem and can not be used for management purposes. Starting from a complex and validated 1D pelagic ecosystem model of the Ligurian Sea (NW Mediterranean Sea), we derived simplified aggregated models in which either the unbalanced algal growth, the functional group diversity or the explicit description of the microbial loop was sacrificed. To overcome the problem of data availability with adequate spatial and temporal resolution, the outputs of the complex model are used as the baseline of perfect knowledge to calibrate the simplified models. Objective criteria of model performance were used to compare the simplified models’ results to the complex model output and to the available data at the DYFAMED station in the central Ligurian Sea. We show that even the simplest (NPZD) model is able to represent the global ecosystem features described by the complex model (e.g. primary and secondary productions, particulate organic matter export flux, etc.). However, a certain degree of sophistication in the formulation of some biogeochemical processes is required to produce realistic behaviors (e.g. the phytoplankton competition, the potential carbon or nitrogen limitation of the zooplankton ingestion, the model trophic closure, etc.). In general, a 9 state-variable model that has the functional group diversity removed, but which retains the bacterial loop and the unbalanced algal growth, performs best.

  17. Gold-standard performance for 2D hydrodynamic modeling

    Science.gov (United States)

    Pasternack, G. B.; MacVicar, B. J.

    2013-12-01

    Two-dimensional, depth-averaged hydrodynamic (2D) models are emerging as an increasingly useful tool for environmental water resources engineering. One of the remaining technical hurdles to the wider adoption and acceptance of 2D modeling is the lack of standards for 2D model performance evaluation when the riverbed undulates, causing lateral flow divergence and convergence. The goal of this study was to establish a gold-standard that quantifies the upper limit of model performance for 2D models of undulating riverbeds when topography is perfectly known and surface roughness is well constrained. A review was conducted of published model performance metrics and the value ranges exhibited by models thus far for each one. Typically predicted velocity differs from observed by 20 to 30 % and the coefficient of determination between the two ranges from 0.5 to 0.8, though there tends to be a bias toward overpredicting low velocity and underpredicting high velocity. To establish a gold standard as to the best performance possible for a 2D model of an undulating bed, two straight, rectangular-walled flume experiments were done with no bed slope and only different bed undulations and water surface slopes. One flume tested model performance in the presence of a porous, homogenous gravel bed with a long flat section, then a linear slope down to a flat pool bottom, and then the same linear slope back up to the flat bed. The other flume had a PVC plastic solid bed with a long flat section followed by a sequence of five identical riffle-pool pairs in close proximity, so it tested model performance given frequent undulations. Detailed water surface elevation and velocity measurements were made for both flumes. Comparing predicted versus observed velocity magnitude for 3 discharges with the gravel-bed flume and 1 discharge for the PVC-bed flume, the coefficient of determination ranged from 0.952 to 0.987 and the slope for the regression line was 0.957 to 1.02. Unsigned velocity

  18. Model of service-oriented catering supply chain performance evaluation

    OpenAIRE

    Gou, Juanqiong; Shen, Guguan; Chai, Rui

    2013-01-01

    Purpose: The aim of this paper is constructing a performance evaluation model for service-oriented catering supply chain. Design/methodology/approach: With the research on the current situation of catering industry, this paper summarized the characters of the catering supply chain, and then presents the service-oriented catering supply chain model based on the platform of logistics and information. At last, the fuzzy AHP method is used to evaluate the performance of service-oriented catering ...

  19. ASSESSING INDIVIDUAL PERFORMANCE ON INFORMATION TECHNOLOGY ADOPTION: A NEW MODEL

    OpenAIRE

    Diah Hari Suryaningrum

    2012-01-01

    This paper aims to propose a new model in assessing individual performance on information technology adoption. The new model to assess individual performance was derived from two different theories: decomposed theory of planned behavior and task-technology fit theory. Although many researchers have tried to expand these theories, some of their efforts might lack of theoretical assumptions. To overcome this problem and enhance the coherence of the integration, I used a theory from social scien...

  20. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...... that involve several types of numerical computations. The computers considered in our study are the Connection Machines CM-200 and CM-5, and the MasPar MP-2216...

  1. Human performance modeling for system of systems analytics.

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, Kevin R.; Lawton, Craig R.; Basilico, Justin Derrick; Longsine, Dennis E. (INTERA, Inc., Austin, TX); Forsythe, James Chris; Gauthier, John Henry; Le, Hai D.

    2008-10-01

    A Laboratory-Directed Research and Development project was initiated in 2005 to investigate Human Performance Modeling in a System of Systems analytic environment. SAND2006-6569 and SAND2006-7911 document interim results from this effort; this report documents the final results. The problem is difficult because of the number of humans involved in a System of Systems environment and the generally poorly defined nature of the tasks that each human must perform. A two-pronged strategy was followed: one prong was to develop human models using a probability-based method similar to that first developed for relatively well-understood probability based performance modeling; another prong was to investigate more state-of-art human cognition models. The probability-based modeling resulted in a comprehensive addition of human-modeling capability to the existing SoSAT computer program. The cognitive modeling resulted in an increased understanding of what is necessary to incorporate cognition-based models to a System of Systems analytic environment.

  2. Four-Stroke, Internal Combustion Engine Performance Modeling

    Science.gov (United States)

    Wagner, Richard C.

    In this thesis, two models of four-stroke, internal combustion engines are created and compared. The first model predicts the intake and exhaust processes using isentropic flow equations augmented by discharge coefficients. The second model predicts the intake and exhaust processes using a compressible, time-accurate, Quasi-One-Dimensional (Q1D) approach. Both models employ the same heat release and reduced-order modeling of the cylinder charge. Both include friction and cylinder loss models so that the predicted performance values can be compared to measurements. The results indicate that the isentropic-based model neglects important fluid mechanics and returns inaccurate results. The Q1D flow model, combined with the reduced-order model of the cylinder charge, is able to capture the dominant intake and exhaust fluid mechanics and produces results that compare well with measurement. Fluid friction, convective heat transfer, piston ring and skirt friction and temperature-varying specific heats in the working fluids are all shown to be significant factors in engine performance predictions. Charge blowby is shown to play a lesser role.

  3. Global climate model performance over Alaska and Greenland

    DEFF Research Database (Denmark)

    Walsh, John E.; Chapman, William L.; Romanovsky, Vladimir

    2008-01-01

    The performance of a set of 15 global climate models used in the Coupled Model Intercomparison Project is evaluated for Alaska and Greenland, and compared with the performance over broader pan-Arctic and Northern Hemisphere extratropical domains. Root-mean-square errors relative to the 1958...... of the models are generally much larger than the biases of the composite output, indicating that the systematic errors differ considerably among the models. There is a tendency for the models with smaller errors to simulate a larger greenhouse warming over the Arctic, as well as larger increases of Arctic...... to narrowing the uncertainty and obtaining more robust estimates of future climate change in regions such as Alaska, Greenland, and the broader Arctic....

  4. CORPORATE FORESIGHT AND PERFORMANCE: A CHAIN-OF-EFFECTS MODEL

    DEFF Research Database (Denmark)

    Jissink, Tymen; Huizingh, Eelko K.R.E.; Rohrbeck, René

    2015-01-01

    , formal organization, and culture. We investigate the relation of corporate foresight with three innovation performance dimensions – new product success, new product innovativeness, and financial performance. We use partial-least-squares structural equations modelling to assess our measurement mode ls......In this paper we develop and validate a measurement scale for corporate foresight and examine its impact on performance in a chain-of-effects model. We conceptualize corporate foresight as an organizational ability consisting of five distinct dimensions: information scope, method usage, people...... performance dimensions. Implications of our findings, and limitations and future research avenues are discussed....

  5. A network application for modeling a centrifugal compressor performance map

    Science.gov (United States)

    Nikiforov, A.; Popova, D.; Soldatova, K.

    2017-08-01

    The approximation of aerodynamic performance of a centrifugal compressor stage and vaneless diffuser by neural networks is presented. Advantages, difficulties and specific features of the method are described. An example of a neural network and its structure is shown. The performances in terms of efficiency, pressure ratio and work coefficient of 39 model stages within the range of flow coefficient from 0.01 to 0.08 were modeled with mean squared error 1.5 %. In addition, the loss and friction coefficients of vaneless diffusers of relative widths 0.014-0.10 are modeled with mean squared error 2.45 %.

  6. Assessment of performance of survival prediction models for cancer prognosis

    Directory of Open Access Journals (Sweden)

    Chen Hung-Chia

    2012-07-01

    Full Text Available Abstract Background Cancer survival studies are commonly analyzed using survival-time prediction models for cancer prognosis. A number of different performance metrics are used to ascertain the concordance between the predicted risk score of each patient and the actual survival time, but these metrics can sometimes conflict. Alternatively, patients are sometimes divided into two classes according to a survival-time threshold, and binary classifiers are applied to predict each patient’s class. Although this approach has several drawbacks, it does provide natural performance metrics such as positive and negative predictive values to enable unambiguous assessments. Methods We compare the survival-time prediction and survival-time threshold approaches to analyzing cancer survival studies. We review and compare common performance metrics for the two approaches. We present new randomization tests and cross-validation methods to enable unambiguous statistical inferences for several performance metrics used with the survival-time prediction approach. We consider five survival prediction models consisting of one clinical model, two gene expression models, and two models from combinations of clinical and gene expression models. Results A public breast cancer dataset was used to compare several performance metrics using five prediction models. 1 For some prediction models, the hazard ratio from fitting a Cox proportional hazards model was significant, but the two-group comparison was insignificant, and vice versa. 2 The randomization test and cross-validation were generally consistent with the p-values obtained from the standard performance metrics. 3 Binary classifiers highly depended on how the risk groups were defined; a slight change of the survival threshold for assignment of classes led to very different prediction results. Conclusions 1 Different performance metrics for evaluation of a survival prediction model may give different conclusions in

  7. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  8. Maintenance personnel performance simulation (MAPPS): a model for predicting maintenance performance reliability in nuclear power plants

    International Nuclear Information System (INIS)

    Knee, H.E.; Krois, P.A.; Haas, P.M.; Siegel, A.I.; Ryan, T.G.

    1983-01-01

    The NRC has developed a structured, quantitative, predictive methodology in the form of a computerized simulation model for assessing maintainer task performance. Objective of the overall program is to develop, validate, and disseminate a practical, useful, and acceptable methodology for the quantitative assessment of NPP maintenance personnel reliability. The program was organized into four phases: (1) scoping study, (2) model development, (3) model evaluation, and (4) model dissemination. The program is currently nearing completion of Phase 2 - Model Development

  9. System Level Modelling and Performance Estimation of Embedded Systems

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer

    The advances seen in the semiconductor industry within the last decade have brought the possibility of integrating evermore functionality onto a single chip forming functionally highly advanced embedded systems. These integration possibilities also imply that as the design complexity increases, so...... an efficient system level design methodology, a modelling framework for performance estimation and design space exploration at the system level is required. This thesis presents a novel component based modelling framework for system level modelling and performance estimation of embedded systems. The framework...... is performed by having the framework produce detailed quantitative information about the system model under investigation. The project is part of the national Danish research project, Danish Network of Embedded Systems (DaNES), which is funded by the Danish National Advanced Technology Foundation. The project...

  10. Causal Analysis for Performance Modeling of Computer Programs

    Directory of Open Access Journals (Sweden)

    Jan Lemeire

    2007-01-01

    Full Text Available Causal modeling and the accompanying learning algorithms provide useful extensions for in-depth statistical investigation and automation of performance modeling. We enlarged the scope of existing causal structure learning algorithms by using the form-free information-theoretic concept of mutual information and by introducing the complexity criterion for selecting direct relations among equivalent relations. The underlying probability distribution of experimental data is estimated by kernel density estimation. We then reported on the benefits of a dependency analysis and the decompositional capacities of causal models. Useful qualitative models, providing insight into the role of every performance factor, were inferred from experimental data. This paper reports on the results for a LU decomposition algorithm and on the study of the parameter sensitivity of the Kakadu implementation of the JPEG-2000 standard. Next, the analysis was used to search for generic performance characteristics of the applications.

  11. Construction Of A Performance Assessment Model For Zakat Management Institutions

    Directory of Open Access Journals (Sweden)

    Sri Fadilah

    2016-12-01

    Full Text Available The objective of the research is to examine the performance evaluation using Balanced Scorecard model. The research is conducted due to a big gap existing between zakat (alms and religious tax in Islam with its potential earn of as much as 217 trillion rupiahs and the realization of the collected zakat fund that is only reached for three trillion. This indicates that the performance of zakat management organizations in collecting the zakat is still very low. On the other hand, the quantity and the quality of zakat management organizations have to be improved. This means the performance evaluation model as a tool to evaluate performance is needed. The model construct is making a performance evaluation model that can be implemented to zakat management organizations. The organizational performance with Balanced Scorecard evaluation model will be effective if it is supported by three aspects, namely:  PI, BO and TQM. This research uses explanatory method and data analysis tool of SEM/PLS. Data collecting technique are questionnaires, interviews and documentation. The result of this research shows that PI, BO and TQM simultaneously and partially gives a significant effect on organizational performance.

  12. Longitudinal modeling in sports: young swimmers' performance and biomechanics profile.

    Science.gov (United States)

    Morais, Jorge E; Marques, Mário C; Marinho, Daniel A; Silva, António J; Barbosa, Tiago M

    2014-10-01

    New theories about dynamical systems highlight the multi-factorial interplay between determinant factors to achieve higher sports performances, including in swimming. Longitudinal research does provide useful information on the sportsmen's changes and how training help him to excel. These questions may be addressed in one single procedure such as latent growth modeling. The aim of the study was to model a latent growth curve of young swimmers' performance and biomechanics over a season. Fourteen boys (12.33 ± 0.65 years-old) and 16 girls (11.15 ± 0.55 years-old) were evaluated. Performance, stroke frequency, speed fluctuation, arm's propelling efficiency, active drag, active drag coefficient and power to overcome drag were collected in four different moments of the season. Latent growth curve modeling was computed to understand the longitudinal variation of performance (endogenous variables) over the season according to the biomechanics (exogenous variables). Latent growth curve modeling showed a high inter- and intra-subject variability in the performance growth. Gender had a significant effect at the baseline and during the performance growth. In each evaluation moment, different variables had a meaningful effect on performance (M1: Da, β = -0.62; M2: Da, β = -0.53; M3: η(p), β = 0.59; M4: SF, β = -0.57; all P < .001). The models' goodness-of-fit was 1.40 ⩽ χ(2)/df ⩽ 3.74 (good-reasonable). Latent modeling is a comprehensive way to gather insight about young swimmers' performance over time. Different variables were the main responsible for the performance improvement. A gender gap, intra- and inter-subject variability was verified. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Hybrid Corporate Performance Prediction Model Considering Technical Capability

    Directory of Open Access Journals (Sweden)

    Joonhyuck Lee

    2016-07-01

    Full Text Available Many studies have tried to predict corporate performance and stock prices to enhance investment profitability using qualitative approaches such as the Delphi method. However, developments in data processing technology and machine-learning algorithms have resulted in efforts to develop quantitative prediction models in various managerial subject areas. We propose a quantitative corporate performance prediction model that applies the support vector regression (SVR algorithm to solve the problem of the overfitting of training data and can be applied to regression problems. The proposed model optimizes the SVR training parameters based on the training data, using the genetic algorithm to achieve sustainable predictability in changeable markets and managerial environments. Technology-intensive companies represent an increasing share of the total economy. The performance and stock prices of these companies are affected by their financial standing and their technological capabilities. Therefore, we apply both financial indicators and technical indicators to establish the proposed prediction model. Here, we use time series data, including financial, patent, and corporate performance information of 44 electronic and IT companies. Then, we predict the performance of these companies as an empirical verification of the prediction performance of the proposed model.

  14. Facial Performance Transfer via Deformable Models and Parametric Correspondence.

    Science.gov (United States)

    Asthana, Akshay; de la Hunty, Miles; Dhall, Abhinav; Goecke, Roland

    2012-09-01

    The issue of transferring facial performance from one person's face to another's has been an area of interest for the movie industry and the computer graphics community for quite some time. In recent years, deformable face models, such as the Active Appearance Model (AAM), have made it possible to track and synthesize faces in real time. Not surprisingly, deformable face model-based approaches for facial performance transfer have gained tremendous interest in the computer vision and graphics community. In this paper, we focus on the problem of real-time facial performance transfer using the AAM framework. We propose a novel approach of learning the mapping between the parameters of two completely independent AAMs, using them to facilitate the facial performance transfer in a more realistic manner than previous approaches. The main advantage of modeling this parametric correspondence is that it allows a "meaningful" transfer of both the nonrigid shape and texture across faces irrespective of the speakers' gender, shape, and size of the faces, and illumination conditions. We explore linear and nonlinear methods for modeling the parametric correspondence between the AAMs and show that the sparse linear regression method performs the best. Moreover, we show the utility of the proposed framework for a cross-language facial performance transfer that is an area of interest for the movie dubbing industry.

  15. Toward a Subjective Measurement Model for Firm Performance

    Directory of Open Access Journals (Sweden)

    Luiz Artur Ledur Brito

    2012-05-01

    Full Text Available Firm performance is a relevant construct in strategic management research and frequently used as a dependentvariable. Despite this relevance, there is hardly a consensus about its definition, dimensionality andmeasurement, what limits advances in research and understanding of the concept. This article proposes and testsa measurement model for firm performance, based on subjective indicators. The model is grounded instakeholder theory and a review of empirical articles. Confirmatory Factor Analyses, using data from 116Brazilian senior managers, were used to test its fit and psychometric properties. The final model had six firstorderdimensions: profitability, growth, customer satisfaction, employee satisfaction, social performance, andenvironmental performance. A second-order financial performance construct, influencing growth andprofitability, correlated with the first-order intercorrelated, non-financial dimensions. Results suggest dimensionscannot be used interchangeably, since they represent different aspects of firm performance, and corroborate theidea that stakeholders have different demands that need to be managed independently. Researchers andpractitioners may use the model to fully treat performance in empirical studies and to understand the impact ofstrategies on multiple performance facets.

  16. A Composite Model for Employees' Performance Appraisal and Improvement

    Science.gov (United States)

    Manoharan, T. R.; Muralidharan, C.; Deshmukh, S. G.

    2012-01-01

    Purpose: The purpose of this paper is to develop an innovative method of performance appraisal that will be useful for designing a structured training programme. Design/methodology/approach: Employees' performance appraisals are conducted using new approaches, namely data envelopment analysis and an integrated fuzzy model. Interpretive structural…

  17. A Model for Effective Performance in the Indonesian Navy.

    Science.gov (United States)

    1987-06-01

    NAVY LEADERSHIP AND MANAGEMENT COM PETENCY M ODEL .................................. 15 D. MCBER COMPETENT MANAGERS MODEL ................ IS E. SU M M... leadership and managerial skills which emphasize on effective performance of the officers in managing the human resources under their cormnand and...supervision. By effective performance we mean officers who not only know about management theories , but who possess the characteristics, knowledge, skill, and

  18. Discussion of various models related to cloud performance

    OpenAIRE

    Kande, Chaitanya Krishna

    2015-01-01

    This paper discusses the various models related to cloud computing. Knowing the metrics related to infrastructure is very critical to enhance the performance of cloud services. Various metrics related to clouds such as pageview response time, admission control and enforcing elasticity to cloud infrastructure are very crucial in analyzing the characteristics of the cloud to enhance the cloud performance.

  19. Human performance modeling for system of systems analytics :soldier fatigue.

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R.; Campbell, James E.; Miller, Dwight Peter

    2005-10-01

    The military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives as can be seen in the Department of Defense's (DoD) Defense Modeling and Simulation Office's (DMSO) Master Plan (DoD 5000.59-P 1995). To this goal, the military is currently spending millions of dollars on programs devoted to HPM in various military contexts. Examples include the Human Performance Modeling Integration (HPMI) program within the Air Force Research Laboratory, which focuses on integrating HPMs with constructive models of systems (e.g. cockpit simulations) and the Navy's Human Performance Center (HPC) established in September 2003. Nearly all of these initiatives focus on the interface between humans and a single system. This is insufficient in the era of highly complex network centric SoS. This report presents research and development in the area of HPM in a system-of-systems (SoS). Specifically, this report addresses modeling soldier fatigue and the potential impacts soldier fatigue can have on SoS performance.

  20. Performance modeling of parallel algorithms for solving neutron diffusion problems

    International Nuclear Information System (INIS)

    Azmy, Y.Y.; Kirk, B.L.

    1995-01-01

    Neutron diffusion calculations are the most common computational methods used in the design, analysis, and operation of nuclear reactors and related activities. Here, mathematical performance models are developed for the parallel algorithm used to solve the neutron diffusion equation on message passing and shared memory multiprocessors represented by the Intel iPSC/860 and the Sequent Balance 8000, respectively. The performance models are validated through several test problems, and these models are used to estimate the performance of each of the two considered architectures in situations typical of practical applications, such as fine meshes and a large number of participating processors. While message passing computers are capable of producing speedup, the parallel efficiency deteriorates rapidly as the number of processors increases. Furthermore, the speedup fails to improve appreciably for massively parallel computers so that only small- to medium-sized message passing multiprocessors offer a reasonable platform for this algorithm. In contrast, the performance model for the shared memory architecture predicts very high efficiency over a wide range of number of processors reasonable for this architecture. Furthermore, the model efficiency of the Sequent remains superior to that of the hypercube if its model parameters are adjusted to make its processors as fast as those of the iPSC/860. It is concluded that shared memory computers are better suited for this parallel algorithm than message passing computers

  1. Comparative performance of high-fidelity training models for flexible ureteroscopy: Are all models effective?

    Directory of Open Access Journals (Sweden)

    Shashikant Mishra

    2011-01-01

    Full Text Available Objective: We performed a comparative study of high-fidelity training models for flexible ureteroscopy (URS. Our objective was to determine whether high-fidelity non-virtual reality (VR models are as effective as the VR model in teaching flexible URS skills. Materials and Methods: Twenty-one trained urologists without clinical experience of flexible URS underwent dry lab simulation practice. After a warm-up period of 2 h, tasks were performed on a high-fidelity non-VR (Uro-scopic Trainer TM ; Endo-Urologie-Modell TM and a high-fidelity VR model (URO Mentor TM . The participants were divided equally into three batches with rotation on each of the three stations for 30 min. Performance of the trainees was evaluated by an expert ureteroscopist using pass rating and global rating score (GRS. The participants rated a face validity questionnaire at the end of each session. Results: The GRS improved statistically at evaluation performed after second rotation (P<0.001 for batches 1, 2 and 3. Pass ratings also improved significantly for all training models when the third and first rotations were compared (P<0.05. The batch that was trained on the VR-based model had more improvement on pass ratings on second rotation but could not achieve statistical significance. Most of the realistic domains were higher for a VR model as compared with the non-VR model, except the realism of the flexible endoscope. Conclusions: All the models used for training flexible URS were effective in increasing the GRS and pass ratings irrespective of the VR status.

  2. An analytical model of the HINT performance metric

    Energy Technology Data Exchange (ETDEWEB)

    Snell, Q.O.; Gustafson, J.L. [Scalable Computing Lab., Ames, IA (United States)

    1996-10-01

    The HINT benchmark was developed to provide a broad-spectrum metric for computers and to measure performance over the full range of memory sizes and time scales. We have extended our understanding of why HINT performance curves look the way they do and can now predict the curves using an analytical model based on simple hardware specifications as input parameters. Conversely, by fitting the experimental curves with the analytical model, hardware specifications such as memory performance can be inferred to provide insight into the nature of a given computer system.

  3. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    OpenAIRE

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models marketing performance as a sequence of intermediate performance measures ultimately leading to financial performance. This framework, called the Hierarchical Marketing Performance (HMP) framework, starts ...

  4. Modelling of green roof hydrological performance for urban drainage applications

    DEFF Research Database (Denmark)

    Locatelli, Luca; Mark, Ole; Mikkelsen, Peter Steen

    2014-01-01

    Green roofs are being widely implemented for stormwater management and their impact on the urban hydrological cycle can be evaluated by incorporating them into urban drainage models. This paper presents a model of green roof long term and single event hydrological performance. The model includes...... surface and subsurface storage components representing the overall retention capacity of the green roof which is continuously re-established by evapotranspiration. The runoff from the model is described through a non-linear reservoir approach. The model was calibrated and validated using measurement data...... from 3 different extensive sedum roofs in Denmark. These data consist of high-resolution measurements of runoff, precipitation and atmospheric variables in the period 2010–2012. The hydrological response of green roofs was quantified based on statistical analysis of the results of a 22-year (1989...

  5. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State Univ., Fort Collins, CO (United States); El-Azab, Anter [Florida State Univ., Tallahassee, FL (United States); Pernice, Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States); Peterson, John W. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Polyakov, Peter [Univ. of Wyoming, Laramie, WY (United States); Tavener, Simon [Colorado State Univ., Fort Collins, CO (United States); Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States); Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis for computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.

  6. Review of Methods for Buildings Energy Performance Modelling

    Science.gov (United States)

    Krstić, Hrvoje; Teni, Mihaela

    2017-10-01

    Research presented in this paper gives a brief review of methods used for buildings energy performance modelling. This paper gives also a comprehensive review of the advantages and disadvantages of available methods as well as the input parameters used for modelling buildings energy performance. European Directive EPBD obliges the implementation of energy certification procedure which gives an insight on buildings energy performance via exiting energy certificate databases. Some of the methods for buildings energy performance modelling mentioned in this paper are developed by employing data sets of buildings which have already undergone an energy certification procedure. Such database is used in this paper where the majority of buildings in the database have already gone under some form of partial retrofitting – replacement of windows or installation of thermal insulation but still have poor energy performance. The case study presented in this paper utilizes energy certificates database obtained from residential units in Croatia (over 400 buildings) in order to determine the dependence between buildings energy performance and variables from database by using statistical dependencies tests. Building energy performance in database is presented with building energy efficiency rate (from A+ to G) which is based on specific annual energy needs for heating for referential climatic data [kWh/(m2a)]. Independent variables in database are surfaces and volume of the conditioned part of the building, building shape factor, energy used for heating, CO2 emission, building age and year of reconstruction. Research results presented in this paper give an insight in possibilities of methods used for buildings energy performance modelling. Further on it gives an analysis of dependencies between buildings energy performance as a dependent variable and independent variables from the database. Presented results could be used for development of new building energy performance

  7. PORFLOW Modeling Supporting The H-Tank Farm Performance Assessment

    International Nuclear Information System (INIS)

    Jordan, J. M.; Flach, G. P.; Westbrook, M. L.

    2012-01-01

    Numerical simulations of groundwater flow and contaminant transport in the vadose and saturated zones have been conducted using the PORFLOW code in support of an overall Performance Assessment (PA) of the H-Tank Farm. This report provides technical detail on selected aspects of PORFLOW model development and describes the structure of the associated electronic files. The PORFLOW models for the H-Tank Farm PA, Rev. 1 were updated with grout, solubility, and inventory changes. The aquifer model was refined. In addition, a set of flow sensitivity runs were performed to allow flow to be varied in the related probabilistic GoldSim models. The final PORFLOW concentration values are used as input into a GoldSim dose calculator

  8. PORFLOW Modeling Supporting The H-Tank Farm Performance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, J. M.; Flach, G. P.; Westbrook, M. L.

    2012-08-31

    Numerical simulations of groundwater flow and contaminant transport in the vadose and saturated zones have been conducted using the PORFLOW code in support of an overall Performance Assessment (PA) of the H-Tank Farm. This report provides technical detail on selected aspects of PORFLOW model development and describes the structure of the associated electronic files. The PORFLOW models for the H-Tank Farm PA, Rev. 1 were updated with grout, solubility, and inventory changes. The aquifer model was refined. In addition, a set of flow sensitivity runs were performed to allow flow to be varied in the related probabilistic GoldSim models. The final PORFLOW concentration values are used as input into a GoldSim dose calculator.

  9. Observer analysis and its impact on task performance modeling

    Science.gov (United States)

    Jacobs, Eddie L.; Brown, Jeremy B.

    2014-05-01

    Fire fighters use relatively low cost thermal imaging cameras to locate hot spots and fire hazards in buildings. This research describes the analyses performed to study the impact of thermal image quality on fire fighter fire hazard detection task performance. Using human perception data collected by the National Institute of Standards and Technology (NIST) for fire fighters detecting hazards in a thermal image, an observer analysis was performed to quantify the sensitivity and bias of each observer. Using this analysis, the subjects were divided into three groups representing three different levels of performance. The top-performing group was used for the remainder of the modeling. Models were developed which related image quality factors such as contrast, brightness, spatial resolution, and noise to task performance probabilities. The models were fitted to the human perception data using logistic regression, as well as probit regression. Probit regression was found to yield superior fits and showed that models with not only 2nd order parameter interactions, but also 3rd order parameter interactions performed the best.

  10. How well can we forecast future model error and uncertainty by mining past model performance data

    Science.gov (United States)

    Solomatine, Dimitri

    2016-04-01

    Consider a hydrological model Y(t) = M(X(t), P), where X=vector of inputs; P=vector of parameters; Y=model output (typically flow); t=time. In cases when there is enough past data on the model M performance, it is possible to use this data to build a (data-driven) model EC of model M error. This model EC will be able to forecast error E when a new input X is fed into model M; then subtracting E from the model prediction Y a better estimate of Y can be obtained. Model EC is usually called the error corrector (in meteorology - a bias corrector). However, we may go further in characterizing model deficiencies, and instead of using the error (a real value) we may consider a more sophisticated characterization, namely a probabilistic one. So instead of rather a model EC of the model M error it is also possible to build a model U of model M uncertainty; if uncertainty is described as the model error distribution D this model will calculate its properties - mean, variance, other moments, and quantiles. The general form of this model could be: D = U (RV), where RV=vector of relevant variables having influence on model uncertainty (to be identified e.g. by mutual information analysis); D=vector of variables characterizing the error distribution (typically, two or more quantiles). There is one aspect which is not always explicitly mentioned in uncertainty analysis work. In our view it is important to distinguish the following main types of model uncertainty: 1. The residual uncertainty of models. In this case the model parameters and/or model inputs are considered to be fixed (deterministic), i.e. the model is considered to be optimal (calibrated) and deterministic. Model error is considered as the manifestation of uncertainty. If there is enough past data about the model errors (i.e. its uncertainty), it is possible to build a statistical or machine learning model of uncertainty trained on this data. Here the following methods can be mentioned: (a) quantile regression (QR

  11. Aerodynamic drag modeling of alpine skiers performing giant slalom turns.

    Science.gov (United States)

    Meyer, Frédéric; Le Pelley, David; Borrani, Fabio

    2012-06-01

    Aerodynamic drag plays an important role in performance for athletes practicing sports that involve high-velocity motions. In giant slalom, the skier is continuously changing his/her body posture, and this affects the energy dissipated in aerodynamic drag. It is therefore important to quantify this energy to understand the dynamic behavior of the skier. The aims of this study were to model the aerodynamic drag of alpine skiers in giant slalom simulated conditions and to apply these models in a field experiment to estimate energy dissipated through aerodynamic drag. The aerodynamic characteristics of 15 recreational male and female skiers were measured in a wind tunnel while holding nine different skiing-specific postures. The drag and the frontal area were recorded simultaneously for each posture. Four generalized and two individualized models of the drag coefficient were built, using different sets of parameters. These models were subsequently applied in a field study designed to compare the aerodynamic energy losses between a dynamic and a compact skiing technique. The generalized models estimated aerodynamic drag with an accuracy of between 11.00% and 14.28%, and the individualized models estimated aerodynamic drag with an accuracy between 4.52% and 5.30%. The individualized model used for the field study showed that using a dynamic technique led to 10% more aerodynamic drag energy loss than using a compact technique. The individualized models were capable of discriminating different techniques performed by advanced skiers and seemed more accurate than the generalized models. The models presented here offer a simple yet accurate method to estimate the aerodynamic drag acting upon alpine skiers while rapidly moving through the range of positions typical to turning technique.

  12. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  13. Aqua/Aura Updated Inclination Adjust Maneuver Performance Prediction Model

    Science.gov (United States)

    Boone, Spencer

    2017-01-01

    This presentation will discuss the updated Inclination Adjust Maneuver (IAM) performance prediction model that was developed for Aqua and Aura following the 2017 IAM series. This updated model uses statistical regression methods to identify potential long-term trends in maneuver parameters, yielding improved predictions when re-planning past maneuvers. The presentation has been reviewed and approved by Eric Moyer, ESMO Deputy Project Manager.

  14. Dynamic Experiments and Constitutive Model Performance for Polycarbonate

    Science.gov (United States)

    2014-07-01

    Storage and loss tangent moduli for PC; DMA experiments performed at 1 Hz and shift at 100 Hz showing the  and transition regions using the...author would also like to thank Dr. Adam D. Mulliken for courteously providing the experimental results and the Abaqus version of the model and...exponential factor . In 1955, the Ree-Eyring model further accounted for microstructural mechanisms by relating molecular motions to yield behavior

  15. Does segmentation always improve model performance in credit scoring?

    OpenAIRE

    Bijak, Katarzyna; Thomas, Lyn C.

    2012-01-01

    Credit scoring allows for the credit risk assessment of bank customers. A single scoring model (scorecard) can be developed for the entire customer population, e.g. using logistic regression. However, it is often expected that segmentation, i.e. dividing the population into several groups and building separate scorecards for them, will improve the model performance. The most common statistical methods for segmentation are the two-step approaches, where logistic regression follows Classificati...

  16. Building Information Modeling (BIM) for Indoor Environmental Performance Analysis

    DEFF Research Database (Denmark)

    The report is a part of a research assignment carried out by students in the 5ETCS course “Project Byggeri – [entitled as: Building Information Modeling (BIM) – Modeling & Analysis]”, during the 3rd semester of master degree in Civil and Architectural Engineering, Department of Engineering, Aarhus...... University. This includes seven papers describing BIM for Sustainability, concentrating specifically on individual topics regarding to Indoor Environment Performance Analysis....

  17. Spatial variability and parametric uncertainty in performance assessment models

    International Nuclear Information System (INIS)

    Pensado, Osvaldo; Mancillas, James; Painter, Scott; Tomishima, Yasuo

    2011-01-01

    The problem of defining an appropriate treatment of distribution functions (which could represent spatial variability or parametric uncertainty) is examined based on a generic performance assessment model for a high-level waste repository. The generic model incorporated source term models available in GoldSim ® , the TDRW code for contaminant transport in sparse fracture networks with a complex fracture-matrix interaction process, and a biosphere dose model known as BDOSE TM . Using the GoldSim framework, several Monte Carlo sampling approaches and transport conceptualizations were evaluated to explore the effect of various treatments of spatial variability and parametric uncertainty on dose estimates. Results from a model employing a representative source and ensemble-averaged pathway properties were compared to results from a model allowing for stochastic variation of transport properties along streamline segments (i.e., explicit representation of spatial variability within a Monte Carlo realization). We concluded that the sampling approach and the definition of an ensemble representative do influence consequence estimates. In the examples analyzed in this paper, approaches considering limited variability of a transport resistance parameter along a streamline increased the frequency of fast pathways resulting in relatively high dose estimates, while those allowing for broad variability along streamlines increased the frequency of 'bottlenecks' reducing dose estimates. On this basis, simplified approaches with limited consideration of variability may suffice for intended uses of the performance assessment model, such as evaluation of site safety. (author)

  18. Conceptual Modeling of Performance Indicators of Higher Education Institutions

    OpenAIRE

    Kahveci, Tuba Canvar; Taşkın, Harun; Toklu, Merve Cengiz

    2013-01-01

    Measuring and analyzing any type of organization are carried out by different actors in the organization. The performance indicators of performance management system increase according to products or services of the organization. Also these indicators should be defined for all levels of the organization. Finally, all of these characteristics make the performance evaluation process more complex for organizations. In order to manage this complexity, the process should be modeled at the beginnin...

  19. Aircraft Anomaly Detection Using Performance Models Trained on Fleet Data

    Science.gov (United States)

    Gorinevsky, Dimitry; Matthews, Bryan L.; Martin, Rodney

    2012-01-01

    This paper describes an application of data mining technology called Distributed Fleet Monitoring (DFM) to Flight Operational Quality Assurance (FOQA) data collected from a fleet of commercial aircraft. DFM transforms the data into aircraft performance models, flight-to-flight trends, and individual flight anomalies by fitting a multi-level regression model to the data. The model represents aircraft flight performance and takes into account fixed effects: flight-to-flight and vehicle-to-vehicle variability. The regression parameters include aerodynamic coefficients and other aircraft performance parameters that are usually identified by aircraft manufacturers in flight tests. Using DFM, the multi-terabyte FOQA data set with half-million flights was processed in a few hours. The anomalies found include wrong values of competed variables, (e.g., aircraft weight), sensor failures and baises, failures, biases, and trends in flight actuators. These anomalies were missed by the existing airline monitoring of FOQA data exceedances.

  20. Model of service-oriented catering supply chain performance evaluation

    Directory of Open Access Journals (Sweden)

    Juanqiong Gou

    2013-03-01

    Full Text Available Purpose: The aim of this paper is constructing a performance evaluation model for service-oriented catering supply chain. Design/methodology/approach: With the research on the current situation of catering industry, this paper summarized the characters of the catering supply chain, and then presents the service-oriented catering supply chain model based on the platform of logistics and information. At last, the fuzzy AHP method is used to evaluate the performance of service-oriented catering supply chain. Findings: With the analysis of the characteristics of catering supply chain, we construct the performance evaluation model in order to guarantee the food safety, logistics efficiency, price stability and so on. Practical implications: In order to evolve an efficient and effective service supply chain, it can not only used to own enterprise improvement, but also can be used for selecting different customers, to choose a different model of development. Originality/value: This paper has a new definition of service-oriented catering supply chain. And it offers a model to evaluate the performance of this catering supply chain.

  1. In Silico Modeling of Gastrointestinal Drug Absorption: Predictive Performance of Three Physiologically Based Absorption Models.

    Science.gov (United States)

    Sjögren, Erik; Thörn, Helena; Tannergren, Christer

    2016-06-06

    Gastrointestinal (GI) drug absorption is a complex process determined by formulation, physicochemical and biopharmaceutical factors, and GI physiology. Physiologically based in silico absorption models have emerged as a widely used and promising supplement to traditional in vitro assays and preclinical in vivo studies. However, there remains a lack of comparative studies between different models. The aim of this study was to explore the strengths and limitations of the in silico absorption models Simcyp 13.1, GastroPlus 8.0, and GI-Sim 4.1, with respect to their performance in predicting human intestinal drug absorption. This was achieved by adopting an a priori modeling approach and using well-defined input data for 12 drugs associated with incomplete GI absorption and related challenges in predicting the extent of absorption. This approach better mimics the real situation during formulation development where predictive in silico models would be beneficial. Plasma concentration-time profiles for 44 oral drug administrations were calculated by convolution of model-predicted absorption-time profiles and reported pharmacokinetic parameters. Model performance was evaluated by comparing the predicted plasma concentration-time profiles, Cmax, tmax, and exposure (AUC) with observations from clinical studies. The overall prediction accuracies for AUC, given as the absolute average fold error (AAFE) values, were 2.2, 1.6, and 1.3 for Simcyp, GastroPlus, and GI-Sim, respectively. The corresponding AAFE values for Cmax were 2.2, 1.6, and 1.3, respectively, and those for tmax were 1.7, 1.5, and 1.4, respectively. Simcyp was associated with underprediction of AUC and Cmax; the accuracy decreased with decreasing predicted fabs. A tendency for underprediction was also observed for GastroPlus, but there was no correlation with predicted fabs. There were no obvious trends for over- or underprediction for GI-Sim. The models performed similarly in capturing dependencies on dose and

  2. Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance

    Science.gov (United States)

    Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.

    2014-01-01

    This paper presents recent thermal model results of the Advanced Stirling Radioisotope Generator (ASRG). The three-dimensional (3D) ASRG thermal power model was built using the Thermal Desktop(trademark) thermal analyzer. The model was correlated with ASRG engineering unit test data and ASRG flight unit predictions from Lockheed Martin's (LM's) I-deas(trademark) TMG thermal model. The auxiliary cooling system (ACS) of the ASRG is also included in the ASRG thermal model. The ACS is designed to remove waste heat from the ASRG so that it can be used to heat spacecraft components. The performance of the ACS is reported under nominal conditions and during a Venus flyby scenario. The results for the nominal case are validated with data from Lockheed Martin. Transient thermal analysis results of ASRG for a Venus flyby with a representative trajectory are also presented. In addition, model results of an ASRG mounted on a Cassini-like spacecraft with a sunshade are presented to show a way to mitigate the high temperatures of a Venus flyby. It was predicted that the sunshade can lower the temperature of the ASRG alternator by 20 C for the representative Venus flyby trajectory. The 3D model also was modified to predict generator performance after a single Advanced Stirling Convertor failure. The geometry of the Microtherm HT insulation block on the outboard side was modified to match deformation and shrinkage observed during testing of a prototypic ASRG test fixture by LM. Test conditions and test data were used to correlate the model by adjusting the thermal conductivity of the deformed insulation to match the post-heat-dump steady state temperatures. Results for these conditions showed that the performance of the still-functioning inboard ACS was unaffected.

  3. Evaluating Flight Crew Performance by a Bayesian Network Model

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2018-03-01

    Full Text Available Flight crew performance is of great significance in keeping flights safe and sound. When evaluating the crew performance, quantitative detailed behavior information may not be available. The present paper introduces the Bayesian Network to perform flight crew performance evaluation, which permits the utilization of multidisciplinary sources of objective and subjective information, despite sparse behavioral data. In this paper, the causal factors are selected based on the analysis of 484 aviation accidents caused by human factors. Then, a network termed Flight Crew Performance Model is constructed. The Delphi technique helps to gather subjective data as a supplement to objective data from accident reports. The conditional probabilities are elicited by the leaky noisy MAX model. Two ways of inference for the BN—probability prediction and probabilistic diagnosis are used and some interesting conclusions are drawn, which could provide data support to make interventions for human error management in aviation safety.

  4. A model to describe the performance of the UASB reactor.

    Science.gov (United States)

    Rodríguez-Gómez, Raúl; Renman, Gunno; Moreno, Luis; Liu, Longcheng

    2014-04-01

    A dynamic model to describe the performance of the Upflow Anaerobic Sludge Blanket (UASB) reactor was developed. It includes dispersion, advection, and reaction terms, as well as the resistances through which the substrate passes before its biotransformation. The UASB reactor is viewed as several continuous stirred tank reactors connected in series. The good agreement between experimental and simulated results shows that the model is able to predict the performance of the UASB reactor (i.e. substrate concentration, biomass concentration, granule size, and height of the sludge bed).

  5. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    NARCIS (Netherlands)

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models

  6. visCOS: An R-package to evaluate model performance of hydrological models

    Science.gov (United States)

    Klotz, Daniel; Herrnegger, Mathew; Wesemann, Johannes; Schulz, Karsten

    2016-04-01

    The evaluation of model performance is a central part of (hydrological) modelling. Much attention has been given to the development of evaluation criteria and diagnostic frameworks. (Klemeš, 1986; Gupta et al., 2008; among many others). Nevertheless, many applications exist for which objective functions do not yet provide satisfying summaries. Thus, the necessity to visualize results arises in order to explore a wider range of model capacities, be it strengths or deficiencies. Visualizations are usually devised for specific projects and these efforts are often not distributed to a broader community (e.g. via open source software packages). Hence, the opportunity to explicitly discuss a state-of-the-art presentation technique is often missed. We therefore present a comprehensive R-package for evaluating model performance by visualizing and exploring different aspects of hydrological time-series. The presented package comprises a set of useful plots and visualization methods, which complement existing packages, such as hydroGOF (Zambrano-Bigiarini et al., 2012). It is derived from practical applications of the hydrological models COSERO and COSEROreg (Kling et al., 2014). visCOS, providing an interface in R, represents an easy-to-use software package for visualizing and assessing model performance and can be implemented in the process of model calibration or model development. The package provides functions to load hydrological data into R, clean the data, process, visualize, explore and finally save the results in a consistent way. Together with an interactive zoom function of the time series, an online calculation of the objective functions for variable time-windows is included. Common hydrological objective functions, such as the Nash-Sutcliffe Efficiency and the Kling-Gupta Efficiency, can also be evaluated and visualized in different ways for defined sub-periods like hydrological years or seasonal sections. Many hydrologists use long-term water-balances as a

  7. Outdoor FSO Communications Under Fog: Attenuation Modeling and Performance Evaluation

    KAUST Repository

    Esmail, Maged Abdullah

    2016-07-18

    Fog is considered to be a primary challenge for free space optics (FSO) systems. It may cause attenuation that is up to hundreds of decibels per kilometer. Hence, accurate modeling of fog attenuation will help telecommunication operators to engineer and appropriately manage their networks. In this paper, we examine fog measurement data coming from several locations in Europe and the United States and derive a unified channel attenuation model. Compared with existing attenuation models, our proposed model achieves a minimum of 9 dB, which is lower than the average root-mean-square error (RMSE). Moreover, we have investigated the statistical behavior of the channel and developed a probabilistic model under stochastic fog conditions. Furthermore, we studied the performance of the FSO system addressing various performance metrics, including signal-to-noise ratio (SNR), bit-error rate (BER), and channel capacity. Our results show that in communication environments with frequent fog, FSO is typically a short-range data transmission technology. Therefore, FSO will have its preferred market segment in future wireless fifth-generation/sixth-generation (5G/6G) networks having cell sizes that are lower than a 1-km diameter. Moreover, the results of our modeling and analysis can be applied in determining the switching/thresholding conditions in highly reliable hybrid FSO/radio-frequency (RF) networks.

  8. Effect of Using Extreme Years in Hydrologic Model Calibration Performance

    Science.gov (United States)

    Goktas, R. K.; Tezel, U.; Kargi, P. G.; Ayvaz, T.; Tezyapar, I.; Mesta, B.; Kentel, E.

    2017-12-01

    Hydrological models are useful in predicting and developing management strategies for controlling the system behaviour. Specifically they can be used for evaluating streamflow at ungaged catchments, effect of climate change, best management practices on water resources, or identification of pollution sources in a watershed. This study is a part of a TUBITAK project named "Development of a geographical information system based decision-making tool for water quality management of Ergene Watershed using pollutant fingerprints". Within the scope of this project, first water resources in Ergene Watershed is studied. Streamgages found in the basin are identified and daily streamflow measurements are obtained from State Hydraulic Works of Turkey. Streamflow data is analysed using box-whisker plots, hydrographs and flow-duration curves focusing on identification of extreme periods, dry or wet. Then a hydrological model is developed for Ergene Watershed using HEC-HMS in the Watershed Modeling System (WMS) environment. The model is calibrated for various time periods including dry and wet ones and the performance of calibration is evaluated using Nash-Sutcliffe Efficiency (NSE), correlation coefficient, percent bias (PBIAS) and root mean square error. It is observed that calibration period affects the model performance, and the main purpose of the development of the hydrological model should guide calibration period selection. Acknowledgement: This study is funded by The Scientific and Technological Research Council of Turkey (TUBITAK) under Project Number 115Y064.

  9. The predictive performance and stability of six species distribution models.

    Science.gov (United States)

    Duan, Ren-Yan; Kong, Xiao-Quan; Huang, Min-Yi; Fan, Wei-Yi; Wang, Zhi-Gao

    2014-01-01

    Predicting species' potential geographical range by species distribution models (SDMs) is central to understand their ecological requirements. However, the effects of using different modeling techniques need further investigation. In order to improve the prediction effect, we need to assess the predictive performance and stability of different SDMs. We collected the distribution data of five common tree species (Pinus massoniana, Betula platyphylla, Quercus wutaishanica, Quercus mongolica and Quercus variabilis) and simulated their potential distribution area using 13 environmental variables and six widely used SDMs: BIOCLIM, DOMAIN, MAHAL, RF, MAXENT, and SVM. Each model run was repeated 100 times (trials). We compared the predictive performance by testing the consistency between observations and simulated distributions and assessed the stability by the standard deviation, coefficient of variation, and the 99% confidence interval of Kappa and AUC values. The mean values of AUC and Kappa from MAHAL, RF, MAXENT, and SVM trials were similar and significantly higher than those from BIOCLIM and DOMAIN trials (pSDMs (MAHAL, RF, MAXENT, and SVM) had higher prediction accuracy, smaller confidence intervals, and were more stable and less affected by the random variable (randomly selected pseudo-absence points). According to the prediction performance and stability of SDMs, we can divide these six SDMs into two categories: a high performance and stability group including MAHAL, RF, MAXENT, and SVM, and a low performance and stability group consisting of BIOCLIM, and DOMAIN. We highlight that choosing appropriate SDMs to address a specific problem is an important part of the modeling process.

  10. Physics based performance model of a UV missile seeker

    Science.gov (United States)

    James, I.

    2017-10-01

    Electro-optically (EO) guided surface to air missiles (SAM) have developed to use Ultraviolet (UV) wavebands supplementary to the more common Infrared (IR) wavebands. Missiles such as the US Stinger have been around for some time, these have been joined recently by Chinese FN-16 and Russian SA-29 (Verba) and there is a much higher potential proliferation risk. The purpose of this paper is to introduce a first-principles, physics based, model of a typical seeker arrangement. The model is constructed from various calculations that aim to characterise the physical effects that will affect the performance of the system. Data has been gathered from a number of sources to provide realism to the variables within the model. It will be demonstrated that many of the variables have the power to dramatically alter the performance of the system as a whole. Further, data will be shown to illustrate the expected performance of a typical UV detector within a SAM in detection range against a variety of target sizes. The trend for the detection range against aircraft size and skin reflectivity will be shown to be non-linear, this should have been expected owing to the exponential decay of a signal through atmosphere. Future work will validate the performance of the model against real world performance data for cameras (when this is available) to ensure that it is operates within acceptable errors.

  11. Performance of neutron kinetics models for ADS transient analyses

    International Nuclear Information System (INIS)

    Rineiski, A.; Maschek, W.; Rimpault, G.

    2002-01-01

    Within the framework of the SIMMER code development, neutron kinetics models for simulating transients and hypothetical accidents in advanced reactor systems, in particular in Accelerator Driven Systems (ADSs), have been developed at FZK/IKET in cooperation with CE Cadarache. SIMMER is a fluid-dynamics/thermal-hydraulics code, coupled with a structure model and a space-, time- and energy-dependent neutronics module for analyzing transients and accidents. The advanced kinetics models have also been implemented into KIN3D, a module of the VARIANT/TGV code (stand-alone neutron kinetics) for broadening application and for testing and benchmarking. In the paper, a short review of the SIMMER and KIN3D neutron kinetics models is given. Some typical transients related to ADS perturbations are analyzed. The general models of SIMMER and KIN3D are compared with more simple techniques developed in the context of this work to get a better understanding of the specifics of transients in subcritical systems and to estimate the performance of different kinetics options. These comparisons may also help in elaborating new kinetics models and extending existing computation tools for ADS transient analyses. The traditional point-kinetics model may give rather inaccurate transient reaction rate distributions in an ADS even if the material configuration does not change significantly. This inaccuracy is not related to the problem of choosing a 'right' weighting function: the point-kinetics model with any weighting function cannot take into account pronounced flux shape variations related to possible significant changes in the criticality level or to fast beam trips. To improve the accuracy of the point-kinetics option for slow transients, we have introduced a correction factor technique. The related analyses give a better understanding of 'long-timescale' kinetics phenomena in the subcritical domain and help to evaluate the performance of the quasi-static scheme in a particular case. One

  12. Positioning performance of the NTCM model driven by GPS Klobuchar model parameters

    Science.gov (United States)

    Hoque, Mohammed Mainul; Jakowski, Norbert; Berdermann, Jens

    2018-03-01

    Users of the Global Positioning System (GPS) utilize the Ionospheric Correction Algorithm (ICA) also known as Klobuchar model for correcting ionospheric signal delay or range error. Recently, we developed an ionosphere correction algorithm called NTCM-Klobpar model for single frequency GNSS applications. The model is driven by a parameter computed from GPS Klobuchar model and consecutively can be used instead of the GPS Klobuchar model for ionospheric corrections. In the presented work we compare the positioning solutions obtained using NTCM-Klobpar with those using the Klobuchar model. Our investigation using worldwide ground GPS data from a quiet and a perturbed ionospheric and geomagnetic activity period of 17 days each shows that the 24-hour prediction performance of the NTCM-Klobpar is better than the GPS Klobuchar model in global average. The root mean squared deviation of the 3D position errors are found to be about 0.24 and 0.45 m less for the NTCM-Klobpar compared to the GPS Klobuchar model during quiet and perturbed condition, respectively. The presented algorithm has the potential to continuously improve the accuracy of GPS single frequency mass market devices with only little software modification.

  13. Hybrid Building Performance Simulation Models for Industrial Energy Efficiency Applications

    Directory of Open Access Journals (Sweden)

    Peter Smolek

    2018-06-01

    Full Text Available In the challenge of achieving environmental sustainability, industrial production plants, as large contributors to the overall energy demand of a country, are prime candidates for applying energy efficiency measures. A modelling approach using cubes is used to decompose a production facility into manageable modules. All aspects of the facility are considered, classified into the building, energy system, production and logistics. This approach leads to specific challenges for building performance simulations since all parts of the facility are highly interconnected. To meet this challenge, models for the building, thermal zones, energy converters and energy grids are presented and the interfaces to the production and logistics equipment are illustrated. The advantages and limitations of the chosen approach are discussed. In an example implementation, the feasibility of the approach and models is shown. Different scenarios are simulated to highlight the models and the results are compared.

  14. Fracture modelling of a high performance armour steel

    Science.gov (United States)

    Skoglund, P.; Nilsson, M.; Tjernberg, A.

    2006-08-01

    The fracture characteristics of the high performance armour steel Armox 500T is investigated. Tensile mechanical experiments using samples with different notch geometries are used to investigate the effect of multi-axial stress states on the strain to fracture. The experiments are numerically simulated and from the simulation the stress at the point of fracture initiation is determined as a function of strain and these data are then used to extract parameters for fracture models. A fracture model based on quasi-static experiments is suggested and the model is tested against independent experiments done at both static and dynamic loading. The result show that the fracture model give reasonable good agreement between simulations and experiments at both static and dynamic loading condition. This indicates that multi-axial loading is more important to the strain to fracture than the deformation rate in the investigated loading range. However on-going work will further characterise the fracture behaviour of Armox 500T.

  15. A PERFORMANCE MANAGEMENT MODEL FOR PHYSICAL ASSET MANAGEMENT

    Directory of Open Access Journals (Sweden)

    J.L. Jooste

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: There has been an emphasis shift from maintenance management towards asset management, where the focus is on reliable and operational equipment and on effective assets at optimum life-cycle costs. A challenge in the manufacturing industry is to develop an asset performance management model that is integrated with business processes and strategies. The authors developed the APM2 model to satisfy that requirement. The model has a generic reference structure and is supported by operational protocols to assist in operations management. It facilitates performance measurement, business integration and continuous improvement, whilst exposing industry to the latest developments in asset performance management.

    AFRIKAANSE OPSOMMING: Daar is ‘n klemverskuiwing vanaf onderhoudsbestuur na batebestuur, waar daar gefokus word op betroubare en operasionele toerusting, asook effektiewe bates teen optimum lewensikluskoste. ‘n Uitdaging in die vervaardigingsindustrie is die ontwikkeling van ‘n prestasiemodel vir bates, wat geïntegreer is met besigheidsprosesse en –strategieë. Die outeurs het die APM2 model ontwikkel om in hierdie behoefte te voorsien. Die model het ‘n generiese verwysingsstruktuur, wat ondersteun word deur operasionele instruksies wat operasionele bestuur bevorder. Dit fasiliteer prestasiebestuur, besigheidsintegrasie en voortdurende verbetering, terwyl dit die industrie ook blootstel aan die nuutste ontwikkelinge in prestasiebestuur van bates.

  16. Product Data Model for Performance-driven Design

    Science.gov (United States)

    Hu, Guang-Zhong; Xu, Xin-Jian; Xiao, Shou-Ne; Yang, Guang-Wu; Pu, Fan

    2017-09-01

    When designing large-sized complex machinery products, the design focus is always on the overall performance; however, there exist no design theory and method based on performance driven. In view of the deficiency of the existing design theory, according to the performance features of complex mechanical products, the performance indices are introduced into the traditional design theory of "Requirement-Function-Structure" to construct a new five-domain design theory of "Client Requirement-Function-Performance-Structure-Design Parameter". To support design practice based on this new theory, a product data model is established by using performance indices and the mapping relationship between them and the other four domains. When the product data model is applied to high-speed train design and combining the existing research result and relevant standards, the corresponding data model and its structure involving five domains of high-speed trains are established, which can provide technical support for studying the relationships between typical performance indices and design parameters and the fast achievement of a high-speed train scheme design. The five domains provide a reference for the design specification and evaluation criteria of high speed train and a new idea for the train's parameter design.

  17. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  18. Summary of Calculation Performed with NPIC's New FGR Model

    International Nuclear Information System (INIS)

    Jiao Yongjun; Li Wenjie; Zhou Yi; Xing Shuo

    2013-01-01

    1. Introduction The NPIC modeling group has performed calculations on both real cases and idealized cases in FUMEX II and III data packages. The performance code we used is COPERNIC 2.4 developed by AREVA but a new FGR model has been added. Therefore, a comparison study has been made between the Bernard model (V2.2) and the new model, in order to evaluate the performance of the new model. As mentioned before, the focus of our study lies in thermal fission gas release, or more specifically the grain boundary bubble behaviors. 2. Calculation method There are some differences between the calculated burnup and measured burnup in many real cases. Considering FGR is significant dependent on rod average burnup, a multiplicative factor on fuel rod linear power, i.e. FQE, is applied and adjusted in the calculations to ensure the calculated burnup generally equals the measured burnup. Also, a multiplicative factor on upper plenum volume, i.e. AOPL, is applied and adjusted in the calculations to ensure the calculated free volume equals pre-irradiation data of total free volume in rod. Cladding temperatures were entered if they were provided . Otherwise the cladding temperatures are calculated from the inlet coolant temperature. The results are presented in excel form as an attachment of this paper, including thirteen real cases and three idealized cases. Three real cases (BK353, BK370, US PWR TSQ022) are excluded from validation of the new model, because the athermal release predicted is even greater than release measured, which means a negative thermal release. Obviously it is not reasonable for validation, but the results are also listed in excel (sheet 'Cases excluded from validation'). 3. Results The results of 10 real cases are listed in sheet 'Steady case summary', which summarizes measured and predicted values of Bu, FGR for each case, and plots M/P ratio of FGR calculation by different models in COPERNIC. A statistic comparison was also made with three indexes, i

  19. Performance model for telehealth use in home health agencies.

    Science.gov (United States)

    Frey, Jocelyn; Harmonosky, Catherine M; Dansky, Kathryn H

    2005-10-01

    Increasingly, home health agencies (HHAs) are considering the value of implementing telehealth technology. However, questions arise concerning how to manage and use this technology to benefit patients, nurses, and the agency. Performance models will be beneficial to managers and decision makers in the home health field by providing quantitative information for present and future planning of staff and technology usage in the HHA. This paper presents a model that predicts the average daily census of the HHA as a function of statistically identified parameters. Average daily census was chosen as the outcome variable because it is a proxy measure of an agency's capacity. The model suggests that including a telehealth system in the HHA increases average daily census by 40%-90% depending on the number of nurse full-time equivalent(s) (FTEs) and amount of travel hours per month. The use of a home telecare system enhances HHA performance.

  20. PHARAO laser source flight model: Design and performances

    Energy Technology Data Exchange (ETDEWEB)

    Lévèque, T., E-mail: thomas.leveque@cnes.fr; Faure, B.; Esnault, F. X.; Delaroche, C.; Massonnet, D.; Grosjean, O.; Buffe, F.; Torresi, P. [Centre National d’Etudes Spatiales, 18 avenue Edouard Belin, 31400 Toulouse (France); Bomer, T.; Pichon, A.; Béraud, P.; Lelay, J. P.; Thomin, S. [Sodern, 20 Avenue Descartes, 94451 Limeil-Brévannes (France); Laurent, Ph. [LNE-SYRTE, CNRS, UPMC, Observatoire de Paris, 61 avenue de l’Observatoire, 75014 Paris (France)

    2015-03-15

    In this paper, we describe the design and the main performances of the PHARAO laser source flight model. PHARAO is a laser cooled cesium clock specially designed for operation in space and the laser source is one of the main sub-systems. The flight model presented in this work is the first remote-controlled laser system designed for spaceborne cold atom manipulation. The main challenges arise from mechanical compatibility with space constraints, which impose a high level of compactness, a low electric power consumption, a wide range of operating temperature, and a vacuum environment. We describe the main functions of the laser source and give an overview of the main technologies developed for this instrument. We present some results of the qualification process. The characteristics of the laser source flight model, and their impact on the clock performances, have been verified in operational conditions.

  1. Ergonomic evaluation model of operational room based on team performance

    Directory of Open Access Journals (Sweden)

    YANG Zhiyi

    2017-05-01

    Full Text Available A theoretical calculation model based on the ergonomic evaluation of team performance was proposed in order to carry out the ergonomic evaluation of the layout design schemes of the action station in a multitasking operational room. This model was constructed in order to calculate and compare the theoretical value of team performance in multiple layout schemes by considering such substantial influential factors as frequency of communication, distance, angle, importance, human cognitive characteristics and so on. An experiment was finally conducted to verify the proposed model under the criteria of completion time and accuracy rating. As illustrated by the experiment results,the proposed approach is conductive to the prediction and ergonomic evaluation of the layout design schemes of the action station during early design stages,and provides a new theoretical method for the ergonomic evaluation,selection and optimization design of layout design schemes.

  2. Thin film bulk acoustic wave devices : performance optimization and modeling

    OpenAIRE

    Pensala, Tuomas

    2011-01-01

    Thin film bulk acoustic wave (BAW) resonators and filters operating in the GHz range are used in mobile phones for the most demanding filtering applications and complement the surface acoustic wave (SAW) based filters. Their main advantages are small size and high performance at frequencies above 2 GHz. This work concentrates on the characterization, performance optimization, and modeling techniques of thin film BAW devices. Laser interferometric vibration measurements together with plat...

  3. Computational modelling of expressive music performance in hexaphonic guitar

    OpenAIRE

    Siquier, Marc

    2017-01-01

    Computational modelling of expressive music performance has been widely studied in the past. While previous work in this area has been mainly focused on classical piano music, there has been very little work on guitar music, and such work has focused on monophonic guitar playing. In this work, we present a machine learning approach to automatically generate expressive performances from non expressive music scores for polyphonic guitar. We treated guitar as an hexaphonic instrument, obtaining ...

  4. A New Model to Simulate Energy Performance of VRF Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Tianzhen; Pang, Xiufeng; Schetrit, Oren; Wang, Liping; Kasahara, Shinichi; Yura, Yoshinori; Hinokuma, Ryohei

    2014-03-30

    This paper presents a new model to simulate energy performance of variable refrigerant flow (VRF) systems in heat pump operation mode (either cooling or heating is provided but not simultaneously). The main improvement of the new model is the introduction of the evaporating and condensing temperature in the indoor and outdoor unit capacity modifier functions. The independent variables in the capacity modifier functions of the existing VRF model in EnergyPlus are mainly room wet-bulb temperature and outdoor dry-bulb temperature in cooling mode and room dry-bulb temperature and outdoor wet-bulb temperature in heating mode. The new approach allows compliance with different specifications of each indoor unit so that the modeling accuracy is improved. The new VRF model was implemented in a custom version of EnergyPlus 7.2. This paper first describes the algorithm for the new VRF model, which is then used to simulate the energy performance of a VRF system in a Prototype House in California that complies with the requirements of Title 24 ? the California Building Energy Efficiency Standards. The VRF system performance is then compared with three other types of HVAC systems: the Title 24-2005 Baseline system, the traditional High Efficiency system, and the EnergyStar Heat Pump system in three typical California climates: Sunnyvale, Pasadena and Fresno. Calculated energy savings from the VRF systems are significant. The HVAC site energy savings range from 51 to 85percent, while the TDV (Time Dependent Valuation) energy savings range from 31 to 66percent compared to the Title 24 Baseline Systems across the three climates. The largest energy savings are in Fresno climate followed by Sunnyvale and Pasadena. The paper discusses various characteristics of the VRF systems contributing to the energy savings. It should be noted that these savings are calculated using the Title 24 prototype House D under standard operating conditions. Actual performance of the VRF systems for real

  5. A model of CCTV surveillance operator performance | Donald ...

    African Journals Online (AJOL)

    cognitive processes involved in visual search and monitoring – key activities of operators. The aim of this paper was to integrate the factors into a holistic theoretical model of performance for CCTV operators, drawing on areas such as vigilance, ...

  6. Item Response Theory Models for Performance Decline during Testing

    Science.gov (United States)

    Jin, Kuan-Yu; Wang, Wen-Chung

    2014-01-01

    Sometimes, test-takers may not be able to attempt all items to the best of their ability (with full effort) due to personal factors (e.g., low motivation) or testing conditions (e.g., time limit), resulting in poor performances on certain items, especially those located toward the end of a test. Standard item response theory (IRT) models fail to…

  7. Models for the financial-performance effects of Marketing

    NARCIS (Netherlands)

    Hanssens, D.M.; Dekimpe, Marnik; Wierenga, B.; van der Lans, R.

    We consider marketing-mix models that explicitly include financial performance criteria. These financial metrics are not only comparable across the marketing mix, they also relate well to investors’ evaluation of the firm. To that extent, we treat marketing as an investment in customer value

  8. Modeling performance measurement applications and implementation issues in DEA

    CERN Document Server

    Cook, Wade D

    2005-01-01

    Addresses advanced/new DEA methodology and techniques that are developed for modeling unique and new performance evaluation issuesPesents new DEA methodology and techniques via discussions on how to solve managerial problemsProvides an easy-to-use DEA software - DEAFrontier (www.deafrontier.com) which is an excellent tool for both DEA researchers and practitioners.

  9. Performances of estimators of linear auto-correlated error model ...

    African Journals Online (AJOL)

    The performances of five estimators of linear models with autocorrelated disturbance terms are compared when the independent variable is exponential. The results reveal that for both small and large samples, the Ordinary Least Squares (OLS) compares favourably with the Generalized least Squares (GLS) estimators in ...

  10. Towards a Social Networks Model for Online Learning & Performance

    Science.gov (United States)

    Chung, Kon Shing Kenneth; Paredes, Walter Christian

    2015-01-01

    In this study, we develop a theoretical model to investigate the association between social network properties, "content richness" (CR) in academic learning discourse, and performance. CR is the extent to which one contributes content that is meaningful, insightful and constructive to aid learning and by social network properties we…

  11. Quantitative modeling of human performance in complex, dynamic systems

    National Research Council Canada - National Science Library

    Baron, Sheldon; Kruser, Dana S; Huey, Beverly Messick

    1990-01-01

    ... Sheldon Baron, Dana S. Kruser, and Beverly Messick Huey, editors Panel on Human Performance Modeling Committee on Human Factors Commission on Behavioral and Social Sciences and Education National Research Council NATIONAL ACADEMY PRESS Washington, D.C. 1990 Copyrightoriginal retained, the be not from cannot book, paper original however, for version forma...

  12. Performances Of Estimators Of Linear Models With Autocorrelated ...

    African Journals Online (AJOL)

    The performances of five estimators of linear models with Autocorrelated error terms are compared when the independent variable is autoregressive. The results reveal that the properties of the estimators when the sample size is finite is quite similar to the properties of the estimators when the sample size is infinite although ...

  13. Performances of estimators of linear model with auto-correlated ...

    African Journals Online (AJOL)

    Performances of estimators of linear model with auto-correlated error terms when the independent variable is normal. ... On the other hand, the same slope coefficients β , under Generalized Least Squares (GLS) decreased with increased autocorrelation when the sample size T is small. Journal of the Nigerian Association ...

  14. Performance Modeling for Heterogeneous Wireless Networks with Multiservice Overflow Traffic

    DEFF Research Database (Denmark)

    Huang, Qian; Ko, King-Tim; Iversen, Villy Bæk

    2009-01-01

    . Multiservice loss analysis based on multi-dimensional Markov chain becomes intractable in these networks due to intensive computations required. This paper focuses on performance modeling for heterogeneous wireless networks based on a hierarchical overlay infrastructure. A method based on decomposition...

  15. Stutter-Step Models of Performance in School

    Science.gov (United States)

    Morgan, Stephen L.; Leenman, Theodore S.; Todd, Jennifer J.; Kentucky; Weeden, Kim A.

    2013-01-01

    To evaluate a stutter-step model of academic performance in high school, this article adopts a unique measure of the beliefs of 12,591 high school sophomores from the Education Longitudinal Study, 2002-2006. Verbatim responses to questions on occupational plans are coded to capture specific job titles, the listing of multiple jobs, and the listing…

  16. Performance in model transformations: experiments with ATL and QVT

    NARCIS (Netherlands)

    van Amstel, Marcel; Bosems, S.; Ivanov, Ivan; Ferreira Pires, Luis; Cabot, Jordi; Visser, Eelco

    Model transformations are increasingly being incorporated in software development processes. However, as systems being developed with transformations grow in size and complexity, the performance of the transformations tends to degrade. In this paper we investigate the factors that have an impact on

  17. Evaluation of the performance of DIAS ionospheric forecasting models

    Directory of Open Access Journals (Sweden)

    Tsagouri Ioanna

    2011-08-01

    Full Text Available Nowcasting and forecasting ionospheric products and services for the European region are regularly provided since August 2006 through the European Digital upper Atmosphere Server (DIAS, http://dias.space.noa.gr. Currently, DIAS ionospheric forecasts are based on the online implementation of two models: (i the solar wind driven autoregression model for ionospheric short-term forecast (SWIF, which combines historical and real-time ionospheric observations with solar-wind parameters obtained in real time at the L1 point from NASA ACE spacecraft, and (ii the geomagnetically correlated autoregression model (GCAM, which is a time series forecasting method driven by a synthetic geomagnetic index. In this paper we investigate the operational ability and the accuracy of both DIAS models carrying out a metrics-based evaluation of their performance under all possible conditions. The analysis was established on the systematic comparison between models’ predictions with actual observations obtained over almost one solar cycle (1998–2007 at four European ionospheric locations (Athens, Chilton, Juliusruh and Rome and on the comparison of the models’ performance against two simple prediction strategies, the median- and the persistence-based predictions during storm conditions. The results verify operational validity for both models and quantify their prediction accuracy under all possible conditions in support of operational applications but also of comparative studies in assessing or expanding the current ionospheric forecasting capabilities.

  18. 3D Massive MIMO Systems: Channel Modeling and Performance Analysis

    KAUST Repository

    Nadeem, Qurrat-Ul-Ain

    2015-03-01

    Multiple-input-multiple-output (MIMO) systems of current LTE releases are capable of adaptation in the azimuth only. More recently, the trend is to enhance the system performance by exploiting the channel\\'s degrees of freedom in the elevation through the dynamic adaptation of the vertical antenna beam pattern. This necessitates the derivation and characterization of three-dimensional (3D) channels. Over the years, channel models have evolved to address the challenges of wireless communication technologies. In parallel to theoretical studies on channel modeling, many standardized channel models like COST-based models, 3GPP SCM, WINNER, ITU have emerged that act as references for industries and telecommunication companies to assess system-level and link-level performances of advanced signal processing techniques over real-like channels. Given the existing channels are only two dimensional (2D) in nature; a large effort in channel modeling is needed to study the impact of the channel component in the elevation direction. The first part of this work sheds light on the current 3GPP activity around 3D channel modeling and beamforming, an aspect that to our knowledge has not been extensively covered by a research publication. The standardized MIMO channel model is presented, that incorporates both the propagation effects of the environment and the radio effects of the antennas. In order to facilitate future studies on the use of 3D beamforming, the main features of the proposed 3D channel model are discussed. A brief overview of the future 3GPP 3D channel model being outlined for the next generation of wireless networks is also provided. In the subsequent part of this work, we present an information-theoretic channel model for MIMO systems that supports the elevation dimension. The model is based on the principle of maximum entropy, which enables us to determine the distribution of the channel matrix consistent with the prior information on the angles of departure and

  19. Performance of GeantV EM Physics Models

    Science.gov (United States)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Aurora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S. Y.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2017-10-01

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  20. Performance modelling for product development of advanced window systems

    DEFF Research Database (Denmark)

    Appelfeld, David

    The research presented in this doctoral thesis shows how the product development (PD) of Complex Fenestration Systems (CFSs) can be facilitated by computer-based analysis to improve the energy efficiency of fenestration systems as well as to improve the indoor environment. The first chapter defines...... and methods,which can address interrelated performance parameters of CFS, are sought. It is possible to evaluate such systems by measurements, however the high cost and complexity of the measurements are limiting factors. The studies in this thesis confirmed that the results from the performance measurements...... of CFSs can be interpreted by simulations and hence simulations can be used for the performance analysis of new CFSs. An advanced simulation model must be often developed and needs to be validated by measurements before the model can be reused. The validation of simulations against the measurements proved...

  1. Modeling the seakeeping performance of luxury cruise ships

    Science.gov (United States)

    Cao, Yu; Yu, Bao-Jun; Wang, Jian-Fang

    2010-09-01

    The seakeeping performance of a luxury cruise ship was evaluated during the concept design phase. By comparing numerical predictions based on 3-D linear potential flow theory in the frequency domain with the results of model tests, it was shown that the 3-D method predicted the seakeeping performance of the luxury cruise ship well. Based on the model, the seakeeping features of the luxury cruise ship were analyzed, and then the influence was seen of changes to the primary design parameters (center of gravity, inertial radius, etc.). Based on the results, suggestions were proposed to improve the choice of parameters for luxury cruise ships during the concept design phase. They should improve seakeeping performance.

  2. Performance Models and Risk Management in Communications Systems

    CERN Document Server

    Harrison, Peter; Rüstem, Berç

    2011-01-01

    This volume covers recent developments in the design, operation, and management of telecommunication and computer network systems in performance engineering and addresses issues of uncertainty, robustness, and risk. Uncertainty regarding loading and system parameters leads to challenging optimization and robustness issues. Stochastic modeling combined with optimization theory ensures the optimum end-to-end performance of telecommunication or computer network systems. In view of the diverse design options possible, supporting models have many adjustable parameters and choosing the best set for a particular performance objective is delicate and time-consuming. An optimization based approach determines the optimal possible allocation for these parameters. Researchers and graduate students working at the interface of telecommunications and operations research will benefit from this book. Due to the practical approach, this book will also serve as a reference tool for scientists and engineers in telecommunication ...

  3. Performance of GeantV EM Physics Models

    CERN Document Server

    Amadio, G; Apostolakis, J; Aurora, A; Bandieramonte, M; Bhattacharyya, A; Bianchini, C; Brun, R; Canal P; Carminati, F; Cosmo, G; Duhem, L; Elvira, D; Folger, G; Gheata, A; Gheata, M; Goulas, I; Iope, R; Jun, S Y; Lima, G; Mohanty, A; Nikitina, T; Novak, M; Pokorski, W; Ribon, A; Seghal, R; Shadura, O; Vallecorsa, S; Wenzel, S; Zhang, Y

    2017-01-01

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  4. Performance of GeantV EM Physics Models

    Energy Technology Data Exchange (ETDEWEB)

    Amadio, G.; et al.

    2016-10-14

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  5. Electrical circuit models for performance modeling of Lithium-Sulfur batteries

    DEFF Research Database (Denmark)

    Knap, Vaclav; Stroe, Daniel Ioan; Teodorescu, Remus

    2015-01-01

    Energy storage technologies such as Lithium-ion (Li-ion) batteries are widely used in the present effort to move towards more ecological solutions in sectors like transportation or renewable-energy integration. However, today's Li-ion batteries are reaching their limits and not all demands...... of the industry are met yet. Therefore, researchers focus on alternative battery chemistries as Lithium-Sulfur (Li-S), which have a huge potential due to their high theoretical specific capacity (approx. 1675 Ah/kg) and theoretical energy density of almost 2600 Wh/kg. To analyze the suitability of this new...... emerging technology for various applications, there is a need for Li-S battery performance model; however, developing such models represents a challenging task due to batteries' complex ongoing chemical reactions. Therefore, the literature review was performed to summarize electrical circuit models (ECMs...

  6. Model tests on dynamic performance of RC shear walls

    International Nuclear Information System (INIS)

    Nagashima, Toshio; Shibata, Akenori; Inoue, Norio; Muroi, Kazuo.

    1991-01-01

    For the inelastic dynamic response analysis of a reactor building subjected to earthquakes, it is essentially important to properly evaluate its restoring force characteristics under dynamic loading condition and its damping performance. Reinforced concrete shear walls are the main structural members of a reactor building, and dominate its seismic behavior. In order to obtain the basic information on the dynamic restoring force characteristics and damping performance of shear walls, the dynamic test using a large shaking table, static displacement control test and the pseudo-dynamic test on the models of a shear wall were conducted. In the dynamic test, four specimens were tested on a large shaking table. In the static test, four specimens were tested, and in the pseudo-dynamic test, three specimens were tested. These tests are outlined. The results of these tests were compared, placing emphasis on the restoring force characteristics and damping performance of the RC wall models. The strength was higher in the dynamic test models than in the static test models mainly due to the effect of loading rate. (K.I.)

  7. A Practical Model to Perform Comprehensive Cybersecurity Audits

    Directory of Open Access Journals (Sweden)

    Regner Sabillon

    2018-03-01

    Full Text Available These days organizations are continually facing being targets of cyberattacks and cyberthreats; the sophistication and complexity of modern cyberattacks and the modus operandi of cybercriminals including Techniques, Tactics and Procedures (TTP keep growing at unprecedented rates. Cybercriminals are always adopting new strategies to plan and launch cyberattacks based on existing cybersecurity vulnerabilities and exploiting end users by using social engineering techniques. Cybersecurity audits are extremely important to verify that information security controls are in place and to detect weaknesses of inexistent cybersecurity or obsolete controls. This article presents an innovative and comprehensive cybersecurity audit model. The CyberSecurity Audit Model (CSAM can be implemented to perform internal or external cybersecurity audits. This model can be used to perform single cybersecurity audits or can be part of any corporate audit program to improve cybersecurity controls. Any information security or cybersecurity audit team has either the options to perform a full audit for all cybersecurity domains or by selecting specific domains to audit certain areas that need control verification and hardening. The CSAM has 18 domains; Domain 1 is specific for Nation States and Domains 2-18 can be implemented at any organization. The organization can be any small, medium or large enterprise, the model is also applicable to any Non-Profit Organization (NPO.

  8. Instruction-level performance modeling and characterization of multimedia applications

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Y. [Los Alamos National Lab., NM (United States). Scientific Computing Group; Cameron, K.W. [Louisiana State Univ., Baton Rouge, LA (United States). Dept. of Computer Science

    1999-06-01

    One of the challenges for characterizing and modeling realistic multimedia applications is the lack of access to source codes. On-chip performance counters effectively resolve this problem by monitoring run-time behaviors at the instruction-level. This paper presents a novel technique of characterizing and modeling workloads at the instruction level for realistic multimedia applications using hardware performance counters. A variety of instruction counts are collected from some multimedia applications, such as RealPlayer, GSM Vocoder, MPEG encoder/decoder, and speech synthesizer. These instruction counts can be used to form a set of abstract characteristic parameters directly related to a processor`s architectural features. Based on microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. The biggest advantage of this new characterization technique is a better understanding of processor utilization efficiency and architectural bottleneck for each application. This technique also provides predictive insight of future architectural enhancements and their affect on current codes. In this paper the authors also attempt to model architectural effect on processor utilization without memory influence. They derive formulas for calculating CPI{sub 0}, CPI without memory effect, and they quantify utilization of architectural parameters. These equations are architecturally diagnostic and predictive in nature. Results provide promise in code characterization, and empirical/analytical modeling.

  9. Modelling of performance of the ATLAS SCT detector

    International Nuclear Information System (INIS)

    Kazi, S.

    2000-01-01

    Full text: The ATLAS detector being built at LHC will use the SCT (semiconductor tracking) module for particle tracking in the inner core of the detector. An analytical/numerical model of the discriminator threshold dependence and the temperature dependence of the SCT module was derived. Measurements were conducted on the performance of the SCT module versus temperature and these results were compared with the predictions made by the model. The affect of radiation damage of the SCT detector was also investigated. The detector will operate for approximately 10 years so a study was carried out on the effects of the 10 years of radiation exposure to the SCT

  10. Rethinking board role performance: Towards an integrative model

    Directory of Open Access Journals (Sweden)

    Babić Verica M.

    2011-01-01

    Full Text Available This research focuses on the board role evolution analysis which took place simultaneously with the development of different corporate governance theories and perspectives. The purpose of this paper is to provide understanding of key factors that make a board effective in the performance of its role. We argue that analysis of board role performance should incorporate both structural and process variables. This paper’s contribution is the development of an integrative model that aims to establish the relationship between the board structure and processes on the one hand, and board role performance on the other.

  11. Human performance models for computer-aided engineering

    Science.gov (United States)

    Elkind, Jerome I. (Editor); Card, Stuart K. (Editor); Hochberg, Julian (Editor); Huey, Beverly Messick (Editor)

    1989-01-01

    This report discusses a topic important to the field of computational human factors: models of human performance and their use in computer-based engineering facilities for the design of complex systems. It focuses on a particular human factors design problem -- the design of cockpit systems for advanced helicopters -- and on a particular aspect of human performance -- vision and related cognitive functions. By focusing in this way, the authors were able to address the selected topics in some depth and develop findings and recommendations that they believe have application to many other aspects of human performance and to other design domains.

  12. The integration of intrapreneurship into a performance management model

    Directory of Open Access Journals (Sweden)

    Thabo WL Foba

    2007-02-01

    Full Text Available This study aimed to investigate the feasibility of using the dynamics of intrapreneurship to develop a new generation performance management model based on the structural dynamics of the Balanced Score Card approach. The literature survey covered entrepreneurship, from which the construct, intrapreneurship, was synthesized. Reconstructive logic and Hermeneutic methodology were used in studying the performance management systems and the Balanced Score Card approach. The dynamics were then integrated into a new approach for the management of performance of intrapreneurial employees in the corporate environment. An unstructured opinion survey followed: a sample of intrapreneurship students evaluated and validated the model’s conceptual feasibility and probable practical value.

  13. Model Checking for a Class of Performance Properties of Fluid Stochastic Models

    NARCIS (Netherlands)

    Bujorianu, L.M.; Bujorianu, M.C.; Horváth, A.; Telek, M.

    2006-01-01

    Recently, there is an explosive development of fluid approa- ches to computer and distributed systems. These approaches are inherently stochastic and generate continuous state space models. Usually, the performance measures for these systems are defined using probabilities of reaching certain sets

  14. Cooperative cognitive radio networking system model, enabling techniques, and performance

    CERN Document Server

    Cao, Bin; Mark, Jon W

    2016-01-01

    This SpringerBrief examines the active cooperation between users of Cooperative Cognitive Radio Networking (CCRN), exploring the system model, enabling techniques, and performance. The brief provides a systematic study on active cooperation between primary users and secondary users, i.e., (CCRN), followed by the discussions on research issues and challenges in designing spectrum-energy efficient CCRN. As an effort to shed light on the design of spectrum-energy efficient CCRN, they model the CCRN based on orthogonal modulation and orthogonally dual-polarized antenna (ODPA). The resource allocation issues are detailed with respect to both models, in terms of problem formulation, solution approach, and numerical results. Finally, the optimal communication strategies for both primary and secondary users to achieve spectrum-energy efficient CCRN are analyzed.

  15. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    Science.gov (United States)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  16. PERFORMANCE EVALUATION OF 3D MODELING SOFTWARE FOR UAV PHOTOGRAMMETRY

    Directory of Open Access Journals (Sweden)

    H. Yanagi

    2016-06-01

    Full Text Available UAV (Unmanned Aerial Vehicle photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  17. Modeling and design of a high-performance hybrid actuator

    Science.gov (United States)

    Aloufi, Badr; Behdinan, Kamran; Zu, Jean

    2016-12-01

    This paper presents the model and design of a novel hybrid piezoelectric actuator which provides high active and passive performances for smart structural systems. The actuator is composed of a pair of curved pre-stressed piezoelectric actuators, so-called commercially THUNDER actuators, installed opposite each other using two clamping mechanisms constructed of in-plane fixable hinges, grippers and solid links. A fully mathematical model is developed to describe the active and passive dynamics of the actuator and investigate the effects of its geometrical parameters on the dynamic stiffness, free displacement and blocked force properties. Among the literature that deals with piezoelectric actuators in which THUNDER elements are used as a source of electromechanical power, the proposed study is unique in that it presents a mathematical model that has the ability to predict the actuator characteristics and achieve other phenomena, such as resonances, mode shapes, phase shifts, dips, etc. For model validation, the measurements of the free dynamic response per unit voltage and passive acceleration transmissibility of a particular actuator design are used to check the accuracy of the results predicted by the model. The results reveal that there is a good agreement between the model and experiment. Another experiment is performed to teste the linearity of the actuator system by examining the variation of the output dynamic responses with varying forces and voltages at different frequencies. From the results, it can be concluded that the actuator acts approximately as a linear system at frequencies up to 1000 Hz. A parametric study is achieved here by applying the developed model to analyze the influence of the geometrical parameters of the fixable hinges on the active and passive actuator properties. The model predictions in the frequency range of 0-1000 Hz show that the hinge thickness, radius, and opening angle parameters have great effects on the frequency dynamic

  18. Radionuclide release rates from spent fuel for performance assessment modeling

    International Nuclear Information System (INIS)

    Curtis, D.B.

    1994-01-01

    In a scenario of aqueous transport from a high-level radioactive waste repository, the concentration of radionuclides in water in contact with the waste constitutes the source term for transport models, and as such represents a fundamental component of all performance assessment models. Many laboratory experiments have been done to characterize release rates and understand processes influencing radionuclide release rates from irradiated nuclear fuel. Natural analogues of these waste forms have been studied to obtain information regarding the long-term stability of potential waste forms in complex natural systems. This information from diverse sources must be brought together to develop and defend methods used to define source terms for performance assessment models. In this manuscript examples of measures of radionuclide release rates from spent nuclear fuel or analogues of nuclear fuel are presented. Each example represents a very different approach to obtaining a numerical measure and each has its limitations. There is no way to obtain an unambiguous measure of this or any parameter used in performance assessment codes for evaluating the effects of processes operative over many millennia. The examples are intended to suggest by example that in the absence of the ability to evaluate accuracy and precision, consistency of a broadly based set of data can be used as circumstantial evidence to defend the choice of parameters used in performance assessments

  19. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  20. A refined index of model performance: a rejoinder

    Science.gov (United States)

    Legates, David R.; McCabe, Gregory J.

    2013-01-01

    Willmott et al. [Willmott CJ, Robeson SM, Matsuura K. 2012. A refined index of model performance. International Journal of Climatology, forthcoming. DOI:10.1002/joc.2419.] recently suggest a refined index of model performance (dr) that they purport to be superior to other methods. Their refined index ranges from − 1.0 to 1.0 to resemble a correlation coefficient, but it is merely a linear rescaling of our modified coefficient of efficiency (E1) over the positive portion of the domain of dr. We disagree with Willmott et al. (2012) that dr provides a better interpretation; rather, E1 is more easily interpreted such that a value of E1 = 1.0 indicates a perfect model (no errors) while E1 = 0.0 indicates a model that is no better than the baseline comparison (usually the observed mean). Negative values of E1 (and, for that matter, dr McCabe [Legates DR, McCabe GJ. 1999. Evaluating the use of “goodness-of-fit” measures in hydrologic and hydroclimatic model validation. Water Resources Research 35(1): 233-241.] and Schaefli and Gupta [Schaefli B, Gupta HV. 2007. Do Nash values have value? Hydrological Processes 21: 2075-2080. DOI: 10.1002/hyp.6825.]. This important discussion focuses on the appropriate baseline comparison to use, and why the observed mean often may be an inadequate choice for model evaluation and development. 

  1. The predictive performance and stability of six species distribution models.

    Directory of Open Access Journals (Sweden)

    Ren-Yan Duan

    Full Text Available Predicting species' potential geographical range by species distribution models (SDMs is central to understand their ecological requirements. However, the effects of using different modeling techniques need further investigation. In order to improve the prediction effect, we need to assess the predictive performance and stability of different SDMs.We collected the distribution data of five common tree species (Pinus massoniana, Betula platyphylla, Quercus wutaishanica, Quercus mongolica and Quercus variabilis and simulated their potential distribution area using 13 environmental variables and six widely used SDMs: BIOCLIM, DOMAIN, MAHAL, RF, MAXENT, and SVM. Each model run was repeated 100 times (trials. We compared the predictive performance by testing the consistency between observations and simulated distributions and assessed the stability by the standard deviation, coefficient of variation, and the 99% confidence interval of Kappa and AUC values.The mean values of AUC and Kappa from MAHAL, RF, MAXENT, and SVM trials were similar and significantly higher than those from BIOCLIM and DOMAIN trials (p<0.05, while the associated standard deviations and coefficients of variation were larger for BIOCLIM and DOMAIN trials (p<0.05, and the 99% confidence intervals for AUC and Kappa values were narrower for MAHAL, RF, MAXENT, and SVM. Compared to BIOCLIM and DOMAIN, other SDMs (MAHAL, RF, MAXENT, and SVM had higher prediction accuracy, smaller confidence intervals, and were more stable and less affected by the random variable (randomly selected pseudo-absence points.According to the prediction performance and stability of SDMs, we can divide these six SDMs into two categories: a high performance and stability group including MAHAL, RF, MAXENT, and SVM, and a low performance and stability group consisting of BIOCLIM, and DOMAIN. We highlight that choosing appropriate SDMs to address a specific problem is an important part of the modeling process.

  2. Econometric model as a regulatory tool in electricity distribution - Case Network Performance Assessment Model

    International Nuclear Information System (INIS)

    Honkapuro, S.; Lassila, J.; Viljainen, S.; Tahvanainen, K.; Partanen, J.

    2004-01-01

    Electricity distribution companies operate in the state of natural monopolies since building of parallel networks is not cost-effective. Monopoly companies do not have pressure from the open markets to keep their prices and costs at reasonable level. The regulation of these companies is needed to prevent the misuse of the monopoly position. Regulation is usually focused either on the profit of company or on the price of electricity. In this document, the usability of an econometric model in the regulation of electricity distribution companies is evaluated. Regulation method which determines allowed income for each company with generic computation model can be seen as an econometric model. As the special case of an econometric model, the method called Network Performance Assessment Model, NPAM (Naetnyttomodellen in Swedish), is analysed. NPAM is developed by Swedish Energy Agency (STEM) for the regulation of electricity distribution companies. Both theoretical analysis and calculations of an example network area are presented in this document to find the major directing effects of the model. The parameters of NPAM, which are used in the calculations of this research report, were dated on 30th of March 2004. These parameters were most recent available at the time when analysis was done. However, since NPAM is under development, the parameters have been constantly changing. Therefore slightly changes in the results can occur if calculations were made with latest parameters. However, main conclusions are same and do not depend on exact parameters. (orig.)

  3. Econometric model as a regulatory tool in electricity distribution. Case network performance assessment model

    International Nuclear Information System (INIS)

    Honkapuro, S.; Lassila, J.; Viljainen, S.; Tahvanainen, K.; Partanen, J.

    2004-01-01

    Electricity distribution companies operate in the state of natural monopolies since building of parallel networks is not cost- effective. Monopoly companies do not have pressure from the open markets to keep their prices and costs at reasonable level. The regulation of these companies is needed to prevent the misuse of the monopoly position. Regulation is usually focused either on the profit of company or on the price of electricity. Regulation method which determines allowed income for each company with generic computation model can be seen as an econometric model. In this document, the usability of an econometric model in the regulation of electricity distribution companies is evaluated. As the special case of an econometric model, the method called Network Performance Assessment Model, NPAM (Naetnyttomodellen in Swedish), is analysed. NPAM is developed by Swedish Energy Agency (STEM) for the regulation of electricity distribution companies. Both theoretical analysis and calculations of an example network area are presented in this document to find the major directing effects of the model. The parameters of NPAM, which are used in the calculations of this research report, were dated on 30th of March 2004. These parameters were most recent ones available at the time when analysis was done. However, since NPAM have been under development, the parameters have been constantly changing. Therefore slight changes might occur in the numerical results of calculations if they were made with the latest set of parameters. However, main conclusions are same and do not depend on exact parameters

  4. Performance of fire behavior fuel models developed for the Rothermel Surface Fire Spread Model

    Science.gov (United States)

    Robert Ziel; W. Matt Jolly

    2009-01-01

    In 2005, 40 new fire behavior fuel models were published for use with the Rothermel Surface Fire Spread Model. These new models are intended to augment the original 13 developed in 1972 and 1976. As a compiled set of quantitative fuel descriptions that serve as input to the Rothermel model, the selected fire behavior fuel model has always been critical to the resulting...

  5. URBAN MODELLING PERFORMANCE OF NEXT GENERATION SAR MISSIONS

    Directory of Open Access Journals (Sweden)

    U. G. Sefercik

    2017-09-01

    Full Text Available In synthetic aperture radar (SAR technology, urban mapping and modelling have become possible with revolutionary missions TerraSAR-X (TSX and Cosmo-SkyMed (CSK since 2007. These satellites offer 1m spatial resolution in high-resolution spotlight imaging mode and capable for high quality digital surface model (DSM acquisition for urban areas utilizing interferometric SAR (InSAR technology. With the advantage of independent generation from seasonal weather conditions, TSX and CSK DSMs are much in demand by scientific users. The performance of SAR DSMs is influenced by the distortions such as layover, foreshortening, shadow and double-bounce depend up on imaging geometry. In this study, the potential of DSMs derived from convenient 1m high-resolution spotlight (HS InSAR pairs of CSK and TSX is validated by model-to-model absolute and relative accuracy estimations in an urban area. For the verification, an airborne laser scanning (ALS DSM of the study area was used as the reference model. Results demonstrated that TSX and CSK urban DSMs are compatible in open, built-up and forest land forms with the absolute accuracy of 8–10 m. The relative accuracies based on the coherence of neighbouring pixels are superior to absolute accuracies both for CSK and TSX.

  6. Urban Modelling Performance of Next Generation SAR Missions

    Science.gov (United States)

    Sefercik, U. G.; Yastikli, N.; Atalay, C.

    2017-09-01

    In synthetic aperture radar (SAR) technology, urban mapping and modelling have become possible with revolutionary missions TerraSAR-X (TSX) and Cosmo-SkyMed (CSK) since 2007. These satellites offer 1m spatial resolution in high-resolution spotlight imaging mode and capable for high quality digital surface model (DSM) acquisition for urban areas utilizing interferometric SAR (InSAR) technology. With the advantage of independent generation from seasonal weather conditions, TSX and CSK DSMs are much in demand by scientific users. The performance of SAR DSMs is influenced by the distortions such as layover, foreshortening, shadow and double-bounce depend up on imaging geometry. In this study, the potential of DSMs derived from convenient 1m high-resolution spotlight (HS) InSAR pairs of CSK and TSX is validated by model-to-model absolute and relative accuracy estimations in an urban area. For the verification, an airborne laser scanning (ALS) DSM of the study area was used as the reference model. Results demonstrated that TSX and CSK urban DSMs are compatible in open, built-up and forest land forms with the absolute accuracy of 8-10 m. The relative accuracies based on the coherence of neighbouring pixels are superior to absolute accuracies both for CSK and TSX.

  7. Photovoltaic Pixels for Neural Stimulation: Circuit Models and Performance.

    Science.gov (United States)

    Boinagrov, David; Lei, Xin; Goetz, Georges; Kamins, Theodore I; Mathieson, Keith; Galambos, Ludwig; Harris, James S; Palanker, Daniel

    2016-02-01

    Photovoltaic conversion of pulsed light into pulsed electric current enables optically-activated neural stimulation with miniature wireless implants. In photovoltaic retinal prostheses, patterns of near-infrared light projected from video goggles onto subretinal arrays of photovoltaic pixels are converted into patterns of current to stimulate the inner retinal neurons. We describe a model of these devices and evaluate the performance of photovoltaic circuits, including the electrode-electrolyte interface. Characteristics of the electrodes measured in saline with various voltages, pulse durations, and polarities were modeled as voltage-dependent capacitances and Faradaic resistances. The resulting mathematical model of the circuit yielded dynamics of the electric current generated by the photovoltaic pixels illuminated by pulsed light. Voltages measured in saline with a pipette electrode above the pixel closely matched results of the model. Using the circuit model, our pixel design was optimized for maximum charge injection under various lighting conditions and for different stimulation thresholds. To speed discharge of the electrodes between the pulses of light, a shunt resistor was introduced and optimized for high frequency stimulation.

  8. Modeling impact of environmental factors on photovoltaic array performance

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jie; Sun, Yize; Xu, Yang [College of Mechanical Engineering, Donghua University NO.2999, North Renmin Road, Shanghai (China)

    2013-07-01

    It is represented in this paper that a methodology to model and quantify the impact of the three environmental factors, the ambient temperature, the incident irradiance and the wind speed, upon the performance of photovoltaic array operating under outdoor conditions. First, A simple correlation correlating operating temperature with the three environmental variables is validated for a range of wind speed studied, 2-8, and for irradiance values between 200 and 1000. Root mean square error (RMSE) between modeled operating temperature and measured values is 1.19% and the mean bias error (MBE) is -0.09%. The environmental factors studied influence I-V curves, P-V curves, and maximum-power outputs of photovoltaic array. The cell-to-module-to-array mathematical model for photovoltaic panels is established in this paper and the method defined as segmented iteration is adopted to solve the I-V curve expression to relate model I-V curves. The model I-V curves and P-V curves are concluded to coincide well with measured data points. The RMSE between numerically calculated maximum-power outputs and experimentally measured ones is 0.2307%, while the MBE is 0.0183%. In addition, a multivariable non-linear regression equation is proposed to eliminate the difference between numerically calculated values and measured ones of maximum power outputs over the range of high ambient temperature and irradiance at noon and in the early afternoon. In conclusion, the proposed method is reasonably simple and accurate.

  9. A conceptual model to improve performance in virtual teams

    Directory of Open Access Journals (Sweden)

    Shopee Dube

    2016-09-01

    Full Text Available Background: The vast improvement in communication technologies and sophisticated project management tools, methods and techniques has allowed geographically and culturally diverse groups to operate and function in a virtual environment. To succeed in this virtual environment where time and space are becoming increasingly irrelevant, organisations must define new ways of implementing initiatives. This virtual environment phenomenon has brought about the formation of virtual project teams that allow organisations to harness the skills and knowhow of the best resources, irrespective of their location. Objectives: The aim of this article was to investigate performance criteria and develop a conceptual model which can be applied to enhance the success of virtual project teams. There are no clear guidelines of the performance criteria in managing virtual project teams. Method: A qualitative research methodology was used in this article. The purpose of content analysis was to explore the literature to understand the concept of performance in virtual project teams and to summarise the findings of the literature reviewed. Results: The research identified a set of performance criteria for the virtual project teams as follows: leadership, trust, communication, team cooperation, reliability, motivation, comfort and social interaction. These were used to conceptualise the model. Conclusion: The conceptual model can be used in a holistic way to determine the overall performance of the virtual project team, but each factor can be analysed individually to determine the impact on the overall performance. The knowledge of performance criteria for virtual project teams could aid project managers in enhancing the success of these teams and taking a different approach to better manage and coordinate them.

  10. Modeling Windows in Energy Plus with Simple Performance Indices

    Energy Technology Data Exchange (ETDEWEB)

    Arasteh, Dariush; Kohler, Christian; Griffith, Brent

    2009-10-12

    The building energy simulation program, Energy Plus (E+), cannot use standard window performance indices (U, SHGC, VT) to model window energy impacts. Rather, E+ uses more accurate methods which require a physical description of the window. E+ needs to be able to accept U and SHGC indices as window descriptors because, often, these are all that is known about a window and because building codes, standards, and voluntary programs are developed using these terms. This paper outlines a procedure, developed for E+, which will allow it to use standard window performance indices to model window energy impacts. In this 'Block' model, a given U, SHGC, VT are mapped to the properties of a fictitious 'layer' in E+. For thermal conductance calculations, the 'Block' functions as a single solid layer. For solar optical calculations, the model begins by defining a solar transmittance (Ts) at normal incidence based on the SHGC. For properties at non-normal incidence angles, the 'Block' takes on the angular properties of multiple glazing layers; the number and type of layers defined by the U and SHGC. While this procedure is specific to E+, parts of it may have applicability to other window/building simulation programs.

  11. Decline curve based models for predicting natural gas well performance

    Directory of Open Access Journals (Sweden)

    Arash Kamari

    2017-06-01

    Full Text Available The productivity of a gas well declines over its production life as cannot cover economic policies. To overcome such problems, the production performance of gas wells should be predicted by applying reliable methods to analyse the decline trend. Therefore, reliable models are developed in this study on the basis of powerful artificial intelligence techniques viz. the artificial neural network (ANN modelling strategy, least square support vector machine (LSSVM approach, adaptive neuro-fuzzy inference system (ANFIS, and decision tree (DT method for the prediction of cumulative gas production as well as initial decline rate multiplied by time as a function of the Arps' decline curve exponent and ratio of initial gas flow rate over total gas flow rate. It was concluded that the results obtained based on the models developed in current study are in satisfactory agreement with the actual gas well production data. Furthermore, the results of comparative study performed demonstrates that the LSSVM strategy is superior to the other models investigated for the prediction of both cumulative gas production, and initial decline rate multiplied by time.

  12. Green roof hydrologic performance and modeling: a review.

    Science.gov (United States)

    Li, Yanling; Babcock, Roger W

    2014-01-01

    Green roofs reduce runoff from impervious surfaces in urban development. This paper reviews the technical literature on green roof hydrology. Laboratory experiments and field measurements have shown that green roofs can reduce stormwater runoff volume by 30 to 86%, reduce peak flow rate by 22 to 93% and delay the peak flow by 0 to 30 min and thereby decrease pollution, flooding and erosion during precipitation events. However, the effectiveness can vary substantially due to design characteristics making performance predictions difficult. Evaluation of the most recently published study findings indicates that the major factors affecting green roof hydrology are precipitation volume, precipitation dynamics, antecedent conditions, growth medium, plant species, and roof slope. This paper also evaluates the computer models commonly used to simulate hydrologic processes for green roofs, including stormwater management model, soil water atmosphere and plant, SWMS-2D, HYDRUS, and other models that are shown to be effective for predicting precipitation response and economic benefits. The review findings indicate that green roofs are effective for reduction of runoff volume and peak flow, and delay of peak flow, however, no tool or model is available to predict expected performance for any given anticipated system based on design parameters that directly affect green roof hydrology.

  13. A Fluid Model for Performance Analysis in Cellular Networks

    Directory of Open Access Journals (Sweden)

    Coupechoux Marceau

    2010-01-01

    Full Text Available We propose a new framework to study the performance of cellular networks using a fluid model and we derive from this model analytical formulas for interference, outage probability, and spatial outage probability. The key idea of the fluid model is to consider the discrete base station (BS entities as a continuum of transmitters that are spatially distributed in the network. This model allows us to obtain simple analytical expressions to reveal main characteristics of the network. In this paper, we focus on the downlink other-cell interference factor (OCIF, which is defined for a given user as the ratio of its outer cell received power to its inner cell received power. A closed-form formula of the OCIF is provided in this paper. From this formula, we are able to obtain the global outage probability as well as the spatial outage probability, which depends on the location of a mobile station (MS initiating a new call. Our analytical results are compared to Monte Carlo simulations performed in a traditional hexagonal network. Furthermore, we demonstrate an application of the outage probability related to cell breathing and densification of cellular networks.

  14. Sensitivity of hydrological performance assessment analysis to variations in material properties, conceptual models, and ventilation models

    Energy Technology Data Exchange (ETDEWEB)

    Sobolik, S.R.; Ho, C.K.; Dunn, E. [Sandia National Labs., Albuquerque, NM (United States); Robey, T.H. [Spectra Research Inst., Albuquerque, NM (United States); Cruz, W.T. [Univ. del Turabo, Gurabo (Puerto Rico)

    1996-07-01

    The Yucca Mountain Site Characterization Project is studying Yucca Mountain in southwestern Nevada as a potential site for a high-level nuclear waste repository. Site characterization includes surface- based and underground testing. Analyses have been performed to support the design of an Exploratory Studies Facility (ESF) and the design of the tests performed as part of the characterization process, in order to ascertain that they have minimal impact on the natural ability of the site to isolate waste. The information in this report pertains to sensitivity studies evaluating previous hydrological performance assessment analyses to variation in the material properties, conceptual models, and ventilation models, and the implications of this sensitivity on previous recommendations supporting ESF design. This document contains information that has been used in preparing recommendations for Appendix I of the Exploratory Studies Facility Design Requirements document.

  15. Sensitivity of hydrological performance assessment analysis to variations in material properties, conceptual models, and ventilation models

    International Nuclear Information System (INIS)

    Sobolik, S.R.; Ho, C.K.; Dunn, E.; Robey, T.H.; Cruz, W.T.

    1996-07-01

    The Yucca Mountain Site Characterization Project is studying Yucca Mountain in southwestern Nevada as a potential site for a high-level nuclear waste repository. Site characterization includes surface- based and underground testing. Analyses have been performed to support the design of an Exploratory Studies Facility (ESF) and the design of the tests performed as part of the characterization process, in order to ascertain that they have minimal impact on the natural ability of the site to isolate waste. The information in this report pertains to sensitivity studies evaluating previous hydrological performance assessment analyses to variation in the material properties, conceptual models, and ventilation models, and the implications of this sensitivity on previous recommendations supporting ESF design. This document contains information that has been used in preparing recommendations for Appendix I of the Exploratory Studies Facility Design Requirements document

  16. Modelling and Predicting Backstroke Start Performance Using Non-Linear And Linear Models

    Directory of Open Access Journals (Sweden)

    de Jesus Karla

    2018-03-01

    Full Text Available Our aim was to compare non-linear and linear mathematical model responses for backstroke start performance prediction. Ten swimmers randomly completed eight 15 m backstroke starts with feet over the wedge, four with hands on the highest horizontal and four on the vertical handgrip. Swimmers were videotaped using a dual media camera set-up, with the starts being performed over an instrumented block with four force plates. Artificial neural networks were applied to predict 5 m start time using kinematic and kinetic variables and to determine the accuracy of the mean absolute percentage error. Artificial neural networks predicted start time more robustly than the linear model with respect to changing training to the validation dataset for the vertical handgrip (3.95 ± 1.67 vs. 5.92 ± 3.27%. Artificial neural networks obtained a smaller mean absolute percentage error than the linear model in the horizontal (0.43 ± 0.19 vs. 0.98 ± 0.19% and vertical handgrip (0.45 ± 0.19 vs. 1.38 ± 0.30% using all input data. The best artificial neural network validation revealed a smaller mean absolute error than the linear model for the horizontal (0.007 vs. 0.04 s and vertical handgrip (0.01 vs. 0.03 s. Artificial neural networks should be used for backstroke 5 m start time prediction due to the quite small differences among the elite level performances.

  17. Modelling and Predicting Backstroke Start Performance Using Non-Linear and Linear Models.

    Science.gov (United States)

    de Jesus, Karla; Ayala, Helon V H; de Jesus, Kelly; Coelho, Leandro Dos S; Medeiros, Alexandre I A; Abraldes, José A; Vaz, Mário A P; Fernandes, Ricardo J; Vilas-Boas, João Paulo

    2018-03-01

    Our aim was to compare non-linear and linear mathematical model responses for backstroke start performance prediction. Ten swimmers randomly completed eight 15 m backstroke starts with feet over the wedge, four with hands on the highest horizontal and four on the vertical handgrip. Swimmers were videotaped using a dual media camera set-up, with the starts being performed over an instrumented block with four force plates. Artificial neural networks were applied to predict 5 m start time using kinematic and kinetic variables and to determine the accuracy of the mean absolute percentage error. Artificial neural networks predicted start time more robustly than the linear model with respect to changing training to the validation dataset for the vertical handgrip (3.95 ± 1.67 vs. 5.92 ± 3.27%). Artificial neural networks obtained a smaller mean absolute percentage error than the linear model in the horizontal (0.43 ± 0.19 vs. 0.98 ± 0.19%) and vertical handgrip (0.45 ± 0.19 vs. 1.38 ± 0.30%) using all input data. The best artificial neural network validation revealed a smaller mean absolute error than the linear model for the horizontal (0.007 vs. 0.04 s) and vertical handgrip (0.01 vs. 0.03 s). Artificial neural networks should be used for backstroke 5 m start time prediction due to the quite small differences among the elite level performances.

  18. The Network Performance Assessment Model - Regulation with a Reference Network

    International Nuclear Information System (INIS)

    Larsson, Mats B.O.

    2003-11-01

    A new model - the Network Performance Assessment Model - has been developed gradually since 1998, in order to evaluate and benchmark local electricity grids. The model is intended to be a regulation tool for the Swedish local electricity networks, used by the Swedish Energy Agency. At spring 2004 the Network Performance Assessment Model will run into operation, based on the companies' results for 2003. The mission of the Network Performance Assessment Model is to evaluate the networks from a costumers' point of view and establish a fair price level. In order to do that, the performance of the operator is evaluated. The performances are assessed in correspondence to a price level that the consumer is considered to accept, can agree to as fair and is prepared to pay. This price level is based on an average cost, based on the cost of an efficient grid that will be built today, with already known technology. The performances are accounted in Customer Values. Those Customer Values are what can be created by someone but can't be created better by someone else. The starting point is to look upon the companies from a customers' point of view. The factors that can't be influenced by the companies are evaluated by fixed rules, valid to all companies. The rules reflect the differences. The cost for a connection is evaluated from the actual facts, i.e. the distances between the subscribers and the demanded capacity by the subscriber. This is done by the creation of a reference network, with a capacity to fulfill the demand from the subscriber. This is an efficient grid with no spare capacity and no excess capacity. The companies' existing grid are without importance, as well as holds for dimensioning as technology. Those factors which the company can influence, for an example connection reliability, are evaluated from a customer perspective by measuring the actual reliability, measured as the number and length of the interruption. When implemented to the regulation the Network

  19. Modeling time-lagged reciprocal psychological empowerment-performance relationships.

    Science.gov (United States)

    Maynard, M Travis; Luciano, Margaret M; D'Innocenzo, Lauren; Mathieu, John E; Dean, Matthew D

    2014-11-01

    Employee psychological empowerment is widely accepted as a means for organizations to compete in increasingly dynamic environments. Previous empirical research and meta-analyses have demonstrated that employee psychological empowerment is positively related to several attitudinal and behavioral outcomes including job performance. While this research positions psychological empowerment as an antecedent influencing such outcomes, a close examination of the literature reveals that this relationship is primarily based on cross-sectional research. Notably, evidence supporting the presumed benefits of empowerment has failed to account for potential reciprocal relationships and endogeneity effects. Accordingly, using a multiwave, time-lagged design, we model reciprocal relationships between psychological empowerment and job performance using a sample of 441 nurses from 5 hospitals. Incorporating temporal effects in a staggered research design and using structural equation modeling techniques, our findings provide support for the conventional positive correlation between empowerment and subsequent performance. Moreover, accounting for the temporal stability of variables over time, we found support for empowerment levels as positive influences on subsequent changes in performance. Finally, we also found support for the reciprocal relationship, as performance levels were shown to relate positively to changes in empowerment over time. Theoretical and practical implications of the reciprocal psychological empowerment-performance relationships are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  20. Indonesian Private University Lecturer Performance Improvement Model to Improve a Sustainable Organization Performance

    Science.gov (United States)

    Suryaman

    2018-01-01

    Lecturer performance will affect the quality and carrying capacity of the sustainability of an organization, in this case the university. There are many models developed to measure the performance of teachers, but not much to discuss the influence of faculty performance itself towards sustainability of an organization. This study was conducted in…

  1. Models for the energy performance of low-energy houses

    DEFF Research Database (Denmark)

    Andersen, Philip Hvidthøft Delff

    of buildings is needed both in order to assess energy-effciency and to operate modern buildings economically. Energy signatures are a central tool in both energy performance assessment and decision making related to refurbishment of buildings. Also for operation of modern buildings with installations......-building. The building is well-insulated and features large modern energy-effcient windows and oor heating. These features lead to increased non-linear responses to solar radiation and longer time constants. The building is equipped with advanced control and measuring equipment. Experiments are designed and performed...... in order to identify important dynamical properties of the building, and the collected data is used for modeling. The thesis emphasizes the statistical model building and validation needed to identify dynamical systems. It distinguishes from earlier work by focusing on modern low-energy construction...

  2. Lysimeter data as input to performance assessment models

    International Nuclear Information System (INIS)

    McConnell, J.W. Jr.

    1998-01-01

    The Field Lysimeter Investigations: Low-Level Waste Data Base Development Program is obtaining information on the performance of radioactive waste forms in a disposal environment. Waste forms fabricated using ion-exchange resins from EPICOR-117 prefilters employed in the cleanup of the Three Mile Island (TMI) Nuclear Power Station are being tested to develop a low-level waste data base and to obtain information on survivability of waste forms in a disposal environment. The program includes reviewing radionuclide releases from those waste forms in the first 7 years of sampling and examining the relationship between code input parameters and lysimeter data. Also, lysimeter data are applied to performance assessment source term models, and initial results from use of data in two models are presented

  3. WWER reactor fuel performance, modelling and experimental support. Proceedings

    International Nuclear Information System (INIS)

    Stefanova, S.; Chantoin, P.; Kolev, I.

    1994-01-01

    This publication is a compilation of 36 papers presented at the International Seminar on WWER Reactor Fuel Performance, Modelling and Experimental Support, organised by the Institute for Nuclear Research and Nuclear Energy (BG), in cooperation with the International Atomic Energy Agency. The Seminar was attended by 76 participants from 16 countries, including representatives of all major Russian plants and institutions responsible for WWER reactor fuel manufacturing, design and research. The reports are grouped in four chapters: 1) WWER Fuel Performance and Economics: Status and Improvement Prospects: 2) WWER Fuel Behaviour Modelling and Experimental Support; 3) Licensing of WWER Fuel and Fuel Analysis Codes; 4) Spent Fuel of WWER Plants. The reports from the corresponding four panel discussion sessions are also included. All individual papers are recorded in INIS as separate items

  4. Integrated model for supplier selection and performance evaluation

    Directory of Open Access Journals (Sweden)

    Borges de Araújo, Maria Creuza

    2015-08-01

    Full Text Available This paper puts forward a model for selecting suppliers and evaluating the performance of those already working with a company. A simulation was conducted in a food industry. This sector has high significance in the economy of Brazil. The model enables the phases of selecting and evaluating suppliers to be integrated. This is important so that a company can have partnerships with suppliers who are able to meet their needs. Additionally, a group method is used to enable managers who will be affected by this decision to take part in the selection stage. Finally, the classes resulting from the performance evaluation are shown to support the contractor in choosing the most appropriate relationship with its suppliers.

  5. A Fuzzy Knowledge Representation Model for Student Performance Assessment

    DEFF Research Database (Denmark)

    Badie, Farshad

    Knowledge representation models based on Fuzzy Description Logics (DLs) can provide a foundation for reasoning in intelligent learning environments. While basic DLs are suitable for expressing crisp concepts and binary relationships, Fuzzy DLs are capable of processing degrees of truth/completene....../completeness about vague or imprecise information. This paper tackles the issue of representing fuzzy classes using OWL2 in a dataset describing Performance Assessment Results of Students (PARS)....

  6. PERFORMANCE EVALUATION OF EMPIRICAL MODELS FOR VENTED LEAN HYDROGEN EXPLOSIONS

    OpenAIRE

    Anubhav Sinha; Vendra C. Madhav Rao; Jennifer X. Wen

    2017-01-01

    Explosion venting is a method commonly used to prevent or minimize damage to an enclosure caused by an accidental explosion. An estimate of the maximum overpressure generated though explosion is an important parameter in the design of the vents. Various engineering models (Bauwens et al., 2012, Molkov and Bragin, 2015) and European (EN 14994 ) and USA standards (NFPA 68) are available to predict such overpressure. In this study, their performance is evaluated using a number of published exper...

  7. System performance modeling of extreme ultraviolet lithographic thermal issues

    International Nuclear Information System (INIS)

    Spence, P. A.; Gianoulakis, S. E.; Moen, C. D.; Kanouff, M. P.; Fisher, A.; Ray-Chaudhuri, A. K.

    1999-01-01

    Numerical simulation is used in the development of an extreme ultraviolet lithography Engineering Test Stand. Extensive modeling was applied to predict the impact of thermal loads on key lithographic parameters such as image placement error, focal shift, and loss of CD control. We show that thermal issues can be effectively managed to ensure that their impact on lithographic performance is maintained within design error budgets. (c) 1999 American Vacuum Society

  8. Introducing Model Predictive Control for Improving Power Plant Portfolio Performance

    DEFF Research Database (Denmark)

    Edlund, Kristian Skjoldborg; Bendtsen, Jan Dimon; Børresen, Simon

    2008-01-01

    This paper introduces a model predictive control (MPC) approach for construction of a controller for balancing the power generation against consumption in a power system. The objective of the controller is to coordinate a portfolio consisting of multiple power plant units in the effort to perform...... implementation consisting of a distributed PI controller structure, both in terms of minimising the overall cost but also in terms of the ability to minimise deviation, which is the classical objective....

  9. Thermal performance modeling of cross-flow heat exchangers

    CERN Document Server

    Cabezas-Gómez, Luben; Saíz-Jabardo, José Maria

    2014-01-01

    This monograph introduces a numerical computational methodology for thermal performance modeling of cross-flow heat exchangers, with applications in chemical, refrigeration and automobile industries. This methodology allows obtaining effectiveness-number of transfer units (e-NTU) data and has been used for simulating several standard and complex flow arrangements configurations of cross-flow heat exchangers. Simulated results have been validated through comparisons with results from available exact and approximate analytical solutions. Very accurate results have been obtained over wide ranges

  10. 3D Massive MIMO Systems: Modeling and Performance Analysis

    KAUST Repository

    Nadeem, Qurrat-Ul-Ain

    2015-07-30

    Multiple-input-multiple-output (MIMO) systems of current LTE releases are capable of adaptation in the azimuth only. Recently, the trend is to enhance system performance by exploiting the channel’s degrees of freedom in the elevation, which necessitates the characterization of 3D channels. We present an information-theoretic channel model for MIMO systems that supports the elevation dimension. The model is based on the principle of maximum entropy, which enables us to determine the distribution of the channel matrix consistent with the prior information on the angles. Based on this model, we provide analytical expression for the cumulative density function (CDF) of the mutual information (MI) for systems with a single receive and finite number of transmit antennas in the general signalto- interference-plus-noise-ratio (SINR) regime. The result is extended to systems with finite receive antennas in the low SINR regime. A Gaussian approximation to the asymptotic behavior of MI distribution is derived for the large number of transmit antennas and paths regime. We corroborate our analysis with simulations that study the performance gains realizable through meticulous selection of the transmit antenna downtilt angles, confirming the potential of elevation beamforming to enhance system performance. The results are directly applicable to the analysis of 5G 3D-Massive MIMO-systems.

  11. Evaluation of CFVS Performance with SPARC Model and Application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Il; Na, Young Su; Ha, Kwang Soon; Cho, Song Won [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    Containment Filtered Venting System (CFVS) is one of the important safety features to reduce the amount of released fission product into the environment by depressurizing the containment. KAERI has been conducted the integrated performance verification test of CFVS as a part of a Ministry of Trade, Industry and Energy (MOTIE) project. Generally, some codes are used in the case of wet type filter, such as SPARC, BUSCA, SUPRA, etc. Especially SPARC model is included in the MELCOR to calculate the fission product removal rate through the pool scrubbing. In this study, CFVS performance is evaluated using SPARC model in MELCOR according to the steam fraction in the containment. The calculation is mainly focused on the effect of steam fraction in the containment, and the calculation result is explained with the aerosol removal model in SPARC. Previous study on the OPR 1000 is applied to the result. There were two CFVS valve opening period and it is found that the CFVS performance is different in each case. The result of the study provides the fundamental data can be used to decide the CFVS operation time, however, more calculation data is necessary to generalize the result.

  12. Modelling the Progression of Male Swimmers’ Performances through Adolescence

    Directory of Open Access Journals (Sweden)

    Shilo J. Dormehl

    2016-01-01

    Full Text Available Insufficient data on adolescent athletes is contributing to the challenges facing youth athletic development and accurate talent identification. The purpose of this study was to model the progression of male sub-elite swimmers’ performances during adolescence. The performances of 446 males (12–19 year olds competing in seven individual events (50, 100, 200 m freestyle, 100 m backstroke, breaststroke, butterfly, 200 m individual medley over an eight-year period at an annual international schools swimming championship, run under FINA regulations were collected. Quadratic functions for each event were determined using mixed linear models. Thresholds of peak performance were achieved between the ages of 18.5 ± 0.1 (50 m freestyle and 200 m individual medley and 19.8 ± 0.1 (100 m butterfly years. The slowest rate of improvement was observed in the 200 m individual medley (20.7% and the highest in the 100 m butterfly (26.2%. Butterfly does however appear to be one of the last strokes in which males specialise. The models may be useful as talent identification tools, as they predict the age at which an average sub-elite swimmer could potentially peak. The expected rate of improvement could serve as a tool in which to monitor and evaluate benchmarks.

  13. Modeling, Simulation and Performance Evaluation of Parabolic Trough

    African Journals Online (AJOL)

    Mekuannint

    demand. Heat exchangers are used to transfer heat energy from the heat transfer fluid (HTF) to water coming from feedwater heaters. In this paper a proposed .... flexibility. The TRNSYS modeling includes the TRNSYS field model and power model. The solar field model shown in Fig. 4 includes weather data processors ...

  14. Compact models and performance investigations for subthreshold interconnects

    CERN Document Server

    Dhiman, Rohit

    2014-01-01

    The book provides a detailed analysis of issues related to sub-threshold interconnect performance from the perspective of analytical approach and design techniques. Particular emphasis is laid on the performance analysis of coupling noise and variability issues in sub-threshold domain to develop efficient compact models. The proposed analytical approach gives physical insight of the parameters affecting the transient behavior of coupled interconnects. Remedial design techniques are also suggested to mitigate the effect of coupling noise. The effects of wire width, spacing between the wires, wi

  15. Performance prediction of industrial centrifuges using scale-down models.

    Science.gov (United States)

    Boychyn, M; Yim, S S S; Bulmer, M; More, J; Bracewell, D G; Hoare, M

    2004-12-01

    Computational fluid dynamics was used to model the high flow forces found in the feed zone of a multichamber-bowl centrifuge and reproduce these in a small, high-speed rotating disc device. Linking the device to scale-down centrifugation, permitted good estimation of the performance of various continuous-flow centrifuges (disc stack, multichamber bowl, CARR Powerfuge) for shear-sensitive protein precipitates. Critically, the ultra scale-down centrifugation process proved to be a much more accurate predictor of production multichamber-bowl performance than was the pilot centrifuge.

  16. Model for determining and optimizing delivery performance in industrial systems

    Directory of Open Access Journals (Sweden)

    Fechete Flavia

    2017-01-01

    Full Text Available Performance means achieving organizational objectives regardless of their nature and variety, and even overcoming them. Improving performance is one of the major goals of any company. Achieving the global performance means not only obtaining the economic performance, it is a must to take into account other functions like: function of quality, delivery, costs and even the employees satisfaction. This paper aims to improve the delivery performance of an industrial system due to their very low results. The delivery performance took into account all categories of performance indicators, such as on time delivery, backlog efficiency or transport efficiency. The research was focused on optimizing the delivery performance of the industrial system, using linear programming. Modeling the delivery function using linear programming led to obtaining precise quantities to be produced and delivered each month by the industrial system in order to minimize their transport cost, satisfying their customers orders and to control their stock. The optimization led to a substantial improvement in all four performance indicators that concern deliveries.

  17. Modeling the Performance of Fast Mulipole Method on HPC platforms

    KAUST Repository

    Ibeid, Huda

    2012-04-06

    The current trend in high performance computing is pushing towards exascale computing. To achieve this exascale performance, future systems will have between 100 million and 1 billion cores assuming gigahertz cores. Currently, there are many efforts studying the hardware and software bottlenecks for building an exascale system. It is important to understand and meet these bottlenecks in order to attain 10 PFLOPS performance. On applications side, there is an urgent need to model application performance and to understand what changes need to be made to ensure continued scalability at this scale. Fast multipole methods (FMM) were originally developed for accelerating N-body problems for particle based methods. Nowadays, FMM is more than an N-body solver, recent trends in HPC have been to use FMMs in unconventional application areas. FMM is likely to be a main player in exascale due to its hierarchical nature and the techniques used to access the data via a tree structure which allow many operations to happen simultaneously at each level of the hierarchy. In this thesis , we discuss the challenges for FMM on current parallel computers and future exasclae architecture. Furthermore, we develop a novel performance model for FMM. Our ultimate aim of this thesis is to ensure the scalability of FMM on the future exascale machines.

  18. Mars Propellant Liquefaction and Storage Performance Modeling using Thermal Desktop with an Integrated Cryocooler Model

    Science.gov (United States)

    Desai, Pooja; Hauser, Dan; Sutherlin, Steven

    2017-01-01

    NASAs current Mars architectures are assuming the production and storage of 23 tons of liquid oxygen on the surface of Mars over a duration of 500+ days. In order to do this in a mass efficient manner, an energy efficient refrigeration system will be required. Based on previous analysis NASA has decided to do all liquefaction in the propulsion vehicle storage tanks. In order to allow for transient Martian environmental effects, a propellant liquefaction and storage system for a Mars Ascent Vehicle (MAV) was modeled using Thermal Desktop. The model consisted of a propellant tank containing a broad area cooling loop heat exchanger integrated with a reverse turbo Brayton cryocooler. Cryocooler sizing and performance modeling was conducted using MAV diurnal heat loads and radiator rejection temperatures predicted from a previous thermal model of the MAV. A system was also sized and modeled using an alternative heat rejection system that relies on a forced convection heat exchanger. Cryocooler mass, input power, and heat rejection for both systems were estimated and compared against sizing based on non-transient sizing estimates.

  19. Investigating the performance of directional boundary layer model through staged modeling method

    Science.gov (United States)

    Jeong, Moon-Gyu; Lee, Won-Chan; Yang, Seung-Hune; Jang, Sung-Hoon; Shim, Seong-Bo; Kim, Young-Chang; Suh, Chun-Suk; Choi, Seong-Woon; Kim, Young-Hee

    2011-04-01

    BLM since the feasibility of the BLM has been investigated in many papers[4][5][6]. Instead of fitting the parameters to the wafer critical dimensions (CD) directly, we tried to use the aerial image (AI) from the rigorous simulator with the electromagnetic field (EMF) solver. Usually that kind of method is known as the staged modeling method. To see the advantages of this method we conducted several experiments and observed the results comparing the method of fitting to the wafer CD directly. Through the tests we could observe some remarkable results and confirmed that the staged modeling had better performance in many ways.

  20. Synthesised model of market orientation-business performance relationship

    Directory of Open Access Journals (Sweden)

    G. Nwokah

    2006-12-01

    Full Text Available Purpose: The purpose of this paper is to assess the impact of market orientation on the performance of the organisation. While much empirical works have centered on market orientation, the generalisability of its impact on performance of the Food and Beverages organisations in the Nigeria context has been under-researched. Design/Methodology/Approach: The study adopted a triangulation methodology (quantitative and qualitative approach. Data was collected from key informants using a research instrument. Returned instruments were analyzed using nonparametric correlation through the use of the Statistical Package for Social Sciences (SPSS version 10. Findings: The study validated the earlier instruments but did not find any strong association between market orientation and business performance in the Nigerian context using the food and beverages organisations for the study. The reasons underlying the weak relationship between market orientation and business performance of the Food and Beverages organisations is government policies, new product development, diversification, innovation and devaluation of the Nigerian currency. One important finding of this study is that market orientation leads to business performance through some moderating variables. Implications: The study recommends that Nigerian Government should ensure a stable economy and make economic policies that will enhance existing business development in the country. Also, organisations should have performance measurement systems to detect the impact of investment on market orientation with the aim of knowing how the organisation works. Originality/Value: This study significantly refines the body of knowledge concerning the impact of market orientation on the performance of the organisation, and thereby offers a model of market orientation and business performance in the Nigerian context for marketing scholars and practitioners. This model will, no doubt, contribute to the body of

  1. Clinical laboratory as an economic model for business performance analysis

    Science.gov (United States)

    Buljanović, Vikica; Patajac, Hrvoje; Petrovečki, Mladen

    2011-01-01

    Aim To perform SWOT (strengths, weaknesses, opportunities, and threats) analysis of a clinical laboratory as an economic model that may be used to improve business performance of laboratories by removing weaknesses, minimizing threats, and using external opportunities and internal strengths. Methods Impact of possible threats to and weaknesses of the Clinical Laboratory at Našice General County Hospital business performance and use of strengths and opportunities to improve operating profit were simulated using models created on the basis of SWOT analysis results. The operating profit as a measure of profitability of the clinical laboratory was defined as total revenue minus total expenses and presented using a profit and loss account. Changes in the input parameters in the profit and loss account for 2008 were determined using opportunities and potential threats, and economic sensitivity analysis was made by using changes in the key parameters. The profit and loss account and economic sensitivity analysis were tools for quantifying the impact of changes in the revenues and expenses on the business operations of clinical laboratory. Results Results of simulation models showed that operational profit of €470 723 in 2008 could be reduced to only €21 542 if all possible threats became a reality and current weaknesses remained the same. Also, operational gain could be increased to €535 804 if laboratory strengths and opportunities were utilized. If both the opportunities and threats became a reality, the operational profit would decrease by €384 465. Conclusion The operational profit of the clinical laboratory could be significantly reduced if all threats became a reality and the current weaknesses remained the same. The operational profit could be increased by utilizing strengths and opportunities as much as possible. This type of modeling may be used to monitor business operations of any clinical laboratory and improve its financial situation by

  2. Clinical laboratory as an economic model for business performance analysis.

    Science.gov (United States)

    Buljanović, Vikica; Patajac, Hrvoje; Petrovecki, Mladen

    2011-08-15

    To perform SWOT (strengths, weaknesses, opportunities, and threats) analysis of a clinical laboratory as an economic model that may be used to improve business performance of laboratories by removing weaknesses, minimizing threats, and using external opportunities and internal strengths. Impact of possible threats to and weaknesses of the Clinical Laboratory at Našice General County Hospital business performance and use of strengths and opportunities to improve operating profit were simulated using models created on the basis of SWOT analysis results. The operating profit as a measure of profitability of the clinical laboratory was defined as total revenue minus total expenses and presented using a profit and loss account. Changes in the input parameters in the profit and loss account for 2008 were determined using opportunities and potential threats, and economic sensitivity analysis was made by using changes in the key parameters. The profit and loss account and economic sensitivity analysis were tools for quantifying the impact of changes in the revenues and expenses on the business operations of clinical laboratory. Results of simulation models showed that operational profit of €470 723 in 2008 could be reduced to only €21 542 if all possible threats became a reality and current weaknesses remained the same. Also, operational gain could be increased to €535 804 if laboratory strengths and opportunities were utilized. If both the opportunities and threats became a reality, the operational profit would decrease by €384 465. The operational profit of the clinical laboratory could be significantly reduced if all threats became a reality and the current weaknesses remained the same. The operational profit could be increased by utilizing strengths and opportunities as much as possible. This type of modeling may be used to monitor business operations of any clinical laboratory and improve its financial situation by implementing changes in the next fiscal

  3. Risk Prediction Models for Incident Heart Failure: A Systematic Review of Methodology and Model Performance.

    Science.gov (United States)

    Sahle, Berhe W; Owen, Alice J; Chin, Ken Lee; Reid, Christopher M

    2017-09-01

    Numerous models predicting the risk of incident heart failure (HF) have been developed; however, evidence of their methodological rigor and reporting remains unclear. This study critically appraises the methods underpinning incident HF risk prediction models. EMBASE and PubMed were searched for articles published between 1990 and June 2016 that reported at least 1 multivariable model for prediction of HF. Model development information, including study design, variable coding, missing data, and predictor selection, was extracted. Nineteen studies reporting 40 risk prediction models were included. Existing models have acceptable discriminative ability (C-statistics > 0.70), although only 6 models were externally validated. Candidate variable selection was based on statistical significance from a univariate screening in 11 models, whereas it was unclear in 12 models. Continuous predictors were retained in 16 models, whereas it was unclear how continuous variables were handled in 16 models. Missing values were excluded in 19 of 23 models that reported missing data, and the number of events per variable was models. Only 2 models presented recommended regression equations. There was significant heterogeneity in discriminative ability of models with respect to age (P prediction models that had sufficient discriminative ability, although few are externally validated. Methods not recommended for the conduct and reporting of risk prediction modeling were frequently used, and resulting algorithms should be applied with caution. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Performance Analysis of Different NeQuick Ionospheric Model Parameters

    Directory of Open Access Journals (Sweden)

    WANG Ningbo

    2017-04-01

    Full Text Available Galileo adopts NeQuick model for single-frequency ionospheric delay corrections. For the standard operation of Galileo, NeQuick model is driven by the effective ionization level parameter Az instead of the solar activity level index, and the three broadcast ionospheric coefficients are determined by a second-polynomial through fitting the Az values estimated from globally distributed Galileo Sensor Stations (GSS. In this study, the processing strategies for the estimation of NeQuick ionospheric coefficients are discussed and the characteristics of the NeQuick coefficients are also analyzed. The accuracy of Global Position System (GPS broadcast Klobuchar, original NeQuick2 and fitted NeQuickC as well as Galileo broadcast NeQuickG models is evaluated over the continental and oceanic regions, respectively, in comparison with the ionospheric total electron content (TEC provided by global ionospheric maps (GIM, GPS test stations and JASON-2 altimeter. The results show that NeQuickG can mitigate ionospheric delay by 54.2%~65.8% on a global scale, and NeQuickC can correct for 71.1%~74.2% of the ionospheric delay. NeQuick2 performs at the same level with NeQuickG, which is a bit better than that of GPS broadcast Klobuchar model.

  5. The performance indicators of model projects. A special evaluation

    International Nuclear Information System (INIS)

    1995-11-01

    As a result of the acknowledgment of the key role of the Model Project concept in the Agency's Technical Co-operation Programme, the present review of the objectives of the model projects which are now in operation, was undertaken, as recommended by the Board of Governors, to determine at an early stage: the extent to which the present objectives have been defined in a measurable way; whether objectively verifiable performance indicators and success criteria had been identified for each project; whether mechanisms to obtain feedback on the achievements had been foreseen. The overall budget for the 23 model projects, as approved from 1994 to 1998, amounts to $32,557,560, of which 45% is funded by Technical Co-operation Fund. This represents an average investment of about $8 million per year, that is over 15% of the annual TC budget. The conceptual importance of the Model Project initiative, as well as the significant funds allocated to them, led the Secretariat to plan the methods to be used to determine their socio-economic impact. 1 tab

  6. Simulation model of a twin-tail, high performance airplane

    Science.gov (United States)

    Buttrill, Carey S.; Arbuckle, P. Douglas; Hoffler, Keith D.

    1992-01-01

    The mathematical model and associated computer program to simulate a twin-tailed high performance fighter airplane (McDonnell Douglas F/A-18) are described. The simulation program is written in the Advanced Continuous Simulation Language. The simulation math model includes the nonlinear six degree-of-freedom rigid-body equations, an engine model, sensors, and first order actuators with rate and position limiting. A simplified form of the F/A-18 digital control laws (version 8.3.3) are implemented. The simulated control law includes only inner loop augmentation in the up and away flight mode. The aerodynamic forces and moments are calculated from a wind-tunnel-derived database using table look-ups with linear interpolation. The aerodynamic database has an angle-of-attack range of -10 to +90 and a sideslip range of -20 to +20 degrees. The effects of elastic deformation are incorporated in a quasi-static-elastic manner. Elastic degrees of freedom are not actively simulated. In the engine model, the throttle-commanded steady-state thrust level and the dynamic response characteristics of the engine are based on airflow rate as determined from a table look-up. Afterburner dynamics are switched in at a threshold based on the engine airflow and commanded thrust.

  7. GEN-IV Benchmarking of Triso Fuel Performance Models under accident conditions modeling input data

    Energy Technology Data Exchange (ETDEWEB)

    Collin, Blaise Paul [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: • The modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release. • The modeling of the AGR-1 and HFR-EU1bis safety testing experiments. • The comparison of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from “Case 5” of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. “Case 5” of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to “effects of the numerical calculation method rather than the physical model” [IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison with each other. The participants should read

  8. ICT evaluation models and performance of medium and small enterprises

    Directory of Open Access Journals (Sweden)

    Bayaga Anass

    2014-01-01

    Full Text Available Building on prior research related to (1 impact of information communication technology (ICT and (2 operational risk management (ORM in the context of medium and small enterprises (MSEs, the focus of this study was to investigate the relationship between (1 ICT operational risk management (ORM and (2 performances of MSEs. To achieve the focus, the research investigated evaluating models for understanding the value of ICT ORM in MSEs. Multiple regression, Repeated-Measures Analysis of Variance (RM-ANOVA and Repeated-Measures Multivariate Analysis of Variance (RM-MANOVA were performed. The findings of the distribution revealed that only one variable made a significant percentage contribution to the level of ICT operation in MSEs, the Payback method (β = 0.410, p < .000. It may thus be inferred that the Payback method is the prominent variable, explaining the variation in level of evaluation models affecting ICT adoption within MSEs. Conclusively, in answering the two questions (1 degree of variability explained and (2 predictors, the results revealed that the variable contributed approximately 88.4% of the variations in evaluation models affecting ICT adoption within MSEs. The analysis of variance also revealed that the regression coefficients were real and did not occur by chance

  9. Behavioral Model of High Performance Camera for NIF Optics Inspection

    International Nuclear Information System (INIS)

    Hackel, B M

    2007-01-01

    The purpose of this project was to develop software that will model the behavior of the high performance Spectral Instruments 1000 series Charge-Coupled Device (CCD) camera located in the Final Optics Damage Inspection (FODI) system on the National Ignition Facility. NIF's target chamber will be mounted with 48 Final Optics Assemblies (FOAs) to convert the laser light from infrared to ultraviolet and focus it precisely on the target. Following a NIF shot, the optical components of each FOA must be carefully inspected for damage by the FODI to ensure proper laser performance during subsequent experiments. Rapid image capture and complex image processing (to locate damage sites) will reduce shot turnaround time; thus increasing the total number of experiments NIF can conduct during its 30 year lifetime. Development of these rapid processes necessitates extensive offline software automation -- especially after the device has been deployed in the facility. Without access to the unique real device or an exact behavioral model, offline software testing is difficult. Furthermore, a software-based behavioral model allows for many instances to be running concurrently; this allows multiple developers to test their software at the same time. Thus it is beneficial to construct separate software that will exactly mimic the behavior and response of the real SI-1000 camera

  10. Exploring uncertainty and model predictive performance concepts via a modular snowmelt-runoff modeling framework

    Science.gov (United States)

    Tyler Jon Smith; Lucy Amanda. Marshall

    2010-01-01

    Model selection is an extremely important aspect of many hydrologic modeling studies because of the complexity, variability, and uncertainty that surrounds the current understanding of watershed-scale systems. However, development and implementation of a complete precipitation-runoff modeling framework, from model selection to calibration and uncertainty analysis, are...

  11. Analytical and numerical performance models of a Heisenberg Vortex Tube

    Science.gov (United States)

    Bunge, C. D.; Cavender, K. A.; Matveev, K. I.; Leachman, J. W.

    2017-12-01

    Analytical and numerical investigations of a Heisenberg Vortex Tube (HVT) are performed to estimate the cooling potential with cryogenic hydrogen. The Ranque-Hilsch Vortex Tube (RHVT) is a device that tangentially injects a compressed fluid stream into a cylindrical geometry to promote enthalpy streaming and temperature separation between inner and outer flows. The HVT is the result of lining the inside of a RHVT with a hydrogen catalyst. This is the first concept to utilize the endothermic heat of para-orthohydrogen conversion to aid primary cooling. A review of 1st order vortex tube models available in the literature is presented and adapted to accommodate cryogenic hydrogen properties. These first order model predictions are compared with 2-D axisymmetric Computational Fluid Dynamics (CFD) simulations.

  12. Performance of the demonstration model of DIOS FXT

    Science.gov (United States)

    Tawara, Yuzuru; Sakurai, Ikuya; Masuda, Tadashi; Torii, Tatsuharu; Matsushita, Kohji; Ramsey, Brian D.

    2009-08-01

    To search for warm-hot intergalactic medium (WHIM), a small satellite mission DIOS (Diffuse Intergalactic Oxygen Surveyor ) is planned and a specially designed four-stage X-ray telescope (FXT) has been developed as the best fit optics to have a wide field of view and large effective area. Based on the previous design work and mirror fabrication technology used in the Suzaku mission, we made small demonstration model of DIOS FXT. This model has focal length of 700 mm consisting of quadrant housing and four-stage mirror sets with different radii of 150 - 180 mm and each stage mirror hight of 40 mm. We performed X-ray measurement for one set of four-stage mirror with a radius of 180 mm. From the results of the optical and X-ray measurement, it was found that tighter control were required for positioning and fabrication process of each mirror even to get angular resolution of several arcmin.

  13. BISON and MARMOT Development for Modeling Fast Reactor Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Gamble, Kyle Allan Lawrence [Idaho National Lab. (INL), Idaho Falls, ID (United States); Williamson, Richard L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, Daniel [Idaho National Lab. (INL), Idaho Falls, ID (United States); Zhang, Yongfeng [Idaho National Lab. (INL), Idaho Falls, ID (United States); Novascone, Stephen Rhead [Idaho National Lab. (INL), Idaho Falls, ID (United States); Medvedev, Pavel G. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    BISON and MARMOT are two codes under development at the Idaho National Laboratory for engineering scale and lower length scale fuel performance modeling. It is desired to add capabilities for fast reactor applications to these codes. The fast reactor fuel types under consideration are metal (U-Pu-Zr) and oxide (MOX). The cladding types of interest include 316SS, D9, and HT9. The purpose of this report is to outline the proposed plans for code development and provide an overview of the models added to the BISON and MARMOT codes for fast reactor fuel behavior. A brief overview of preliminary discussions on the formation of a bilateral agreement between the Idaho National Laboratory and the National Nuclear Laboratory in the United Kingdom is presented.

  14. Integrated healthcare networks' performance: a growth curve modeling approach.

    Science.gov (United States)

    Wan, Thomas T H; Wang, Bill B L

    2003-05-01

    This study examines the effects of integration on the performance ratings of the top 100 integrated healthcare networks (IHNs) in the United States. A strategic-contingency theory is used to identify the relationship of IHNs' performance to their structural and operational characteristics and integration strategies. To create a database for the panel study, the top 100 IHNs selected by the SMG Marketing Group in 1998 were followed up in 1999 and 2000. The data were merged with the Dorenfest data on information system integration. A growth curve model was developed and validated by the Mplus statistical program. Factors influencing the top 100 IHNs' performance in 1998 and their subsequent rankings in the consecutive years were analyzed. IHNs' initial performance scores were positively influenced by network size, number of affiliated physicians and profit margin, and were negatively associated with average length of stay and technical efficiency. The continuing high performance, judged by maintaining higher performance scores, tended to be enhanced by the use of more managerial or executive decision-support systems. Future studies should include time-varying operational indicators to serve as predictors of network performance.

  15. Graphical User Interface for Simulink Integrated Performance Analysis Model

    Science.gov (United States)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  16. A performance measurement using balanced scorecard and structural equation modeling

    Directory of Open Access Journals (Sweden)

    Rosha Makvandi

    2014-02-01

    Full Text Available During the past few years, balanced scorecard (BSC has been widely used as a promising method for performance measurement. BSC studies organizations in terms of four perspectives including customer, internal processes, learning and growth and financial figures. This paper presents a hybrid of BSC and structural equation modeling (SEM to measure the performance of an Iranian university in province of Alborz, Iran. The proposed study of this paper uses this conceptual method, designs a questionnaire and distributes it among some university students and professors. Using SEM technique, the survey analyzes the data and the results indicate that the university did poorly in terms of all four perspectives. The survey extracts necessary target improvement by presenting necessary attributes for performance improvement.

  17. The application of DEA model in enterprise environmental performance auditing

    Science.gov (United States)

    Li, F.; Zhu, L. Y.; Zhang, J. D.; Liu, C. Y.; Qu, Z. G.; Xiao, M. S.

    2017-01-01

    As a part of society, enterprises have an inescapable responsibility for environmental protection and governance. This article discusses the feasibility and necessity of enterprises environmental performance auditing and uses DEA model calculate the environmental performance of Haier for example. The most of reference data are selected and sorted from Haier’s environmental reportspublished in 2008, 2009, 2011 and 2015, and some of the data from some published articles and fieldwork. All the calculation results are calculated by DEAP software andhave a high credibility. The analysis results of this article can give corporate managements an idea about using environmental performance auditing to adjust their corporate environmental investments capital quota and change their company’s environmental strategies.

  18. PHARAO flight model: optical on ground performance tests

    Science.gov (United States)

    Lévèque, T.; Faure, B.; Esnault, F. X.; Grosjean, O.; Delaroche, C.; Massonnet, D.; Escande, C.; Gasc, Ph.; Ratsimandresy, A.; Béraud, S.; Buffe, F.; Torresi, P.; Larivière, Ph.; Bernard, V.; Bomer, T.; Thomin, S.; Salomon, C.; Abgrall, M.; Rovera, D.; Moric, I.; Laurent, Ph.

    2017-11-01

    PHARAO (Projet d'Horloge Atomique par Refroidissement d'Atomes en Orbite), which has been developed by CNES, is the first primary frequency standard specially designed for operation in space. PHARAO is the main instrument of the ESA mission ACES (Atomic Clock Ensemble in Space). ACES payload will be installed on-board the International Space Station (ISS) to perform fundamental physics experiments. All the sub-systems of the Flight Model (FM) have now passed the qualification process and the whole FM of the cold cesium clock, PHARAO, is being assembled and will undergo extensive tests. The expected performances in space are frequency accuracy less than 3.10-16 (with a final goal at 10-16) and frequency stability of 10-13 τ-1/2. In this paper, we focus on the laser source performances and the main results on the cold atom manipulation.

  19. Advanced transport systems analysis, modeling, and evaluation of performances

    CERN Document Server

    Janić, Milan

    2014-01-01

    This book provides a systematic analysis, modeling and evaluation of the performance of advanced transport systems. It offers an innovative approach by presenting a multidimensional examination of the performance of advanced transport systems and transport modes, useful for both theoretical and practical purposes. Advanced transport systems for the twenty-first century are characterized by the superiority of one or several of their infrastructural, technical/technological, operational, economic, environmental, social, and policy performances as compared to their conventional counterparts. The advanced transport systems considered include: Bus Rapid Transit (BRT) and Personal Rapid Transit (PRT) systems in urban area(s), electric and fuel cell passenger cars, high speed tilting trains, High Speed Rail (HSR), Trans Rapid Maglev (TRM), Evacuated Tube Transport system (ETT), advanced commercial subsonic and Supersonic Transport Aircraft (STA), conventionally- and Liquid Hydrogen (LH2)-fuelled commercial air trans...

  20. Evaluating performances of simplified physically based landslide susceptibility models.

    Science.gov (United States)

    Capparelli, Giovanna; Formetta, Giuseppe; Versace, Pasquale

    2015-04-01

    Rainfall induced shallow landslides cause significant damages involving loss of life and properties. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. This paper presents a package of GIS based models for landslide susceptibility analysis. It was integrated in the NewAge-JGrass hydrological model using the Object Modeling System (OMS) modeling framework. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices (GOF) by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system offers the possibility to investigate and fairly compare the quality and the robustness of models and models parameters, according a procedure that includes: i) model parameters estimation by optimizing each of the GOF index separately, ii) models evaluation in the ROC plane by using each of the optimal parameter set, and iii) GOF robustness evaluation by assessing their sensitivity to the input parameter variation. This procedure was repeated for all three models. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, Average Index (AI) optimization coupled with model M3 is the best modeling solution for our test case. This research was funded by PON Project No. 01_01503 "Integrated Systems for Hydrogeological Risk

  1. SR 97 - Alternative models project. Discrete fracture network modelling for performance assessment of Aberg

    International Nuclear Information System (INIS)

    Dershowitz, B.; Eiben, T.; Follin, S.; Andersson, Johan

    1999-08-01

    As part of studies into the siting of a deep repository for nuclear waste, Swedish Nuclear Fuel and Waste Management Company (SKB) has commissioned the Alternative Models Project (AMP). The AMP is a comparison of three alternative modeling approaches for geosphere performance assessment for a single hypothetical site. The hypothetical site, arbitrarily named Aberg is based on parameters from the Aespoe Hard Rock Laboratory in southern Sweden. The Aberg model domain, boundary conditions and canister locations are defined as a common reference case to facilitate comparisons between approaches. This report presents the results of a discrete fracture pathways analysis of the Aberg site, within the context of the SR 97 performance assessment exercise. The Aberg discrete fracture network (DFN) site model is based on consensus Aberg parameters related to the Aespoe HRL site. Discrete fracture pathways are identified from canister locations in a prototype repository design to the surface of the island or to the sea bottom. The discrete fracture pathways analysis presented in this report is used to provide the following parameters for SKB's performance assessment transport codes FARF31 and COMP23: * F-factor: Flow wetted surface normalized with regards to flow rate (yields an appreciation of the contact area available for diffusion and sorption processes) [TL -1 ]. * Travel Time: Advective transport time from a canister location to the environmental discharge [T]. * Canister Flux: Darcy flux (flow rate per unit area) past a representative canister location [LT -1 ]. In addition to the above, the discrete fracture pathways analysis in this report also provides information about: additional pathway parameters such as pathway length, pathway width, transport aperture, reactive surface area and transmissivity, percentage of canister locations with pathways to the surface discharge, spatial pattern of pathways and pathway discharges, visualization of pathways, and statistical

  2. SR 97 - Alternative models project. Discrete fracture network modelling for performance assessment of Aberg

    Energy Technology Data Exchange (ETDEWEB)

    Dershowitz, B.; Eiben, T. [Golder Associates Inc., Seattle (United States); Follin, S.; Andersson, Johan [Golder Grundteknik KB, Stockholm (Sweden)

    1999-08-01

    As part of studies into the siting of a deep repository for nuclear waste, Swedish Nuclear Fuel and Waste Management Company (SKB) has commissioned the Alternative Models Project (AMP). The AMP is a comparison of three alternative modeling approaches for geosphere performance assessment for a single hypothetical site. The hypothetical site, arbitrarily named Aberg is based on parameters from the Aespoe Hard Rock Laboratory in southern Sweden. The Aberg model domain, boundary conditions and canister locations are defined as a common reference case to facilitate comparisons between approaches. This report presents the results of a discrete fracture pathways analysis of the Aberg site, within the context of the SR 97 performance assessment exercise. The Aberg discrete fracture network (DFN) site model is based on consensus Aberg parameters related to the Aespoe HRL site. Discrete fracture pathways are identified from canister locations in a prototype repository design to the surface of the island or to the sea bottom. The discrete fracture pathways analysis presented in this report is used to provide the following parameters for SKB's performance assessment transport codes FARF31 and COMP23: * F-factor: Flow wetted surface normalized with regards to flow rate (yields an appreciation of the contact area available for diffusion and sorption processes) [TL{sup -1}]. * Travel Time: Advective transport time from a canister location to the environmental discharge [T]. * Canister Flux: Darcy flux (flow rate per unit area) past a representative canister location [LT{sup -1}]. In addition to the above, the discrete fracture pathways analysis in this report also provides information about: additional pathway parameters such as pathway length, pathway width, transport aperture, reactive surface area and transmissivity, percentage of canister locations with pathways to the surface discharge, spatial pattern of pathways and pathway discharges, visualization of pathways, and

  3. The Social Responsibility Performance Outcomes Model: Building Socially Responsible Companies through Performance Improvement Outcomes.

    Science.gov (United States)

    Hatcher, Tim

    2000-01-01

    Considers the role of performance improvement professionals and human resources development professionals in helping organizations realize the ethical and financial power of corporate social responsibility. Explains the social responsibility performance outcomes model, which incorporates the concepts of societal needs and outcomes. (LRW)

  4. Thermal modelling of PV module performance under high ambient temperatures

    Energy Technology Data Exchange (ETDEWEB)

    Diarra, D.C.; Harrison, S.J. [Queen' s Univ., Kingston, ON (Canada). Dept. of Mechanical and Materials Engineering Solar Calorimetry Lab; Akuffo, F.O. [Kwame Nkrumah Univ. of Science and Technology, Kumasi (Ghana). Dept. of Mechanical Engineering

    2005-07-01

    When predicting the performance of photovoltaic (PV) generators, the actual performance is typically lower than test results conducted under standard test conditions because the radiant energy absorbed in the module under normal operation raises the temperature of the cell and other multilayer components. The increase in temperature translates to a lower conversion efficiency of the solar cells. In order to address these discrepancies, a thermal model of a characteristic PV module was developed to assess and predict its performance under real field-conditions. The PV module consisted of monocrystalline silicon cells in EVA between a glass cover and a tedlar backing sheet. The EES program was used to compute the equilibrium temperature profile in the PV module. It was shown that heat is dissipated towards the bottom and the top of the module, and that its temperature can be much higher than the ambient temperature. Modelling results indicate that 70-75 per cent of the absorbed solar radiation is dissipated from the solar cells as heat, while 4.7 per cent of the solar energy is absorbed in the glass cover and the EVA. It was also shown that the operating temperature of the PV module decreases with increased wind speed. 2 refs.

  5. Water desalination price from recent performances: Modelling, simulation and analysis

    International Nuclear Information System (INIS)

    Metaiche, M.; Kettab, A.

    2005-01-01

    The subject of the present article is the technical simulation of seawater desalination, by a one stage reverse osmosis system, the objectives of which are the recent valuation of cost price through the use of new membrane and permeator performances, the use of new means of simulation and modelling of desalination parameters, and show the main parameters influencing the cost price. We have taken as the simulation example the Seawater Desalting centre of Djannet (Boumerdes, Algeria). The present performances allow water desalting at a price of 0.5 $/m 3 , which is an interesting and promising price, corresponding with the very acceptable water product quality, in the order of 269 ppm. It is important to run the desalting systems by reverse osmosis under high pressure, resulting in further decrease of the desalting cost and the production of good quality water. Aberration in choice of functioning conditions produces high prices and unacceptable quality. However there exists the possibility of decreasing the price by decreasing the requirement on the product quality. The seawater temperature has an effect on the cost price and quality. The installation of big desalting centres, contributes to the decrease in prices. A very important, long and tedious calculation is effected, which is impossible to conduct without programming and informatics tools. The use of the simulation model has been much efficient in the design of desalination centres that can perform at very improved prices. (author)

  6. A Hybrid Fuzzy Model for Lean Product Development Performance Measurement

    Science.gov (United States)

    Osezua Aikhuele, Daniel; Mohd Turan, Faiz

    2016-02-01

    In the effort for manufacturing companies to meet up with the emerging consumer demands for mass customized products, many are turning to the application of lean in their product development process, and this is gradually moving from being a competitive advantage to a necessity. However, due to lack of clear understanding of the lean performance measurements, many of these companies are unable to implement and fully integrated the lean principle into their product development process. Extensive literature shows that only few studies have focus systematically on the lean product development performance (LPDP) evaluation. In order to fill this gap, the study therefore proposed a novel hybrid model based on Fuzzy Reasoning Approach (FRA), and the extension of Fuzzy-AHP and Fuzzy-TOPSIS methods for the assessment of the LPDP. Unlike the existing methods, the model considers the importance weight of each of the decision makers (Experts) since the performance criteria/attributes are required to be rated, and these experts have different level of expertise. The rating is done using a new fuzzy Likert rating scale (membership-scale) which is designed such that it can address problems resulting from information lost/distortion due to closed-form scaling and the ordinal nature of the existing Likert scale.

  7. Uncertainty assessment in building energy performance with a simplified model

    Directory of Open Access Journals (Sweden)

    Titikpina Fally

    2015-01-01

    Full Text Available To assess a building energy performance, the consumption being predicted or estimated during the design stage is compared to the measured consumption when the building is operational. When valuing this performance, many buildings show significant differences between the calculated and measured consumption. In order to assess the performance accurately and ensure the thermal efficiency of the building, it is necessary to evaluate the uncertainties involved not only in measurement but also those induced by the propagation of the dynamic and the static input data in the model being used. The evaluation of measurement uncertainty is based on both the knowledge about the measurement process and the input quantities which influence the result of measurement. Measurement uncertainty can be evaluated within the framework of conventional statistics presented in the Guide to the Expression of Measurement Uncertainty (GUM as well as by Bayesian Statistical Theory (BST. Another choice is the use of numerical methods like Monte Carlo Simulation (MCS. In this paper, we proposed to evaluate the uncertainty associated to the use of a simplified model for the estimation of the energy consumption of a given building. A detailed review and discussion of these three approaches (GUM, MCS and BST is given. Therefore, an office building has been monitored and multiple temperature sensors have been mounted on candidate locations to get required data. The monitored zone is composed of six offices and has an overall surface of 102 m2.

  8. Modelling of LOCA Tests with the BISON Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, Richard L [Idaho National Laboratory; Pastore, Giovanni [Idaho National Laboratory; Novascone, Stephen Rhead [Idaho National Laboratory; Spencer, Benjamin Whiting [Idaho National Laboratory; Hales, Jason Dean [Idaho National Laboratory

    2016-05-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that is under development at Idaho National Laboratory (USA). Recent advances of BISON include the extension of the code to the analysis of LWR fuel rod behaviour during loss-of-coolant accidents (LOCAs). In this work, BISON models for the phenomena relevant to LWR cladding behaviour during LOCAs are described, followed by presentation of code results for the simulation of LOCA tests. Analysed experiments include separate effects tests of cladding ballooning and burst, as well as the Halden IFA-650.2 fuel rod test. Two-dimensional modelling of the experiments is performed, and calculations are compared to available experimental data. Comparisons include cladding burst pressure and temperature in separate effects tests, as well as the evolution of fuel rod inner pressure during ballooning and time to cladding burst. Furthermore, BISON three-dimensional simulations of separate effects tests are performed, which demonstrate the capability to reproduce the effect of azimuthal temperature variations in the cladding. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project, and the IAEA Coordinated Research Project FUMAC.

  9. From Performance Measurement to Strategic Management Model: Balanced Scorecard

    Directory of Open Access Journals (Sweden)

    Cihat Savsar

    2015-03-01

    Full Text Available Abstract: In Today’s competitive markets, one of the main conditions of the surviving of enterprises is the necessity to have effective performance management systems. Decisions must be taken by the management according to the performance of assets. In the transition from industrial society to information society, the presence of business structures have changed and the values of non-financial assets have increased in this period. So some systems have emerged based on intangible assets and to measure them instead of tangible assets and their measurements. With economic and technological development multi-dimensional evaluation in the business couldn’t be sufficient.  Performance evaluation methods can be applied in business with an integrated approach by its accordance with business strategy, linking to reward system and cause effects link established between performance measures. Balanced scorecard is one of the commonly used in measurement methods. While it was used for the first time in 1992 as a performance measurement tool today it has been used as a strategic management model besides its conventional uses. BSC contains customer perspective, internal perspective and learning and growth perspective besides financial perspective. Learning and growth perspective is determinant of other perspectives. In order to achieve the objectives set out in the financial perspective in other dimensions that need to be accomplished, is emphasized. Establishing a causal link between performance measures and targets how to achieve specified goals with strategy maps are described.

  10. Evaluation of ECOMOD Model performance for the Scenario 'Iput'

    International Nuclear Information System (INIS)

    Kryshev, A.I.; Sazykina, T.G.

    2003-01-01

    The main purpose of the model is a more detailed description of radionuclide transfer in food chains, including the dynamics in the early period after accidental release. Detailed modelling of the dynamics of radioactive depositions is beyond the purpose of the model. Standard procedures are used for assessing inhalation and external doses. Two versions of the ECOMOD model have been developed: (a) radionuclide transfer in terrestrial food chains - submodel ECOMOD-T; (b) radionuclide transfer in aquatic food chains - submodel ECOMOD-W

  11. A human performance modelling approach to intelligent decision support systems

    Science.gov (United States)

    Mccoy, Michael S.; Boys, Randy M.

    1987-01-01

    Manned space operations require that the many automated subsystems of a space platform be controllable by a limited number of personnel. To minimize the interaction required of these operators, artificial intelligence techniques may be applied to embed a human performance model within the automated, or semi-automated, systems, thereby allowing the derivation of operator intent. A similar application has previously been proposed in the domain of fighter piloting, where the demand for pilot intent derivation is primarily a function of limited time and high workload rather than limited operators. The derivation and propagation of pilot intent is presented as it might be applied to some programs.

  12. Use of total plant models for plant performance optimisation

    International Nuclear Information System (INIS)

    Ardron, K.H.

    2004-01-01

    Consideration is given to the mathematical techniques used by Nuclear Electric for steady state power plant analysis and performance optimisation. A quasi-Newton method is deployed to calculate the steady state followed by a model fitting procedure based on Lagrange's method to yield a fit to measured plant data. An optimising algorithm is used to establish maximum achievable power and efficiency. An example is described in which the techniques are applied to identify the plant constraints preventing output increases at a Nuclear Electric Advanced Gas Cooled Reactor. (author)

  13. Model complexity and performance: how far can we simplify?

    NARCIS (Netherlands)

    Raick, C.; Soetaert, K.E.R.; Grégoire, M.

    2006-01-01

    Handling model complexity and reliability is a key area of research today. While complex models containing sufficient detail have become possible due to increased computing power, they often lead to too much uncertainty. On the other hand, very simple models often crudely oversimplify the real

  14. Impact of reactive settler models on simulated WWTP performance

    DEFF Research Database (Denmark)

    Gernaey, Krist; Jeppsson, Ulf; Batstone, Damien J.

    2006-01-01

    Including a reactive settler model in a wastewater treatment plant model allows representation of the biological reactions taking place in the sludge blanket in the settler, something that is neglected in many simulation studies. The idea of including a reactive settler model is investigated for ...

  15. Modeling of high gain helical antenna for improved performance ...

    African Journals Online (AJOL)

    The modeling of High Gain Helical Antenna structure is subdivided into three sections : introduction of helical structures ,Numerical analysis, modeling and simulation based on the parameters of helical antenna. The basic foundation software for the research paper is Matlab technical computing software, the modeling were ...

  16. Effects of Stochastic Traffic Flow Model on Expected System Performance

    Science.gov (United States)

    2012-12-01

    feasible. 2.2.2 Brownian Bridge Stochastic Path Model Like the linear model, the Brownian bridge ( Karlin and Taylor 1981) model uses the idea of...IEEE Computer Vision and Pattern Recog- nition Workshop, Piscataway, New Jersey: Institute of Electrical and Electronics Engineers, Inc. Karlin , S

  17. Fuzzy regression modeling for tool performance prediction and degradation detection.

    Science.gov (United States)

    Li, X; Er, M J; Lim, B S; Zhou, J H; Gan, O P; Rutkowski, L

    2010-10-01

    In this paper, the viability of using Fuzzy-Rule-Based Regression Modeling (FRM) algorithm for tool performance and degradation detection is investigated. The FRM is developed based on a multi-layered fuzzy-rule-based hybrid system with Multiple Regression Models (MRM) embedded into a fuzzy logic inference engine that employs Self Organizing Maps (SOM) for clustering. The FRM converts a complex nonlinear problem to a simplified linear format in order to further increase the accuracy in prediction and rate of convergence. The efficacy of the proposed FRM is tested through a case study - namely to predict the remaining useful life of a ball nose milling cutter during a dry machining process of hardened tool steel with a hardness of 52-54 HRc. A comparative study is further made between four predictive models using the same set of experimental data. It is shown that the FRM is superior as compared with conventional MRM, Back Propagation Neural Networks (BPNN) and Radial Basis Function Networks (RBFN) in terms of prediction accuracy and learning speed.

  18. Modelling of Performance of Caisson Type Breakwaters under Extreme Waves

    Science.gov (United States)

    Güney Doǧan, Gözde; Özyurt Tarakcıoǧlu, Gülizar; Baykal, Cüneyt

    2016-04-01

    Many coastal structures are designed without considering loads of tsunami-like waves or long waves although they are constructed in areas prone to encounter these waves. Performance of caisson type breakwaters under extreme swells is tested in Middle East Technical University (METU) Coastal and Ocean Engineering Laboratory. This paper presents the comparison of pressure measurements taken along the surface of caisson type breakwaters and obtained from numerical modelling of them using IH2VOF as well as damage behavior of the breakwater under the same extreme swells tested in a wave flume at METU. Experiments are conducted in the 1.5 m wide wave flume, which is divided into two parallel sections (0.74 m wide each). A piston type of wave maker is used to generate the long wave conditions located at one end of the wave basin. Water depth is determined as 0.4m and kept constant during the experiments. A caisson type breakwater is constructed to one side of the divided flume. The model scale, based on the Froude similitude law, is chosen as 1:50. 7 different wave conditions are applied in the tests as the wave period ranging from 14.6 s to 34.7 s, wave heights from 3.5 m to 7.5 m and steepness from 0.002 to 0.015 in prototype scale. The design wave parameters for the breakwater were 5m wave height and 9.5s wave period in prototype. To determine the damage of the breakwater which were designed according to this wave but tested under swell waves, video and photo analysis as well as breakwater profile measurements before and after each test are performed. Further investigations are carried out about the acting wave forces on the concrete blocks of the caisson structures via pressure measurements on the surfaces of these structures where the structures are fixed to the channel bottom minimizing. Finally, these pressure measurements will be compared with the results obtained from the numerical study using IH2VOF which is one of the RANS models that can be applied to simulate

  19. Tree-based flood damage modeling of companies: Damage processes and model performance

    Science.gov (United States)

    Sieg, Tobias; Vogel, Kristin; Merz, Bruno; Kreibich, Heidi

    2017-07-01

    Reliable flood risk analyses, including the estimation of damage, are an important prerequisite for efficient risk management. However, not much is known about flood damage processes affecting companies. Thus, we conduct a flood damage assessment of companies in Germany with regard to two aspects. First, we identify relevant damage-influencing variables. Second, we assess the prediction performance of the developed damage models with respect to the gain by using an increasing amount of training data and a sector-specific evaluation of the data. Random forests are trained with data from two postevent surveys after flood events occurring in the years 2002 and 2013. For a sector-specific consideration, the data set is split into four subsets corresponding to the manufacturing, commercial, financial, and service sectors. Further, separate models are derived for three different company assets: buildings, equipment, and goods and stock. Calculated variable importance values reveal different variable sets relevant for the damage estimation, indicating significant differences in the damage process for various company sectors and assets. With an increasing number of data used to build the models, prediction errors decrease. Yet the effect is rather small and seems to saturate for a data set size of several hundred observations. In contrast, the prediction improvement achieved by a sector-specific consideration is more distinct, especially for damage to equipment and goods and stock. Consequently, sector-specific data acquisition and a consideration of sector-specific company characteristics in future flood damage assessments is expected to improve the model performance more than a mere increase in data.

  20. A Tool for Performance Modeling of Parallel Programs

    Directory of Open Access Journals (Sweden)

    J.A. González

    2003-01-01

    Full Text Available Current performance prediction analytical models try to characterize the performance behavior of actual machines through a small set of parameters. In practice, substantial deviations are observed. These differences are due to factors as memory hierarchies or network latency. A natural approach is to associate a different proportionality constant with each basic block, and analogously, to associate different latencies and bandwidths with each "communication block". Unfortunately, to use this approach implies that the evaluation of parameters must be done for each algorithm. This is a heavy task, implying experiment design, timing, statistics, pattern recognition and multi-parameter fitting algorithms. Software support is required. We present a compiler that takes as source a C program annotated with complexity formulas and produces as output an instrumented code. The trace files obtained from the execution of the resulting code are analyzed with an interactive interpreter, giving us, among other information, the values of those parameters.

  1. Assigning probability distributions to input parameters of performance assessment models

    International Nuclear Information System (INIS)

    Mishra, Srikanta

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available

  2. A modelling study of long term green roof retention performance.

    Science.gov (United States)

    Stovin, Virginia; Poë, Simon; Berretta, Christian

    2013-12-15

    This paper outlines the development of a conceptual hydrological flux model for the long term continuous simulation of runoff and drought risk for green roof systems. A green roof's retention capacity depends upon its physical configuration, but it is also strongly influenced by local climatic controls, including the rainfall characteristics and the restoration of retention capacity associated with evapotranspiration during dry weather periods. The model includes a function that links evapotranspiration rates to substrate moisture content, and is validated against observed runoff data. The model's application to typical extensive green roof configurations is demonstrated with reference to four UK locations characterised by contrasting climatic regimes, using 30-year rainfall time-series inputs at hourly simulation time steps. It is shown that retention performance is dependent upon local climatic conditions. Volumetric retention ranges from 0.19 (cool, wet climate) to 0.59 (warm, dry climate). Per event retention is also considered, and it is demonstrated that retention performance decreases significantly when high return period events are considered in isolation. For example, in Sheffield the median per-event retention is 1.00 (many small events), but the median retention for events exceeding a 1 in 1 yr return period threshold is only 0.10. The simulation tool also provides useful information about the likelihood of drought periods, for which irrigation may be required. A sensitivity study suggests that green roofs with reduced moisture-holding capacity and/or low evapotranspiration rates will tend to offer reduced levels of retention, whilst high moisture-holding capacity and low evapotranspiration rates offer the strongest drought resistance. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  4. Modelling of green roofs' hydrologic performance using EPA's SWMM.

    Science.gov (United States)

    Burszta-Adamiak, E; Mrowiec, M

    2013-01-01

    Green roofs significantly affect the increase in water retention and thus the management of rain water in urban areas. In Poland, as in many other European countries, excess rainwater resulting from snowmelt and heavy rainfall contributes to the development of local flooding in urban areas. Opportunities to reduce surface runoff and reduce flood risks are among the reasons why green roofs are more likely to be used also in this country. However, there are relatively few data on their in situ performance. In this study the storm water performance was simulated for the green roofs experimental plots using the Storm Water Management Model (SWMM) with Low Impact Development (LID) Controls module (version 5.0.022). The model consists of many parameters for a particular layer of green roofs but simulation results were unsatisfactory considering the hydrologic response of the green roofs. For the majority of the tested rain events, the Nash coefficient had negative values. It indicates a weak fit between observed and measured flow-rates. Therefore complexity of the LID module does not affect the increase of its accuracy. Further research at a technical scale is needed to determine the role of the green roof slope, vegetation cover and drying process during the inter-event periods.

  5. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    Energy Technology Data Exchange (ETDEWEB)

    G. Saulnier and W. Statham

    2006-04-16

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. The Pena Blanca Natural Analogue Performance Assessment Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash-flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following analogous characteristics as compared to the Yucca Mountain repository site: (1) Analogous source--UO{sub 2} uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geology--(i.e. fractured, welded, and altered rhyolitic ash-flow tuffs); (3) Analogous climate--Semiarid to arid; (4) Analogous setting--Volcanic tuffs overlie carbonate rocks; and (5) Analogous geochemistry--Oxidizing conditions Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table.

  6. A Water Treatment Case Study for Quantifying Model Performance with Multilevel Flow Modelling

    DEFF Research Database (Denmark)

    Nielsen, Emil Krabbe; Bram, Mads Valentin; Frutiger, Jerome

    2018-01-01

    Decision support systems are a key focus of research on developing control rooms to aid operators in making reliable decisions, and reducing incidents caused by human errors. For this purpose, models of complex systems can be developed to diagnose causes or consequences for specific alarms. Model...... plant experiments are used for validation of simple Multilevel Flow Modelling models of a hydrocyclone unit for oil removal from produced water....

  7. A performance model for the communication in fast multipole methods on high-performance computing platforms

    KAUST Repository

    Ibeid, Huda

    2016-03-04

    Exascale systems are predicted to have approximately 1 billion cores, assuming gigahertz cores. Limitations on affordable network topologies for distributed memory systems of such massive scale bring new challenges to the currently dominant parallel programing model. Currently, there are many efforts to evaluate the hardware and software bottlenecks of exascale designs. It is therefore of interest to model application performance and to understand what changes need to be made to ensure extrapolated scalability. The fast multipole method (FMM) was originally developed for accelerating N-body problems in astrophysics and molecular dynamics but has recently been extended to a wider range of problems. Its high arithmetic intensity combined with its linear complexity and asynchronous communication patterns make it a promising algorithm for exascale systems. In this paper, we discuss the challenges for FMM on current parallel computers and future exascale architectures, with a focus on internode communication. We focus on the communication part only; the efficiency of the computational kernels are beyond the scope of the present study. We develop a performance model that considers the communication patterns of the FMM and observe a good match between our model and the actual communication time on four high-performance computing (HPC) systems, when latency, bandwidth, network topology, and multicore penalties are all taken into account. To our knowledge, this is the first formal characterization of internode communication in FMM that validates the model against actual measurements of communication time. The ultimate communication model is predictive in an absolute sense; however, on complex systems, this objective is often out of reach or of a difficulty out of proportion to its benefit when there exists a simpler model that is inexpensive and sufficient to guide coding decisions leading to improved scaling. The current model provides such guidance.

  8. Generalized Thermohydraulics Module GENFLO for Combining With the PWR Core Melting Model, BWR Recriticality Neutronics Model and Fuel Performance Model

    International Nuclear Information System (INIS)

    Miettinen, Jaakko; Hamalainen, Anitta; Pekkarinen, Esko

    2002-01-01

    Thermal hydraulic simulation capability for accident conditions is needed at present in VTT in several programs. Traditional thermal hydraulic models are too heavy for simulation in the analysis tasks, where the main emphasis is the rapid neutron dynamics or the core melting. The GENFLO thermal hydraulic model has been developed at VTT for special applications in the combined codes. The basic field equations in GENFLO are for the phase mass, the mixture momentum and phase energy conservation equations. The phase separation is solved with the drift flux model. The basic variables to be solved are the pressure, void fraction, mixture velocity, gas enthalpy, liquid enthalpy, and concentration of non-condensable gas fractions. The validation of the thermohydraulic solution alone includes large break LOCA reflooding experiments and in specific for the severe accident conditions QUENCH tests. In the recriticality analysis the core neutronics is simulated with a two-dimensional transient neutronics code TWODIM. The recriticality with one rapid prompt peak is expected during a severe accident scenario, where the control rods have been melted and ECCS reflooding is started after the depressurization. The GENFLO module simulates the BWR thermohydraulics in this application. The core melting module has been developed for the real time operator training by using the APROS engineering simulators. The core heatup, oxidation, metal and fuel pellet relocation and corium pool formation into the lower plenum are calculated. In this application the GENFLO model simulates the PWR vessel thermohydraulics. In the fuel performance analysis the fuel rod transient behavior is simulated with the FRAPTRAN code. GENFLO simulates the subchannel around a single fuel rod and delivers the heat transfer on the cladding surface for the FRAPTRAN. The transient boundary conditions for the subchannel are transmitted from the system code for operational transient, loss of coolant accidents and

  9. A process-based model for cattle manure compost windrows: Model performance and application

    Science.gov (United States)

    A model was developed and incorporated in the Integrated Farm System Model (IFSM, v.4.3) that simulates important processes occurring during windrow composting of manure. The model, documented in an accompanying paper, predicts changes in windrow properties and conditions and the resulting emissions...

  10. Human performance modeling for system of systems analytics: combat performance-shaping factors.

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R.; Miller, Dwight Peter

    2006-01-01

    The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate they would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.

  11. Performance of monitoring networks estimated from a Gaussian plume model

    International Nuclear Information System (INIS)

    Seebregts, A.J.; Hienen, J.F.A.

    1990-10-01

    In support of the ECN study on monitoring strategies after nuclear accidents, the present report describes the analysis of the performance of a monitoring network in a square grid. This network is used to estimate the distribution of the deposition pattern after a release of radioactivity into the atmosphere. The analysis is based upon a single release, a constant wind direction and an atmospheric dispersion according to a simplified Gaussian plume model. A technique is introduced to estimate the parameters in this Gaussian model based upon measurements at specific monitoring locations and linear regression, although this model is intrinsically non-linear. With these estimated parameters and the Gaussian model the distribution of the contamination due to deposition can be estimated. To investigate the relation between the network and the accuracy of the estimates for the deposition, deposition data have been generated by the Gaussian model, including a measurement error by a Monte Carlo simulation and this procedure has been repeated for several grid sizes, dispersion conditions, number of measurements per location, and errors per single measurement. The present technique has also been applied for the mesh sizes of two networks in the Netherlands, viz. the Landelijk Meetnet Radioaciviteit (National Measurement Network on Radioactivity, mesh size approx. 35 km) and the proposed Landelijk Meetnet Nucleaire Incidenten (National Measurement Network on Nuclear Incidents, mesh size approx. 15 km). The results show accuracies of 11 and 7 percent, respectively, if monitoring locations are used more than 10 km away from the postulated accident site. These figures are based upon 3 measurements per location and a dispersion during neutral weather with a wind velocity of 4 m/s. For stable weather conditions and low wind velocities, i.e. a small plume, the calculated accuracies are at least a factor 1.5 worse.The present type of analysis makes a cost-benefit approach to the

  12. A high performance finite element model for wind farm modeling in forested areas

    Science.gov (United States)

    Owen, Herbert; Avila, Matias; Folch, Arnau; Cosculluela, Luis; Prieto, Luis

    2015-04-01

    Wind energy has grown significantly during the past decade and is expected to continue growing in the fight against climate change. In the search for new land where the impact of the wind turbines is small several wind farms are currently being installed in forested areas. In order to optimize the distribution of the wind turbines within the wind farm the Reynolds Averaged Navier Stokes equations are solved over the domain of interest using either commercial or in house codes. The existence of a canopy alters the Atmospheric Boundary Layer wind profile close to the ground. Therefore in order to obtain a more accurate representation of the flow in forested areas modification to both the Navier Stokes and turbulence variables equations need to be introduced. Several existing canopy models have been tested in an academic problem showing that the one proposed by Sogachev et. al gives the best results. This model has been implemented in an in house CFD solver named Alya. It is a high performance unstructured finite element code that has been designed from scratch to be able to run in the world's biggest supercomputers. Its scalabililty has recently been tested up to 100000 processors in both American and European supercomputers. During the past three years the code has been tuned and tested for wind energy problems. Recent efforts have focused on the canopy model following industry needs. In this work we shall benchmark our results in a wind farm that is currently being designed by Scottish Power and Iberdrola in Scotland. This is a very interesting real case with extensive experimental data from five different masts with anemometers at several heights. It is used to benchmark both the wind profiles and the speed up obtained between different masts. Sixteen different wind directions are simulated. The numerical model provides very satisfactory results for both the masts that are affected by the canopy and those that are not influenced by it.

  13. The better model to predict and improve pediatric health care quality: performance or importance-performance?

    Science.gov (United States)

    Olsen, Rebecca M; Bryant, Carol A; McDermott, Robert J; Ortinau, David

    2013-01-01

    The perpetual search for ways to improve pediatric health care quality has resulted in a multitude of assessments and strategies; however, there is little research evidence as to their conditions for maximum effectiveness. A major reason for the lack of evaluation research and successful quality improvement initiatives is the methodological challenge of measuring quality from the parent perspective. Comparison of performance-only and importance-performance models was done to determine the better predictor of pediatric health care quality and more successful method for improving the quality of care provided to children. Fourteen pediatric health care centers serving approximately 250,000 patients in 70,000 households in three West Central Florida counties were studied. A cross-sectional design was used to determine the importance and performance of 50 pediatric health care attributes and four global assessments of pediatric health care quality. Exploratory factor analysis revealed five dimensions of care (physician care, access, customer service, timeliness of services, and health care facility). Hierarchical multiple regression compared the performance-only and the importance-performance models. In-depth interviews, participant observations, and a direct cognitive structural analysis identified 50 health care attributes included in a mailed survey to parents(n = 1,030). The tailored design method guided survey development and data collection. The importance-performance multiplicative additive model was a better predictor of pediatric health care quality. Attribute importance moderates performance and quality, making the importance-performance model superior for measuring and providing a deeper understanding of pediatric health care quality and a better method for improving the quality of care provided to children. Regardless of attribute performance, if the level of attribute importance is not taken into consideration, health care organizations may spend valuable

  14. PERFORMANCE EVALUATION OF 3D MODELING SOFTWARE FOR UAV PHOTOGRAMMETRY

    OpenAIRE

    H. Yanagi; H. Yanagi; H. Chikatsu

    2016-01-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algori...

  15. Achievement emotions and academic performance: longitudinal models of reciprocal effects

    OpenAIRE

    Pekrun, Reinhard; Lichtenfeld, Stephanie; Marsh, Herbert, W.; Murayama, Kou; Goetz, Thomas

    2017-01-01

    A reciprocal effects model linking emotion and achievement over time is proposed. The model was tested using five annual waves of the PALMA longitudinal study, which investigated adolescents’ development in mathematics (grades 5-9; N=3,425 German students; mean starting age=11.7 years; representative sample). Structural equation modeling showed that positive emotions (enjoyment, pride) positively predicted subsequent achievement (math end-of-the-year grades and test scores), and that achievem...

  16. Methodology for Modeling Building Energy Performance across the Commercial Sector

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, B.; Long, N.; Torcellini, P.; Judkoff, R.; Crawley, D.; Ryan, J.

    2008-03-01

    This report uses EnergyPlus simulations of each building in the 2003 Commercial Buildings Energy Consumption Survey (CBECS) to document and demonstrate bottom-up methods of modeling the entire U.S. commercial buildings sector (EIA 2006). The ability to use a whole-building simulation tool to model the entire sector is of interest because the energy models enable us to answer subsequent 'what-if' questions that involve technologies and practices related to energy. This report documents how the whole-building models were generated from the building characteristics in 2003 CBECS and compares the simulation results to the survey data for energy use.

  17. Forecasting project schedule performance using probabilistic and deterministic models

    Directory of Open Access Journals (Sweden)

    S.A. Abdel Azeem

    2014-04-01

    Full Text Available Earned value management (EVM was originally developed for cost management and has not widely been used for forecasting project duration. In addition, EVM based formulas for cost or schedule forecasting are still deterministic and do not provide any information about the range of possible outcomes and the probability of meeting the project objectives. The objective of this paper is to develop three models to forecast the estimated duration at completion. Two of these models are deterministic; earned value (EV and earned schedule (ES models. The third model is a probabilistic model and developed based on Kalman filter algorithm and earned schedule management. Hence, the accuracies of the EV, ES and Kalman Filter Forecasting Model (KFFM through the different project periods will be assessed and compared with the other forecasting methods such as the Critical Path Method (CPM, which makes the time forecast at activity level by revising the actual reporting data for each activity at a certain data date. A case study project is used to validate the results of the three models. Hence, the best model is selected based on the lowest average percentage of error. The results showed that the KFFM developed in this study provides probabilistic prediction bounds of project duration at completion and can be applied through the different project periods with smaller errors than those observed in EV and ES forecasting models.

  18. Blocking performance of the hose model and the pipe model for VPN service provisioning over WDM optical networks

    Science.gov (United States)

    Wang, Haibo; Swee Poo, Gee

    2004-08-01

    We study the provisioning of virtual private network (VPN) service over WDM optical networks. For this purpose, we investigate the blocking performance of the hose model versus the pipe model for the provisioning. Two techniques are presented: an analytical queuing model and a discrete event simulation. The queuing model is developed from the multirate reduced-load approximation technique. The simulation is done with the OPNET simulator. Several experimental situations were used. The blocking probabilities calculated from the two approaches show a close match, indicating that the multirate reduced-load approximation technique is capable of predicting the blocking performance for the pipe model and the hose model in WDM networks. A comparison of the blocking behavior of the two models shows that the hose model has superior blocking performance as compared with pipe model. By and large, the blocking probability of the hose model is better than that of the pipe model by a few orders of magnitude, particularly at low load regions. The flexibility of the hose model allowing for the sharing of resources on a link among all connections accounts for its superior performance.

  19. Modeling and Performance Evaluation of a Top Gated Graphene MOSFET

    Directory of Open Access Journals (Sweden)

    Jith Sarker

    2017-08-01

    Full Text Available In the modernistics years, Graphene has become a promising resplendence in the horizon of fabrication technology, due to some of its unique electronic properties like zero band gap, high saturation velocity, higher electrical conductivity and so on followed by extraordinary thermal, optical and mechanical properties such as- high thermal conductivity, optical transparency, flexibility and thinness. Graphene based devices demand to be deliberated as a possible option for post Si based fabrication technology. In this paper, we have modelled a top gated graphene metal oxide semiconductor field effect transistor (MOSFET. Surface potential dependent Quantum capacitance is obtained self-consistently along with linear and square root approximation model. Gate voltage dependence of surface potential has been analyzed with graphical illustrations and required mathematics as well. Output characteristics, transfer characteristics, transconductance (as a function of gate voltage behavior have been investigated. In the end, effect of channel length on device performance has been justified. Variation of effective mobility and minimum carrier density with respect to channel length has also been observed. Considering all of the graphical illustrations, we do like to conclude that, graphene will be a successor in post silicon era and bring revolutionary changes in the field of fabrication technology.

  20. Modeling the Energy Performance of LoRaWAN.

    Science.gov (United States)

    Casals, Lluís; Mir, Bernat; Vidal, Rafael; Gomez, Carles

    2017-10-16

    LoRaWAN is a flagship Low-Power Wide Area Network (LPWAN) technology that has highly attracted much attention from the community in recent years. Many LoRaWAN end-devices, such as sensors or actuators, are expected not to be powered by the electricity grid; therefore, it is crucial to investigate the energy consumption of LoRaWAN. However, published works have only focused on this topic to a limited extent. In this paper, we present analytical models that allow the characterization of LoRaWAN end-device current consumption, lifetime and energy cost of data delivery. The models, which have been derived based on measurements on a currently prevalent LoRaWAN hardware platform, allow us to quantify the impact of relevant physical and Medium Access Control (MAC) layer LoRaWAN parameters and mechanisms, as well as Bit Error Rate (BER) and collisions, on energy performance. Among others, evaluation results show that an appropriately configured LoRaWAN end-device platform powered by a battery of 2400 mAh can achieve a 1-year lifetime while sending one message every 5 min, and an asymptotic theoretical lifetime of 6 years for infrequent communication.

  1. CLRP: Individual evaluation of model performance for scenario S

    International Nuclear Information System (INIS)

    Krajewski, P.

    1996-01-01

    The model CLRP was created in 1989 as a part of research project ''Long-Lived Post-Chernobyl Radioactivity and Radiation Protection Criteria for Risk Reduction'' performed in cooperation with US Environmental Protection Agency. The aim of this project was to examine the fate of long-lived radionuclides in the terrestrial ecosystem. Concentrations of Cs-137 and Cs-134 in the particular components of terrestrial ecosystem e.g. soil, vegetation, animal tissues and animal products are calculated as a function of time following deposition from the atmosphere. Based on this data the whole body contents of radionuclide as a function of time is calculated and dose to a specific organ for the radionuclide may be estimated as an integral of the resultant dose rate over a sufficient period. In addition, the model allows estimation of inhalation dose from time integrated air concentration and external dose from total deposition using simple conversion factors. The program is designed to allow the simulation of many different radiological situations (chronic or acute releases) and dose affecting countermeasures. Figs, tabs

  2. A Critical Analysis of Measurement Models of Export Performance

    Directory of Open Access Journals (Sweden)

    Jorge Carneiro

    2007-05-01

    Full Text Available Poor conceptualization of the export performance construct may undermine theory development efforts and may be one of the reasons behind the often conflicting findings in empirical research on the export performance phenomenon. This article reviews the conceptual and empirical literature and proposes a new analytical scheme that may serve as a standard for judging content validity and a guiding yardstick for drawing operational representations of the construct. A critical assessment of some of the most frequently cited measurement frameworks, followed by an analysis of recent (1999-2004 empirical research, leaves no doubt that there are flaws in the conceptualization and operationalization of the performance construct that ought to be addressed. A new measurement model is advanced along with some guidelines which are suggested for its future empirical validation. The new measurement framework allegedly improves on other past efforts in terms of breadth of coverage of the construct’s domain (content validity. It also offers a measurement perspective (with the simultaneous use of both formative and reflective approaches that appears to reflect better the nature of the construct.

  3. Modeling illumination performance of plastic optical fiber passive daylighting system

    International Nuclear Information System (INIS)

    Sulaiman, F.; Ahmad, A.; Ahmed, A.Z.

    2006-01-01

    One of the most direct methods of utilizing solar energy for energy conservation is to bring natural light indoors to light up an area. This paper reports on the investigation of the feasibility to utilize large core optical fibers to convey and distribute solar light passively throughout residential or commercial structures. The focus of this study is on the mathematical modeling of the illumination performance and the light transmission efficiency of solid core end light fiber for optical day lighting systems. The Meatball simulations features the optical fiber transmittance for glass and plastic fibers, illumination performance over lengths of plastic end-lit fiber, spectral transmission, light intensity loss through the large diameter solid core optical fibers as well as the transmission efficiency of the optical fiber itself. It was found that plastic optical fiber has less transmission loss over the distance of the fiber run which clearly shows that the Plastic Optical Fiber should be optimized for emitting visible light. The findings from the analysis on the performance of large diameter optical fibers for day lighting systems seems feasible for energy efficient lighting system in commercial or residential buildings

  4. Job Demands-Control-Support model and employee safety performance.

    Science.gov (United States)

    Turner, Nick; Stride, Chris B; Carter, Angela J; McCaughey, Deirdre; Carroll, Anthony E

    2012-03-01

    The aim of this study was to explore whether work characteristics (job demands, job control, social support) comprising Karasek and Theorell's (1990) Job Demands-Control-Support framework predict employee safety performance (safety compliance and safety participation; Neal and Griffin, 2006). We used cross-sectional data of self-reported work characteristics and employee safety performance from 280 healthcare staff (doctors, nurses, and administrative staff) from Emergency Departments of seven hospitals in the United Kingdom. We analyzed these data using a structural equation model that simultaneously regressed safety compliance and safety participation on the main effects of each of the aforementioned work characteristics, their two-way interactions, and the three-way interaction among them, while controlling for demographic, occupational, and organizational characteristics. Social support was positively related to safety compliance, and both job control and the two-way interaction between job control and social support were positively related to safety participation. How work design is related to employee safety performance remains an important area for research and provides insight into how organizations can improve workplace safety. The current findings emphasize the importance of the co-worker in promoting both safety compliance and safety participation. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  5. FASTSim: A Model to Estimate Vehicle Efficiency, Cost and Performance

    Energy Technology Data Exchange (ETDEWEB)

    Brooker, A.; Gonder, J.; Wang, L.; Wood, E.; Lopp, S.; Ramroth, L.

    2015-05-04

    The Future Automotive Systems Technology Simulator (FASTSim) is a high-level advanced vehicle powertrain systems analysis tool supported by the U.S. Department of Energy’s Vehicle Technologies Office. FASTSim provides a quick and simple approach to compare powertrains and estimate the impact of technology improvements on light- and heavy-duty vehicle efficiency, performance, cost, and battery batches of real-world drive cycles. FASTSim’s calculation framework and balance among detail, accuracy, and speed enable it to simulate thousands of driven miles in minutes. The key components and vehicle outputs have been validated by comparing the model outputs to test data for many different vehicles to provide confidence in the results. A graphical user interface makes FASTSim easy and efficient to use. FASTSim is freely available for download from the National Renewable Energy Laboratory’s website (see www.nrel.gov/fastsim).

  6. Modelling ‘Headless': Finance, Performance Art, and Paradoxy

    Directory of Open Access Journals (Sweden)

    Angus Cameron

    2015-07-01

    Full Text Available This essay reflects on the author’s experience of a collaborative performance art project – Headless, by Goldin+Senneby – which since 2007 has created a quasi ‘model’ of offshore finance. Headless is not a conventional economic model, but rather an example of an ‘imaginary economics’ that is peculiar to contemporary art (as in Velthuis, 2005. Because of the context in which the project has unfolded, it has been able to develop an effective engagement with and critique of the ‘real’ offshore world. Headless, the essay argues, moves beyond passive representation, to become an active, if deliberately paradoxical, intervention in debates over the nature of finance, spatiality, privacy, and art.

  7. Computational Human Performance Modeling For Alarm System Design

    Energy Technology Data Exchange (ETDEWEB)

    Jacques Hugo

    2012-07-01

    The introduction of new technologies like adaptive automation systems and advanced alarms processing and presentation techniques in nuclear power plants is already having an impact on the safety and effectiveness of plant operations and also the role of the control room operator. This impact is expected to escalate dramatically as more and more nuclear power utilities embark on upgrade projects in order to extend the lifetime of their plants. One of the most visible impacts in control rooms will be the need to replace aging alarm systems. Because most of these alarm systems use obsolete technologies, the methods, techniques and tools that were used to design the previous generation of alarm system designs are no longer effective and need to be updated. The same applies to the need to analyze and redefine operators’ alarm handling tasks. In the past, methods for analyzing human tasks and workload have relied on crude, paper-based methods that often lacked traceability. New approaches are needed to allow analysts to model and represent the new concepts of alarm operation and human-system interaction. State-of-the-art task simulation tools are now available that offer a cost-effective and efficient method for examining the effect of operator performance in different conditions and operational scenarios. A discrete event simulation system was used by human factors researchers at the Idaho National Laboratory to develop a generic alarm handling model to examine the effect of operator performance with simulated modern alarm system. It allowed analysts to evaluate alarm generation patterns as well as critical task times and human workload predicted by the system.

  8. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    International Nuclear Information System (INIS)

    G.J. Saulnier Jr; W. Statham

    2006-01-01

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. the Pena Blanca Natural Analogue Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following characteristics as compared to the Yucca Mountain repository site. (1) Analogous source: UO 2 uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geologic setting: fractured, welded, and altered rhyolitic ash flow tuffs overlying carbonate rocks; (3) Analogous climate: Semiarid to arid; (4) Analogous geochemistry: Oxidizing conditions; and (5) Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table. The Nopal I deposit is approximately 8 ± 0.5 million years old and has been exposed to oxidizing conditions during the last 3.2 to 3.4 million years. The Pena Blanca Natural Analogue Model considers that the uranium oxide and uranium silicates in the ore deposit were originally analogous to uranium-oxide spent nuclear fuel. The Pena Blanca site has been characterized using field and laboratory investigations of its fault and fracture distribution, mineralogy, fracture fillings, seepage into the mine adits, regional hydrology, and mineralization that shows the extent of radionuclide migration. Three boreholes were drilled at the Nopal I mine site in 2003 and these boreholes have provided samples for lithologic characterization, water-level measurements, and water samples for laboratory analysis

  9. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    Energy Technology Data Exchange (ETDEWEB)

    G.J. Saulnier Jr; W. Statham

    2006-03-10

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. the Pena Blanca Natural Analogue Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following characteristics as compared to the Yucca Mountain repository site. (1) Analogous source: UO{sub 2} uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geologic setting: fractured, welded, and altered rhyolitic ash flow tuffs overlying carbonate rocks; (3) Analogous climate: Semiarid to arid; (4) Analogous geochemistry: Oxidizing conditions; and (5) Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table. The Nopal I deposit is approximately 8 {+-} 0.5 million years old and has been exposed to oxidizing conditions during the last 3.2 to 3.4 million years. The Pena Blanca Natural Analogue Model considers that the uranium oxide and uranium silicates in the ore deposit were originally analogous to uranium-oxide spent nuclear fuel. The Pena Blanca site has been characterized using field and laboratory investigations of its fault and fracture distribution, mineralogy, fracture fillings, seepage into the mine adits, regional hydrology, and mineralization that shows the extent of radionuclide migration. Three boreholes were drilled at the Nopal I mine site in 2003 and these boreholes have provided samples for lithologic characterization, water-level measurements, and water samples for laboratory

  10. Proposal for a Method for Business Model Performance Assessment: Toward an Experimentation Tool for Business Model Innovation

    Directory of Open Access Journals (Sweden)

    Antonio Batocchio

    2017-04-01

    Full Text Available The representation of business models has been recently widespread, especially in the pursuit of innovation. However, defining a company’s business model is sometimes limited to discussion and debates. This study observes the need for performance measurement so that business models can be data-driven. To meet this goal, the work proposed as a hypothesis the creation of a method that combines the practices of the Balanced Scorecard with a method of business models representation – the Business Model Canvas. Such a combination was based on study of conceptual adaptation, resulting in an application roadmap. A case study application was performed to check the functionality of the proposition, focusing on startup organizations. It was concluded that based on the performance assessment of the business model it is possible to propose the search for change through experimentation, a path that can lead to business model innovation.

  11. Performance Evaluation of Infiltration Models in a Hydromorphic Soil ...

    African Journals Online (AJOL)

    Four infiltration models were investigated for their capacity to describe water infiltration into hydromorphic (gleysol) soil. The Models were Kostiakov\\'s (1932), Philip\\'s (1957) Kostiakov- Lewis\\'(1982) and Modified Kostiakov (1978). Field measurement of infiltration was made using double ring infiltrometers on an ...

  12. Influence of input matrix representation on topic modelling performance

    CSIR Research Space (South Africa)

    De Waal, A

    2010-11-01

    Full Text Available Topic models explain a collection of documents with a small set of distributions over terms. These distributions over terms define the topics. Topic models ignore the structure of documents and use a bag-of-words approach which relies solely...

  13. Storm Water Management Model (SWMM): Performance Review and Gap Analysis

    Science.gov (United States)

    The Storm Water Management Model (SWMM) is a widely used tool for urban drainage design and planning. Hundreds of peer-reviewed articles and conference proceedings have been written describing applications of SWMM. This review focused on collecting information on model performanc...

  14. Achievement Emotions and Academic Performance: Longitudinal Models of Reciprocal Effects

    Science.gov (United States)

    Pekrun, Reinhard; Lichtenfeld, Stephanie; Marsh, Herbert W.; Murayama, Kou; Goetz, Thomas

    2017-01-01

    A reciprocal effects model linking emotion and achievement over time is proposed. The model was tested using five annual waves of the Project for the Analysis of Learning and Achievement in Mathematics (PALMA) longitudinal study, which investigated adolescents' development in mathematics (Grades 5-9; N = 3,425 German students; mean starting…

  15. SR 97. Alternative models project. Channel network modelling of Aberg. Performance assessment using CHAN3D

    International Nuclear Information System (INIS)

    Gylling, B.; Moreno, L.; Neretnieks, I.

    1999-06-01

    In earlier papers, discussions of the mechanisms which are important in performance assessment in fractured media are given. The influence of the mechanisms have been demonstrated using CHAN3D. In this study CHAN3D has been used to simulate production of input data to COMP23 and FARF31. CHAN3D has been integrated with COMP23 in earlier studies, but it has not been used before to calculate input data to FARF31. In the normal use of CHAN3D, the transport part of the concept simulates far field migration. The task in this study was to produce input data according to a specification and using a defined hypothetical repository located at Aespoe HRL as a platform. During the process of applying CHAN3D to the site, the scaling of conductivity was studied, using both data from Aespoe HRL and synthetic data. From the realisations performed, ensemble statistics of water travel time, flux at repository scale, flow-wetted surface and F-ratio values were calculated. Two typical realisations were studied in more detail. The results for three specified canister positions were also highlighted. Exit locations for the released particles were studied. In each realisation statistics were calculated over the entities. The values were post-processed to obtain performance measures of higher order. From the averaging over all the realisations it can be concluded that the Monte Carlo stability is reached for the ensemble statistics. The presence of fracture zones has a large influence on flow and transport in the rock. However, for a single canister the result may be very different between realisations. In some realisations there may be a fast path to a fracture zone whereas in other realisations the opposite may be valid. From the calculation of the flow over the boundaries between the regional model and the smaller local model the consistency seem to be acceptable considering that a perfect match of properties is hard to obtain

  16. Performance of a quasi-steady model for hovering hummingbirds

    Directory of Open Access Journals (Sweden)

    Jialei Song

    2015-01-01

    Full Text Available A quasi-steady model describing aerodynamics of hovering Ruby-throated hummingbirds is presented to study extent of the low-order model in representing the flow physics of the bird and also to separately quantify the forces from the translational, rotational, and acceleration effects. Realistic wing kinematics are adopted and the model is calibrated against computational fluid dynamics (CFD simulations of a corresponding revolving-wing model. The results show that the quasi-steady model is able to predict overall lift production reasonably well but fails to capture detailed force oscillations. The downstroke–upstroke asymmetry is consistent with that in the previous CFD study. Further analysis shows that significant rotational force is produced during mid-stroke rather than wing reversal.

  17. Dst Index in the 2008 GEM Modeling Challenge - Model Performance for Moderate and Strong Magnetic Storms

    Science.gov (United States)

    Rastaetter, Lutz; Kuznetsova, Maria; Hesse, Michael; Chulaki, Anna; Pulkkinen, Antti; Ridley, Aaron J.; Gombosi, Tamas; Vapirev, Alexander; Raeder, Joachim; Wiltberger, Michael James; hide

    2010-01-01

    The GEM 2008 modeling challenge efforts are expanding beyond comparing in-situ measurements in the magnetosphere and ionosphere to include the computation of indices to be compared. The Dst index measures the largest deviations of the horizontal magnetic field at 4 equatorial magnetometers from the quiet-time background field and is commonly used to track the strength of the magnetic disturbance of the magnetosphere during storms. Models can calculate a proxy Dst index in various ways, including using the Dessler-Parker Sckopke relation and the energy of the ring current and Biot-Savart integration of electric currents in the magnetosphere. The GEM modeling challenge investigates 4 space weather events and we compare models available at CCMC against each other and the observed values of Ost. Models used include SWMF/BATSRUS, OpenGGCM, LFM, GUMICS (3D magnetosphere MHD models), Fok-RC, CRCM, RAM-SCB (kinetic drift models of the ring current), WINDMI (magnetosphere-ionosphere electric circuit model), and predictions based on an impulse response function (IRF) model and analytic coupling functions with inputs of solar wind data. In addition to the analysis of model-observation comparisons we look at the way Dst is computed in global magnetosphere models. The default value of Dst computed by the SWMF model is for Bz the Earth's center. In addition to this, we present results obtained at different locations on the Earth's surface. We choose equatorial locations at local noon, dusk (18:00 hours), midnight and dawn (6:00 hours). The different virtual observatory locations reveal the variation around the earth-centered Dst value resulting from the distribution of electric currents in the magnetosphere during different phases of a storm.

  18. Discharge simulations performed with a hydrological model using bias corrected regional climate model input

    Directory of Open Access Journals (Sweden)

    S. C. van Pelt

    2009-12-01

    Full Text Available Studies have demonstrated that precipitation on Northern Hemisphere mid-latitudes has increased in the last decades and that it is likely that this trend will continue. This will have an influence on discharge of the river Meuse. The use of bias correction methods is important when the effect of precipitation change on river discharge is studied. The objective of this paper is to investigate the effect of using two different bias correction methods on output from a Regional Climate Model (RCM simulation. In this study a Regional Atmospheric Climate Model (RACMO2 run is used, forced by ECHAM5/MPIOM under the condition of the SRES-A1B emission scenario, with a 25 km horizontal resolution. The RACMO2 runs contain a systematic precipitation bias on which two bias correction methods are applied. The first method corrects for the wet day fraction and wet day average (WD bias correction and the second method corrects for the mean and coefficient of variance (MV bias correction. The WD bias correction initially corrects well for the average, but it appears that too many successive precipitation days were removed with this correction. The second method performed less well on average bias correction, but the temporal precipitation pattern was better. Subsequently, the discharge was calculated by using RACMO2 output as forcing to the HBV-96 hydrological model. A large difference was found between the simulated discharge of the uncorrected RACMO2 run, the WD bias corrected run and the MV bias corrected run. These results show the importance of an appropriate bias correction.

  19. Model Engine Performance Measurement From Force Balance Instrumentation

    Science.gov (United States)

    Jeracki, Robert J.

    1998-01-01

    A large scale model representative of a low-noise, high bypass ratio turbofan engine was tested for acoustics and performance in the NASA Lewis 9- by 15-Foot Low-Speed Wind Tunnel. This test was part of NASA's continuing Advanced Subsonic Technology Noise Reduction Program. The low tip speed fan, nacelle, and an un-powered core passage (with core inlet guide vanes) were simulated. The fan blades and hub are mounted on a rotating thrust and torque balance. The nacelle, bypass duct stators, and core passage are attached to a six component force balance. The two balance forces, when corrected for internal pressure tares, measure the total thrust-minus-drag of the engine simulator. Corrected for scaling and other effects, it is basically the same force that the engine supports would feel, operating at similar conditions. A control volume is shown and discussed, identifying the various force components of the engine simulator thrust and definitions of net thrust. Several wind tunnel runs with nearly the same hardware installed are compared, to identify the repeatability of the measured thrust-minus-drag. Other wind tunnel runs, with hardware changes that affected fan performance, are compared to the baseline configuration, and the thrust and torque effects are shown. Finally, a thrust comparison between the force balance and nozzle gross thrust methods is shown, and both yield very similar results.

  20. Modeling silica aerogel optical performance by determining its radiative properties

    Science.gov (United States)

    Zhao, Lin; Yang, Sungwoo; Bhatia, Bikram; Strobach, Elise; Wang, Evelyn N.

    2016-02-01

    Silica aerogel has been known as a promising candidate for high performance transparent insulation material (TIM). Optical transparency is a crucial metric for silica aerogels in many solar related applications. Both scattering and absorption can reduce the amount of light transmitted through an aerogel slab. Due to multiple scattering, the transmittance deviates from the Beer-Lambert law (exponential attenuation). To better understand its optical performance, we decoupled and quantified the extinction contributions of absorption and scattering separately by identifying two sets of radiative properties. The radiative properties are deduced from the measured total transmittance and reflectance spectra (from 250 nm to 2500 nm) of synthesized aerogel samples by solving the inverse problem of the 1-D Radiative Transfer Equation (RTE). The obtained radiative properties are found to be independent of the sample geometry and can be considered intrinsic material properties, which originate from the aerogel's microstructure. This finding allows for these properties to be directly compared between different samples. We also demonstrate that by using the obtained radiative properties, we can model the photon transport in aerogels of arbitrary shapes, where an analytical solution is difficult to obtain.

  1. An Efficient Framework Model for Optimizing Routing Performance in VANETs

    Science.gov (United States)

    Zulkarnain, Zuriati Ahmad; Subramaniam, Shamala

    2018-01-01

    Routing in Vehicular Ad hoc Networks (VANET) is a bit complicated because of the nature of the high dynamic mobility. The efficiency of routing protocol is influenced by a number of factors such as network density, bandwidth constraints, traffic load, and mobility patterns resulting in frequency changes in network topology. Therefore, Quality of Service (QoS) is strongly needed to enhance the capability of the routing protocol and improve the overall network performance. In this paper, we introduce a statistical framework model to address the problem of optimizing routing configuration parameters in Vehicle-to-Vehicle (V2V) communication. Our framework solution is based on the utilization of the network resources to further reflect the current state of the network and to balance the trade-off between frequent changes in network topology and the QoS requirements. It consists of three stages: simulation network stage used to execute different urban scenarios, the function stage used as a competitive approach to aggregate the weighted cost of the factors in a single value, and optimization stage used to evaluate the communication cost and to obtain the optimal configuration based on the competitive cost. The simulation results show significant performance improvement in terms of the Packet Delivery Ratio (PDR), Normalized Routing Load (NRL), Packet loss (PL), and End-to-End Delay (E2ED). PMID:29462884

  2. Battery Performance Modelling ad Simulation: a Neural Network Based Approach

    Science.gov (United States)

    Ottavianelli, Giuseppe; Donati, Alessandro

    2002-01-01

    This project has developed on the background of ongoing researches within the Control Technology Unit (TOS-OSC) of the Special Projects Division at the European Space Operations Centre (ESOC) of the European Space Agency. The purpose of this research is to develop and validate an Artificial Neural Network tool (ANN) able to model, simulate and predict the Cluster II battery system's performance degradation. (Cluster II mission is made of four spacecraft flying in tetrahedral formation and aimed to observe and study the interaction between sun and earth by passing in and out of our planet's magnetic field). This prototype tool, named BAPER and developed with a commercial neural network toolbox, could be used to support short and medium term mission planning in order to improve and maximise the batteries lifetime, determining which are the future best charge/discharge cycles for the batteries given their present states, in view of a Cluster II mission extension. This study focuses on the five Silver-Cadmium batteries onboard of Tango, the fourth Cluster II satellite, but time restrains have allowed so far to perform an assessment only on the first battery. In their most basic form, ANNs are hyper-dimensional curve fits for non-linear data. With their remarkable ability to derive meaning from complicated or imprecise history data, ANN can be used to extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques. ANNs learn by example, and this is why they can be described as an inductive, or data-based models for the simulation of input/target mappings. A trained ANN can be thought of as an "expert" in the category of information it has been given to analyse, and this expert can then be used, as in this project, to provide projections given new situations of interest and answer "what if" questions. The most appropriate algorithm, in terms of training speed and memory storage requirements, is clearly the Levenberg

  3. Human Performance Modeling for Dynamic Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory; Mandelli, Diego [Idaho National Laboratory

    2015-08-01

    Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.

  4. Limits of performance for the model reduction problem of hidden Markov models

    KAUST Repository

    Kotsalis, Georgios

    2015-12-15

    We introduce system theoretic notions of a Hankel operator, and Hankel norm for hidden Markov models. We show how the related Hankel singular values provide lower bounds on the norm of the difference between a hidden Markov model of order n and any lower order approximant of order n̂ < n.

  5. An Integrated Performance Evaluation Model for the Photovoltaics Industry

    Directory of Open Access Journals (Sweden)

    He-Yau Kang

    2012-04-01

    Full Text Available Global warming is causing damaging changes to climate around the World. For environmental protection and natural resource scarcity, alternative forms of energy, such as wind energy, fire energy, hydropower energy, geothermal energy, solar energy, biomass energy, ocean power and natural gas, are gaining attention as means of meeting global energy demands. Due to Japan’s nuclear plant disaster in March 2011, people are demanding a good alternative energy resource, which not only produces zero or little air pollutants and greenhouse gases, but also with a high safety level to protect the World. Solar energy, which depends on an infinite resource, the sun, is one of the most promising renewable energy sources from the perspective of environmental sustainability. Currently, the manufacturing cost of solar cells is still very high, and the power conversion efficiency is low. Therefore, photovoltaics (PV firms must continue to invest in research and development, commit to product differentiation, achieve economies of scale, and consider the possibility of vertical integration, in order to strengthen their competitiveness and to acquire the maximum benefit from the PV market. This research proposes a performance evaluation model by integrating analytic hierarchy process (AHP and data envelopment analysis (DEA to assess the current business performance of PV firms. AHP is applied to obtain experts’ opinions on the importance of the factors, and DEA is used to determine which firms are efficient. A case study is performed on the crystalline silicon PV firms in Taiwan. The findings shall help the firms determine their strengths and weaknesses and provide directions for future improvements in business operations.

  6. Correlation between human observer performance and model observer performance in differential phase contrast CT

    International Nuclear Information System (INIS)

    Li, Ke; Garrett, John; Chen, Guang-Hong

    2013-01-01

    Purpose: With the recently expanding interest and developments in x-ray differential phase contrast CT (DPC-CT), the evaluation of its task-specific detection performance and comparison with the corresponding absorption CT under a given radiation dose constraint become increasingly important. Mathematical model observers are often used to quantify the performance of imaging systems, but their correlations with actual human observers need to be confirmed for each new imaging method. This work is an investigation of the effects of stochastic DPC-CT noise on the correlation of detection performance between model and human observers with signal-known-exactly (SKE) detection tasks.Methods: The detectabilities of different objects (five disks with different diameters and two breast lesion masses) embedded in an experimental DPC-CT noise background were assessed using both model and human observers. The detectability of the disk and lesion signals was then measured using five types of model observers including the prewhitening ideal observer, the nonprewhitening (NPW) observer, the nonprewhitening observer with eye filter and internal noise (NPWEi), the prewhitening observer with eye filter and internal noise (PWEi), and the channelized Hotelling observer (CHO). The same objects were also evaluated by four human observers using the two-alternative forced choice method. The results from the model observer experiment were quantitatively compared to the human observer results to assess the correlation between the two techniques.Results: The contrast-to-detail (CD) curve generated by the human observers for the disk-detection experiments shows that the required contrast to detect a disk is inversely proportional to the square root of the disk size. Based on the CD curves, the ideal and NPW observers tend to systematically overestimate the performance of the human observers. The NPWEi and PWEi observers did not predict human performance well either, as the slopes of their CD

  7. An integrated control-oriented modelling for HVAC performance benchmarking

    NARCIS (Netherlands)

    Satyavada, Harish; Baldi, S.

    2016-01-01

    Energy efficiency in building heating, ventilating and air conditioning (HVAC) equipment requires the development of accurate models for testing HVAC control strategies and corresponding energy consumption. In order to make the HVAC control synthesis computationally affordable, such

  8. Sustainable innovation, business models and economic performance: an overview

    NARCIS (Netherlands)

    Montalvo Corral, C.

    2013-01-01

    Sustainable development requires radical and systemic innovations. Such innovations can be more effectively created and studied when building on the concept of business models. This concept provides firms with a holistic framework to envision and implement sustainable innovations. For researchers,

  9. Modeling the performance of a manually operated legume dehuller ...

    African Journals Online (AJOL)

    Abstract. Dimensional analysis was used to establish a model equation for predicting the relationships between the dimensionless hulling efficiency (η/M) and the dehulling stress (τ), dehulling length (L), throughput (F) and dehulling speed (S).

  10. Simplified Predictive Models for CO2 Sequestration Performance Assessment

    Science.gov (United States)

    Mishra, Srikanta; RaviGanesh, Priya; Schuetter, Jared; Mooney, Douglas; He, Jincong; Durlofsky, Louis

    2014-05-01

    We present results from an ongoing research project that seeks to develop and validate a portfolio of simplified modeling approaches that will enable rapid feasibility and risk assessment for CO2 sequestration in deep saline formation. The overall research goal is to provide tools for predicting: (a) injection well and formation pressure buildup, and (b) lateral and vertical CO2 plume migration. Simplified modeling approaches that are being developed in this research fall under three categories: (1) Simplified physics-based modeling (SPM), where only the most relevant physical processes are modeled, (2) Statistical-learning based modeling (SLM), where the simulator is replaced with a "response surface", and (3) Reduced-order method based modeling (RMM), where mathematical approximations reduce the computational burden. The system of interest is a single vertical well injecting supercritical CO2 into a 2-D layered reservoir-caprock system with variable layer permeabilities. In the first category (SPM), we use a set of well-designed full-physics compositional simulations to understand key processes and parameters affecting pressure propagation and buoyant plume migration. Based on these simulations, we have developed correlations for dimensionless injectivity as a function of the slope of fractional-flow curve, variance of layer permeability values, and the nature of vertical permeability arrangement. The same variables, along with a modified gravity number, can be used to develop a correlation for the total storage efficiency within the CO2 plume footprint. In the second category (SLM), we develop statistical "proxy models" using the simulation domain described previously with two different approaches: (a) classical Box-Behnken experimental design with a quadratic response surface fit, and (b) maximin Latin Hypercube sampling (LHS) based design with a Kriging metamodel fit using a quadratic trend and Gaussian correlation structure. For roughly the same number of

  11. Do Some Business Models Perform Better than Others?

    OpenAIRE

    Malone, Thomas; Weill, Peter; Lai, Richard; D'Urso, Victoria; Herman, George; Apel, Thomas; Woerner, Stephanie

    2006-01-01

    This paper defines four basic business models based on what asset rights are sold (Creators, Distributors, Landlords and Brokers) and four variations of each based on what type of assets are involved (Financial, Physical, Intangible, and Human). Using this framework, we classified the business models of all 10,970 publicly traded firms in the US economy from 1998 through 2002. Some of these classifications were done manually, based on the firms' descriptions of sources of revenue in their fin...

  12. AGING PERFORMANCE OF MODEL 9975 PACKAGE FLUOROELASTOMER O-RINGS

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, E.; Daugherty, W.; Skidmore, E.; Dunn, K.; Fisher, D.

    2011-05-31

    The influence of temperature and radiation on Viton{reg_sign} GLT and GLT-S fluoroelastomer O-rings is an ongoing research focus at the Savannah River National Laboratory. The O-rings are credited for leaktight containment in the Model 9975 shipping package used for transportation of plutonium-bearing materials. At the Savannah River Site, the Model 9975 packages are being used for interim storage. Primary research efforts have focused on surveillance of O-rings from actual packages, leak testing of seals at bounding aging conditions and the effect of aging temperature on compression stress relaxation behavior, with the goal of service life prediction for long-term storage conditions. Recently, an additional effort to evaluate the effect of aging temperature on the oxidation of the materials has begun. Degradation in the mechanical properties of elastomers is directly related to the oxidation of the polymer. Sensitive measurements of the oxidation rate can be performed in a more timely manner than waiting for a measurable change in mechanical properties, especially at service temperatures. Measuring the oxidation rate therefore provides a means to validate the assumption that the degradation mechanisms(s) do not change from the elevated temperatures used for accelerated aging and the lower service temperatures. Monitoring the amount of oxygen uptake by the material over time at various temperatures can provide increased confidence in lifetime predictions. Preliminary oxygen consumption analysis of a Viton GLT-based fluoroelastomer compound (Parker V0835-75) using an Oxzilla II differential oxygen analyzer in the temperature range of 40-120 C was performed. Early data suggests oxygen consumption rates may level off within the first 100,000 hours (10-12 years) at 40 C and that sharp changes in the degradation mechanism (stress-relaxation) are not expected over the temperature range examined. This is consistent with the known long-term heat aging resistance of

  13. Modeling Friction Performance of Drill String Torsional Oscillation Using Dynamic Friction Model

    Directory of Open Access Journals (Sweden)

    Xingming Wang

    2017-01-01

    Full Text Available Drill string torsional and longitudinal oscillation can significantly reduce axial drag in horizontal drilling. An improved theoretical model for the analysis of the frictional force was proposed based on microscopic contact deformation theory and a bristle model. The established model, an improved dynamic friction model established for drill strings in a wellbore, was used to determine the relationship of friction force changes and the drill string torsional vibration. The model results were in good agreement with the experimental data, verifying the accuracy of the established model. The analysis of the influence of drilling mud properties indicated that there is an approximately linear relationship between the axial friction force and dynamic shear and viscosity. The influence of drill string torsional oscillation on the axial friction force is discussed. The results indicated that the drill string transverse velocity is a prerequisite for reducing axial friction. In addition, low amplitude of torsional vibration speed can significantly reduce axial friction. Then, increasing the amplitude of transverse vibration speed, the effect of axial reduction is not significant. In addition, by involving general field drilling parameters, this model can accurately describe the friction behavior and quantitatively predict the frictional resistance in horizontal drilling.

  14. DiamondTorre Algorithm for High-Performance Wave Modeling

    Directory of Open Access Journals (Sweden)

    Vadim Levchenko

    2016-08-01

    Full Text Available Effective algorithms of physical media numerical modeling problems’ solution are discussed. The computation rate of such problems is limited by memory bandwidth if implemented with traditional algorithms. The numerical solution of the wave equation is considered. A finite difference scheme with a cross stencil and a high order of approximation is used. The DiamondTorre algorithm is constructed, with regard to the specifics of the GPGPU’s (general purpose graphical processing unit memory hierarchy and parallelism. The advantages of these algorithms are a high level of data localization, as well as the property of asynchrony, which allows one to effectively utilize all levels of GPGPU parallelism. The computational intensity of the algorithm is greater than the one for the best traditional algorithms with stepwise synchronization. As a consequence, it becomes possible to overcome the above-mentioned limitation. The algorithm is implemented with CUDA. For the scheme with the second order of approximation, the calculation performance of 50 billion cells per second is achieved. This exceeds the result of the best traditional algorithm by a factor of five.

  15. Fast Performance Computing Model for Smart Distributed Power Systems

    Directory of Open Access Journals (Sweden)

    Umair Younas

    2017-06-01

    Full Text Available Plug-in Electric Vehicles (PEVs are becoming the more prominent solution compared to fossil fuels cars technology due to its significant role in Greenhouse Gas (GHG reduction, flexible storage, and ancillary service provision as a Distributed Generation (DG resource in Vehicle to Grid (V2G regulation mode. However, large-scale penetration of PEVs and growing demand of energy intensive Data Centers (DCs brings undesirable higher load peaks in electricity demand hence, impose supply-demand imbalance and threaten the reliability of wholesale and retail power market. In order to overcome the aforementioned challenges, the proposed research considers smart Distributed Power System (DPS comprising conventional sources, renewable energy, V2G regulation, and flexible storage energy resources. Moreover, price and incentive based Demand Response (DR programs are implemented to sustain the balance between net demand and available generating resources in the DPS. In addition, we adapted a novel strategy to implement the computational intensive jobs of the proposed DPS model including incoming load profiles, V2G regulation, battery State of Charge (SOC indication, and fast computation in decision based automated DR algorithm using Fast Performance Computing resources of DCs. In response, DPS provide economical and stable power to DCs under strict power quality constraints. Finally, the improved results are verified using case study of ISO California integrated with hybrid generation.

  16. Computational fluid dynamics analysis of cyclist aerodynamics: performance of different turbulence-modelling and boundary-layer modelling approaches.

    Science.gov (United States)

    Defraeye, Thijs; Blocken, Bert; Koninckx, Erwin; Hespel, Peter; Carmeliet, Jan

    2010-08-26

    This study aims at assessing the accuracy of computational fluid dynamics (CFD) for applications in sports aerodynamics, for example for drag predictions of swimmers, cyclists or skiers, by evaluating the applied numerical modelling techniques by means of detailed validation experiments. In this study, a wind-tunnel experiment on a scale model of a cyclist (scale 1:2) is presented. Apart from three-component forces and moments, also high-resolution surface pressure measurements on the scale model's surface, i.e. at 115 locations, are performed to provide detailed information on the flow field. These data are used to compare the performance of different turbulence-modelling techniques, such as steady Reynolds-averaged Navier-Stokes (RANS), with several k-epsilon and k-omega turbulence models, and unsteady large-eddy simulation (LES), and also boundary-layer modelling techniques, namely wall functions and low-Reynolds number modelling (LRNM). The commercial CFD code Fluent 6.3 is used for the simulations. The RANS shear-stress transport (SST) k-omega model shows the best overall performance, followed by the more computationally expensive LES. Furthermore, LRNM is clearly preferred over wall functions to model the boundary layer. This study showed that there are more accurate alternatives for evaluating flow around bluff bodies with CFD than the standard k-epsilon model combined with wall functions, which is often used in CFD studies in sports. 2010 Elsevier Ltd. All rights reserved.

  17. Incorporating a 360 Degree Evaluation Model IOT Transform the USMC Performance Evaluation System

    Science.gov (United States)

    2005-02-08

    Incorporating a 360 Evaluation Model IOT Transform the USMC Performance Evaluation System EWS 2005 Subject Area Manpower...Incorporating a 360 Evaluation Model IOT Transform the USMC Performance Evaluation System” Contemporary...COVERED 00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE Incorporating a 360 Evaluation Model IOT Transform the USMC Performance

  18. Computational Model-Based Prediction of Human Episodic Memory Performance Based on Eye Movements

    Science.gov (United States)

    Sato, Naoyuki; Yamaguchi, Yoko

    Subjects' episodic memory performance is not simply reflected by eye movements. We use a ‘theta phase coding’ model of the hippocampus to predict subjects' memory performance from their eye movements. Results demonstrate the ability of the model to predict subjects' memory performance. These studies provide a novel approach to computational modeling in the human-machine interface.

  19. Microstructural Modeling of Brittle Materials for Enhanced Performance and Reliability.

    Energy Technology Data Exchange (ETDEWEB)

    Teague, Melissa Christine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Teague, Melissa Christine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rodgers, Theron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rodgers, Theron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grutzik, Scott Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grutzik, Scott Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Meserole, Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Meserole, Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    Brittle failure is often influenced by difficult to measure and variable microstructure-scale stresses. Recent advances in photoluminescence spectroscopy (PLS), including improved confocal laser measurement and rapid spectroscopic data collection have established the potential to map stresses with microscale spatial resolution (%3C2 microns). Advanced PLS was successfully used to investigate both residual and externally applied stresses in polycrystalline alumina at the microstructure scale. The measured average stresses matched those estimated from beam theory to within one standard deviation, validating the technique. Modeling the residual stresses within the microstructure produced general agreement in comparison with the experimentally measured results. Microstructure scale modeling is primed to take advantage of advanced PLS to enable its refinement and validation, eventually enabling microstructure modeling to become a predictive tool for brittle materials.

  20. Evaluating the performance and utility of regional climate models

    DEFF Research Database (Denmark)

    Christensen, Jens H.; Carter, Timothy R.; Rummukainen, Markku

    2007-01-01

    This special issue of Climatic Change contains a series of research articles documenting co-ordinated work carried out within a 3-year European Union project 'Prediction of Regional scenarios and Uncertainties for Defining European Climate change risks and Effects' (PRUDENCE). The main objective...... weather events and (7) implications of the results for policy. A paper summarising the related MICE (Modelling the Impact of Climate Extremes) project is also included. The second part of the issue contains 12 articles that focus in more detail on some of the themes summarised in the overarching papers....... The PRUDENCE results represent the first comprehensive, continental-scale intercomparison and evaluation of high resolution climate models and their applications, bringing together climate modelling, impact research and social sciences expertise on climate change....

  1. Lifetime-Aware Cloud Data Centers: Models and Performance Evaluation

    Directory of Open Access Journals (Sweden)

    Luca Chiaraviglio

    2016-06-01

    Full Text Available We present a model to evaluate the server lifetime in cloud data centers (DCs. In particular, when the server power level is decreased, the failure rate tends to be reduced as a consequence of the limited number of components powered on. However, the variation between the different power states triggers a failure rate increase. We therefore consider these two effects in a server lifetime model, subject to an energy-aware management policy. We then evaluate our model in a realistic case study. Our results show that the impact on the server lifetime is far from negligible. As a consequence, we argue that a lifetime-aware approach should be pursued to decide how and when to apply a power state change to a server.

  2. Modelling of cloudless solar radiation for PV module performance analysis

    International Nuclear Information System (INIS)

    Dusabe, D.; Munda, J.; Jimoh, A.

    2009-01-01

    The empirical model developed in this study uses standard specifications together with actual solar radiation and cell temperature to predict voltage-current characteristics of a photovoltaic panel under varying weather conditions. The paper focuses on the modelling of hourly cloudless solar radiation to provide the insolation on a PV module of any orientation, located at any site. The model is built in MATLAB/Simulink environment to provide a tool that may be loaded in the library. It is found that the predicted solar radiation strongly agrees with the experimental data from the National Renewable Energy Laboratory (NREL). Further, a satisfactory agreement between the predicted voltage - current curves and laboratory measurements is obtained. (authors)

  3. Multi-Site Validation of the SWAT Model on the Bani Catchment: Model Performance and Predictive Uncertainty

    Directory of Open Access Journals (Sweden)

    Jamilatou Chaibou Begou

    2016-04-01

    Full Text Available The objective of this study was to assess the performance and predictive uncertainty of the Soil and Water Assessment Tool (SWAT model on the Bani River Basin, at catchment and subcatchment levels. The SWAT model was calibrated using the Generalized Likelihood Uncertainty Estimation (GLUE approach. Potential Evapotranspiration (PET and biomass were considered in the verification of model outputs accuracy. Global Sensitivity Analysis (GSA was used for identifying important model parameters. Results indicated a good performance of the global model at daily as well as monthly time steps with adequate predictive uncertainty. PET was found to be overestimated but biomass was better predicted in agricultural land and forest. Surface runoff represents the dominant process on streamflow generation in that region. Individual calibration at subcatchment scale yielded better performance than when the global parameter sets were applied. These results are very useful and provide a support to further studies on regionalization to make prediction in ungauged basins.

  4. Target acquisition modeling over the exact optical path: extending the EOSTAR TDA with the TOD sensor performance model

    Science.gov (United States)

    Dijk, J.; Bijl, P.; Oppeneer, M.; ten Hove, R. J. M.; van Iersel, M.

    2017-10-01

    The Electro-Optical Signal Transmission and Ranging (EOSTAR) model is an image-based Tactical Decision Aid (TDA) for thermal imaging systems (MWIR/LWIR) developed for a sea environment with an extensive atmosphere model. The Triangle Orientation Discrimination (TOD) Target Acquisition model calculates the sensor and signal processing effects on a set of input triangle test pattern images, judges their orientation using humans or a Human Visual System (HVS) model and derives the system image quality and operational field performance from the correctness of the responses. Combination of the TOD model and EOSTAR, basically provides the possibility to model Target Acquisition (TA) performance over the exact path from scene to observer. In this method ship representative TOD test patterns are placed at the position of the real target, subsequently the combined effects of the environment (atmosphere, background, etc.), sensor and signal processing on the image are calculated using EOSTAR and finally the results are judged by humans. The thresholds are converted into Detection-Recognition-Identification (DRI) ranges of the real target. In experiments is shown that combination of the TOD model and the EOSTAR model is indeed possible. The resulting images look natural and provide insight in the possibilities of combining the two models. The TOD observation task can be done well by humans, and the measured TOD is consistent with analytical TOD predictions for the same camera that was modeled in the ECOMOS project.

  5. Comparing the performance of species distribution models of

    NARCIS (Netherlands)

    Valle , M.; van Katwijk, M.M.; de Jong, D.J.; Bouma, T.; Schipper, A.M.; Chust, G.; Benito, B.M.; Garmendia, J.M.; Borja, A.

    2013-01-01

    Intertidal seagrasses show high variability in their extent and location, with local extinctions and (re-)colonizations being inherent in their population dynamics. Suitable habitats are identified usually using Species Distribution Models (SDM), based upon the overall distribution of the species;

  6. Influence of horizontal resolution and ensemble size on model performance

    CSIR Research Space (South Africa)

    Dalton, A

    2014-10-01

    Full Text Available Computing costs increase with an increase in global model resolution and ensemble size. This paper strives to determine the extent to which resolution and ensemble size affect seasonal forecast skill when simulating mid-summer rainfall totals over...

  7. Comparative assessment of PV plant performance models considering climate effects

    DEFF Research Database (Denmark)

    Tina, Giuseppe; Ventura, Cristina; Sera, Dezso

    2017-01-01

    . The methodological approach is based on comparative tests of the analyzed models applied to two PV plants installed respectively in north of Denmark (Aalborg) and in the south of Italy (Agrigento). The different ambient, operating and installation conditions allow to understand how these factors impact the precision...

  8. Development of a Simple Hydraulic Performance Model for ...

    African Journals Online (AJOL)

    ... model prediction is proposed. The procedure for conducting the field flow test in the determination of the Hazen William's friction factor is described by utilizing the set up and data of the field flow test carried out on 11th August 1988 on completion of the swabbing exercise. Journal of Civil Engineering, JKUAT (2001) Vol 6, ...

  9. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage

  10. Modeling the performance of coated LPG tanks engulfes in fires

    NARCIS (Netherlands)

    Cozzani, V.; Landucci, G.; Molag, M. (Menso)

    2009-01-01

    The improvement of passive fire protection of storage vessels is a key factor to enhance safety among the LPG distribution chain. A thermal and mechanical model based on finite elements simulations was developed to assess the behaviour of full size tanks used for LPG storage and transportation in

  11. Modeling the performance of coated LPG tanks engulfed in fires

    NARCIS (Netherlands)

    Landucci, G.; Molag, M.; Cozzani, V.

    2009-01-01

    The improvement of passive fire protection of storage vessels is a key factor to enhance safety among the LPG distribution chain. A thermal and mechanical model based on finite elements simulations was developed to assess the behaviour of full size tanks used for LPG storage and transportation in

  12. Modelling biomechanical performance and injuries for sport applications in MADYMO

    NARCIS (Netherlands)

    Forbes, P.A.; Wolski, S.; Cappon, H.; Ruimmerman, R.; Rodarius, C.

    2007-01-01

    MADYMO is the worldwide standard software for the design and analyses of safety devices that protect occupants in car crashes, such as seatbelts and airbags. It features generic multibody and finite element capabilities, a full range of predictive and efficient occupant models (both crash dummies

  13. Regional climate model performance and prediction of seasonal ...

    African Journals Online (AJOL)

    Knowledge about future climate provides valuable insights into how the challenges posed by climate change and variability can be addressed. ... Impacts Studies) in simulating rainfall and temperature over Uganda and also assess future impacts of climate when forced by an ensemble of two Global Climate Models (GCMs) ...

  14. Performances of estimators of linear model with auto-correlated ...

    African Journals Online (AJOL)

    A Monte Carlo Study of the small sampling properties of five estimators of a linear model with Autocorrelated error terms is discussed. The independent variable was specified as standard normal data. The estimators of the slop coefficients β with the help of Ordinary Least Squares (OLS), increased with increased ...

  15. The European computer model for optronic system performance prediction (ECOMOS)

    NARCIS (Netherlands)

    Kessler, S.; Bijl, P.; Labarre, L.; Repasi, E.; Wittenstein, W.; Bürsing, H.

    2017-01-01

    ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The

  16. Stochastic Modeling and Performance Analysis of Multimedia SoCs

    DEFF Research Database (Denmark)

    Raman, Balaji; Nouri, Ayoub; Gangadharan, Deepak

    2013-01-01

    Reliability and flexibility are among the key required features of a framework used to model a system. Existing approaches to design resource-constrained, soft-real time systems either provide guarantees for output quality or account for loss in the system, but not both. We propose two independent...

  17. Performances of some estimators of linear model with ...

    African Journals Online (AJOL)

    The estimators are compared by examing the finite properties of estimators namely; sum of biases, sum of absolute biases, sum of variances and sum of the mean squared error of the estimated parameter of the model. Results show that when the autocorrelation level is small (ρ=0.4), the MLGD estimator is best except when ...

  18. Constraining performance assessment models with tracer test results: a comparison between two conceptual models

    Science.gov (United States)

    McKenna, Sean A.; Selroos, Jan-Olof

    Tracer tests are conducted to ascertain solute transport parameters of a single rock feature over a 5-m transport pathway. Two different conceptualizations of double-porosity solute transport provide estimates of the tracer breakthrough curves. One of the conceptualizations (single-rate) employs a single effective diffusion coefficient in a matrix with infinite penetration depth. However, the tracer retention between different flow paths can vary as the ratio of flow-wetted surface to flow rate differs between the path lines. The other conceptualization (multirate) employs a continuous distribution of multiple diffusion rate coefficients in a matrix with variable, yet finite, capacity. Application of these two models with the parameters estimated on the tracer test breakthrough curves produces transport results that differ by orders of magnitude in peak concentration and time to peak concentration at the performance assessment (PA) time and length scales (100,000 years and 1,000 m). These differences are examined by calculating the time limits for the diffusive capacity to act as an infinite medium. These limits are compared across both conceptual models and also against characteristic times for diffusion at both the tracer test and PA scales. Additionally, the differences between the models are examined by re-estimating parameters for the multirate model from the traditional double-porosity model results at the PA scale. Results indicate that for each model the amount of the diffusive capacity that acts as an infinite medium over the specified time scale explains the differences between the model results and that tracer tests alone cannot provide reliable estimates of transport parameters for the PA scale. Results of Monte Carlo runs of the transport models with varying travel times and path lengths show consistent results between models and suggest that the variation in flow-wetted surface to flow rate along path lines is insignificant relative to variability in

  19. Assessment of classical performance measures and signature indices from Flow Duration Curves for model evaluation.

    Science.gov (United States)

    Ley, Rita; Hellebrand, Hugo; Casper, Markus C.; Fenicia, Fabrizio

    2015-04-01

    The result of model evaluation is strongly influenced by the choice of the used performance measures. There exist a large variety of performance measures, each with its strengths and weaknesses. Although all of them represent the ability of a hydrological model to reproduce observed stream flow, it is unclear which one is most appropriate for specific applications. The objective of this study is to investigate which performance measure is best suited to find a best performing model structure for a single basin out of multiple model structures. We compare the usability of a new performance measure, the Standardized Signature Index Sum, with several classical statistical performance measures and hydrological performance measures like the Root Mean Square Error or the Nash and Sutcliffe Efficiency. In contrast to the classical and hydrological performance measures, the Standardized Signature Index Sum is based on the comparison of observed and simulated Flow Duration Curves (FDCs). It combines the performance for different parts of the FDC to one measure considering the whole FDC and therefore the whole hydrograph. For this purpose 12 model structures were generated using the SUPERFLEX modeling framework and applied to 53 meso-scale basins in Rhineland Palatinate (Germany). For all calibrated models based on the 12 model structures and 53 basins, we calculate several performance measures and compare their usability to identify a best performing model structure for each basin. In many cases the classical performance measures and the hydrological performance measures assigned similar values to seemingly different hydrographs simulated with different model structures. Therefore, these measures are not well suited for model comparison. The proposed Standardized Signature Index Sum is more effective in revealing differences between model results. Furthermore, it provides information in which part of the hydrograph and how a model fails. The Signature Index Sum allows for a

  20. Performance model-directed data sieving for high-performance I/O

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yong; Lu, Yin; Amritkar, Prathamesh; Thakur, Rajeev; Zhuang, Yu

    2014-09-10

    Many scientific computing applications and engineering simulations exhibit noncontiguous I/O access patterns. Data sieving is an important technique to improve the performance of noncontiguous I/O accesses by combining small and noncontiguous requests into a large and contiguous request. It has been proven effective even though more data are potentially accessed than demanded. In this study, we propose a new data sieving approach namely performance model-directed data sieving, or PMD data sieving in short. It improves the existing data sieving approach from two aspects: (1) dynamically determines when it is beneficial to perform data sieving; and (2) dynamically determines how to perform data sieving if beneficial. It improves the performance of the existing data sieving approach considerably and reduces the memory consumption as verified by both theoretical analysis and experimental results. Given the importance of supporting noncontiguous accesses effectively and reducing the memory pressure in a large-scale system, the proposed PMD data sieving approach in this research holds a great promise and will have an impact on high-performance I/O systems.

  1. Modelling of Human Transplacental Transport as Performed in Copenhagen, Denmark

    DEFF Research Database (Denmark)

    Mathiesen, L.; Morck, T. A.; Zuri, G.

    2014-01-01

    classes of chemicals and nanoparticles for comparisons across chemical structures as well as different test systems. Our test systems are based on human material to bypass the extrapolation from animal data. By combining data from our two test systems, we are able to rank and compare the transport...... the relationships between maternal and foetal exposures to various compounds including pollutants such as polychlorinated biphenyls, polybrominated flame retardants, nanoparticles as well as recombinant human antibodies. The compounds have been studied in the human placenta perfusion model and to some extent...... in vitro with an established human monolayer trophoblast cell culture model. Results from our studies distinguish placental transport of substances by physicochemical properties, adsorption to placental tissue, binding to transport and receptor proteins and metabolism. We have collected data from different...

  2. Model-supported selection of distribution coefficients for performance assessment

    International Nuclear Information System (INIS)

    Ochs, M.; Lothenbach, B.; Shibata, Hirokazu; Yui, Mikazu

    1999-01-01

    A thermodynamic speciation/sorption model is used to illustrate typical problems encountered in the extrapolation of batch-type K d values to repository conditions. For different bentonite-groundwater systems, the composition of the corresponding equilibrium solutions and the surface speciation of the bentonite is calculated by treating simultaneously solution equilibria of soluble components of the bentonite as well as ion exchange and acid/base reactions at the bentonite surface. K d values for Cs, Ra, and Ni are calculated by implementing the appropriate ion exchange and surface complexation equilibria in the bentonite model. Based on this approach, hypothetical batch experiments are contrasted with expected conditions in compacted backfill. For each of these scenarios, the variation of K d values as a function of groundwater composition is illustrated for Cs, Ra, and Ni. The applicability of measured, batch-type K d values to repository conditions is discussed. (author)

  3. Competency-Based Model for Predicting Construction Project Managers Performance

    OpenAIRE

    Dainty, A. R. J.; Cheng, M.; Moore, D. R.

    2005-01-01

    Using behavioral competencies to influence human resource management decisions is gaining popularity in business organizations. This study identifies the core competencies associated with the construction management role and further, develops a predictive model to inform human resource selection and development decisions within large construction organizations. A range of construction managers took part in behavioral event interviews where staffs were asked to recount critical management inci...

  4. Zeolite Membranes: Ozone Detemplation, Modeling, and Performance Characterization

    OpenAIRE

    Kuhn, J.

    2009-01-01

    Membrane technology plays an increasingly important role in developing a more sustainable process industry. Zeolites are a novel class of membrane materials with unique properties enabling molecular sieving and affinity based separations. This thesis proposes some new concepts in zeolite membrane synthesis, application, and modeling. The influence of zeolite polarity is assessed and the use of a hydrophobic zeolite membrane for water separation is explored. Ozonication, a novel method for zeo...

  5. OPNET Modeler simulations of performance for multi nodes wireless systems

    Directory of Open Access Journals (Sweden)

    Krupanek Beata

    2016-01-01

    Full Text Available Paper presents a study under the Quality of Service in modern wireless sensor networks. Such a networks are characterized by small amount of data transmitted in fixed periods. Very often this data must by transmitted in real time so data transmission delays should be well known. This article shows multimode network simulated in packet OPNET Modeler. Also nowadays the quality of services is very important especially in multi-nodes systems such a home automation or measurement systems.

  6. Predicting Adaptive Performance in Multicultural Teams: A Causal Model

    Science.gov (United States)

    2008-02-01

    International Personality Item Pool – Five-Factor Model ( IPIP -FFM), http://ipip.ori.org/, were used in the present study to assess neuroticism as an... IPIP personality scale. Based on Matsumoto et al.’s (2001) results, only those items that exceeded their established criterion for factor loadings... IPIP ) were combined in a composite score representing cultural adjustment (α = .75). As described below, the factor of emotion regulation will be

  7. Performance Analysis and Modeling of Thermally Sprayed Resistive Heaters

    Science.gov (United States)

    Lamarre, Jean-Michel; Marcoux, Pierre; Perrault, Michel; Abbott, Richard C.; Legoux, Jean-Gabriel

    2013-08-01

    Many processes and systems require hot surfaces. These are usually heated using electrical elements located in their vicinity. However, this solution is subject to intrinsic limitations associated with heating element geometry and physical location. Thermally spraying electrical elements directly on surfaces can overcome these limitations by tailoring the geometry of the heating element to the application. Moreover, the element heat transfer is maximized by minimizing the distance between the heater and the surface to be heated. This article is aimed at modeling and characterizing resistive heaters sprayed on metallic substrates. Heaters were fabricated by using a plasma-sprayed alumina dielectric insulator and a wire flame-sprayed iron-based alloy resistive element. Samples were energized and kept at a constant temperature of 425 °C for up to 4 months. SEM cross-sectional observations revealed the formation of cracks at very specific locations in the alumina layer after thermal use. Finite-element modeling shows that these cracks originate from high local thermal stresses and can be predicted according to the considered geometry. The simulation model was refined using experimental parameters obtained by several techniques such as emissivity and time-dependent temperature profile (infra-red camera), resistivity (four-probe technique), thermal diffusivity (laser flash method), and mechanical properties (micro and nanoindentation). The influence of the alumina thickness and the substrate material on crack formation was evaluated.

  8. Achievement Emotions and Academic Performance: Longitudinal Models of Reciprocal Effects.

    Science.gov (United States)

    Pekrun, Reinhard; Lichtenfeld, Stephanie; Marsh, Herbert W; Murayama, Kou; Goetz, Thomas

    2017-09-01

    A reciprocal effects model linking emotion and achievement over time is proposed. The model was tested using five annual waves of the Project for the Analysis of Learning and Achievement in Mathematics (PALMA) longitudinal study, which investigated adolescents' development in mathematics (Grades 5-9; N = 3,425 German students; mean starting age = 11.7 years; representative sample). Structural equation modeling showed that positive emotions (enjoyment, pride) positively predicted subsequent achievement (math end-of-the-year grades and test scores), and that achievement positively predicted these emotions, controlling for students' gender, intelligence, and family socioeconomic status. Negative emotions (anger, anxiety, shame, boredom, hopelessness) negatively predicted achievement, and achievement negatively predicted these emotions. The findings were robust across waves, achievement indicators, and school tracks, highlighting the importance of emotions for students' achievement and of achievement for the development of emotions. © 2017 The Authors. Child Development © 2017 Society for Research in Child Development, Inc.

  9. Model-based optimization biofilm based systems performing autotrophic nitrogen removal using the comprehensive NDHA model

    DEFF Research Database (Denmark)

    Valverde Pérez, Borja; Ma, Yunjie; Morset, Martin

    Completely autotrophic nitrogen removal (CANR) can be obtained in single stage biofilm-based bioreactors. However, their environmental footprint is compromised due to elevated N2O emissions. We developed novel spatially explicit biochemical process model of biofilm based CANR systems that predicts...

  10. Middle-School Science Students' Scientific Modelling Performances Across Content Areas and Within a Learning Progression

    Science.gov (United States)

    Bamberger, Yael M.; Davis, Elizabeth A.

    2013-01-01

    This paper focuses on students' ability to transfer modelling performances across content areas, taking into consideration their improvement of content knowledge as a result of a model-based instruction. Sixty-five sixth grade students of one science teacher in an urban public school in the Midwestern USA engaged in scientific modelling practices that were incorporated into a curriculum focused on the nature of matter. Concept-process models were embedded in the curriculum, as well as emphasis on meta-modelling knowledge and modelling practices. Pre-post test items that required drawing scientific models of smell, evaporation, and friction were analysed. The level of content understanding was coded and scored, as were the following elements of modelling performance: explanation, comparativeness, abstraction, and labelling. Paired t-tests were conducted to analyse differences in students' pre-post tests scores on content knowledge and on each element of the modelling performances. These are described in terms of the amount of transfer. Students significantly improved in their content knowledge for the smell and the evaporation models, but not for the friction model, which was expected as that topic was not taught during the instruction. However, students significantly improved in some of their modelling performances for all the three models. This improvement serves as evidence that the model-based instruction can help students acquire modelling practices that they can apply in a new content area.

  11. Performance Estimation of Networked Business Models: Case Study on a Finnish eHealth Service Project

    Directory of Open Access Journals (Sweden)

    Marikka Heikkilä

    2014-08-01

    Full Text Available Purpose: The objective of this paper is to propose and demonstrate a framework for estimating performance in a networked business model. Design/methodology/approach: Our approach is design science, utilising action research in studying a case of four independent firms in Health & Wellbeing sector aiming to jointly provide a new service for business and private customers. The duration of the research study is 3 years. Findings: We propose that a balanced set of performance indicators can be defined by paying attention to all main components of the business model, enriched with of network collaboration. The results highlight the importance of measuring all main components of the business model and also the business network partners’ view on trust, contracts and fairness. Research implications: This article contributes to the business model literature by combining business modelling with performance evaluation. The article points out that it is essential to create metrics that can be applied to evaluate and improve the business model blueprints, but it is also important to measure business collaboration aspects. Practical implications: Companies have already adopted Business model canvas or similar business model tools to innovate new business models. We suggest that companies continue their business model innovation work by agreeing on a set of performance metrics, building on the business model components model enriched with measures of network collaboration. Originality/value: This article contributes to the business model literature and praxis by combining business modelling with performance evaluation.

  12. Predicting the Impacts of Intravehicular Displays on Driving Performance with Human Performance Modeling

    Science.gov (United States)

    Mitchell, Diane Kuhl; Wojciechowski, Josephine; Samms, Charneta

    2012-01-01

    A challenge facing the U.S. National Highway Traffic Safety Administration (NHTSA), as well as international safety experts, is the need to educate car drivers about the dangers associated with performing distraction tasks while driving. Researchers working for the U.S. Army Research Laboratory have developed a technique for predicting the increase in mental workload that results when distraction tasks are combined with driving. They implement this technique using human performance modeling. They have predicted workload associated with driving combined with cell phone use. In addition, they have predicted the workload associated with driving military vehicles combined with threat detection. Their technique can be used by safety personnel internationally to demonstrate the dangers of combining distracter tasks with driving and to mitigate the safety risks.

  13. Uncertainty in Earth System Models: Benchmarks for Ocean Model Performance and Validation

    Science.gov (United States)

    Ogunro, O. O.; Elliott, S.; Collier, N.; Wingenter, O. W.; Deal, C.; Fu, W.; Hoffman, F. M.

    2017-12-01

    The mean ocean CO2 sink is a major component of the global carbon budget, with marine reservoirs holding about fifty times more carbon than the atmosphere. Phytoplankton play a significant role in the net carbon sink through photosynthesis and drawdown, such that about a quarter of anthropogenic CO2 emissions end up in the ocean. Biology greatly increases the efficiency of marine environments in CO2 uptake and ultimately reduces the impact of the persistent rise in atmospheric concentrations. However, a number of challenges remain in appropriate representation of marine biogeochemical processes in Earth System Models (ESM). These threaten to undermine the community effort to quantify seasonal to multidecadal variability in ocean uptake of atmospheric CO2. In a bid to improve analyses of marine contributions to climate-carbon cycle feedbacks, we have developed new analysis methods and biogeochemistry metrics as part of the International Ocean Model Benchmarking (IOMB) effort. Our intent is to meet the growing diagnostic and benchmarking needs of ocean biogeochemistry models. The resulting software package has been employed to validate DOE ocean biogeochemistry results by comparison with observational datasets. Several other international ocean models contributing results to the fifth phase of the Coupled Model Intercomparison Project (CMIP5) were analyzed simultaneously. Our comparisons suggest that the biogeochemical processes determining CO2 entry into the global ocean are not well represented in most ESMs. Polar regions continue to show notable biases in many critical biogeochemical and physical oceanographic variables. Some of these disparities could have first order impacts on the conversion of atmospheric CO2 to organic carbon. In addition, single forcing simulations show that the current ocean state can be partly explained by the uptake of anthropogenic emissions. Combined effects of two or more of these forcings on ocean biogeochemical cycles and ecosystems

  14. Modeling seismic performance of high-strength steel–ultra-high-performance concrete piers with modified Kent–Park model using fiber elements

    Directory of Open Access Journals (Sweden)

    Zhen Wang

    2016-02-01

    Full Text Available The seismic performance of ultra-high-performance concrete–high-strength steel pier was studied using fiber elements, which are capable to model accurately elastic–plastic behavior of members with fibers of different material constitutive relations. For high-strength steel–ultra-high-performance concrete piers, the modified Kent–Park model was utilized to describe the compressive stress–strain relations of ultra-high-performance concrete and high-strength steel-confined ultra-high-performance concrete, respectively, by determining four key parameters. A finite element model was established to simulate the hysteretic response; conduct parameter analysis including axial load ratio, longitudinal reinforcement ratio, and transverse reinforcement ratio; and assess the maximum ground acceleration capacity based on inelastic response spectra for high-strength steel–ultra-high-performance concrete piers. The conclusions are summarized that modified Kent–Park model is proved to be effective due to experimental data. The calculated hysteretic curves of high-strength steel–ultra-high-performance concrete piers show good agreement with the experimental results. Three parameters have evident effects on seismic performance of high-strength steel–ultra-high-performance concrete piers, which indicates that various seismic demands can be achieved by reasonable parameter settings. Compared to nonlinear dynamic analysis based on finite element model, the results provided by inelastic response spectra are less conservative for short high-strength steel–ultra-high-performance concrete piers under high axial load ratio.

  15. Techniques for Modeling Human Performance in Synthetic Environments: A Supplementary Review

    National Research Council Canada - National Science Library

    Ritter, Frank E; Shadbolt, Nigel R; Elliman, David; Young, Richard M; Gobet, Fernand; Baxter, Gordon D

    2003-01-01

    Selected recent developments and promising directions for improving the quality of models of human performance in synthetic environments are summarized, beginning with the potential uses and goals for behavioral models...

  16. Modeling the Energy Performance of LoRaWAN

    OpenAIRE

    Casals, Lluís; Mir, Bernat; Vidal, Rafael; Gomez, Carles

    2017-01-01

    LoRaWAN is a flagship Low-Power Wide Area Network (LPWAN) technology that has highly attracted much attention from the community in recent years. Many LoRaWAN end-devices, such as sensors or actuators, are expected not to be powered by the electricity grid; therefore, it is crucial to investigate the energy consumption of LoRaWAN. However, published works have only focused on this topic to a limited extent. In this paper, we present analytical models that allow the characterization of LoRaWAN...

  17. EURO-CORDEX regional climate models: Performance over Mediterranean region

    Science.gov (United States)

    Stilinović, Tomislav; Güttler, Ivan; Srnec, Lidija; Branković, Čedo

    2017-04-01

    Regional climate models (RCMs) are high-resolution version of a global climate models (GCMs) designed to achieve simulations at horizontal resolutions relevant for human activities on local and regional spatial scales, and to simulate relevant processes in historical and potential future climate conditions. In this study, a set of experiments the EURO-CORDEX simulations are evaluated over the Mediterranean region. All simulations were made at the two horizontal resolutions (50 km and 12.5 km) and compared with gridded pan-European gridded dataset E-OBSv11 at the regular 0.25°×0.25° grid for the two periods (1989-2008 for the ERA-Interim-driven ensemble of simulations; 1971-2000 for the GCMs-driven ensemble of simulations). We will evaluate the impacts of (1) the boundary conditions, (2) different horizontal resolutions (0.44°/50 km vs. 0.11°/12.5 km), and (3) the impact of convective parametrization on systematic errors, specialy in case of the RegCM4 model extensively used at DHMZ. For each simulation commonly used evaluation metrics are applied. They include: (1) spatially-averaged differences between RCMs and observations, (2) the spatial 95 th percentiles of simulated and observed temperature and precipitation, (3) spatial correlation coefficients between models and observations, (4) the ratio of spatial standard deviations between simulated and observed fields, and (5) the Spearman rank correlations between simulated and observed time-series of spatially-averaged temperature and precipitation. As commonly found in other studies, the total precipitation in RCM simulations is often overestimated and spatial correlations are noticeably lower than for temperature. The results highlight that, the RegCM4 is able to capture the (observed) spatial variability of the Mediterranean temperature climate. This is indicated by high spatial correlations with values larger than 0.9 and values of normalized standard deviation below 1 for Mediterranean region. The results

  18. Mathematically modelling the effects of pacing, finger strategies and urgency on numerical typing performance with queuing network model human processor.

    Science.gov (United States)

    Lin, Cheng-Jhe; Wu, Changxu

    2012-01-01

    Numerical typing is an important perceptual-motor task whose performance may vary with different pacing, finger strategies and urgency of situations. Queuing network-model human processor (QN-MHP), a computational architecture, allows performance of perceptual-motor tasks to be modelled mathematically. The current study enhanced QN-MHP with a top-down control mechanism, a close-loop movement control and a finger-related motor control mechanism to account for task interference, endpoint reduction, and force deficit, respectively. The model also incorporated neuromotor noise theory to quantify endpoint variability in typing. The model predictions of typing speed and accuracy were validated with Lin and Wu's (2011) experimental results. The resultant root-mean-squared errors were 3.68% with a correlation of 95.55% for response time, and 35.10% with a correlation of 96.52% for typing accuracy. The model can be applied to provide optimal speech rates for voice synthesis and keyboard designs in different numerical typing situations. An enhanced QN-MHP model was proposed in the study to mathematically account for the effects of pacing, finger strategies and internalised urgency on numerical typing performance. The model can be used to provide optimal pacing for voice synthesise systems and suggested optimal numerical keyboard designs under urgency.

  19. Analysis on fuel thermal conductivity model of the computer code for performance prediction of fuel rods

    International Nuclear Information System (INIS)

    Li Hai; Huang Chen; Du Aibing; Xu Baoyu

    2014-01-01

    The thermal conductivity is one of the most important parameters in the computer code for performance prediction for fuel rods. Several fuel thermal conductivity models used in foreign computer code, including thermal conductivity models for MOX fuel and UO 2 fuel were introduced in this paper. Thermal conductivities were calculated by using these models, and the results were compared and analyzed. Finally, the thermal conductivity model for the native computer code for performance prediction for fuel rods in fast reactor was recommended. (authors)

  20. Modelling PV modules' performance in Sahelian climates

    Energy Technology Data Exchange (ETDEWEB)

    Diarra, D.C.; Akuffo, F.O. [Kwame Nkrumah Univ. of Science and Technology, Kumasi (Ghana). Dept. of Mechanical Engineering

    2003-08-01

    This paper describes the development of a thermo-optical model designed to evaluate the temperature of a photovoltaic (PV) module in an effort to design a cost-effective cooling system for PV modules operating under high ambient temperatures. The power output of a PV module is greatly reduced when its temperature rises. This loss in efficiency is particularly significant in Sahelian regions where PV modules are subjected to high solar radiation intensities and high ambient temperatures. The newly developed thermo-optical model confirms that most of the heat in a PV module is generated in the solar cell. The results of the analysis include: the optical absorption, reflection and transmission of the solar radiation incident on the module; the temperature distribution in the module; and, the heat transfer through the top and bottom of the module. At incidence angles of 60 degrees, approximately three-quarters of the heat is generated in the solar cell. The optical efficiency is 88.44 per cent at normal incidence angle and 82.48 per cent when the incidence angle is 60 degrees. It was determined that the cooling system should be located as close as possible to the solar cell in order to increase the thermal heat flow from the cell. 4 refs., 3 tabs., 4 figs.

  1. A service based estimation method for MPSoC performance modelling

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer; Madsen, Jan; Jensen, Bjørn Sand

    2008-01-01

    This paper presents an abstract service based estimation method for MPSoC performance modelling which allows fast, cycle accurate design space exploration of complex architectures including multi processor configurations at a very early stage in the design phase. The modelling method uses a service...... oriented model of computation based on Hierarchical Colored Petri Nets and allows the modelling of both software and hardware in one unified model. To illustrate the potential of the method, a small MPSoC system, developed at Bang & Olufsen ICEpower a/s, is modelled and performance estimates are produced...

  2. Modeling the links between young swimmers' performance: energetic and biomechanic profiles.

    Science.gov (United States)

    Barbosa, Tiago M; Costa, Mário; Marinho, Daniel A; Coelho, Joel; Moreira, Marc; Silva, António J

    2010-08-01

    The aim was to develop a path-flow analysis model for young swimmers' performance based on biomechanical and energetic parameters, using structural equation modeling. Thirty-eight male young swimmers served as subjects. Performance was assessed by the 200-m freestyle event. For biomechanical assessment the stroke length, the stroke frequency and the swimming velocity were analyzed. Energetics assessment included the critical velocity, the stroke index and the propulsive efficiency. The confirmatory model explained 79% of swimming performance after deleting the stroke index-performance path, which was nonsignificant (SRMR = 0.06). As a conclusion, the model is appropriate to explain performance in young swimmers.

  3. Applicability and performance evaluation of QSAR models for bioconcentration in fish

    DEFF Research Database (Denmark)

    Lombardo, A.; Roncaglioni, A.; Rotoumenou, M. I.

    2011-01-01

    methods (e.g. in-vitro, QSAR or read-across). Goal of this work was to evaluate in which cases QSAR models can replace in-vivo tests. For this reason several QSAR models, developed within OSIRIS project (i.e. ChemPropTM and BCF regressions model for monovalent ionic compounds by Fu et al., 2009) or freely...... available (i.e. EPISuite, T.E.S.T., CAESAR, CORAL and logP-based equations) were tested. Performance and applicability of the models has been evaluated using a large dataset. The models were analysed both as regression models (in particular error distribution and correlation) and as classification models...

  4. Building and Running the Yucca Mountain Total System Performance Model in a Quality Environment

    International Nuclear Information System (INIS)

    D.A. Kalinich; K.P. Lee; J.A. McNeish

    2005-01-01

    A Total System Performance Assessment (TSPA) model has been developed to support the Safety Analysis Report (SAR) for the Yucca Mountain High-Level Waste Repository. The TSPA model forecasts repository performance over a 20,000-year simulation period. It has a high degree of complexity due to the complexity of its underlying process and abstraction models. This is reflected in the size of the model (a 27,000 element GoldSim file), its use of dynamic-linked libraries (14 DLLs), the number and size of its input files (659 files totaling 4.7 GB), and the number of model input parameters (2541 input database entries). TSPA model development and subsequent simulations with the final version of the model were performed to a set of Quality Assurance (QA) procedures. Due to the complexity of the model, comments on previous TSPAs, and the number of analysts involved (22 analysts in seven cities across four time zones), additional controls for the entire life-cycle of the TSPA model, including management, physical, model change, and input controls were developed and documented. These controls did not replace the QA. procedures, rather they provided guidance for implementing the requirements of the QA procedures with the specific intent of ensuring that the model development process and the simulations performed with the final version of the model had sufficient checking, traceability, and transparency. Management controls were developed to ensure that only management-approved changes were implemented into the TSPA model and that only management-approved model runs were performed. Physical controls were developed to track the use of prototype software and preliminary input files, and to ensure that only qualified software and inputs were used in the final version of the TSPA model. In addition, a system was developed to name, file, and track development versions of the TSPA model as well as simulations performed with the final version of the model

  5. Integrated Modeling of Process, Structures and Performance in Cast Parts

    DEFF Research Database (Denmark)

    Kotas, Petr

    and to defects occurrence. In other words, it is desired to eliminate all of the potential casting defects and at the same time to maximize the casting yield. The numerical optimization algorithm then takes these objectives and searches for a set of the investigated process, design or material parameters e......This thesis deals with numerical simulations of gravity sand casting processes for the production of large steel parts. The entire manufacturing process is numerically modeled and evaluated, taking into consideration mould filling, solidification, solid state cooling and the subsequent stress build.......g. chill design, riser design, gating system design, etc., which would satisfy these objectives the most. The first step in the numerical casting process simulation is to analyze mould filling where the emphasis is put on the gating system design. There are still a lot of foundry specialists who ignore...

  6. Nursery performance analysis within a model of management units

    Directory of Open Access Journals (Sweden)

    Maria Luiza Hexsel Segui

    2014-12-01

    Full Text Available Objective: Identify the nursery’s working spaces after implementation of a Unit Management model at a large and complex teaching hospital in south of Brazil. Method: exploratory descriptive study with a qualitative approach – Case Study. The data were collected with semi-structured interviews. Subjects: 15 nurses that represent the job positions at the institution. Results: All subjects are female, aged from 25 to 59 years, with high-level education, work experience at the hospital between 5 and 30 years, most of them have a post-graduate degree. The activities are divided on consulting services, public health surveillance, management, nursing care, on-the-job education, health education, research, and university-related management work. Conclusion: nurses working as assistants give support to public health surveillance; at Nursing Services, the activities are related to unit and care management; and at external administrative positions, they support general institutional demands.

  7. Transport modelling and gyrokinetic analysis of advanced high performance discharges

    International Nuclear Information System (INIS)

    Kinsey, J.E.; Imbeaux, F.; Staebler, G.M.; Budny, R.; Bourdelle, C.; Fukuyama, A.; Garbet, X.; Tala, T.; Parail, V.

    2005-01-01

    Predictive transport modelling and gyrokinetic stability analyses of demonstration hybrid (HYBRID) and advanced tokamak (AT) discharges from the International Tokamak Physics Activity (ITPA) profile database are presented. Both regimes have exhibited enhanced core confinement (above the conventional ITER reference H-mode scenario) but differ in their current density profiles. Recent contributions to the ITPA database have facilitated an effort to study the underlying physics governing confinement in these advanced scenarios. In this paper, we assess the level of commonality of the turbulent transport physics and the relative roles of the transport suppression mechanisms (i.e. E x B shear and Shafranov shift (α) stabilization) using data for select HYBRID and AT discharges from the DIII-D, JET and AUG tokamaks. GLF23 transport modelling and gyrokinetic stability analysis indicate that E x B shear and Shafranov shift stabilization play essential roles in producing the improved core confinement in both HYBRID and AT discharges. Shafranov shift stabilization is found to be more important in AT discharges than in HYBRID discharges. We have also examined the competition between the stabilizing effects of E x B shear and Shafranov shift stabilization and the destabilizing effects of higher safety factors and parallel velocity shear. Linear and nonlinear gyrokinetic simulations of idealized low and high safety factor cases reveal some interesting consequences. A low safety factor (i.e. HYBRID relevant) is directly beneficial in reducing the transport, and E x B shear stabilization can dominate parallel velocity shear destabilization allowing the turbulence to be quenched. However, at low-q/high current, Shafranov shift stabilization plays less of a role. Higher safety factors (as found in AT discharges), on the other hand, have larger amounts of Shafranov shift stabilization, but parallel velocity shear destabilization can prevent E x B shear quenching of the turbulent

  8. Transport modeling and gyrokinetic analysis of advanced high performance discharges

    International Nuclear Information System (INIS)

    Kinsey, J.; Imbeaux, F.; Bourdelle, C.; Garbet, X.; Staebler, G.; Budny, R.; Fukuyama, A.; Tala, T.; Parail, V.

    2005-01-01

    Predictive transport modeling and gyrokinetic stability analyses of demonstration hybrid (HYBRID) and Advanced Tokamak (AT) discharges from the International Tokamak Physics Activity (ITPA) profile database are presented. Both regimes have exhibited enhanced core confinement (above the conventional ITER reference H-mode scenario) but differ in their current density profiles. Recent contributions to the ITPA database have facilitated an effort to study the underlying physics governing confinement in these advanced scenarios. In this paper, we assess the level of commonality of the turbulent transport physics and the relative roles of the transport suppression mechanisms (i.e. ExB shear and Shafranov shift (α) stabilization) using data for select HYBRID and AT discharges from the DIII-D, JET, and AUG tokamaks. GLF23 transport modeling and gyrokinetic stability analysis indicates that ExB shear and Shafranov shift stabilization play essential roles in producing the improved core confinement in both HYBRID and AT discharges. Shafranov shift stabilization is found to be more important in AT discharges than in HYBRID discharges. We have also examined the competition between the stabilizing effects of ExB shear and Shafranov shift stabilization and the destabilizing effects of higher safety factors and parallel velocity shear. Linear and nonlinear gyrokinetic simulations of idealized low and high safety factor cases reveals some interesting consequences. A low safety factor (i.e. HYBRID relevant) is directly beneficial in reducing the transport, and ExB shear stabilization can win out over parallel velocity shear destabilization allowing the turbulence to be quenched. However, at low-q/high current, Shafranov shift stabilization plays less of a role. Higher safety factors (as found in AT discharges), on the other hand, have larger amounts of Shafranov shift stabilization, but parallel velocity shear destabilization can prevent ExB shear quenching of the turbulent

  9. CLIMBER-2: a climate system model of intermediate complexity. Pt. 1. Model description and performance for present climate

    Energy Technology Data Exchange (ETDEWEB)

    Petoukhov, V.; Ganopolski, A.; Brovkin, V.; Claussen, M.; Eliseev, A.; Kubatzki, C.; Rahmstorf, S.

    1998-02-01

    A 2.5-dimensional climate system model of intermediate complexity CLIMBER-2 and its performance for present climate conditions are presented. The model consists of modules describing atmosphere, ocean, sea ice, land surface processes, terrestrial vegetation cover, and global carbon cycle. The modules interact (on-line) through the fluxes of momentum, energy, water and carbon. The model has a coarse spatial resolution, allowing nevertheless to capture the major features of the Earth`s geography. The model describes temporal variability of the system on seasonal and longer time scales. Due to the fact that the model does not employ any type of flux adjustment and has fast turnaround time, it can be used for study of climates significantly different from the present one and allows to perform long-term (multimillennia) simulations. The constraints for coupling the atmosphere and ocean without flux adjustment are discussed. The results of a model validation against present climate data show that the model successfully describes the seasonal variability of a large set of characteristics of the climate system, including radiative balance, temperature, precipitation, ocean circulation and cryosphere. (orig.) 62 refs.

  10. On the performance of a generic length scale turbulence model within an adaptive finite element ocean model

    Science.gov (United States)

    Hill, Jon; Piggott, M. D.; Ham, David A.; Popova, E. E.; Srokosz, M. A.

    2012-10-01

    Research into the use of unstructured mesh methods for ocean modelling has been growing steadily in the last few years. One advantage of using unstructured meshes is that one can concentrate resolution where it is needed. In addition, dynamic adaptive mesh optimisation (DAMO) strategies allow resolution to be concentrated when this is required. Despite the advantage that DAMO gives in terms of improving the spatial resolution where and when required, small-scale turbulence in the oceans still requires parameterisation. A two-equation, generic length scale (GLS) turbulence model (one equation for turbulent kinetic energy and another for a generic turbulence length-scale quantity) adds this parameterisation and can be used in conjunction with adaptive mesh techniques. In this paper, an implementation of the GLS turbulence parameterisation is detailed in a non-hydrostatic, finite-element, unstructured mesh ocean model, Fluidity-ICOM. The implementation is validated by comparing to both a laboratory-scale experiment and real-world observations, on both fixed and adaptive meshes. The model performs well, matching laboratory and observed data, with resolution being adjusted as necessary by DAMO. Flexibility in the prognostic fields used to construct the error metric used in DAMO is required to ensure best performance. Moreover, the adaptive mesh models perform as well as fixed mesh models in terms of root mean square error to observation or theoretical mixed layer depths, but uses fewer elements and hence has a reduced computational cost.

  11. Analytical Performance Modeling and Validation of Intel’s Xeon Phi Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Chunduri, Sudheer; Balaprakash, Prasanna; Morozov, Vitali; Vishwanath, Venkatram; Kumaran, Kalyan

    2017-01-01

    Modeling the performance of scientific applications on emerging hardware plays a central role in achieving extreme-scale computing goals. Analytical models that capture the interaction between applications and hardware characteristics are attractive because even a reasonably accurate model can be useful for performance tuning before the hardware is made available. In this paper, we develop a hardware model for Intel’s second-generation Xeon Phi architecture code-named Knights Landing (KNL) for the SKOPE framework. We validate the KNL hardware model by projecting the performance of mini-benchmarks and application kernels. The results show that our KNL model can project the performance with prediction errors of 10% to 20%. The hardware model also provides informative recommendations for code transformations and tuning.

  12. Performance Evaluation of UML2-Modeled Embedded Streaming Applications with System-Level Simulation

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2009-01-01

    Full Text Available This article presents an efficient method to capture abstract performance model of streaming data real-time embedded systems (RTESs. Unified Modeling Language version 2 (UML2 is used for the performance modeling and as a front-end for a tool framework that enables simulation-based performance evaluation and design-space exploration. The adopted application meta-model in UML resembles the Kahn Process Network (KPN model and it is targeted at simulation-based performance evaluation. The application workload modeling is done using UML2 activity diagrams, and platform is described with structural UML2 diagrams and model elements. These concepts are defined using a subset of the profile for Modeling and Analysis of Realtime and Embedded (MARTE systems from OMG and custom stereotype extensions. The goal of the performance modeling and simulation is to achieve early estimates on task response times, processing element, memory, and on-chip network utilizations, among other information that is used for design-space exploration. As a case study, a video codec application on multiple processors is modeled, evaluated, and explored. In comparison to related work, this is the first proposal that defines transformation between UML activity diagrams and streaming data application workload meta models and successfully adopts it for RTES performance evaluation.

  13. Models, Web-Based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance

    National Research Council Canada - National Science Library

    Hill, Raymond

    2001-01-01

    ... Laboratory, Logistics Research Division, Logistics Readiness Branch to propose a research agenda entitled, "Models, Web-based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance...

  14. A decision model to priorotise logistics performance indicators

    OpenAIRE

    Kucukaltan, Berk

    2016-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and was awarded by Brunel University London Performance measurement is an important concern that has recently attracted much attention in the logistics area from both practitioners and academics. The performance measurement of logistics companies is based upon diverse performance indicators. However, to date, limited attention has been paid to the performance measurement of logistics companies and, also, performance measureme...

  15. Performance evaluation of groundwater model hydrostratigraphy from airborne electromagnetic data and lithological borehole logs

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; He, X.

    2015-01-01

    resistivity and clay fraction are classified into hydrostratigraphic zones using k-means clustering. Hydraulic conductivity values of the zones are estimated by hydrological calibration using hydraulic head and stream discharge observations. The method is applied to a Danish case study. Benchmarking...... hydrological performance by comparison of performance statistics from comparable hydrological models, the cluster model performed competitively. Calibrations of 11 hydrostratigraphic cluster models with 1-11 hydraulic conductivity zones showed improved hydrological performance with an increasing number...... of clusters. Beyond the 5-cluster model hydrological performance did not improve. Due to reproducibility and possibility of method standardization and automation, we believe that hydrostratigraphic model generation with the proposed method has important prospects for groundwater models used in water resources...

  16. Hydrological Modeling in Northern Tunisia with Regional Climate Model Outputs: Performance Evaluation and Bias-Correction in Present Climate Conditions

    Directory of Open Access Journals (Sweden)

    Asma Foughali

    2015-07-01

    Full Text Available This work aims to evaluate the performance of a hydrological balance model in a watershed located in northern Tunisia (wadi Sejnane, 378 km2 in present climate conditions using input variables provided by four regional climate models. A modified version (MBBH of the lumped and single layer surface model BBH (Bucket with Bottom Hole model, in which pedo-transfer parameters estimated using watershed physiographic characteristics are introduced is adopted to simulate the water balance components. Only two parameters representing respectively the water retention capacity of the soil and the vegetation resistance to evapotranspiration are calibrated using rainfall-runoff data. The evaluation criterions for the MBBH model calibration are: relative bias, mean square error and the ratio of mean actual evapotranspiration to mean potential evapotranspiration. Daily air temperature, rainfall and runoff observations are available from 1960 to 1984. The period 1960–1971 is selected for calibration while the period 1972–1984 is chosen for validation. Air temperature and precipitation series are provided by four regional climate models (DMI, ARP, SMH and ICT from the European program ENSEMBLES, forced by two global climate models (GCM: ECHAM and ARPEGE. The regional climate model outputs (precipitation and air temperature are compared to the observations in terms of statistical distribution. The analysis was performed at the seasonal scale for precipitation. We found out that RCM precipitation must be corrected before being introduced as MBBH inputs. Thus, a non-parametric quantile-quantile bias correction method together with a dry day correction is employed. Finally, simulated runoff generated using corrected precipitation from the regional climate model SMH is found the most acceptable by comparison with runoff simulated using observed precipitation data, to reproduce the temporal variability of mean monthly runoff. The SMH model is the most accurate to

  17. Prediction models for performance and emissions of a dual fuel CI ...

    Indian Academy of Sciences (India)

    use artificial intelligence modelling techniques like fuzzy logic, Artificial Neural Net- work (ANN), Genetic Algorithm (GA), etc. This paper uses a neuro fuzzy modelling technique, Adaptive Neuro Fuzzy Inference System (ANFIS) for developing predic- tion models for performance and emission parameter of a dual fuel engine.

  18. Bayesian Comparison of Alternative Graded Response Models for Performance Assessment Applications

    Science.gov (United States)

    Zhu, Xiaowen; Stone, Clement A.

    2012-01-01

    This study examined the relative effectiveness of Bayesian model comparison methods in selecting an appropriate graded response (GR) model for performance assessment applications. Three popular methods were considered: deviance information criterion (DIC), conditional predictive ordinate (CPO), and posterior predictive model checking (PPMC). Using…

  19. A model for evaluating the social performance of construction waste management

    International Nuclear Information System (INIS)

    Yuan Hongping

    2012-01-01

    Highlights: ► Scant attention is paid to social performance of construction waste management (CWM). ► We develop a model for assessing the social performance of CWM. ► With the model, the social performance of CWM can be quantitatively simulated. - Abstract: It has been determined by existing literature that a lot of research efforts have been made to the economic performance of construction waste management (CWM), but less attention is paid to investigation of the social performance of CWM. This study therefore attempts to develop a model for quantitatively evaluating the social performance of CWM by using a system dynamics (SD) approach. Firstly, major variables affecting the social performance of CWM are identified and a holistic system for assessing the social performance of CWM is formulated in line with feedback relationships underlying these variables. The developed system is then converted into a SD model through the software iThink. An empirical case study is finally conducted to demonstrate application of the model. Results of model validation indicate that the model is robust and reasonable to reflect the situation of the real system under study. Findings of the case study offer helpful insights into effectively promoting the social performance of CWM of the project investigated. Furthermore, the model exhibits great potential to function as an experimental platform for dynamically evaluating effects of management measures on improving the social performance of CWM of construction projects.

  20. CI-WATER HPC Model: Cyberinfrastructure to Advance High Performance Water Resources Modeling in the Intermountain Western U.S

    Science.gov (United States)

    Ogden, F. L.; Lai, W.; Douglas, C. C.; Miller, S. N.; Zhang, Y.

    2012-12-01

    The CI-WATER project is a cooperative effort between the Utah and Wyoming EPSCoR jurisdictions, and is funded through a cooperative agreement with the U.S. National Science Foundation EPSCoR. The CI-WATER project is acquiring hardware and developing software cyberinfrastructure (CI) to enhance accessibility of High Performance Computing for water resources modeling in the Western U.S. One of the components of the project is development of a large-scale, high-resolution, physically-based, data-driven, integrated computational water resources model, which we call the CI-WATER HPC model. The objective of this model development is to enable evaluation of integrated system behavior to guide and support water system planning and management by individual users, cities, or states. The model is first being tested in the Green River basin of Wyoming, which is the largest tributary to the Colorado River. The model will ultimately be applied to simulate the entire Upper Colorado River basin for hydrological studies, watershed management, economic analysis, as well as evaluation of potential changes in environmental policy and law, population, land use, and climate. In addition to hydrologically important processes simulated in many hydrological models, the CI-WATER HPC model will emphasize anthropogenic influences such as land use change, water resources infrastructure, irrigation practices, trans-basin diversions, and urban/suburban development. The model operates on an unstructured mesh, employing adaptive mesh at grid sizes as small as 10 m as needed- particularly in high elevation snow melt regions. Data for the model are derived from remote sensing sources, atmospheric models and geophysical techniques. Monte-Carlo techniques and ensemble Kalman filtering methodologies are employed for data assimilation. The model includes application programming interface (API) standards to allow easy substitution of alternative process-level simulation routines, and provide post

  1. A Belief-Based Model of Air Traffic Controllers Performing Separation Assurance

    Science.gov (United States)

    Landry, S.J.

    2009-01-01

    A model of an air traffic controller performing a separation assurance task was produced. The model was designed to be simple to use and deploy in a simulator, but still provide realistic behavior. The model is based upon an evaluation of the safety function of the controller for separation assurance, and utilizes fast and frugal heuristics and belief networks to establish a knowledge set for the controller model. Based on this knowledge set, the controller acts to keep aircraft separated. Validation results are provided to demonstrate the model s performance.

  2. Results and Comparison from the SAM Linear Fresnel Technology Performance Model: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, M. J.

    2012-04-01

    This paper presents the new Linear Fresnel technology performance model in NREL's System Advisor Model. The model predicts the financial and technical performance of direct-steam-generation Linear Fresnel power plants, and can be used to analyze a range of system configurations. This paper presents a brief discussion of the model formulation and motivation, and provides extensive discussion of the model performance and financial results. The Linear Fresnel technology is also compared to other concentrating solar power technologies in both qualitative and quantitative measures. The Linear Fresnel model - developed in conjunction with the Electric Power Research Institute - provides users with the ability to model a variety of solar field layouts, fossil backup configurations, thermal receiver designs, and steam generation conditions. This flexibility aims to encompass current market solutions for the DSG Linear Fresnel technology, which is seeing increasing exposure in fossil plant augmentation and stand-alone power generation applications.

  3. Solving Enterprise Applications Performance Puzzles Queuing Models to the Rescue

    CERN Document Server

    Grinshpan, Leonid

    2012-01-01

    A groundbreaking scientific approach to solving enterprise applications performance problems Enterprise applications are the information backbone of today's corporations, supporting vital business functions such as operational management, supply chain maintenance, customer relationship administration, business intelligence, accounting, procurement logistics, and more. Acceptable performance of enterprise applications is critical for a company's day-to-day operations as well as for its profitability. Unfortunately, troubleshooting poorly performing enterprise applications has traditionally

  4. A nonlinear model for the characterization and optimization of athletic training and performance

    Directory of Open Access Journals (Sweden)

    Turner James D.

    2017-02-01

    Full Text Available Study aim: Mathematical models of the relationship between training and performance facilitate the design of training protocols to achieve performance goals. However, current linear models do not account for nonlinear physiological effects such as saturation and over-training. This severely limits their practical applicability, especially for optimizing training strategies. This study describes, analyzes, and applies a new nonlinear model to account for these physiological effects. Material and methods: This study considers the equilibria and step response of the nonlinear differential equation model to show its characteristics and trends, optimizes training protocols using genetic algorithms to maximize performance by applying the model under various realistic constraints, and presents a case study fitting the model to human performance data. Results: The nonlinear model captures the saturation and over-training effects; produces realistic training protocols with training progression, a high-intensity phase, and a taper; and closely fits the experimental performance data. Fitting the model parameters to subsets of the data identifies which parameters have the largest variability but reveals that the performance predictions are relatively consistent. Conclusions: These findings provide a new mathematical foundation for modeling and optimizing athletic training routines subject to an individual’s personal physiology, constraints, and performance goals.

  5. Comparative performance of Bayesian and AIC-based measures of phylogenetic model uncertainty.

    Science.gov (United States)

    Alfaro, Michael E; Huelsenbeck, John P

    2006-02-01

    Reversible-jump Markov chain Monte Carlo (RJ-MCMC) is a technique for simultaneously evaluating multiple related (but not necessarily nested) statistical models that has recently been applied to the problem of phylogenetic model selection. Here we use a simulation approach to assess the performance of this method and compare it to Akaike weights, a measure of model uncertainty that is based on the Akaike information criterion. Under conditions where the assumptions of the candidate models matched the generating conditions, both Bayesian and AIC-based methods perform well. The 95% credible interval contained the generating model close to 95% of the time. However, the size of the credible interval differed with the Bayesian credible set containing approximately 25% to 50% fewer models than an AIC-based credible interval. The posterior probability was a better indicator of the correct model than the Akaike weight when all assumptions were met but both measures performed similarly when some model assumptions were violated. Models in the Bayesian posterior distribution were also more similar to the generating model in their number of parameters and were less biased in their complexity. In contrast, Akaike-weighted models were more distant from the generating model and biased towards slightly greater complexity. The AIC-based credible interval appeared to be more robust to the violation of the rate homogeneity assumption. Both AIC and Bayesian approaches suggest that substantial uncertainty can accompany the choice of model for phylogenetic analyses, suggesting that alternative candidate models should be examined in analysis of phylogenetic data. [AIC; Akaike weights; Bayesian phylogenetics; model averaging; model selection; model uncertainty; posterior probability; reversible jump.].

  6. Performance Analysis of Transposition Models Simulating Solar Radiation on Inclined Surfaces: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Yu; Sengupta, Manajit

    2016-06-01

    Transposition models are widely used in the solar energy industry to simulate solar radiation on inclined photovoltaic (PV) panels. These transposition models have been developed using various assumptions about the distribution of the diffuse radiation, and most of the parameterizations in these models have been developed using hourly ground data sets. Numerous studies have compared the performance of transposition models, but this paper aims to understand the quantitative uncertainty in the state-of-the-art transposition models and the sources leading to the uncertainty using high-resolution ground measurements in the plane of array. Our results suggest that the amount of aerosol optical depth can affect the accuracy of isotropic models. The choice of empirical coefficients and the use of decomposition models can both result in uncertainty in the output from the transposition models. It is expected that the results of this study will ultimately lead to improvements of the parameterizations as well as the development of improved physical models.

  7. The contribution of high-performance computing and modelling for industrial development

    CSIR Research Space (South Africa)

    Sithole, Happy

    2017-10-01

    Full Text Available Performance Computing and Modelling for Industrial Development Dr Happy Sithole and Dr Onno Ubbink 2 Strategic context • High-performance computing (HPC) combined with machine Learning and artificial intelligence present opportunities to non...

  8. Maneuver Performance Assessment of the Cassini Spacecraft Through Execution-Error Modeling and Analysis

    Science.gov (United States)

    Wagner, Sean

    2014-01-01

    The Cassini spacecraft has executed nearly 300 maneuvers since 1997, providing ample data for execution-error model updates. With maneuvers through 2017, opportunities remain to improve on the models and remove biases identified in maneuver executions. This manuscript focuses on how execution-error models can be used to judge maneuver performance, while providing a means for detecting performance degradation. Additionally, this paper describes Cassini's execution-error model updates in August 2012. An assessment of Cassini's maneuver performance through OTM-368 on January 5, 2014 is also presented.

  9. Some concepts of model uncertainty for performance assessments of nuclear waste repositories

    International Nuclear Information System (INIS)

    Eisenberg, N.A.; Sagar, B.; Wittmeyer, G.W.

    1994-01-01

    Models of the performance of nuclear waste repositories will be central to making regulatory decisions regarding the safety of such facilities. The conceptual model of repository performance is represented by mathematical relationships, which are usually implemented as one or more computer codes. A geologic system may allow many conceptual models, which are consistent with the observations. These conceptual models may or may not have the same mathematical representation. Experiences in modeling the performance of a waste repository representation. Experiences in modeling the performance of a waste repository (which is, in part, a geologic system), show that this non-uniqueness of conceptual models is a significant source of model uncertainty. At the same time, each conceptual model has its own set of parameters and usually, it is not be possible to completely separate model uncertainty from parameter uncertainty for the repository system. Issues related to the origin of model uncertainty, its relation to parameter uncertainty, and its incorporation in safety assessments are discussed from a broad regulatory perspective. An extended example in which these issues are explored numerically is also provided

  10. Performance Requirements Modeling andAssessment for Active Power Ancillary Services

    DEFF Research Database (Denmark)

    Bondy, Daniel Esteban Morales; Thavlov, Anders; Tougaard, Janus Bundsgaard Mosbæk

    2017-01-01

    system operation, a reliable service delivery is required, yet it may not be appropriate to apply conventional performance requirements to new technologies and methods. The service performance requirements and assessment methods therefore need to be generalized and standardized in order to include future...... ancillary service sources. This paper develops a modeling method for ancillary services performance requirements, including performance and verification indices. The use of the modeling method and the indices is exemplified in two case studies....

  11. Methods to assess performance of models estimating risk of death in intensive care patients: a review.

    Science.gov (United States)

    Cook, D A

    2006-04-01

    Models that estimate the probability of death of intensive care unit patients can be used to stratify patients according to the severity of their condition and to control for casemix and severity of illness. These models have been used for risk adjustment in quality monitoring, administration, management and research and as an aid to clinical decision making. Models such as the Mortality Prediction Model family, SAPS II, APACHE II, APACHE III and the organ system failure models provide estimates of the probability of in-hospital death of ICU patients. This review examines methods to assess the performance of these models. The key attributes of a model are discrimination (the accuracy of the ranking in order of probability of death) and calibration (the extent to which the model's prediction of probability of death reflects the true risk of death). These attributes should be assessed in existing models that predict the probability of patient mortality, and in any subsequent model that is developed for the purposes of estimating these probabilities. The literature contains a range of approaches for assessment which are reviewed and a survey of the methodologies used in studies of intensive care mortality models is presented. The systematic approach used by Standards for Reporting Diagnostic Accuracy provides a framework to incorporate these theoretical considerations of model assessment and recommendations are made for evaluation and presentation of the performance of models that estimate the probability of death of intensive care patients.

  12. Predictors of Academic Performance of University Students: An Application of the Goal Efficacy Model

    Science.gov (United States)

    Klomegah, Roger Yao

    2007-01-01

    This study utilized the goal-efficacy model to examine a) the extent to which index scores of student self-efficacy, self-set goals, assigned goals, and ability (four variables in the model) could predict academic performance of university students; and b) the best predictor of academic performance. The sample comprised 103 undergraduate students…

  13. University Library Strategy Development: A Conceptual Model of Researcher Performance to Inform Service Delivery

    Science.gov (United States)

    Maddox, Alexia; Zhao, Linlin

    2017-01-01

    This case study presents a conceptual model of researcher performance developed by Deakin University Library, Australia. The model aims to organize research performance data into meaningful researcher profiles, referred to as researcher typologies, which support the demonstration of research impact and value. Three dimensions shaping researcher…

  14. The Pieter Schippers story : Almost 40 years of developments in sonar performance modelling in the Netherlands

    NARCIS (Netherlands)

    Schippers, P.; Colin, M.E.G.D.; Beerens, S.P.

    2014-01-01

    This paper is dedicated to the work of Pieter Schippers and gives an overview of his achievement in sonar performance modelling over his career. This publication is the last of a long list, many of which published at UDT [1-5]. A historical review is presented of the sonar performance modeling work

  15. Modeling Students' Problem Solving Performance in the Computer-Based Mathematics Learning Environment

    Science.gov (United States)

    Lee, Young-Jin

    2017-01-01

    Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…

  16. Development and Validation of a Path Analytic Model of Students' Performance in Chemistry.

    Science.gov (United States)

    Anamuah-Mensah, Jophus; And Others

    1987-01-01

    Reported the development and validation of an integrated model of performance on chemical concept-volumetric analysis. Model was tested on 265 chemistry students in eight schools.Results indicated that for subjects using algorithms without understanding, performance on volumetric analysis problems was not influenced by proportional reasoning…

  17. A predictive model of flight crew performance in automated air traffic control and flight management operations

    Science.gov (United States)

    1995-01-01

    Prepared ca. 1995. This paper describes Air-MIDAS, a model of pilot performance in interaction with varied levels of automation in flight management operations. The model was used to predict the performance of a two person flight crew responding to c...

  18. Assessing the performance of prediction models: A framework for traditional and novel measures

    NARCIS (Netherlands)

    E.W. Steyerberg (Ewout); A.J. Vickers (Andrew); N.R. Cook (Nancy); T.A. Gerds (Thomas); M. Gonen (Mithat); N. Obuchowski (Nancy); M. Pencina (Michael); M.W. Kattan (Michael)

    2010-01-01

    textabstractThe performance of prediction models can be assessed using a variety of methods and metrics. Traditional measures for binary and survival outcomes include the Brier score to indicate overall model performance, the concordance (or c) statistic for discriminative ability (or area under the

  19. A Model for Setting Performance Objectives for Salmonella in the Broiler Supply Chain

    NARCIS (Netherlands)

    Tromp, S.O.; Franz, E.; Rijgersberg, H.; Asselt, van E.D.; Fels-Klerx, van der H.J.

    2010-01-01

    A stochastic model for setting performance objectives for Salmonella in the broiler supply chain was developed. The goal of this study was to develop a model by which performance objectives for Salmonella prevalence at various points in the production chain can be determined, based on a preset final

  20. Using dynamical uncertainty models estimating uncertainty bounds on power plant performance prediction

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob; Mataji, B.

    2007-01-01

    of the prediction error. These proposed dynamical uncertainty models result in an upper and lower bound on the predicted performance of the plant. The dynamical uncertainty models are used to estimate the uncertainty of the predicted performance of a coal-fired power plant. The proposed scheme, which uses dynamical...

  1. Performance prediction model for distributed applications on multicore clusters

    CSIR Research Space (South Africa)

    Khanyile, NP

    2012-07-01

    Full Text Available Distributed processing offers a way of successfully dealing with computationally demanding applications such as scientific problems. Over the years, researchers have investigated ways to predict the performance of parallel algorithms. Amdahl’s law...

  2. Architecture and Programming Models for High Performance Intensive Computation

    Science.gov (United States)

    2016-06-29

    written in funJava, a functional programming language. • Implement selected test programs in funJava for demonstrating performance and en- ergy efficiency... Programming language The funJava programming language is a version of standard Java , restricted to a functional subset. This simple functional programming ... programs written in funJava, a functional programming language. • Implement selected test programs in funJava for demonstrating performance and energy

  3. Performance Calculations - and Appendix I - Model XC-120 (M-107)

    Science.gov (United States)

    1950-09-25

    and cargo and& point. Drop nack and return to bass. Take-off cargo Fnd return to Cross weight defined at base without pack. Takel halfway point. off...Steciolonditions or Standard Airaraft Chearsoterim tios Performance pressented herein ir. tha~t requiredby roferernee ()for Standard. Airaraft...horsepower available as used in the performance calculations of this report in defined an: THP : ) ) -• re : BiP = engine brake horsepower from engine

  4. Evaluation of performance of distributed delay model for chemotherapy-induced myelosuppression.

    Science.gov (United States)

    Krzyzanski, Wojciech; Hu, Shuhua; Dunlavey, Michael

    2018-04-01

    The distributed delay model has been introduced that replaces the transit compartments in the classic model of chemotherapy-induced myelosuppression with a convolution integral. The maturation of granulocyte precursors in the bone marrow is described by the gamma probability density function with the shape parameter (ν). If ν is a positive integer, the distributed delay model coincides with the classic model with ν transit compartments. The purpose of this work was to evaluate performance of the distributed delay model with particular focus on model deterministic identifiability in the presence of the shape parameter. The classic model served as a reference for comparison. Previously published white blood cell (WBC) count data in rats receiving bolus doses of 5-fluorouracil were fitted by both models. The negative two log-likelihood objective function (-2LL) and running times were used as major markers of performance. Local sensitivity analysis was done to evaluate the impact of ν on the pharmacodynamics response WBC. The ν estimate was 1.46 with 16.1% CV% compared to ν = 3 for the classic model. The difference of 6.78 in - 2LL between classic model and the distributed delay model implied that the latter performed significantly better than former according to the log-likelihood ratio test (P = 0.009), although the overall performance was modestly better. The running times were 1 s and 66.2 min, respectively. The long running time of the distributed delay model was attributed to computationally intensive evaluation of the convolution integral. The sensitivity analysis revealed that ν strongly influences the WBC response by controlling cell proliferation and elimination of WBCs from the circulation. In conclusion, the distributed delay model was deterministically identifiable from typical cytotoxic data. Its performance was modestly better than the classic model with significantly longer running time.

  5. Development of a Stochastically-driven, Forward Predictive Performance Model for PEMFCs

    Science.gov (United States)

    Harvey, David Benjamin Paul

    A one-dimensional multi-scale coupled, transient, and mechanistic performance model for a PEMFC membrane electrode assembly has been developed. The model explicitly includes each of the 5 layers within a membrane electrode assembly and solves for the transport of charge, heat, mass, species, dissolved water, and liquid water. Key features of the model include the use of a multi-step implementation of the HOR reaction on the anode, agglomerate catalyst sub-models for both the anode and cathode catalyst layers, a unique approach that links the composition of the catalyst layer to key properties within the agglomerate model and the implementation of a stochastic input-based approach for component material properties. The model employs a new methodology for validation using statistically varying input parameters and statistically-based experimental performance data; this model represents the first stochastic input driven unit cell performance model. The stochastic input driven performance model was used to identify optimal ionomer content within the cathode catalyst layer, demonstrate the role of material variation in potential low performing MEA materials, provide explanation for the performance of low-Pt loaded MEAs, and investigate the validity of transient-sweep experimental diagnostic methods.

  6. Evaluating performance of simplified physically based models for shallow landslide susceptibility

    Directory of Open Access Journals (Sweden)

    G. Formetta

    2016-11-01

    Full Text Available Rainfall-induced shallow landslides can lead to loss of life and significant damage to private and public properties, transportation systems, etc. Predicting locations that might be susceptible to shallow landslides is a complex task and involves many disciplines: hydrology, geotechnical science, geology, hydrogeology, geomorphology, and statistics. Two main approaches are commonly used: statistical or physically based models. Reliable model applications involve automatic parameter calibration, objective quantification of the quality of susceptibility maps, and model sensitivity analyses. This paper presents a methodology to systemically and objectively calibrate, verify, and compare different models and model performance indicators in order to identify and select the models whose behavior is the most reliable for particular case studies.The procedure was implemented in a package of models for landslide susceptibility analysis and integrated in the NewAge-JGrass hydrological model. The package includes three simplified physically based models for landslide susceptibility analysis (M1, M2, and M3 and a component for model verification. It computes eight goodness-of-fit indices by comparing pixel-by-pixel model results and measurement data. The integration of the package in NewAge-JGrass uses other components, such as geographic information system tools, to manage input–output processes, and automatic calibration algorithms to estimate model parameters. The system was applied for a case study in Calabria (Italy along the Salerno–Reggio Calabria highway, between Cosenza and Altilia. The area is extensively subject to rainfall-induced shallow landslides mainly because of its complex geology and climatology. The analysis was carried out considering all the combinations of the eight optimized indices and the three models. Parameter calibration, verification, and model performance assessment were performed by a comparison with a detailed landslide

  7. FRAPCON-3: Modifications to fuel rod material properties and performance models for high-burnup application

    International Nuclear Information System (INIS)

    Lanning, D.D.; Beyer, C.E.; Painter, C.L.

    1997-12-01

    This volume describes the fuel rod material and performance models that were updated for the FRAPCON-3 steady-state fuel rod performance code. The property and performance models were changed to account for behavior at extended burnup levels up to 65 Gwd/MTU. The property and performance models updated were the fission gas release, fuel thermal conductivity, fuel swelling, fuel relocation, radial power distribution, solid-solid contact gap conductance, cladding corrosion and hydriding, cladding mechanical properties, and cladding axial growth. Each updated property and model was compared to well characterized data up to high burnup levels. The installation of these properties and models in the FRAPCON-3 code along with input instructions are provided in Volume 2 of this report and Volume 3 provides a code assessment based on comparison to integral performance data. The updated FRAPCON-3 code is intended to replace the earlier codes FRAPCON-2 and GAPCON-THERMAL-2. 94 refs., 61 figs., 9 tabs

  8. FRAPCON-3: Modifications to fuel rod material properties and performance models for high-burnup application

    Energy Technology Data Exchange (ETDEWEB)

    Lanning, D.D.; Beyer, C.E.; Painter, C.L.

    1997-12-01

    This volume describes the fuel rod material and performance models that were updated for the FRAPCON-3 steady-state fuel rod performance code. The property and performance models were changed to account for behavior at extended burnup levels up to 65 Gwd/MTU. The property and performance models updated were the fission gas release, fuel thermal conductivity, fuel swelling, fuel relocation, radial power distribution, solid-solid contact gap conductance, cladding corrosion and hydriding, cladding mechanical properties, and cladding axial growth. Each updated property and model was compared to well characterized data up to high burnup levels. The installation of these properties and models in the FRAPCON-3 code along with input instructions are provided in Volume 2 of this report and Volume 3 provides a code assessment based on comparison to integral performance data. The updated FRAPCON-3 code is intended to replace the earlier codes FRAPCON-2 and GAPCON-THERMAL-2. 94 refs., 61 figs., 9 tabs.

  9. Is "Turnaround" a useful model for low-performing schools?

    Directory of Open Access Journals (Sweden)

    Sandra Stotsky

    2015-04-01

    -authorization draft is approved by Congress, Arkansas may find Excellence for All one of the few intervention programs recommended for its low-income, low-achieving schools despite its costs and no evidence that this new program can do better than America’s Choice did. At this point, it seems reasonable to suggest that there should be no federal or state requirement for "turnaround" partners or the "turnaround" model, whether or not the programs they promote address Common Core’s standards. We clearly do not need the federal government pushing states and local districts to pay for consultants and services to solve the problems in low-achieving schools for which they have had no solutions.

  10. Optimization of A(2)O BNR processes using ASM and EAWAG Bio-P models: model performance.

    Science.gov (United States)

    El Shorbagy, Walid E; Radif, Nawras N; Droste, Ronald L

    2013-12-01

    This paper presents the performance of an optimization model for a biological nutrient removal (BNR) system using the anaerobic-anoxic-oxic (A(2)O) process. The formulated model simulates removal of organics, nitrogen, and phosphorus using a reduced International Water Association (IWA) Activated Sludge Model #3 (ASM3) model and a Swiss Federal Institute for Environmental Science and Technology (EAWAG) Bio-P module. Optimal sizing is attained considering capital and operational costs. Process performance is evaluated against the effect of influent conditions, effluent limits, and selected parameters of various optimal solutions with the following results: an increase of influent temperature from 10 degrees C to 25 degrees C decreases the annual cost by about 8.5%, an increase of influent flow from 500 to 2500 m(3)/h triples the annual cost, the A(2)O BNR system is more sensitive to variations in influent ammonia than phosphorus concentration and the maximum growth rate of autotrophic biomass was the most sensitive kinetic parameter in the optimization model.

  11. Probabilistic Seismic Performance Model for Tunnel Form Concrete Building Structures

    Directory of Open Access Journals (Sweden)

    S. Bahram Beheshti Aval

    2016-12-01

    Full Text Available Despite widespread construction of mass-production houses with tunnel form structural system across the world, unfortunately no special seismic code is published for design of this type of construction. Through a literature survey, only a few studies are about the seismic behavior of this type of structural system. Thus based on reasonable numerical results, the seismic performance of structures constructed with this technique considering the effective factors on structural behavior is highly noteworthy in a seismic code development process. In addition, due to newness of this system and observed damages in past earthquakes, and especially random nature of future earthquakes, the importance of probabilistic approach and necessity of developing fragility curves in a next generation Performance Based Earthquake Engineering (PBEE frame work are important. In this study, the seismic behavior of 2, 5 and 10 story tunnel form structures with a regular plan is examined. First, the performance levels of these structures under the design earthquake (return period of 475 years with time history analysis and pushover method are assessed, and then through incremental dynamic analysis, fragility curves are extracted for different levels of damage in walls and spandrels. The results indicated that the case study structures have high capacity and strength and show appropriate seismic performance. Moreover, all three structures subjected were in immediate occupancy performance level.

  12. Adding propensity scores to pure prediction models fails to improve predictive performance

    Directory of Open Access Journals (Sweden)

    Amy S. Nowacki

    2013-08-01

    Full Text Available Background. Propensity score usage seems to be growing in popularity leading researchers to question the possible role of propensity scores in prediction modeling, despite the lack of a theoretical rationale. It is suspected that such requests are due to the lack of differentiation regarding the goals of predictive modeling versus causal inference modeling. Therefore, the purpose of this study is to formally examine the effect of propensity scores on predictive performance. Our hypothesis is that a multivariable regression model that adjusts for all covariates will perform as well as or better than those models utilizing propensity scores with respect to model discrimination and calibration.Methods. The most commonly encountered statistical scenarios for medical prediction (logistic and proportional hazards regression were used to investigate this research question. Random cross-validation was performed 500 times to correct for optimism. The multivariable regression models adjusting for all covariates were compared with models that included adjustment for or weighting with the propensity scores. The methods were compared based on three predictive performance measures: (1 concordance indices; (2 Brier scores; and (3 calibration curves.Results. Multivariable models adjusting for all covariates had the highest average concordance index, the lowest average Brier score, and the best calibration. Propensity score adjustment and inverse probability weighting models without adjustment for all covariates performed worse than full models and failed to improve predictive performance with full covariate adjustment.Conclusion. Propensity score techniques did not improve prediction performance measures beyond multivariable adjustment. Propensity scores are not recommended if the analytical goal is pure prediction modeling.

  13. Effects of error covariance structure on estimation of model averaging weights and predictive performance

    Science.gov (United States)

    Lu, Dan; Ye, Ming; Meyer, Philip D.; Curtis, Gary P.; Shi, Xiaoqing; Niu, Xu-Feng; Yabusaki, Steve B.

    2013-01-01

    obtained from the iterative two-stage method also improved predictive performance of the individual models and model averaging in both synthetic and experimental studies.

  14. Performance Analysis of Several GPS/Galileo Precise Point Positioning Models.

    Science.gov (United States)

    Afifi, Akram; El-Rabbany, Ahmed

    2015-06-19

    This paper examines the performance of several precise point positioning (PPP) models, which combine dual-frequency GPS/Galileo observations in the un-differenced and between-satellite single-difference (BSSD) modes. These include the traditional un-differenced model, the decoupled clock model, the semi-decoupled clock model, and the between-satellite single-difference model. We take advantage of the IGS-MGEX network products to correct for the satellite differential code biases and the orbital and satellite clock errors. Natural Resources Canada's GPSPace PPP software is modified to handle the various GPS/Galileo PPP models. A total of six data sets of GPS and Galileo observations at six IGS stations are processed to examine the performance of the various PPP models. It is shown that the traditional un-differenced GPS/Galileo PPP model, the GPS decoupled clock model, and the semi-decoupled clock GPS/Galileo PPP model improve the convergence time by about 25% in comparison with the un-differenced GPS-only model. In addition, the semi-decoupled GPS/Galileo PPP model improves the solution precision by about 25% compared to the traditional un-differenced GPS/Galileo PPP model. Moreover, the BSSD GPS/Galileo PPP model improves the solution convergence time by about 50%, in comparison with the un-differenced GPS PPP model, regardless of the type of BSSD combination used. As well, the BSSD model improves the precision of the estimated parameters by about 50% and 25% when the loose and the tight combinations are used, respectively, in comparison with the un-differenced GPS-only model. Comparable results are obtained through the tight combination when either a GPS or a Galileo satellite is selected as a reference.

  15. Range performance calculations using the NVEOL-Georgia Tech Research Institute 0.1- to 100-GHz radar performance model

    Science.gov (United States)

    Rodak, S. P.; Thomas, N. I.

    1983-05-01

    A computer model that can be used to calculate radar range performance at any frequency in the 0.1-to 100-GHz electromagnetic spectrum is described. These different numerical examples are used to demonstrate how to use the radar range performance model. Input/output documentation are included for each case that was run on the MERADCOM CDC 6600 computer at Fort Belvoir, Virginia.

  16. Examination of atmospheric dynamic model's performance over complex terrain under temporally changing synoptic meteorological conditions

    International Nuclear Information System (INIS)

    Nagai, Haruyasu; Yamazawa, Hiromi

    1995-01-01

    The mesoscale atmospheric dynamic model, a submodel of the numerical atmospheric dispersion model named PHYSIC, was improved and its performance was examined in a coastal area with a complex terrain. To introduce temporally changing synoptic meteorological conditions into the model, the initial and boundary conditions were improved. Moreover, land surface temperature calculations were modified to apply the model to snow-covered areas. These improvements worked effectively in the model simulation of four series of the observations during winter and summer in 1992. The model successfully simulated the wind fields and its temporal variations under the condition of strong westerlies and a land and sea breeze. Limitation of model's performance caused by the temporal and spatial resolutions of input data was also discussed. (author)

  17. Beyond Performativity: A Pragmatic Model of Teacher Professional Learning

    Science.gov (United States)

    Lloyd, Margaret; Davis, James P.

    2018-01-01

    The intent and content of teacher professional learning has changed in recent times to meet the demands of performativity. In this article, we offer and demonstrate a pragmatic way to map teacher professional learning that both meets current demands and secures a place for teacher-led catalytic learning. To achieve this, we position identified…

  18. Performance Evaluation and Modeling of Internet Traffic of an ...

    African Journals Online (AJOL)

    Academic institutions require internet services to facilitate research and learning processes for the students and members of staff. More so, the Federal University of Technology, Akure, a technologically oriented institution, deserves a viable internet services for these purposes. Optimal performance and continual delivery of ...

  19. Gender consequences of a national performance-based funding model

    DEFF Research Database (Denmark)

    Nielsen, Mathias Wullum

    2017-01-01

    -regarded’ and highly selective journals and book publishers, and 1 and 5 points for equivalent scientific contributions via ‘normal level’ channels. On the basis of bibliometric data, the study shows that the BRI considerably widens the existing gender gap in researcher performance, since men on average receive more...... privileges collaborative research, which disadvantages women due to gender differences in collaborative network relations....

  20. A Brazilian theatre model meets Zulu performance conventions ...

    African Journals Online (AJOL)

    In July 2002, Christopher Hurst supervised Mbongiseni Buthelezi, a postgraduate student in Drama and Performance Studies, who conducted a Prison Theatre project at the Medium B Prison (a men's maximum security prison) at Westville Prison in Durban. Buthelezi used theatre workshop techniques to create a play that ...