WorldWideScience

Sample records for model requires evaluation

  1. How Many Model Evaluations Are Required To Predict The AEP Of A Wind Power Plant?

    International Nuclear Information System (INIS)

    Murcia, J P; Réthoré, P E; Natarajan, A; Sørensen, J D

    2015-01-01

    Wind farm flow models have advanced considerably with the use of large eddy simulations (LES) and Reynolds averaged Navier-Stokes (RANS) computations. The main limitation of these techniques is their high computational time requirements; which makes their use for wind farm annual energy production (AEP) predictions expensive. The objective of the present paper is to minimize the number of model evaluations required to capture the wind power plant's AEP using stationary wind farm flow models. Polynomial chaos techniques are proposed based on arbitrary Weibull distributed wind speed and Von Misses distributed wind direction. The correlation between wind direction and wind speed are captured by defining Weibull-parameters as functions of wind direction. In order to evaluate the accuracy of these methods the expectation and variance of the wind farm power distributions are compared against the traditional binning method with trapezoidal and Simpson's integration rules.The wind farm flow model used in this study is the semi-empirical wake model developed by Larsen [1]. Three test cases are studied: a single turbine, a simple and a real offshore wind power plant. A reduced number of model evaluations for a general wind power plant is proposed based on the convergence of the present method for each case. (paper)

  2. How Many Model Evaluations Are Required To Predict The AEP Of A Wind Power Plant?

    DEFF Research Database (Denmark)

    Murcia Leon, Juan Pablo; Réthoré, Pierre-Elouan; Natarajan, Anand

    2015-01-01

    (AEP) predictions expensive. The objective of the present paper is to minimize the number of model evaluations required to capture the wind power plant's AEP using stationary wind farm flow models. Polynomial chaos techniques are proposed based on arbitrary Weibull distributed wind speed and Von Misses...... distributed wind direction. The correlation between wind direction and wind speed are captured by defining Weibull-parameters as functions of wind direction. In order to evaluate the accuracy of these methods the expectation and variance of the wind farm power distributions are compared against...... the traditional binning method with trapezoidal and Simpson's integration rules. The wind farm flow model used in this study is the semi-empirical wake model developed by Larsen [1]. Three test cases are studied: a single turbine, a simple and a real offshore wind power plant. A reduced number of model...

  3. Requirements Modeling with the Aspect-oriented User Requirements Notation (AoURN): A Case Study

    Science.gov (United States)

    Mussbacher, Gunter; Amyot, Daniel; Araújo, João; Moreira, Ana

    The User Requirements Notation (URN) is a recent ITU-T standard that supports requirements engineering activities. The Aspect-oriented URN (AoURN) adds aspect-oriented concepts to URN, creating a unified framework that allows for scenario-based, goal-oriented, and aspect-oriented modeling. AoURN is applied to the car crash crisis management system (CCCMS), modeling its functional and non-functional requirements (NFRs). AoURN generally models all use cases, NFRs, and stakeholders as individual concerns and provides general guidelines for concern identification. AoURN handles interactions between concerns, capturing their dependencies and conflicts as well as the resolutions. We present a qualitative comparison of aspect-oriented techniques for scenario-based and goal-oriented requirements engineering. An evaluation carried out based on the metrics adapted from literature and a task-based evaluation suggest that AoURN models are more scalable than URN models and exhibit better modularity, reusability, and maintainability.

  4. Evaluation of olive flowering at low latitude sites in Argentina using a chilling requirement model

    Energy Technology Data Exchange (ETDEWEB)

    Aybar, V.E.; Melo-Abreu, J.P. de; Searles, P.S.; Matias, A.G.; Del Rio, C.; Caballero, C. M.; Rousseaux, M.C.

    2015-07-01

    Olive production has expanded significantly from the Mediterranean Basin into the New World over the last two decades. In some cases, cultivars of European origin have been introduced at a large commercial scale with little previous evaluation of potential productivity. The objective of this study was to evaluate whether a temperature-driven simulation model developed in the Mediterranean Basin to predict normal flowering occurrence and flowering date using cultivar-specific thermal requirements was suitable for the low latitude areas of Northwest Argentina. The model was validated at eight sites over several years and a wide elevation range (350–1200 m above mean sea level) for three cultivars (‘Arbequina’, ‘Frantoio’, ‘Leccino’) with potentially different chilling requirements. In ‘Arbequina’, normal flowering was observed at almost all sites and in all years, while normal flowering events in ‘Frantoio’ and ‘Leccino’ were uncommon. The model successfully predicted if flowering would be normal in 92% and 83% of the cases in ‘Arbequina’ and ‘Frantoio’, respectively, but was somewhat less successful in ‘Leccino’ (61%). When flowering occurred, the predicted flowering date was within ± 7 days of the observed date in 71% of the cases. Overall, the model results indicate that cultivar-specific simulation models may be used as an approximate tool to predict whether individual cultivars will be successful in new growing areas. In Northwest Argentina, the model could be used to identify cultivars to replace ‘Frantoio’ and ‘Leccino’ and to simulate global warming scenarios. (Author)

  5. Evaluating the Impact of Design-Driven Requirements Using SysML

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research will develop SysML requirements modeling patterns and scripts to automate the evaluation of the impact of design driven requirements....

  6. Evaluation and decision of products conceptual design schemes based on customer requirements

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Hong Zhong; Li, Yan Feng; Liu, Yu; Wang, Zhonglai [University of Electronic Science and Technology of China, Sichuan (China); Liu, Wenhai [2China Science Patent Trademark Agents Ltd., Beijing (China)

    2011-09-15

    Within the competitive market environment, understanding customer requirements is crucial for all corporations to obtain market share and survive competition. Only the products exactly meeting customer requirements can win in the market place. Therefore, customer requirements play a very important role in the evaluation and decision process of conceptual design schemes of products. In this paper, an evaluation and decision method based on customer requirements is presented. It utilizes the importance of customer requirements, the satisfaction degree of each evaluation metric to the specification, and an evaluation metric which models customer requirements to evaluate the satisfaction degree of each design scheme to specific customer requirements via the proposed BP neural networks. In the evaluation and decision process, fuzzy sets are used to describe the importance of customer requirements, the relationship between customer requirements and evaluation metrics, the satisfaction degree of each scheme to customer requirements, and the crisp set is used to describe the satisfaction degree of each metric to specifications. The effectiveness of the proposed method is demonstrated by an example of front suspension fork design of mountain bikes.

  7. Evaluation and decision of products conceptual design schemes based on customer requirements

    International Nuclear Information System (INIS)

    Huang, Hong Zhong; Li, Yan Feng; Liu, Yu; Wang, Zhonglai; Liu, Wenhai

    2011-01-01

    Within the competitive market environment, understanding customer requirements is crucial for all corporations to obtain market share and survive competition. Only the products exactly meeting customer requirements can win in the market place. Therefore, customer requirements play a very important role in the evaluation and decision process of conceptual design schemes of products. In this paper, an evaluation and decision method based on customer requirements is presented. It utilizes the importance of customer requirements, the satisfaction degree of each evaluation metric to the specification, and an evaluation metric which models customer requirements to evaluate the satisfaction degree of each design scheme to specific customer requirements via the proposed BP neural networks. In the evaluation and decision process, fuzzy sets are used to describe the importance of customer requirements, the relationship between customer requirements and evaluation metrics, the satisfaction degree of each scheme to customer requirements, and the crisp set is used to describe the satisfaction degree of each metric to specifications. The effectiveness of the proposed method is demonstrated by an example of front suspension fork design of mountain bikes

  8. Conceptual modelling of human resource evaluation process

    Directory of Open Access Journals (Sweden)

    Negoiţă Doina Olivia

    2017-01-01

    Full Text Available Taking into account the highly diverse tasks which employees have to fulfil due to complex requirements of nowadays consumers, the human resource within an enterprise has become a strategic element for developing and exploiting products which meet the market expectations. Therefore, organizations encounter difficulties when approaching the human resource evaluation process. Hence, the aim of the current paper is to design a conceptual model of the aforementioned process, which allows the enterprises to develop a specific methodology. In order to design the conceptual model, Business Process Modelling instruments were employed - Adonis Community Edition Business Process Management Toolkit using the ADONIS BPMS Notation. The conceptual model was developed based on an in-depth secondary research regarding the human resource evaluation process. The proposed conceptual model represents a generic workflow (sequential and/ or simultaneously activities, which can be extended considering the enterprise’s needs regarding their requirements when conducting a human resource evaluation process. Enterprises can benefit from using software instruments for business process modelling as they enable process analysis and evaluation (predefined / specific queries and also model optimization (simulations.

  9. A fuzzy model for exploiting customer requirements

    Directory of Open Access Journals (Sweden)

    Zahra Javadirad

    2016-09-01

    Full Text Available Nowadays, Quality function deployment (QFD is one of the total quality management tools, where customers’ views and requirements are perceived and using various techniques improves the production requirements and operations. The QFD department, after identification and analysis of the competitors, takes customers’ feedbacks to meet the customers’ demands for the products compared with the competitors. In this study, a comprehensive model for assessing the importance of the customer requirements in the products or services for an organization is proposed. The proposed study uses linguistic variables, as a more comprehensive approach, to increase the precision of the expression evaluations. The importance of these requirements specifies the strengths and weaknesses of the organization in meeting the requirements relative to competitors. The results of these experiments show that the proposed method performs better than the other methods.

  10. The EU model evaluation group

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1999-01-01

    The model evaluation group (MEG) was launched in 1992 growing out of the Major Technological Hazards Programme with EU/DG XII. The goal of MEG was to improve the culture in which models were developed, particularly by encouraging voluntary model evaluation procedures based on a formalised and consensus protocol. The evaluation intended to assess the fitness-for-purpose of the models being used as a measure of the quality. The approach adopted was focused on developing a generic model evaluation protocol and subsequent targeting this onto specific areas of application. Five such developments have been initiated, on heavy gas dispersion, liquid pool fires, gas explosions, human factors and momentum fires. The quality of models is an important element when complying with the 'Seveso Directive' requiring that the safety reports submitted to the authorities comprise an assessment of the extent and severity of the consequences of identified major accidents. Further, the quality of models become important in the land use planning process, where the proximity of industrial sites to vulnerable areas may be critical. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  11. A Hybrid Parallel Execution Model for Logic Based Requirement Specifications (Invited Paper

    Directory of Open Access Journals (Sweden)

    Jeffrey J. P. Tsai

    1999-05-01

    Full Text Available It is well known that undiscovered errors in a requirements specification is extremely expensive to be fixed when discovered in the software maintenance phase. Errors in the requirement phase can be reduced through the validation and verification of the requirements specification. Many logic-based requirements specification languages have been developed to achieve these goals. However, the execution and reasoning of a logic-based requirements specification can be very slow. An effective way to improve their performance is to execute and reason the logic-based requirements specification in parallel. In this paper, we present a hybrid model to facilitate the parallel execution of a logic-based requirements specification language. A logic-based specification is first applied by a data dependency analysis technique which can find all the mode combinations that exist within a specification clause. This mode information is used to support a novel hybrid parallel execution model, which combines both top-down and bottom-up evaluation strategies. This new execution model can find the failure in the deepest node of the search tree at the early stage of the evaluation, thus this new execution model can reduce the total number of nodes searched in the tree, the total processes needed to be generated, and the total communication channels needed in the search process. A simulator has been implemented to analyze the execution behavior of the new model. Experiments show significant improvement based on several criteria.

  12. Site evaluation for nuclear installations. Safety requirements

    International Nuclear Information System (INIS)

    2003-01-01

    This Safety Requirements publication supersedes the Code on the Safety of Nuclear Power Plants: Siting, which was issued in 1988 as Safety Series No. 50-C-S (Rev. 1). It takes account of developments relating to site evaluations for nuclear installations since the Code on Siting was last revised. These developments include the issuing of the Safety Fundamentals publication on The Safety of Nuclear Installations, and the revision of various safety standards and other publications relating to safety. Requirements for site evaluation are intended to ensure adequate protection of site personnel, the public and the environment from the effects of ionizing radiation arising from nuclear installations. It is recognized that there are steady advances in technology and scientific knowledge, in nuclear safety and in what is considered adequate protection. Safety requirements change with these advances and this publication reflects the present consensus among States. This Safety Requirements publication was prepared under the IAEA programme on safety standards for nuclear installations. It establishes requirements and provides criteria for ensuring safety in site evaluation for nuclear installations. The Safety Guides on site evaluation listed in the references provide recommendations on how to meet the requirements established in this Safety Requirements publication. The objective of this publication is to establish the requirements for the elements of a site evaluation for a nuclear installation so as to characterize fully the site specific conditions pertinent to the safety of a nuclear installation. The purpose is to establish requirements for criteria, to be applied as appropriate to site and site-installation interaction in operational states and accident conditions, including those that could lead to emergency measures for: (a) Defining the extent of information on a proposed site to be presented by the applicant; (b) Evaluating a proposed site to ensure that the site

  13. 48 CFR 652.219-73 - Mentor Requirements and Evaluation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Mentor Requirements and... Mentor Requirements and Evaluation. As prescribed in 619.202-70(o)(2), insert the following clause: Mentor Requirements and Evaluation (APR 2004) (a) Mentor and protégé firms shall submit an evaluation to...

  14. 48 CFR 1052.219-75 - Mentor Requirements and Evaluation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Mentor Requirements and... Mentor Requirements and Evaluation. As prescribed in DTAR 1019.202-70, insert the following clause: Mentor Requirements and Evaluation (JAN 2000) (a) Mentor and protégé firms shall submit an evaluation to...

  15. Evaluating and reducing a model of radiocaesium soil-plant uptake

    Energy Technology Data Exchange (ETDEWEB)

    Tarsitano, D.; Young, S.D. [School of Biosciences, University of Nottingham, University Park, Nottingham, NG7 2RD (United Kingdom); Crout, N.M.J., E-mail: neil.crout@nottingham.ac.u [School of Biosciences, University of Nottingham, University Park, Nottingham, NG7 2RD (United Kingdom)

    2011-03-15

    An existing model of radiocaesium transfer to grasses was extended to include wheat and barley and parameterised using data from a wide range of soils and contact times. The model structure was revised and evaluated using a subset of the available data which was not used for model parameterisation. The resulting model was then used as a basis for systematic model reduction to test the utility of the model components. This analysis suggested that the use of 4 model variables (relating to radiocaesium adsorption on organic matter and the pH sensitivity of soil solution potassium concentration) and 1 model input (pH) are not required. The results of this analysis were used to develop a reduced model which was further evaluated in terms of comparisons to observations. The reduced model had an improved empirical performance and fewer adjustable parameters and soil characteristic inputs. - Research highlights: {yields} A model of plant radiocesium uptake is evaluated and re-parameterised. {yields} The representation of time dependent changes in plant uptake is improved. {yields} Model reduction is applied to evaluate the model structure. {yields} A reduced model is identified which outperforms the previously reported model. {yields} The reduced model requires fewer soil specific inputs.

  16. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  17. From requirements to Java in a snap model-driven requirements engineering in practice

    CERN Document Server

    Smialek, Michal

    2015-01-01

    This book provides a coherent methodology for Model-Driven Requirements Engineering which stresses the systematic treatment of requirements within the realm of modelling and model transformations. The underlying basic assumption is that detailed requirements models are used as first-class artefacts playing a direct role in constructing software. To this end, the book presents the Requirements Specification Language (RSL) that allows precision and formality, which eventually permits automation of the process of turning requirements into a working system by applying model transformations and co

  18. Visual soil evaluation - future research requirements

    Science.gov (United States)

    Emmet-Booth, Jeremy; Forristal, Dermot; Fenton, Owen; Ball, Bruce; Holden, Nick

    2017-04-01

    A review of Visual Soil Evaluation (VSE) techniques (Emmet-Booth et al., 2016) highlighted their established utility for soil quality assessment, though some limitations were identified; (1) The examination of aggregate size, visible intra-porosity and shape forms a key assessment criterion in almost all methods, thus limiting evaluation to structural form. The addition of criteria that holistically examine structure may be desirable. For example, structural stability can be indicated using dispersion tests or examining soil surface crusting, while the assessment of soil colour may indirectly indicate soil organic matter content, a contributor to stability. Organic matter assessment may also indicate structural resilience, along with rooting, earthworm numbers or shrinkage cracking. (2) Soil texture may influence results or impeded method deployment. Modification of procedures to account for extreme texture variation is desirable. For example, evidence of compaction in sandy or single grain soils greatly differs to that in clayey soils. Some procedures incorporate separate classification systems or adjust deployment based on texture. (3) Research into impacts of soil moisture content on VSE evaluation criteria is required. Criteria such as rupture resistance and shape may be affected by moisture content. It is generally recommended that methods are deployed on moist soils and quantification of influences of moisture variation on results is necessary. (4) Robust sampling strategies for method deployment are required. Dealing with spatial variation differs between methods, but where methods can be deployed over large areas, clear instruction on sampling is required. Additionally, as emphasis has been placed on the agricultural production of soil, so the ability of VSE for exploring structural quality in terms of carbon storage, water purification and biodiversity support also requires research. References Emmet-Booth, J.P., Forristal. P.D., Fenton, O., Ball, B

  19. 48 CFR 1852.219-79 - Mentor requirements and evaluation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Mentor requirements and... and Clauses 1852.219-79 Mentor requirements and evaluation. As prescribed in 1819.7215, insert the following clause: Mentor Requirements and Evaluation (MAY 2009) (a) The purpose of the NASA Mentor-Protégé...

  20. 48 CFR 552.219-76 - Mentor Requirements and Evaluation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Mentor Requirements and....219-76 Mentor Requirements and Evaluation. As prescribed in 519.7017(b), insert the following clause: Mentor Requirements and Evaluation (SEP 2009) (a) The purpose of the GSA Mentor-Protégé Program is for a...

  1. 48 CFR 752.219-71 - Mentor requirements and evaluation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Mentor requirements and....219-71 Mentor requirements and evaluation. As prescribed in AIDAR 719.273-11(b), insert the following clause: Mentor Requirements and Evaluation (July 13, 2007) (a) Mentor and Protégé firms shall submit an...

  2. A Novel Model for Security Evaluation for Compliance

    DEFF Research Database (Denmark)

    Hald, Sara Ligaard; Pedersen, Jens Myrup; Prasad, Neeli R.

    2011-01-01

    for Compliance (SEC) model offers a lightweight alternative for use by decision makers to get a quick overview of the security attributes of different technologies for easy comparison and requirement compliance evaluation. The scientific contribution is this new approach to security modelling as well...

  3. Evaluation of animal models of neurobehavioral disorders

    Directory of Open Access Journals (Sweden)

    Nordquist Rebecca E

    2009-02-01

    Full Text Available Abstract Animal models play a central role in all areas of biomedical research. The process of animal model building, development and evaluation has rarely been addressed systematically, despite the long history of using animal models in the investigation of neuropsychiatric disorders and behavioral dysfunctions. An iterative, multi-stage trajectory for developing animal models and assessing their quality is proposed. The process starts with defining the purpose(s of the model, preferentially based on hypotheses about brain-behavior relationships. Then, the model is developed and tested. The evaluation of the model takes scientific and ethical criteria into consideration. Model development requires a multidisciplinary approach. Preclinical and clinical experts should establish a set of scientific criteria, which a model must meet. The scientific evaluation consists of assessing the replicability/reliability, predictive, construct and external validity/generalizability, and relevance of the model. We emphasize the role of (systematic and extended replications in the course of the validation process. One may apply a multiple-tiered 'replication battery' to estimate the reliability/replicability, validity, and generalizability of result. Compromised welfare is inherent in many deficiency models in animals. Unfortunately, 'animal welfare' is a vaguely defined concept, making it difficult to establish exact evaluation criteria. Weighing the animal's welfare and considerations as to whether action is indicated to reduce the discomfort must accompany the scientific evaluation at any stage of the model building and evaluation process. Animal model building should be discontinued if the model does not meet the preset scientific criteria, or when animal welfare is severely compromised. The application of the evaluation procedure is exemplified using the rat with neonatal hippocampal lesion as a proposed model of schizophrenia. In a manner congruent to

  4. Requirements Modeling with Agent Programming

    Science.gov (United States)

    Dasgupta, Aniruddha; Krishna, Aneesh; Ghose, Aditya K.

    Agent-oriented conceptual modeling notations are highly effective in representing requirements from an intentional stance and answering questions such as what goals exist, how key actors depend on each other, and what alternatives must be considered. In this chapter, we review an approach to executing i* models by translating these into set of interacting agents implemented in the CASO language and suggest how we can perform reasoning with requirements modeled (both functional and non-functional) using i* models. In this chapter we particularly incorporate deliberation into the agent design. This allows us to benefit from the complementary representational capabilities of the two frameworks.

  5. Sizing and scaling requirements of a large-scale physical model for code validation

    International Nuclear Information System (INIS)

    Khaleel, R.; Legore, T.

    1990-01-01

    Model validation is an important consideration in application of a code for performance assessment and therefore in assessing the long-term behavior of the engineered and natural barriers of a geologic repository. Scaling considerations relevant to porous media flow are reviewed. An analysis approach is presented for determining the sizing requirements of a large-scale, hydrology physical model. The physical model will be used to validate performance assessment codes that evaluate the long-term behavior of the repository isolation system. Numerical simulation results for sizing requirements are presented for a porous medium model in which the media properties are spatially uncorrelated

  6. A systems evaluation model for selecting spent nuclear fuel storage concepts

    International Nuclear Information System (INIS)

    Postula, F.D.; Finch, W.C.; Morissette, R.P.

    1982-01-01

    This paper describes a system evaluation approach used to identify and evaluate monitored, retrievable fuel storage concepts that fulfill ten key criteria for meeting the functional requirements and system objectives of the National Nuclear Waste Management Program. The selection criteria include health and safety, schedules, costs, socio-economic factors and environmental factors. The methodology used to establish the selection criteria, develop a weight of importance for each criterion and assess the relative merit of each storage system is discussed. The impact of cost relative to technical criteria is examined along with experience in obtaining relative merit data and its application in the model. Topics considered include spent fuel storage requirements, functional requirements, preliminary screening, and Monitored Retrievable Storage (MRS) system evaluation. It is concluded that the proposed system evaluation model is universally applicable when many concepts in various stages of design and cost development need to be evaluated

  7. A Functional Model of Quality Assurance for Psychiatric Hospitals and Corresponding Staffing Requirements.

    Science.gov (United States)

    Kamis-Gould, Edna; And Others

    1991-01-01

    A model for quality assurance (QA) in psychiatric hospitals is described. Its functions (general QA, utilization review, clinical records, evaluation, management information systems, risk management, and infection control), subfunctions, and corresponding staffing requirements are reviewed. This model was designed to foster standardization in QA…

  8. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-01-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a ''living document'' that will be modified over the course of the execution of this work

  9. Towards systematic evaluation of crop model outputs for global land-use models

    Science.gov (United States)

    Leclere, David; Azevedo, Ligia B.; Skalský, Rastislav; Balkovič, Juraj; Havlík, Petr

    2016-04-01

    Land provides vital socioeconomic resources to the society, however at the cost of large environmental degradations. Global integrated models combining high resolution global gridded crop models (GGCMs) and global economic models (GEMs) are increasingly being used to inform sustainable solution for agricultural land-use. However, little effort has yet been done to evaluate and compare the accuracy of GGCM outputs. In addition, GGCM datasets require a large amount of parameters whose values and their variability across space are weakly constrained: increasing the accuracy of such dataset has a very high computing cost. Innovative evaluation methods are required both to ground credibility to the global integrated models, and to allow efficient parameter specification of GGCMs. We propose an evaluation strategy for GGCM datasets in the perspective of use in GEMs, illustrated with preliminary results from a novel dataset (the Hypercube) generated by the EPIC GGCM and used in the GLOBIOM land use GEM to inform on present-day crop yield, water and nutrient input needs for 16 crops x 15 management intensities, at a spatial resolution of 5 arc-minutes. We adopt the following principle: evaluation should provide a transparent diagnosis of model adequacy for its intended use. We briefly describe how the Hypercube data is generated and how it articulates with GLOBIOM in order to transparently identify the performances to be evaluated, as well as the main assumptions and data processing involved. Expected performances include adequately representing the sub-national heterogeneity in crop yield and input needs: i) in space, ii) across crop species, and iii) across management intensities. We will present and discuss measures of these expected performances and weight the relative contribution of crop model, input data and data processing steps in performances. We will also compare obtained yield gaps and main yield-limiting factors against the M3 dataset. Next steps include

  10. Evaluation-Function-based Model-free Adaptive Fuzzy Control

    Directory of Open Access Journals (Sweden)

    Agus Naba

    2016-12-01

    Full Text Available Designs of adaptive fuzzy controllers (AFC are commonly based on the Lyapunov approach, which requires a known model of the controlled plant. They need to consider a Lyapunov function candidate as an evaluation function to be minimized. In this study these drawbacks were handled by designing a model-free adaptive fuzzy controller (MFAFC using an approximate evaluation function defined in terms of the current state, the next state, and the control action. MFAFC considers the approximate evaluation function as an evaluative control performance measure similar to the state-action value function in reinforcement learning. The simulation results of applying MFAFC to the inverted pendulum benchmark verified the proposed scheme’s efficacy.

  11. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  12. Evaluation models and evaluation use

    Science.gov (United States)

    Contandriopoulos, Damien; Brousselle, Astrid

    2012-01-01

    The use of evaluation results is at the core of evaluation theory and practice. Major debates in the field have emphasized the importance of both the evaluator’s role and the evaluation process itself in fostering evaluation use. A recent systematic review of interventions aimed at influencing policy-making or organizational behavior through knowledge exchange offers a new perspective on evaluation use. We propose here a framework for better understanding the embedded relations between evaluation context, choice of an evaluation model and use of results. The article argues that the evaluation context presents conditions that affect both the appropriateness of the evaluation model implemented and the use of results. PMID:23526460

  13. 49 CFR 40.285 - When is a SAP evaluation required?

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false When is a SAP evaluation required? 40.285 Section... § 40.285 When is a SAP evaluation required? (a) As an employee, when you have violated DOT drug and... unless you complete the SAP evaluation, referral, and education/treatment process set forth in this...

  14. Site descriptive modelling - strategy for integrated evaluation

    International Nuclear Information System (INIS)

    Andersson, Johan

    2003-02-01

    The current document establishes the strategy to be used for achieving sufficient integration between disciplines in producing Site Descriptive Models during the Site Investigation stage. The Site Descriptive Model should be a multidisciplinary interpretation of geology, rock mechanics, thermal properties, hydrogeology, hydrogeochemistry, transport properties and ecosystems using site investigation data from deep bore holes and from the surface as input. The modelling comprise the following iterative steps, evaluation of primary data, descriptive and quantitative modelling (in 3D), overall confidence evaluation. Data are first evaluated within each discipline and then the evaluations are checked between the disciplines. Three-dimensional modelling (i.e. estimating the distribution of parameter values in space and its uncertainty) is made in a sequence, where the geometrical framework is taken from the geological model and in turn used by the rock mechanics, thermal and hydrogeological modelling etc. The three-dimensional description should present the parameters with their spatial variability over a relevant and specified scale, with the uncertainty included in this description. Different alternative descriptions may be required. After the individual discipline modelling and uncertainty assessment a phase of overall confidence evaluation follows. Relevant parts of the different modelling teams assess the suggested uncertainties and evaluate the feedback. These discussions should assess overall confidence by, checking that all relevant data are used, checking that information in past model versions is considered, checking that the different kinds of uncertainty are addressed, checking if suggested alternatives make sense and if there is potential for additional alternatives, and by discussing, if appropriate, how additional measurements (i.e. more data) would affect confidence. The findings as well as the modelling results are to be documented in a Site Description

  15. Assessment of compliance with regulatory requirements for a best estimate methodology for evaluation of ECCS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Un Chul; Jang, Jin Wook; Lim, Ho Gon; Jeong, Ik [Seoul National Univ., Seoul (Korea, Republic of); Sim, Suk Ku [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2000-03-15

    Best estimate methodology for evaluation of ECCS proposed by KEPCO(KREM) os using thermal-hydraulic best-estimate code and the topical report for the methodology is described that it meets the regulatory requirement of USNRC regulatory guide. In this research the assessment of compliance with regulatory guide. In this research the assessment of compliance with regulatory requirements for the methodology is performed. The state of licensing procedure of other countries and best-estimate evaluation methodologies of Europe is also investigated, The applicability of models and propriety of procedure of uncertainty analysis of KREM are appraised and compliance with USNRC regulatory guide is assessed.

  16. 38 CFR 21.8030 - Requirement for evaluation of child.

    Science.gov (United States)

    2010-07-01

    ... evaluation of child. 21.8030 Section 21.8030 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS... Certain Children of Vietnam Veterans-Spina Bifida and Covered Birth Defects Evaluation § 21.8030 Requirement for evaluation of child. (a) Children to be evaluated. The VR&E Division will evaluate each child...

  17. An evaluation model for the definition of regulatory requirements on spent fuel pool cooling systems

    International Nuclear Information System (INIS)

    Izquierdo, J.M.

    1979-01-01

    A calculation model is presented for establishing regulatory requirements in the SFPCS System. The major design factors, regulatory and design limits and key parameters are discussed. A regulatory position for internal use is proposed. Finally, associated problems and experience are presented. (author)

  18. GENERAL REQUIREMENTS FOR SIMULATION MODELS IN WASTE MANAGEMENT

    International Nuclear Information System (INIS)

    Miller, Ian; Kossik, Rick; Voss, Charlie

    2003-01-01

    Most waste management activities are decided upon and carried out in a public or semi-public arena, typically involving the waste management organization, one or more regulators, and often other stakeholders and members of the public. In these environments, simulation modeling can be a powerful tool in reaching a consensus on the best path forward, but only if the models that are developed are understood and accepted by all of the parties involved. These requirements for understanding and acceptance of the models constrain the appropriate software and model development procedures that are employed. This paper discusses requirements for both simulation software and for the models that are developed using the software. Requirements for the software include transparency, accessibility, flexibility, extensibility, quality assurance, ability to do discrete and/or continuous simulation, and efficiency. Requirements for the models that are developed include traceability, transparency, credibility/validity, and quality control. The paper discusses these requirements with specific reference to the requirements for performance assessment models that are used for predicting the long-term safety of waste disposal facilities, such as the proposed Yucca Mountain repository

  19. Evaluating significance in linear mixed-effects models in R.

    Science.gov (United States)

    Luke, Steven G

    2017-08-01

    Mixed-effects models are being used ever more frequently in the analysis of experimental data. However, in the lme4 package in R the standards for evaluating significance of fixed effects in these models (i.e., obtaining p-values) are somewhat vague. There are good reasons for this, but as researchers who are using these models are required in many cases to report p-values, some method for evaluating the significance of the model output is needed. This paper reports the results of simulations showing that the two most common methods for evaluating significance, using likelihood ratio tests and applying the z distribution to the Wald t values from the model output (t-as-z), are somewhat anti-conservative, especially for smaller sample sizes. Other methods for evaluating significance, including parametric bootstrapping and the Kenward-Roger and Satterthwaite approximations for degrees of freedom, were also evaluated. The results of these simulations suggest that Type 1 error rates are closest to .05 when models are fitted using REML and p-values are derived using the Kenward-Roger or Satterthwaite approximations, as these approximations both produced acceptable Type 1 error rates even for smaller samples.

  20. Required spatial resolution of hydrological models to evaluate urban flood resilience measures

    Science.gov (United States)

    Gires, A.; Giangola-Murzyn, A.; Tchiguirinskaia, I.; Schertzer, D.; Lovejoy, S.

    2012-04-01

    During a flood in urban area, several non-linear processes (rainfall, surface runoff, sewer flow, and sub-surface flow) interact. Fully distributed hydrological models are a useful tool to better understand these complex interactions between natural processes and man built environment. Developing an efficient model is a first step to improve the understanding of flood resilience in urban area. Given that the previously mentioned underlying physical phenomenon exhibit different relevant scales, determining the required spatial resolution of such model is tricky but necessary issue. For instance such model should be able to properly represent large scale effects of local scale flood resilience measures such as stop logs. The model should also be as simple as possible without being simplistic. In this paper we test two types of model. First we use an operational semi-distributed model over a 3400 ha peri-urban area located in Seine-Saint-Denis (North-East of Paris). In this model, the area is divided into sub-catchments of average size 17 ha that are considered as homogenous, and only the sewer discharge is modelled. The rainfall data, whose resolution is 1 km is space and 5 min in time, comes from the C-band radar of Trappes, located in the West of Paris, and operated by Météo-France. It was shown that the spatial resolution of both the model and the rainfall field did not enable to fully grasp the small scale rainfall variability. To achieve this, first an ensemble of realistic rainfall fields downscaled to a resolution of 100 m is generated with the help of multifractal space-time cascades whose characteristic exponents are estimated on the available radar data. Second the corresponding ensemble of sewer hydrographs is simulated by inputting each rainfall realization to the model. It appears that the probability distribution of the simulated peak flow exhibits a power-law behaviour. This indicates that there is a great uncertainty associated with small scale

  1. The EMEFS model evaluation

    International Nuclear Information System (INIS)

    Barchet, W.R.; Dennis, R.L.; Seilkop, S.K.; Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K.; Byun, D.; McHenry, J.N.; Karamchandani, P.; Venkatram, A.; Fung, C.; Misra, P.K.; Hansen, D.A.; Chang, J.S.

    1991-12-01

    The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs

  2. The EMEFS model evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Barchet, W.R. (Pacific Northwest Lab., Richland, WA (United States)); Dennis, R.L. (Environmental Protection Agency, Research Triangle Park, NC (United States)); Seilkop, S.K. (Analytical Sciences, Inc., Durham, NC (United States)); Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K. (Atmospheric Environment Service, Downsview, ON (Canada)); Byun, D.; McHenry, J.N.

    1991-12-01

    The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs.

  3. A hierarchical modeling methodology for the definition and selection of requirements

    Science.gov (United States)

    Dufresne, Stephane

    This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the

  4. Requirements model generation to support requirements elicitation: The Secure Tropos experience

    NARCIS (Netherlands)

    Kiyavitskaya, N.; Zannone, N.

    2008-01-01

    In recent years several efforts have been devoted by researchers in the Requirements Engineering community to the development of methodologies for supporting designers during requirements elicitation, modeling, and analysis. However, these methodologies often lack tool support to facilitate their

  5. Evaluation of onset of nucleate boiling models

    Energy Technology Data Exchange (ETDEWEB)

    Huang, LiDong [Heat Transfer Research, Inc., College Station, TX (United States)], e-mail: lh@htri.net

    2009-07-01

    This article discusses available models and correlations for predicting the required heat flux or wall superheat for the Onset of Nucleate Boiling (ONB) on plain surfaces. It reviews ONB data in the open literature and discusses the continuing efforts of Heat Transfer Research, Inc. in this area. Our ONB database contains ten individual sources for ten test fluids and a wide range of operating conditions for different geometries, e.g., tube side and shell side flow boiling and falling film evaporation. The article also evaluates literature models and correlations based on the data: no single model in the open literature predicts all data well. The prediction uncertainty is especially higher in vacuum conditions. Surface roughness is another critical criterion in determining which model should be used. However, most models do not directly account for surface roughness, and most investigators do not provide surface roughness information in their published findings. Additional experimental research is needed to improve confidence in predicting the required wall superheats for nucleation boiling for engineering design purposes. (author)

  6. Evaluation of onset of nucleate boiling models

    International Nuclear Information System (INIS)

    Huang, LiDong

    2009-01-01

    This article discusses available models and correlations for predicting the required heat flux or wall superheat for the Onset of Nucleate Boiling (ONB) on plain surfaces. It reviews ONB data in the open literature and discusses the continuing efforts of Heat Transfer Research, Inc. in this area. Our ONB database contains ten individual sources for ten test fluids and a wide range of operating conditions for different geometries, e.g., tube side and shell side flow boiling and falling film evaporation. The article also evaluates literature models and correlations based on the data: no single model in the open literature predicts all data well. The prediction uncertainty is especially higher in vacuum conditions. Surface roughness is another critical criterion in determining which model should be used. However, most models do not directly account for surface roughness, and most investigators do not provide surface roughness information in their published findings. Additional experimental research is needed to improve confidence in predicting the required wall superheats for nucleation boiling for engineering design purposes. (author)

  7. Systems and context modeling approach to requirements analysis

    Science.gov (United States)

    Ahuja, Amrit; Muralikrishna, G.; Patwari, Puneet; Subhrojyoti, C.; Swaminathan, N.; Vin, Harrick

    2014-08-01

    Ensuring completeness and correctness of the requirements for a complex system such as the SKA is challenging. Current system engineering practice includes developing a stakeholder needs definition, a concept of operations, and defining system requirements in terms of use cases and requirements statements. We present a method that enhances this current practice into a collection of system models with mutual consistency relationships. These include stakeholder goals, needs definition and system-of-interest models, together with a context model that participates in the consistency relationships among these models. We illustrate this approach by using it to analyze the SKA system requirements.

  8. Evaluation of uncertainty in geological framework models at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Bagtzoglou, A.C.; Stirewalt, G.L.; Henderson, D.B.; Seida, S.B.

    1995-01-01

    The first step towards determining compliance with the performance objectives for both the repository system and the geologic setting at Yucca Mountain requires the development of detailed geostratigraphic models. This paper proposes an approach for the evaluation of the degree of uncertainty inherent in geologic maps and associated three-dimensional geological models. Following this approach, an assessment of accuracy and completeness of the data and evaluation of conceptual uncertainties in the geological framework models can be performed

  9. Data requirements for integrated near field models

    International Nuclear Information System (INIS)

    Wilems, R.E.; Pearson, F.J. Jr.; Faust, C.R.; Brecher, A.

    1981-01-01

    The coupled nature of the various processes in the near field require that integrated models be employed to assess long term performance of the waste package and repository. The nature of the integrated near field models being compiled under the SCEPTER program are discussed. The interfaces between these near field models and far field models are described. Finally, near field data requirements are outlined in sufficient detail to indicate overall programmatic guidance for data gathering activities

  10. The ACCENT-protocol: a framework for benchmarking and model evaluation

    Directory of Open Access Journals (Sweden)

    V. Grewe

    2012-05-01

    Full Text Available We summarise results from a workshop on "Model Benchmarking and Quality Assurance" of the EU-Network of Excellence ACCENT, including results from other activities (e.g. COST Action 732 and publications. A formalised evaluation protocol is presented, i.e. a generic formalism describing the procedure of how to perform a model evaluation. This includes eight steps and examples from global model applications which are given for illustration. The first and important step is concerning the purpose of the model application, i.e. the addressed underlying scientific or political question. We give examples to demonstrate that there is no model evaluation per se, i.e. without a focused purpose. Model evaluation is testing, whether a model is fit for its purpose. The following steps are deduced from the purpose and include model requirements, input data, key processes and quantities, benchmark data, quality indicators, sensitivities, as well as benchmarking and grading. We define "benchmarking" as the process of comparing the model output against either observational data or high fidelity model data, i.e. benchmark data. Special focus is given to the uncertainties, e.g. in observational data, which have the potential to lead to wrong conclusions in the model evaluation if not considered carefully.

  11. The ACCENT-protocol: a framework for benchmarking and model evaluation

    Science.gov (United States)

    Grewe, V.; Moussiopoulos, N.; Builtjes, P.; Borrego, C.; Isaksen, I. S. A.; Volz-Thomas, A.

    2012-05-01

    We summarise results from a workshop on "Model Benchmarking and Quality Assurance" of the EU-Network of Excellence ACCENT, including results from other activities (e.g. COST Action 732) and publications. A formalised evaluation protocol is presented, i.e. a generic formalism describing the procedure of how to perform a model evaluation. This includes eight steps and examples from global model applications which are given for illustration. The first and important step is concerning the purpose of the model application, i.e. the addressed underlying scientific or political question. We give examples to demonstrate that there is no model evaluation per se, i.e. without a focused purpose. Model evaluation is testing, whether a model is fit for its purpose. The following steps are deduced from the purpose and include model requirements, input data, key processes and quantities, benchmark data, quality indicators, sensitivities, as well as benchmarking and grading. We define "benchmarking" as the process of comparing the model output against either observational data or high fidelity model data, i.e. benchmark data. Special focus is given to the uncertainties, e.g. in observational data, which have the potential to lead to wrong conclusions in the model evaluation if not considered carefully.

  12. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Ciaschini, V; Canaparo, M; Ronchieri, E; Salomoni, D

    2014-01-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  13. Evaluating Predictive Models of Software Quality

    Science.gov (United States)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  14. Standardizing the performance evaluation of short-term wind prediction models

    DEFF Research Database (Denmark)

    Madsen, Henrik; Pinson, Pierre; Kariniotakis, G.

    2005-01-01

    Short-term wind power prediction is a primary requirement for efficient large-scale integration of wind generation in power systems and electricity markets. The choice of an appropriate prediction model among the numerous available models is not trivial, and has to be based on an objective...... evaluation of model performance. This paper proposes a standardized protocol for the evaluation of short-term wind-poser preciction systems. A number of reference prediction models are also described, and their use for performance comparison is analysed. The use of the protocol is demonstrated using results...... from both on-shore and off-shore wind forms. The work was developed in the frame of the Anemos project (EU R&D project) where the protocol has been used to evaluate more than 10 prediction systems....

  15. Risk assessment and remedial policy evaluation using predictive modeling

    International Nuclear Information System (INIS)

    Linkov, L.; Schell, W.R.

    1996-01-01

    As a result of nuclear industry operation and accidents, large areas of natural ecosystems have been contaminated by radionuclides and toxic metals. Extensive societal pressure has been exerted to decrease the radiation dose to the population and to the environment. Thus, in making abatement and remediation policy decisions, not only economic costs but also human and environmental risk assessments are desired. This paper introduces a general framework for risk assessment and remedial policy evaluation using predictive modeling. Ecological risk assessment requires evaluation of the radionuclide distribution in ecosystems. The FORESTPATH model is used for predicting the radionuclide fate in forest compartments after deposition as well as for evaluating the efficiency of remedial policies. Time of intervention and radionuclide deposition profile was predicted as being crucial for the remediation efficiency. Risk assessment conducted for a critical group of forest users in Belarus shows that consumption of forest products (berries and mushrooms) leads to about 0.004% risk of a fatal cancer annually. Cost-benefit analysis for forest cleanup suggests that complete removal of organic layer is too expensive for application in Belarus and a better methodology is required. In conclusion, FORESTPATH modeling framework could have wide applications in environmental remediation of radionuclides and toxic metals as well as in dose reconstruction and, risk-assessment

  16. 45 CFR 2519.800 - What are the evaluation requirements for Higher Education programs?

    Science.gov (United States)

    2010-10-01

    ... (Continued) CORPORATION FOR NATIONAL AND COMMUNITY SERVICE HIGHER EDUCATION INNOVATIVE PROGRAMS FOR COMMUNITY SERVICE Evaluation Requirements § 2519.800 What are the evaluation requirements for Higher Education... 45 Public Welfare 4 2010-10-01 2010-10-01 false What are the evaluation requirements for Higher...

  17. The IIR evaluation model

    DEFF Research Database (Denmark)

    Borlund, Pia

    2003-01-01

    An alternative approach to evaluation of interactive information retrieval (IIR) systems, referred to as the IIR evaluation model, is proposed. The model provides a framework for the collection and analysis of IR interaction data. The aim of the model is two-fold: 1) to facilitate the evaluation ...

  18. Dependencies among Architectural Views Got from Software Requirements Based on a Formal Model

    Directory of Open Access Journals (Sweden)

    Osis Janis

    2014-12-01

    Full Text Available A system architect has software requirements and some unspecified knowledge about a problem domain (e.g., an enterprise as source information for assessment and evaluation of possible solutions and getting the target point, a preliminary software design. The solving factor is architect’s experience and expertise in the problem domain (“AS-IS”. A proposed approach is dedicated to assist a system architect in making an appropriate decision on the solution (“TO-BE”. It is based on a formal mathematical model, Topological Functioning Model (TFM. Compliant TFMs can be transformed into software architectural views. The paper demonstrates and discusses tracing dependency links from the requirements to and between the architectural views.

  19. Multi-criteria evaluation of hydrological models

    Science.gov (United States)

    Rakovec, Oldrich; Clark, Martyn; Weerts, Albrecht; Hill, Mary; Teuling, Ryan; Uijlenhoet, Remko

    2013-04-01

    Over the last years, there is a tendency in the hydrological community to move from the simple conceptual models towards more complex, physically/process-based hydrological models. This is because conceptual models often fail to simulate the dynamics of the observations. However, there is little agreement on how much complexity needs to be considered within the complex process-based models. One way to proceed to is to improve understanding of what is important and unimportant in the models considered. The aim of this ongoing study is to evaluate structural model adequacy using alternative conceptual and process-based models of hydrological systems, with an emphasis on understanding how model complexity relates to observed hydrological processes. Some of the models require considerable execution time and the computationally frugal sensitivity analysis, model calibration and uncertainty quantification methods are well-suited to providing important insights for models with lengthy execution times. The current experiment evaluates two version of the Framework for Understanding Structural Errors (FUSE), which both enable running model inter-comparison experiments. One supports computationally efficient conceptual models, and the second supports more-process-based models that tend to have longer execution times. The conceptual FUSE combines components of 4 existing conceptual hydrological models. The process-based framework consists of different forms of Richard's equations, numerical solutions, groundwater parameterizations and hydraulic conductivity distribution. The hydrological analysis of the model processes has evolved from focusing only on simulated runoff (final model output), to also including other criteria such as soil moisture and groundwater levels. Parameter importance and associated structural importance are evaluated using different types of sensitivity analyses techniques, making use of both robust global methods (e.g. Sobol') as well as several

  20. Extending enterprise architecture modelling with business goals and requirements

    Science.gov (United States)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  1. Development and application of emission models for verification and evaluation of user requirements in the sector of environmental planning

    International Nuclear Information System (INIS)

    Fister, G.

    2001-04-01

    In chapter I, two basic emission models for calculation of emission inventories are presented: the bottom-up- and the top-down-approach. Their characteristics, typical international and national fields of application and the requirements for these approaches are discussed. A separate chapter describes a detailed comparison between two different emission balances. These characterize the same regional area but are based on different emission models (top-down- and bottom-up-approach). The structures of these approaches are analyzed, emission sectors are adjusted for a comparison of detailed emission data. Differences are pointed out and reasons for discrepancies are discussed. Due to the results of this investigation, limits for the fields of application of the two approaches are set and substantiated. An application of results of the above mentioned comparison are shown in the following part. Following the Kyoto Protocol commitment and Lower Austria Climate Protection Program current and future emission situation of Lower Austria is discussed. Other types of emission inventories are included for discussion and a top-down-based approach for a local splitting of Austrian reduction potentials of greenhouse gases is developed. Another step in the Lower Austria Climate Protection Program are investigations of all funding in Lower Austria related to their ozone and climate relevance. Survey and evaluation of funding are described in detail. Further analyses are made with housing grants which include quantitative aspects, too. Taking all aspects into consideration the actual situation regarding ozone and climate related emissions is shown. Changes in requirements of emission inventories in the last decade are discussed, experiences of applying emission approaches are mentioned. Concluding this work, an outlook in calculating emission inventories is given. (author)

  2. Assessment of reflood heat transfer model of COBRA-TIF against ABB-CE evaluation model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S. I.; Lee, S. Y.; Park, C. E.; Choi, H. R.; Choi, C. J. [Korea Power Engineering Company Inc., Taejon (Korea, Republic of)

    2000-05-01

    According to 10 CFR 50 Appendix K, ECCS performance evaluation model should be based on the experimental data of FLECHT and have the conservatism compared with experimental data. To meet this requirement ABB-CE has the complicate code structure as follows: COMPERC-II calculates three reflood rates, and FLELAPC and HTCOF calculate the reflood heat transfer coefficients, and finally STRIKIN-II calculates the cladding temperature using the reflood heat transfer calculated in previous stage. In this paper, to investigate whether or not COBRA-TF satisfies the requirement of Appendix K, the reflood heat transfer coefficient of COBRA-TF was assessed against ABB-CE MOD-2C model. It was found out that COBRA-TF predicts properly the experimental data and has more conservatism than the results of ABB-CE MOD-2C model. Based on these results, it can be concluded that the reflood heat transfer coefficients calculated by COBRA-TF meet the requirement of Appendix K.

  3. A database model for evaluating material accountability safeguards effectiveness against protracted theft

    International Nuclear Information System (INIS)

    Sicherman, A.; Fortney, D.S.; Patenaude, C.J.

    1993-07-01

    DOE Material Control and Accountability Order 5633.3A requires that facilities handling special nuclear material evaluate their effectiveness against protracted theft (repeated thefts of small quantities of material, typically occurring over an extended time frame, to accumulate a goal quantity). Because a protracted theft attempt can extend over time, material accountability-like (MA) safeguards may help detect a protracted theft attempt in progress. Inventory anomalies, and material not in its authorized location when requested for processing are examples of MA detection mechanisms. Crediting such detection in evaluations, however, requires taking into account potential insider subversion of MA safeguards. In this paper, the authors describe a database model for evaluating MA safeguards effectiveness against protracted theft that addresses potential subversion. The model includes a detailed yet practical structure for characterizing various types of MA activities, lists of potential insider MA defeat methods and access/authority related to MA activities, and an initial implementation of built-in MA detection probabilities. This database model, implemented in the new Protracted Insider module of ASSESS (Analytic System and Software for Evaluating Safeguards and Security), helps facilitate the systematic collection of relevant information about MA activity steps, and ''standardize'' MA safeguards evaluations

  4. FEM-model of the Naesliden Mine: requirements and limitations at the outset of the project. [Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Krauland, N.

    1980-05-15

    The design of any instrument depends entirely on the application planned for the instrument. This applies also to the FEM-representation of the Naesliden Mine. With reference to the aims of the project the requirements on the model are outlined with regard to - simulation of the mining process - modelling with special reference to the aims of the project - comparison of FEM-results with in situ observations to determine the validity of the model. The proposed model is two-dimensional and incorporates joint elements to simulate the weak alteration zone between orebody and sidewall rock. The remainder of the model exhibits linear elastic behaviour. This model is evaluated with respect to the given requirements. The limitations of the chosen model are outlined.

  5. Obs4MIPS: Satellite Observations for Model Evaluation

    Science.gov (United States)

    Ferraro, R.; Waliser, D. E.; Gleckler, P. J.

    2017-12-01

    This poster will review the current status of the obs4MIPs project, whose purpose is to provide a limited collection of well-established and documented datasets for comparison with Earth system models (https://www.earthsystemcog.org/projects/obs4mips/). These datasets have been reformatted to correspond with the CMIP5 model output requirements, and include technical documentation specifically targeted for their use in model output evaluation. The project holdings now exceed 120 datasets with observations that directly correspond to CMIP5 model output variables, with new additions in response to the CMIP6 experiments. With the growth in climate model output data volume, it is increasing more difficult to bring the model output and the observations together to do evaluations. The positioning of the obs4MIPs datasets within the Earth System Grid Federation (ESGF) allows for the use of currently available and planned online tools within the ESGF to perform analysis using model output and observational datasets without necessarily downloading everything to a local workstation. This past year, obs4MIPs has updated its submission guidelines to closely align with changes in the CMIP6 experiments, and is implementing additional indicators and ancillary data to allow users to more easily determine the efficacy of an obs4MIPs dataset for specific evaluation purposes. This poster will present the new guidelines and indicators, and update the list of current obs4MIPs holdings and their connection to the ESGF evaluation and analysis tools currently available, and being developed for the CMIP6 experiments.

  6. Applicability evaluation on the conservative metal-water reaction(MWR) model implemented into the SPACE code

    International Nuclear Information System (INIS)

    Lee, Suk Ho; You, Sung Chang; Kim, Han Gon

    2011-01-01

    The SBLOCA (Small Break Loss-of-Coolant Accident) evaluation methodology for the APR1400 (Advanced Power Reactor 1400) is under development using the SPACE code. The goal of the development of this methodology is to set up a conservative evaluation methodology in accordance with Appendix K of 10CFR50 by the end of 2012. In order to develop the Appendix K version of the SPACE code, the code modification is considered through implementation of the code on the required evaluation models. For the conservative models required in the SPACE code, the metal-water reaction (MWR) model, the critical flow model, the Critical Heat Flux (CHF) model and the post-CHF model must be implemented in the code. At present, the integration of the model to generate the Appendix K version of SPACE is in its preliminary stage. Among them, the conservative MWR model and its code applicability are introduced in this paper

  7. Model-Based Requirements Management in Gear Systems Design Based On Graph-Based Design Languages

    Directory of Open Access Journals (Sweden)

    Kevin Holder

    2017-10-01

    Full Text Available For several decades, a wide-spread consensus concerning the enormous importance of an in-depth clarification of the specifications of a product has been observed. A weak clarification of specifications is repeatedly listed as a main cause for the failure of product development projects. Requirements, which can be defined as the purpose, goals, constraints, and criteria associated with a product development project, play a central role in the clarification of specifications. The collection of activities which ensure that requirements are identified, documented, maintained, communicated, and traced throughout the life cycle of a system, product, or service can be referred to as “requirements engineering”. These activities can be supported by a collection and combination of strategies, methods, and tools which are appropriate for the clarification of specifications. Numerous publications describe the strategy and the components of requirements management. Furthermore, recent research investigates its industrial application. Simultaneously, promising developments of graph-based design languages for a holistic digital representation of the product life cycle are presented. Current developments realize graph-based languages by the diagrams of the Unified Modelling Language (UML, and allow the automatic generation and evaluation of multiple product variants. The research presented in this paper seeks to present a method in order to combine the advantages of a conscious requirements management process and graph-based design languages. Consequently, the main objective of this paper is the investigation of a model-based integration of requirements in a product development process by means of graph-based design languages. The research method is based on an in-depth analysis of an exemplary industrial product development, a gear system for so-called “Electrical Multiple Units” (EMU. Important requirements were abstracted from a gear system

  8. DECISION MAKING MODELING OF CONCRETE REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    Suhartono Irawan

    2001-01-01

    Full Text Available This paper presents the results of an experimental evaluation between predicted and practice concrete strength. The scope of the evaluation is the optimisation of the cement content for different concrete grades as a result of bringing the target mean value of tests cubes closer to the required characteristic strength value by reducing the standard deviation. Abstract in Bahasa Indonesia : concrete+mix+design%2C+acceptance+control%2C+optimisation%2C+cement+content.

  9. Customer requirement modeling and mapping of numerical control machine

    Directory of Open Access Journals (Sweden)

    Zhongqi Sheng

    2015-10-01

    Full Text Available In order to better obtain information about customer requirement and develop products meeting customer requirement, it is necessary to systematically analyze and handle the customer requirement. This article uses the product service system of numerical control machine as research objective and studies the customer requirement modeling and mapping oriented toward configuration design. It introduces the conception of requirement unit, expounds the customer requirement decomposition rules, and establishes customer requirement model; it builds the house of quality using quality function deployment and confirms the weight of technical feature of product and service; it explores the relevance rules between data using rough set theory, establishes rule database, and solves the target value of technical feature of product. Using economical turning center series numerical control machine as an example, it verifies the rationality of proposed customer requirement model.

  10. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  11. EVALUATION OF RAINFALL-RUNOFF MODELS FOR MEDITERRANEAN SUBCATCHMENTS

    Directory of Open Access Journals (Sweden)

    A. Cilek

    2016-06-01

    Full Text Available The development and the application of rainfall-runoff models have been a corner-stone of hydrological research for many decades. The amount of rainfall and its intensity and variability control the generation of runoff and the erosional processes operating at different scales. These interactions can be greatly variable in Mediterranean catchments with marked hydrological fluctuations. The aim of the study was to evaluate the performance of rainfall-runoff model, for rainfall-runoff simulation in a Mediterranean subcatchment. The Pan-European Soil Erosion Risk Assessment (PESERA, a simplified hydrological process-based approach, was used in this study to combine hydrological surface runoff factors. In total 128 input layers derived from data set includes; climate, topography, land use, crop type, planting date, and soil characteristics, are required to run the model. Initial ground cover was estimated from the Landsat ETM data provided by ESA. This hydrological model was evaluated in terms of their performance in Goksu River Watershed, Turkey. It is located at the Central Eastern Mediterranean Basin of Turkey. The area is approximately 2000 km2. The landscape is dominated by bare ground, agricultural and forests. The average annual rainfall is 636.4mm. This study has a significant importance to evaluate different model performances in a complex Mediterranean basin. The results provided comprehensive insight including advantages and limitations of modelling approaches in the Mediterranean environment.

  12. Reactor core modeling practice: Operational requirements, model characteristics, and model validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1997-01-01

    The physical models implemented in power plant simulators have greatly increased in performance and complexity in recent years. This process has been enabled by the ever increasing computing power available at affordable prices. This paper describes this process from several angles: First the operational requirements which are more critical from the point of view of model performance, both for normal and off-normal operating conditions; A second section discusses core model characteristics in the light of the solutions implemented by Thomson Training and Simulation (TT and S) in several full-scope simulators recently built and delivered for Dutch, German, and French nuclear power plants; finally we consider the model validation procedures, which are of course an integral part of model development, and which are becoming more and more severe as performance expectations increase. As a conclusion, it may be asserted that in the core modeling field, as in other areas, the general improvement in the quality of simulation codes has resulted in a fairly rapid convergence towards mainstream engineering-grade calculations. This is remarkable performance in view of the stringent real-time requirements which the simulation codes must satisfy as well as the extremely wide range of operating conditions that they are called upon to cover with good accuracy. (author)

  13. An effective quality model for evaluating mobile websites

    International Nuclear Information System (INIS)

    Hassan, W.U.; Nawaz, M.T.; Syed, T.H.; Naseem, A.

    2015-01-01

    The Evolution in Web development in recent years has caused emergence of new area of mobile computing, Mobile phone has been transformed into high speed processing device capable of doing the processes which were suppose to be run only on computer previously, Modem mobile phones now have capability to process data with greater speed then desktop systems and with the inclusion of 3G and 4G networks, mobile became the prime choice for users to send and receive data from any device. As a result, there is a major increase in mobile website need and development but due to uniqueness of mobile website usage as compared to desktop website, there is a need to focus on quality aspect of mobile website, So, to increase and preserve quality of mobile website, a quality model is required which has to be designed specifically to evaluate mobile website quality, To design a mobile website quality model, a survey based methodology is used to gather the information regarding website unique usage in mobile from different users. On the basis of this information, a mobile website quality model is presented which aims to evaluate the quality of mobile websites. In proposed model, some sub characteristics are designed to evaluate mobile websites in particular. The result is a proposed model aims to evaluate features of website which are important in context of its deployment and its usability in mobile platform. (author)

  14. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  15. Design Of Measurements For Evaluating Readiness Of Technoware Components To Meet The Required Standard Of Products

    Science.gov (United States)

    Fauzi, Ilham; Muharram Hasby, Fariz; Irianto, Dradjad

    2018-03-01

    Although government is able to make mandatory standards that must be obeyed by the industry, the respective industries themselves often have difficulties to fulfil the requirements described in those standards. This is especially true in many small and medium sized enterprises that lack the required capital to invest in standard-compliant equipment and machineries. This study aims to develop a set of measurement tools for evaluating the level of readiness of production technology with respect to the requirements of a product standard based on the quality function deployment (QFD) method. By combining the QFD methodology, UNESCAP Technometric model [9] and Analytic Hierarchy Process (AHP), this model is used to measure a firm’s capability to fulfill government standard in the toy making industry. Expert opinions from both the governmental officers responsible for setting and implementing standards and the industry practitioners responsible for managing manufacturing processes are collected and processed to find out the technological capabilities that should be improved by the firm to fulfill the existing standard. This study showed that the proposed model can be used successfully to measure the gap between the requirements of the standard and the readiness of technoware technological component in a particular firm.

  16. Extending enterprise architecture modelling with business goals and requirements

    NARCIS (Netherlands)

    Engelsman, W.; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten J.

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling

  17. Vehicle systems and payload requirements evaluation. [computer programs for identifying launch vehicle system requirements

    Science.gov (United States)

    Rea, F. G.; Pittenger, J. L.; Conlon, R. J.; Allen, J. D.

    1975-01-01

    Techniques developed for identifying launch vehicle system requirements for NASA automated space missions are discussed. Emphasis is placed on development of computer programs and investigation of astrionics for OSS missions and Scout. The Earth Orbit Mission Program - 1 which performs linear error analysis of launch vehicle dispersions for both vehicle and navigation system factors is described along with the Interactive Graphic Orbit Selection program which allows the user to select orbits which satisfy mission requirements and to evaluate the necessary injection accuracy.

  18. Integrated Assessment Model Evaluation

    Science.gov (United States)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.

    2012-12-01

    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside

  19. Milestone-specific, Observed data points for evaluating levels of performance (MODEL) assessment strategy for anesthesiology residency programs.

    Science.gov (United States)

    Nagy, Christopher J; Fitzgerald, Brian M; Kraus, Gregory P

    2014-01-01

    Anesthesiology residency programs will be expected to have Milestones-based evaluation systems in place by July 2014 as part of the Next Accreditation System. The San Antonio Uniformed Services Health Education Consortium (SAUSHEC) anesthesiology residency program developed and implemented a Milestones-based feedback and evaluation system a year ahead of schedule. It has been named the Milestone-specific, Observed Data points for Evaluating Levels of performance (MODEL) assessment strategy. The "MODEL Menu" and the "MODEL Blueprint" are tools that other anesthesiology residency programs can use in developing their own Milestones-based feedback and evaluation systems prior to ACGME-required implementation. Data from our early experience with the streamlined MODEL blueprint assessment strategy showed substantially improved faculty compliance with reporting requirements. The MODEL assessment strategy provides programs with a workable assessment method for residents, and important Milestones data points to programs for ACGME reporting.

  20. Modelling of radon control and air cleaning requirements in underground uranium mines

    International Nuclear Information System (INIS)

    El Fawal, M.; Gadalla, A.

    2014-01-01

    As a part of a comprehensive study concerned with control workplace short-lived radon daughter concentration in underground uranium mines to safe levels, a computer program has been developed and verified, to calculate ventilation parameters e.g. local pressures, flow rates and radon daughter concentration levels. The computer program is composed of two parts, one part for mine ventilation and the other part for radon daughter levels calculations. This program has been validated in an actual case study to calculate radon concentration levels, pressure and flow rates required to maintain acceptable levels of radon concentrations in each point of the mine. The required fan static pressure and the approximate energy consumption were also estimated. The results of the calculations have been evaluated and compared with similar investigation. It was found that the calculated values are in good agreement with the corresponding values obtained using ''REDES'' standard ventilation modelling software. The developed computer model can be used as an available tool to help in the evaluation of ventilation systems proposed by mining authority, to assist the uranium mining industry in maintaining the health and safety of the workers underground while efficiently achieving economic production targets. It could be used also for regulatory inspection and radiation protection assessments of workers in the underground mining. Also with using this model, one can effectively design, assess and manage underground mine ventilation systems. Values of radon decay products concentration in units of working level, pressures drop and flow rates required to reach the acceptable radon concentration relative to the recommended levels, at different extraction points in the mine and fan static pressure could be estimated which are not available using other software. (author)

  1. Empirically evaluating decision-analytic models.

    Science.gov (United States)

    Goldhaber-Fiebert, Jeremy D; Stout, Natasha K; Goldie, Sue J

    2010-08-01

    Model-based cost-effectiveness analyses support decision-making. To augment model credibility, evaluation via comparison to independent, empirical studies is recommended. We developed a structured reporting format for model evaluation and conducted a structured literature review to characterize current model evaluation recommendations and practices. As an illustration, we applied the reporting format to evaluate a microsimulation of human papillomavirus and cervical cancer. The model's outputs and uncertainty ranges were compared with multiple outcomes from a study of long-term progression from high-grade precancer (cervical intraepithelial neoplasia [CIN]) to cancer. Outcomes included 5 to 30-year cumulative cancer risk among women with and without appropriate CIN treatment. Consistency was measured by model ranges overlapping study confidence intervals. The structured reporting format included: matching baseline characteristics and follow-up, reporting model and study uncertainty, and stating metrics of consistency for model and study results. Structured searches yielded 2963 articles with 67 meeting inclusion criteria and found variation in how current model evaluations are reported. Evaluation of the cervical cancer microsimulation, reported using the proposed format, showed a modeled cumulative risk of invasive cancer for inadequately treated women of 39.6% (30.9-49.7) at 30 years, compared with the study: 37.5% (28.4-48.3). For appropriately treated women, modeled risks were 1.0% (0.7-1.3) at 30 years, study: 1.5% (0.4-3.3). To support external and projective validity, cost-effectiveness models should be iteratively evaluated as new studies become available, with reporting standardized to facilitate assessment. Such evaluations are particularly relevant for models used to conduct comparative effectiveness analyses.

  2. Study on visibility evaluation model which is considered field factors; Field factor wo koryoshita shininsei hyoka model ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    Nakanishi, M; Hagiwara, T [Hokkaido University, Sapporo (Japan)

    1997-10-01

    The present study proposes a model to evaluate visual performance of road traffic facilities required for drivers. Two factors were employed to obtain the suitable contrast for drivers under driving situation. One factor is a suitable luminance range, which is derived from minimum required luminance and glare luminance. Another is a field. The model showed capability of providing visibility range in some cases. 8 refs., 4 figs., 2 tabs.

  3. A model for evaluating beef cattle rations considering effects of ruminal fiber mass

    OpenAIRE

    Henrique,Douglas Sampaio; Lana,Rogério de Paula; Vieira,Ricardo Augusto Mendonça; Fontes,Carlos Augusto de Alencar; Botelho,Mosar Faria

    2011-01-01

    A mathematical model based on Cornell Net Carbohydrate and Protein System (CNCPS) was developed and adapted in order to evaluate beef cattle rations at tropical climate conditions. The presented system differs from CNCPS in the modeling of insoluble particles' digestion and passage kinetics, which enabled the estimation of fiber mass in rumen and its effects on animal performance. The equations used to estimate metabolizable protein and net energy requirements for gain, net energy requirement...

  4. IDENTIFYING OPERATIONAL REQUIREMENTS TO SELECT SUITABLE DECISION MODELS FOR A PUBLIC SECTOR EPROCUREMENT DECISION SUPPORT SYSTEM

    Directory of Open Access Journals (Sweden)

    Mohamed Adil

    2014-10-01

    Full Text Available Public sector procurement should be a transparent and fair process. Strict legal requirements are enforced on public sector procurement to make it a standardised process. To make fair decisions on selecting suppliers, a practical method which adheres to legal requirements is important. The research that is the base for this paper aimed at identifying a suitable Multi-Criteria Decision Analysis (MCDA method for the specific legal and functional needs of the Maldivian Public Sector. To identify such operational requirements, a set of focus group interviews were conducted in the Maldives with public officials responsible for procurement decision making. Based on the operational requirements identified through focus groups, criteria-based evaluation is done on published MCDA methods to identify the suitable methods for e-procurement decision making. This paper describes the identification of the operational requirements and the results of the evaluation to select suitable decision models for the Maldivian context.

  5. A model for evaluating beef cattle rations considering effects of ruminal fiber mass

    Directory of Open Access Journals (Sweden)

    Douglas Sampaio Henrique

    2011-11-01

    Full Text Available A mathematical model based on Cornell Net Carbohydrate and Protein System (CNCPS was developed and adapted in order to evaluate beef cattle rations at tropical climate conditions. The presented system differs from CNCPS in the modeling of insoluble particles' digestion and passage kinetics, which enabled the estimation of fiber mass in rumen and its effects on animal performance. The equations used to estimate metabolizable protein and net energy requirements for gain, net energy requirement for maintenance and total efficiency of metabolizable energy utilization were obtained from scientific articles published in Brazil. The parameters of the regression equations in these papers were estimated using data from Bos indicus purebred and crossbred animals reared under tropical conditions. The model was evaluated by using a 368-piece of information database originally published on 11 Doctoral theses, 14 Master's dissertations and four scientific articles. Outputs of the model can be considered adequate.

  6. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, W.; Jonkers, Henk; van Sinderen, Marten J.

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for

  7. Cleanliness Policy Implementation: Evaluating Retribution Model to Rise Public Satisfaction

    Science.gov (United States)

    Dailiati, Surya; Hernimawati; Prihati; Chintia Utami, Bunga

    2018-05-01

    This research is based on the principal issues concerning the evaluation of cleanliness retribution policy which has not been optimally be able to improve the Local Revenue of Pekanbaru City and has not improved the cleanliness of Pekanbaru City. It was estimated to be caused by the performance of Garden and Sanitation Department are not in accordance with the requirement of society of Pekanbaru City. The research method used in this study is a mixed method with sequential exploratory strategy. The data collection used are observation, interview and documentation for qualitative research as well as questionnaires for quantitative research. The collected data were analyzed with interactive model of Miles and Huberman for qualitative research and multiple regression analysis for quantitative research. The research result indicated that the model of cleanliness policy implementation that can increase of PAD Pekanbaru City and be able to improve people’s satisfaction divided into two (2) which are the evaluation model and the society satisfaction model. The evaluation model influence by criteria/variable of effectiveness, efficiency, adequacy, equity, responsiveness, and appropriateness, while the society satisfaction model influence by variables of society satisfaction, intentions, goals, plans, programs, and appropriateness of cleanliness retribution collection policy.

  8. Introducing Program Evaluation Models

    Directory of Open Access Journals (Sweden)

    Raluca GÂRBOAN

    2008-02-01

    Full Text Available Programs and project evaluation models can be extremely useful in project planning and management. The aim is to set the right questions as soon as possible in order to see in time and deal with the unwanted program effects, as well as to encourage the positive elements of the project impact. In short, different evaluation models are used in order to minimize losses and maximize the benefits of the interventions upon small or large social groups. This article introduces some of the most recently used evaluation models.

  9. Models of protein and amino acid requirements for cattle

    Directory of Open Access Journals (Sweden)

    Luis Orlindo Tedeschi

    2015-03-01

    Full Text Available Protein supply and requirements by ruminants have been studied for more than a century. These studies led to the accumulation of lots of scientific information about digestion and metabolism of protein by ruminants as well as the characterization of the dietary protein in order to maximize animal performance. During the 1980s and 1990s, when computers became more accessible and powerful, scientists began to conceptualize and develop mathematical nutrition models, and to program them into computers to assist with ration balancing and formulation for domesticated ruminants, specifically dairy and beef cattle. The most commonly known nutrition models developed during this period were the National Research Council (NRC in the United States, Agricultural Research Council (ARC in the United Kingdom, Institut National de la Recherche Agronomique (INRA in France, and the Commonwealth Scientific and Industrial Research Organization (CSIRO in Australia. Others were derivative works from these models with different degrees of modifications in the supply or requirement calculations, and the modeling nature (e.g., static or dynamic, mechanistic, or deterministic. Circa 1990s, most models adopted the metabolizable protein (MP system over the crude protein (CP and digestible CP systems to estimate supply of MP and the factorial system to calculate MP required by the animal. The MP system included two portions of protein (i.e., the rumen-undegraded dietary CP - RUP - and the contributions of microbial CP - MCP as the main sources of MP for the animal. Some models would explicitly account for the impact of dry matter intake (DMI on the MP required for maintenance (MPm; e.g., Cornell Net Carbohydrate and Protein System - CNCPS, the Dutch system - DVE/OEB, while others would simply account for scurf, urinary, metabolic fecal, and endogenous contributions independently of DMI. All models included milk yield and its components in estimating MP required for lactation

  10. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  11. Simulations, evaluations and models. Vol. 1

    International Nuclear Information System (INIS)

    Brehmer, B.; Leplat, J.

    1992-01-01

    Papers presented at the Fourth MOHAWC (Models of Human Activities in Work Context) workshop. The general theme was simulations, evaluations and models. The emphasis was on time in relation to the modelling of human activities in modern, high tech. work. Such work often requires people to control dynamic systems, and the behaviour and misbehaviour of these systems in time is a principle focus of work in, for example, a modern process plant. The papers report on microworlds and on their innovative uses, both in the form of experiments and in the form of a new form of use, that of testing a program which performs diagnostic reasoning. They present new aspects on the problem of time in process control, showing the importance of considering the time scales of dynamic tasks, both in individual decision making and in distributed decision making, and in providing new formalisms, both for the representation of time and for reasoning involving time in diagnosis. (AB)

  12. Nuclear models relevant to evaluation

    International Nuclear Information System (INIS)

    Arthur, E.D.; Chadwick, M.B.; Hale, G.M.; Young, P.G.

    1991-01-01

    The widespread use of nuclear models continues in the creation of data evaluations. The reasons include extension of data evaluations to higher energies, creation of data libraries for isotopic components of natural materials, and production of evaluations for radiative target species. In these cases, experimental data are often sparse or nonexistent. As this trend continues, the nuclear models employed in evaluation work move towards more microscopically-based theoretical methods, prompted in part by the availability of increasingly powerful computational resources. Advances in nuclear models applicable to evaluation will be reviewed. These include advances in optical model theory, microscopic and phenomenological state and level density theory, unified models that consistently describe both equilibrium and nonequilibrium reaction mechanism, and improved methodologies for calculation of prompt radiation from fission. 84 refs., 8 figs

  13. Finite Element Models Development of Car Seats With Passive Head Restraints to Study Their Meeting Requirements for EURO NCAP

    Directory of Open Access Journals (Sweden)

    D. Yu. Solopov

    2014-01-01

    Full Text Available In performing calculations to evaluate passive safety of car seats by computer modelling methods it is desirable to use the final element models (FEM thereby providing the greatest accuracy of calculation results. Besides, it is expedient to use FEM, which can be calculated by computer for a small period of time to give preliminary results for short terms.The paper describes the features to evaluate a passive safety, which is ensured by the developed KEM of seats with passive head restraints according to requirements of the EURO NCAP.Besides, accuracy of calculated results that is provided by the developed KEM was evaluated. Accuracy evaluation was accomplished in relation to the results obtained the by specialists of the organization conducting similar researches (LSTC.This work was performed within the framework of a technique, which allows us to develop effectively the car seat designs both with passive, and active head restraints, meeting requirements for passive safety.By results of made calculations and experiments it was found that when evaluating by the EURO NCAP technique the "rough" KEM (the 1st and 2nd levels can be considered as rational ones (in terms of labour costs for its creation and problem solving as well as by result errors and it is expedient to use them for preliminary and multivariate calculations. Detailed models (the 3rd level provide the greatest accuracy (the greatest accuracy is reached with the evaluated impact of 16km/h speed under the loading conditions "moderate impact". A relative error of full head acceleration is of 12%.In evaluation by EURO NCAP using NIC criterion a conclusion can be drawn that the seat models of the 2nd level (467 936 KE and the 3rd level (1 255 358 KE meet the passive safety requirements according to EURO NCAP requirements under "light", "moderate", and "heavy" impacts.In evaluation by EURO NCAP for preliminary and multivariate calculations a model of the middle level (consisting of 467

  14. A Statistical Evaluation of Atmosphere-Ocean General Circulation Models: Complexity vs. Simplicity

    OpenAIRE

    Robert K. Kaufmann; David I. Stern

    2004-01-01

    The principal tools used to model future climate change are General Circulation Models which are deterministic high resolution bottom-up models of the global atmosphere-ocean system that require large amounts of supercomputer time to generate results. But are these models a cost-effective way of predicting future climate change at the global level? In this paper we use modern econometric techniques to evaluate the statistical adequacy of three general circulation models (GCMs) by testing thre...

  15. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    Science.gov (United States)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  16. Decision-tree approach to evaluating inactive uranium-processing sites for liner requirements

    International Nuclear Information System (INIS)

    Relyea, J.F.

    1983-03-01

    Recently, concern has been expressed about potential toxic effects of both radon emission and release of toxic elements in leachate from inactive uranium mill tailings piles. Remedial action may be required to meet disposal standards set by the states and the US Environmental Protection Agency (EPA). In some cases, a possible disposal option is the exhumation and reburial (either on site or at a new location) of tailings and reliance on engineered barriers to satisfy the objectives established for remedial actions. Liners under disposal pits are the major engineered barrier for preventing contaminant release to ground and surface water. The purpose of this report is to provide a logical sequence of action, in the form of a decision tree, which could be followed to show whether a selected tailings disposal design meets the objectives for subsurface contaminant release without a liner. This information can be used to determine the need and type of liner for sites exhibiting a potential groundwater problem. The decision tree is based on the capability of hydrologic and mass transport models to predict the movement of water and contaminants with time. The types of modeling capabilities and data needed for those models are described, and the steps required to predict water and contaminant movement are discussed. A demonstration of the decision tree procedure is given to aid the reader in evaluating the need for the adequacy of a liner

  17. Evaluation of the autoregression time-series model for analysis of a noisy signal

    International Nuclear Information System (INIS)

    Allen, J.W.

    1977-01-01

    The autoregression (AR) time-series model of a continuous noisy signal was statistically evaluated to determine quantitatively the uncertainties of the model order, the model parameters, and the model's power spectral density (PSD). The result of such a statistical evaluation enables an experimenter to decide whether an AR model can adequately represent a continuous noisy signal and be consistent with the signal's frequency spectrum, and whether it can be used for on-line monitoring. Although evaluations of other types of signals have been reported in the literature, no direct reference has been found to AR model's uncertainties for continuous noisy signals; yet the evaluation is necessary to decide the usefulness of AR models of typical reactor signals (e.g., neutron detector output or thermocouple output) and the potential of AR models for on-line monitoring applications. AR and other time-series models for noisy data representation are being investigated by others since such models require fewer parameters than the traditional PSD model. For this study, the AR model was selected for its simplicity and conduciveness to uncertainty analysis, and controlled laboratory bench signals were used for continuous noisy data. (author)

  18. Three new models for evaluation of standard involute spur gear mesh stiffness

    Science.gov (United States)

    Liang, Xihui; Zhang, Hongsheng; Zuo, Ming J.; Qin, Yong

    2018-02-01

    Time-varying mesh stiffness is one of the main internal excitation sources of gear dynamics. Accurate evaluation of gear mesh stiffness is crucial for gear dynamic analysis. This study is devoted to developing new models for spur gear mesh stiffness evaluation. Three models are proposed. The proposed model 1 can give very accurate mesh stiffness result but the gear bore surface must be assumed to be rigid. Enlighted by the proposed model 1, our research discovers that the angular deflection pattern of the gear bore surface of a pair of meshing gears under a constant torque basically follows a cosine curve. Based on this finding, two other models are proposed. The proposed model 2 evaluates gear mesh stiffness by using angular deflections at different circumferential angles of an end surface circle of the gear bore. The proposed model 3 requires using only the angular deflection at an arbitrary circumferential angle of an end surface circle of the gear bore but this model can only be used for a gear with the same tooth profile among all teeth. The proposed models are accurate in gear mesh stiffness evaluation and easy to use. Finite element analysis is used to validate the accuracy of the proposed models.

  19. A website evaluation model by integration of previous evaluation models using a quantitative approach

    Directory of Open Access Journals (Sweden)

    Ali Moeini

    2015-01-01

    Full Text Available Regarding the ecommerce growth, websites play an essential role in business success. Therefore, many authors have offered website evaluation models since 1995. Although, the multiplicity and diversity of evaluation models make it difficult to integrate them into a single comprehensive model. In this paper a quantitative method has been used to integrate previous models into a comprehensive model that is compatible with them. In this approach the researcher judgment has no role in integration of models and the new model takes its validity from 93 previous models and systematic quantitative approach.

  20. Large scale Bayesian nuclear data evaluation with consistent model defects

    International Nuclear Information System (INIS)

    Schnabel, G

    2015-01-01

    The aim of nuclear data evaluation is the reliable determination of cross sections and related quantities of the atomic nuclei. To this end, evaluation methods are applied which combine the information of experiments with the results of model calculations. The evaluated observables with their associated uncertainties and correlations are assembled into data sets, which are required for the development of novel nuclear facilities, such as fusion reactors for energy supply, and accelerator driven systems for nuclear waste incineration. The efficiency and safety of such future facilities is dependent on the quality of these data sets and thus also on the reliability of the applied evaluation methods. This work investigated the performance of the majority of available evaluation methods in two scenarios. The study indicated the importance of an essential component in these methods, which is the frequently ignored deficiency of nuclear models. Usually, nuclear models are based on approximations and thus their predictions may deviate from reliable experimental data. As demonstrated in this thesis, the neglect of this possibility in evaluation methods can lead to estimates of observables which are inconsistent with experimental data. Due to this finding, an extension of Bayesian evaluation methods is proposed to take into account the deficiency of the nuclear models. The deficiency is modeled as a random function in terms of a Gaussian process and combined with the model prediction. This novel formulation conserves sum rules and allows to explicitly estimate the magnitude of model deficiency. Both features are missing in available evaluation methods so far. Furthermore, two improvements of existing methods have been developed in the course of this thesis. The first improvement concerns methods relying on Monte Carlo sampling. A Metropolis-Hastings scheme with a specific proposal distribution is suggested, which proved to be more efficient in the studied scenarios than the

  1. Evaluation of upper-shelf toughness requirements for reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Gamble, R.M.; Zahoor, A. (NOVETECH Corp., Rockville, MD (USA)); Hiser, A. (Materials Engineering Associates, Inc., Lanham, MD (USA)); Ernst, H.A.; Pollitz, E.T. (Georgia Inst. of Tech., Atlanta, GA (USA))

    1990-04-01

    This work assesses and applies the criteria recommended by the ASME Subgroup on Evaluation Standards for the evaluation of reactor pressure vessel beltline materials having upper shelf Charpy energies less than 50 ft-lbs. The assessment included comparison of the upper shelf energies required by the criteria recommended for Service Level A and B conditions and criteria proposed for evaluation of postulated Service Level C and D events. The criteria recommended for Service Level A and B conditions was used to evaluate Linde 80 weld material. 9 refs., 4 figs.

  2. Evaluation of upper-shelf toughness requirements for reactor pressure vessels

    International Nuclear Information System (INIS)

    Gamble, R.M.; Zahoor, A.; Hiser, A.; Ernst, H.A.; Pollitz, E.T.

    1990-04-01

    This work assesses and applies the criteria recommended by the ASME Subgroup on Evaluation Standards for the evaluation of reactor pressure vessel beltline materials having upper shelf Charpy energies less than 50 ft-lbs. The assessment included comparison of the upper shelf energies required by the criteria recommended for Service Level A and B conditions and criteria proposed for evaluation of postulated Service Level C and D events. The criteria recommended for Service Level A and B conditions was used to evaluate Linde 80 weld material. 9 refs., 4 figs

  3. Quick pace of property acquisitions requires two-stage evaluations

    International Nuclear Information System (INIS)

    Hollo, R.; Lockwood, S.

    1994-01-01

    The traditional method of evaluating oil and gas reserves may be too cumbersome for the quick pace of oil and gas property acquisition. An acquisition evaluator must decide quickly if a property meets basic purchase criteria. The current business climate requires a two-stage approach. First, the evaluator makes a quick assessment of the property and submits a bid. If the bid is accepted then the evaluator goes on with a detailed analysis, which represents the second stage. Acquisition of producing properties has become an important activity for many independent oil and gas producers, who must be able to evaluate reserves quickly enough to make effective business decisions yet accurately enough to avoid costly mistakes. Independent thus must be familiar with how transactions usually progress as well as with the basic methods of property evaluation. The paper discusses acquisition activity, the initial offer, the final offer, property evaluation, and fair market value

  4. A model to evaluate quality and effectiveness of disease management.

    Science.gov (United States)

    Lemmens, K M M; Nieboer, A P; van Schayck, C P; Asin, J D; Huijsman, R

    2008-12-01

    Disease management has emerged as a new strategy to enhance quality of care for patients suffering from chronic conditions, and to control healthcare costs. So far, however, the effects of this strategy remain unclear. Although current models define the concept of disease management, they do not provide a systematic development or an explanatory theory of how disease management affects the outcomes of care. The objective of this paper is to present a framework for valid evaluation of disease-management initiatives. The evaluation model is built on two pillars of disease management: patient-related and professional-directed interventions. The effectiveness of these interventions is thought to be affected by the organisational design of the healthcare system. Disease management requires a multifaceted approach; hence disease-management programme evaluations should focus on the effects of multiple interventions, namely patient-related, professional-directed and organisational interventions. The framework has been built upon the conceptualisation of these disease-management interventions. Analysis of the underlying mechanisms of these interventions revealed that learning and behavioural theories support the core assumptions of disease management. The evaluation model can be used to identify the components of disease-management programmes and the mechanisms behind them, making valid comparison feasible. In addition, this model links the programme interventions to indicators that can be used to evaluate the disease-management programme. Consistent use of this framework will enable comparisons among disease-management programmes and outcomes in evaluation research.

  5. Requirements for data integration platforms in biomedical research networks: a reference model.

    Science.gov (United States)

    Ganzinger, Matthias; Knaup, Petra

    2015-01-01

    Biomedical research networks need to integrate research data among their members and with external partners. To support such data sharing activities, an adequate information technology infrastructure is necessary. To facilitate the establishment of such an infrastructure, we developed a reference model for the requirements. The reference model consists of five reference goals and 15 reference requirements. Using the Unified Modeling Language, the goals and requirements are set into relation to each other. In addition, all goals and requirements are described textually in tables. This reference model can be used by research networks as a basis for a resource efficient acquisition of their project specific requirements. Furthermore, a concrete instance of the reference model is described for a research network on liver cancer. The reference model is transferred into a requirements model of the specific network. Based on this concrete requirements model, a service-oriented information technology architecture is derived and also described in this paper.

  6. Evaluation procedure of software requirements specification for digital I and C of KNGR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Park, Jong Kyun; Lee, Ki Young; Kim, Jang Yeol; Cheon, Se Woo

    2001-06-01

    The accuracy of the specification of requirements of a digital system is of prime importance to the acceptance and success of the system. The development, use, and regulation of computer systems in nuclear reactor Instrumentation and Control (I and C) systems to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Korean Next Generation Reactor (KNGR) Software Safety Verification and Validation (SSVV) Task, Korea Atomic Energy Research Institute, which investigates different aspects of computer software in reactor I and C systems, and describes the engineering procedures for developing such a software. The purpose of this guideline is to give the software safety evaluator the trail map between the code and standards layer and the design methodology and documents layer for the software important to safety in nuclear power plants. Recently, the requirements specification of safety-critical software systems and safety analysis of them are being recognized as one of the important issues in the software life cycle, and being developed new regulatory positions and standards by the regulatory and the standardization organizations such as IAEA, IEC, and IEEE. We presented the procedure for evaluating the software requirements specifications of the KNGR protection systems. We believe it can be useful for both licenser and licensee to conduct an evaluation of the safety in the requirements phase of developing the software. The guideline consists of the requirements engineering for software of KNGR protection systems in chapter 1, the evaluation checklist of software requirements specification in chapter2.3, and the safety evaluation procedure of KNGR software requirements specification in chapter 2.4

  7. The case for applying an early-lifecycle technology evaluation methodology to comparative evaluation of requirements engineering research

    Science.gov (United States)

    Feather, Martin S.

    2003-01-01

    The premise of this paper is taht there is a useful analogy between evaluation of proposed problem solutions and evaluation of requirements engineering research itself. Both of these application areas face the challenges of evaluation early in the lifecycle, of the need to consider a wide variety of factors, and of the need to combine inputs from multiple stakeholders in making thse evaluation and subsequent decisions.

  8. Requirements for High Level Models Supporting Design Space Exploration in Model-based Systems Engineering

    OpenAIRE

    Haveman, Steven P.; Bonnema, G. Maarten

    2013-01-01

    Most formal models are used in detailed design and focus on a single domain. Few effective approaches exist that can effectively tie these lower level models to a high level system model during design space exploration. This complicates the validation of high level system requirements during detailed design. In this paper, we define requirements for a high level model that is firstly driven by key systems engineering challenges present in industry and secondly connects to several formal and d...

  9. Spatial resolution requirements for traffic-related air pollutant exposure evaluations

    Science.gov (United States)

    Batterman, Stuart; Chambliss, Sarah; Isakov, Vlad

    2014-09-01

    Vehicle emissions represent one of the most important air pollution sources in most urban areas, and elevated concentrations of pollutants found near major roads have been associated with many adverse health impacts. To understand these impacts, exposure estimates should reflect the spatial and temporal patterns observed for traffic-related air pollutants. This paper evaluates the spatial resolution and zonal systems required to estimate accurately intraurban and near-road exposures of traffic-related air pollutants. The analyses use the detailed information assembled for a large (800 km2) area centered on Detroit, Michigan, USA. Concentrations of nitrogen oxides (NOx) due to vehicle emissions were estimated using hourly traffic volumes and speeds on 9700 links representing all but minor roads in the city, the MOVES2010 emission model, the RLINE dispersion model, local meteorological data, a temporal resolution of 1 h, and spatial resolution as low as 10 m. Model estimates were joined with the corresponding shape files to estimate residential exposures for 700,000 individuals at property parcel, census block, census tract, and ZIP code levels. We evaluate joining methods, the spatial resolution needed to meet specific error criteria, and the extent of exposure misclassification. To portray traffic-related air pollutant exposure, raster or inverse distance-weighted interpolations are superior to nearest neighbor approaches, and interpolations between receptors and points of interest should not exceed about 40 m near major roads, and 100 m at larger distances. For census tracts and ZIP codes, average exposures are overestimated since few individuals live very near major roads, the range of concentrations is compressed, most exposures are misclassified, and high concentrations near roads are entirely omitted. While smaller zones improve performance considerably, even block-level data can misclassify many individuals. To estimate exposures and impacts of traffic

  10. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species.

    Science.gov (United States)

    Adams, Matthew P; Collier, Catherine J; Uthicke, Sven; Ow, Yan X; Langlois, Lucas; O'Brien, Katherine R

    2017-01-04

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (T opt ) for maximum photosynthetic rate (P max ). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.

  11. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species

    Science.gov (United States)

    Adams, Matthew P.; Collier, Catherine J.; Uthicke, Sven; Ow, Yan X.; Langlois, Lucas; O'Brien, Katherine R.

    2017-01-01

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (Topt) for maximum photosynthetic rate (Pmax). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.

  12. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation

  13. Evaluation of atmospheric-pressure change in tornado using Fujita model

    International Nuclear Information System (INIS)

    Shimizu, Juntaro; Ohtsubo, Shunsuke

    2017-01-01

    Evaluation of the atmospheric-pressure change (APC) in a tornado is necessary to assess the integrity of nuclear-related facilities. The Rankine model has been most frequently used to theoretically calculate the APC in a tornado. The result, however, is considered to be overly conservative because the Rankine model wind speed at the ground is larger than that in reality. On the other hand, the wind speed of the Fujita model is closer to that of actual tornadoes but is expressed by more complicated algebraic equations than that in the Rankine model. Also, because it is impossible to analytically derive the APC equation using the Fujita model, numerical computation is required. A previous study employed the finite element method (FEM) for such a purpose. However, a general-purpose FEM code often requires complicated input parameters. In order to conduct parametric studies to evaluate the integrity of facilities in various cases of tornadoes, the finite-difference method code “TORPEC”, which is specialized to analyze the APC, was developed as a convenient design tool. TORPEC is based on Poisson’s equation derived from the Navier-Stokes equation. It also runs on widely available technical calculation software such as Microsoft"® Excel VBA or MATLAB"®. Taking advantage of such convenience, various calculations have been conducted to reveal the characteristics of APC as functions of the maximum tangential wind speed, axial position and tornado radius. TORPEC is used as a benchmark in the existing paper. The case study results obtained by TORPEC show a constant ratio of the pressure drop of the Fujita model against the Rankine model. This factor can be used to derive the Fujita model result from the Rankine model result without FEM analysis. (author)

  14. Ex-vessel core catcher design requirements and preliminary concepts evaluation

    International Nuclear Information System (INIS)

    Friedland, A.J.; Tilbrook, R.W.

    1974-01-01

    As part of the overall study of the consequences of a hypothetical failure to scram following loss of pumping power, design requirements and preliminary concepts evaluation of an ex-vessel core catcher (EVCC) were performed. EVCC is the term applied to a class of devices whose primary objective is to provide a stable subcritical and coolable configuration within containment following a postulated accident in which it is assumed that core debris has penetrated the Reactor Vessel and Guard Vessel. Under these assumed conditions a set of functional requirements were developed for an EVCC and several concepts were evaluated. The studies were specifically directed toward the FFTF design considering the restraints imposed by the physical design and construction of the FFTF plant

  15. Information Models, Data Requirements, and Agile Data Curation

    Science.gov (United States)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  16. 13 CFR 303.3 - Application requirements and evaluation criteria.

    Science.gov (United States)

    2010-01-01

    ... involvement of the Region's business leadership at each stage of the preparation of the CEDS, short-term... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Application requirements and evaluation criteria. 303.3 Section 303.3 Business Credit and Assistance ECONOMIC DEVELOPMENT ADMINISTRATION...

  17. Modeling requirements for in situ vitrification

    International Nuclear Information System (INIS)

    MacKinnon, R.J.; Mecham, D.C.; Hagrman, D.L.; Johnson, R.W.; Murray, P.E.; Slater, C.E.; Marwil, E.S.; Weaver, R.A.; Argyle, M.D.

    1991-11-01

    This document outlines the requirements for the model being developed at the INEL which will provide analytical support for the ISV technology assessment program. The model includes representations of the electric potential field, thermal transport with melting, gas and particulate release, vapor migration, off-gas combustion and process chemistry. The modeling objectives are to (1) help determine the safety of the process by assessing the air and surrounding soil radionuclide and chemical pollution hazards, the nuclear criticality hazard, and the explosion and fire hazards, (2) help determine the suitability of the ISV process for stabilizing the buried wastes involved, and (3) help design laboratory and field tests and interpret results therefrom

  18. A required course in the development, implementation, and evaluation of clinical pharmacy services.

    Science.gov (United States)

    Skomo, Monica L; Kamal, Khalid M; Berdine, Hildegarde J

    2008-10-15

    To develop, implement, and assess a required pharmacy practice course to prepare pharmacy students to develop, implement, and evaluate clinical pharmacy services using a business plan model. Course content centered around the process of business planning and pharmacoeconomic evaluations. Selected business planning topics included literature evaluation, mission statement development, market evaluation, policy and procedure development, and marketing strategy. Selected pharmacoeconomic topics included cost-minimization analysis, cost-benefit analysis, cost-effectiveness analysis, cost-utility analysis, and health-related quality of life (HRQoL). Assessment methods included objective examinations, student participation, performance on a group project, and peer evaluation. One hundred fifty-three students were enrolled in the course. The mean scores on the objective examinations (100 points per examination) ranged from 82 to 85 points, with 25%-35% of students in the class scoring over 90, and 40%-50% of students scoring from 80 to 89. The mean scores on the group project (200 points) and classroom participation (50 points) were 183.5 and 46.1, respectively. The mean score on the peer evaluation was 30.8, with scores ranging from 27.5 to 31.7. The course provided pharmacy students with the framework necessary to develop and implement evidence-based disease management programs and to assure efficient, cost-effective utilization of pertinent resources in the provision of patient care.

  19. Non-parametric probabilistic forecasts of wind power: required properties and evaluation

    DEFF Research Database (Denmark)

    Pinson, Pierre; Nielsen, Henrik Aalborg; Møller, Jan Kloppenborg

    2007-01-01

    of a single or a set of quantile forecasts. The required and desirable properties of such probabilistic forecasts are defined and a framework for their evaluation is proposed. This framework is applied for evaluating the quality of two statistical methods producing full predictive distributions from point...

  20. The Spiral-Interactive Program Evaluation Model.

    Science.gov (United States)

    Khaleel, Ibrahim Adamu

    1988-01-01

    Describes the spiral interactive program evaluation model, which is designed to evaluate vocational-technical education programs in secondary schools in Nigeria. Program evaluation is defined; utility oriented and process oriented models for evaluation are described; and internal and external evaluative factors and variables that define each…

  1. Process-level model evaluation: a snow and heat transfer metric

    Science.gov (United States)

    Slater, Andrew G.; Lawrence, David M.; Koven, Charles D.

    2017-04-01

    Land models require evaluation in order to understand results and guide future development. Examining functional relationships between model variables can provide insight into the ability of models to capture fundamental processes and aid in minimizing uncertainties or deficiencies in model forcing. This study quantifies the proficiency of land models to appropriately transfer heat from the soil through a snowpack to the atmosphere during the cooling season (Northern Hemisphere: October-March). Using the basic physics of heat diffusion, we investigate the relationship between seasonal amplitudes of soil versus air temperatures due to insulation from seasonal snow. Observations demonstrate the anticipated exponential relationship of attenuated soil temperature amplitude with increasing snow depth and indicate that the marginal influence of snow insulation diminishes beyond an effective snow depth of about 50 cm. A snow and heat transfer metric (SHTM) is developed to quantify model skill compared to observations. Land models within the CMIP5 experiment vary widely in SHTM scores, and deficiencies can often be traced to model structural weaknesses. The SHTM value for individual models is stable over 150 years of climate, 1850-2005, indicating that the metric is insensitive to climate forcing and can be used to evaluate each model's representation of the insulation process.

  2. Evaluated experimental database on critical heat flux in WWER FA models

    International Nuclear Information System (INIS)

    Artamonov, S.; Sergeev, V.; Volkov, S.

    2015-01-01

    The paper presents the description of the evaluated experimental database on critical heat flux in WWER FA models of new designs. This database was developed on the basis of the experimental data obtained in the years of 2009-2012. In the course of its development, the database was reviewed in terms of completeness of the information about the experiments and its compliance with the requirements of Rostekhnadzor regulatory documents. The description of the experimental FA model characteristics and experimental conditions was specified. Besides, the experimental data were statistically processed with the aim to reject incorrect ones and the sets of experimental data on critical heat fluxes (CHF) were compared for different FA models. As a result, for the fi rst time, the evaluated database on CHF in FA models of new designs was developed, that was complemented with analysis functions, and its main purpose is to be used in the process of development, verification and upgrading of calculation techniques. The developed database incorporates the data of 4183 experimental conditions obtained in 53 WWER FA models of various designs. Keywords: WWER reactor, fuel assembly, CHF, evaluated experimental data, database, statistical analysis. (author)

  3. Using cognitive modeling for requirements engineering in anesthesiology

    NARCIS (Netherlands)

    Pott, C; le Feber, J

    2005-01-01

    Cognitive modeling is a complexity reducing method to describe significant cognitive processes under a specified research focus. Here, a cognitive process model for decision making in anesthesiology is presented and applied in requirements engineering. Three decision making situations of

  4. Design requirements for SRB production control system. Volume 3: Package evaluation, modification and hardware

    Science.gov (United States)

    1981-01-01

    The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.

  5. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    Science.gov (United States)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  6. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  7. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  8. The EMEFS model evaluation. An interim report

    Energy Technology Data Exchange (ETDEWEB)

    Barchet, W.R. [Pacific Northwest Lab., Richland, WA (United States); Dennis, R.L. [Environmental Protection Agency, Research Triangle Park, NC (United States); Seilkop, S.K. [Analytical Sciences, Inc., Durham, NC (United States); Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K. [Atmospheric Environment Service, Downsview, ON (Canada); Byun, D.; McHenry, J.N. [Computer Sciences Corp., Research Triangle Park, NC (United States); Karamchandani, P.; Venkatram, A. [ENSR Consulting and Engineering, Camarillo, CA (United States); Fung, C.; Misra, P.K. [Ontario Ministry of the Environment, Toronto, ON (Canada); Hansen, D.A. [Electric Power Research Inst., Palo Alto, CA (United States); Chang, J.S. [State Univ. of New York, Albany, NY (United States). Atmospheric Sciences Research Center

    1991-12-01

    The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs.

  9. A New Rapid Simplified Model for Urban Rainstorm Inundation with Low Data Requirements

    Directory of Open Access Journals (Sweden)

    Ji Shen

    2016-11-01

    Full Text Available This paper proposes a new rapid simplified inundation model (NRSIM for flood inundation caused by rainstorms in an urban setting that can simulate the urban rainstorm inundation extent and depth in a data-scarce area. Drainage basins delineated from a floodplain map according to the distribution of the inundation sources serve as the calculation cells of NRSIM. To reduce data requirements and computational costs of the model, the internal topography of each calculation cell is simplified to a circular cone, and a mass conservation equation based on a volume spreading algorithm is established to simulate the interior water filling process. Moreover, an improved D8 algorithm is outlined for the simulation of water spilling between different cells. The performance of NRSIM is evaluated by comparing the simulated results with those from a traditional rapid flood spreading model (TRFSM for various resolutions of digital elevation model (DEM data. The results are as follows: (1 given high-resolution DEM data input, the TRFSM model has better performance in terms of precision than NRSIM; (2 the results from TRFSM are seriously affected by the decrease in DEM data resolution, whereas those from NRSIM are not; and (3 NRSIM always requires less computational time than TRFSM. Apparently, compared with the complex hydrodynamic or traditional rapid flood spreading model, NRSIM has much better applicability and cost-efficiency in real-time urban inundation forecasting for data-sparse areas.

  10. Requirements-level semantics and model checking of object-oriented statecharts

    NARCIS (Netherlands)

    Eshuis, H.; Jansen, D.N.; Wieringa, Roelf J.

    2002-01-01

    In this paper we define a requirements-level execution semantics for object-oriented statecharts and show how properties of a system specified by these statecharts can be model checked using tool support for model checkers. Our execution semantics is requirements-level because it uses the perfect

  11. Virtual reality technology as a tool for human factors requirements evaluation in design of the nuclear reactors control desks

    International Nuclear Information System (INIS)

    Grecco, Claudio H.S.; Santos, Isaac J.A.L.; Mol, Antonio C.A.; Carvalho, Paulo V.R.; Silva, Antonio C.F.; Ferreira, Francisco J.O.; Dutra, Marco A.M.

    2007-01-01

    The Virtual Reality (VR) is an advanced computer interface technology that allows the user to internet or to explore a three-dimensional environment through the computer, as was part of the virtual world. This technology presents great applicability in the most diverse areas of the human knowledge. This paper presents a study on the use of the VR as tool for human factors requirements evaluation in design of the nuclear reactors control desks. Moreover, this paper presents a case study: a virtual model of the control desk, developed using virtual reality technology to be used in the human factors requirements evaluation. This case study was developed in the Virtual Reality Laboratory at IEN, and understands the stereo visualization of the Argonauta research nuclear reactor control desk for a static ergonomic evaluation using check-lists, in accordance to the standards and human factors nuclear international guides (IEC 1771, NUREG-0700). (author)

  12. Virtual reality technology as a tool for human factors requirements evaluation in design of the nuclear reactors control desks

    Energy Technology Data Exchange (ETDEWEB)

    Grecco, Claudio H.S.; Santos, Isaac J.A.L.; Mol, Antonio C.A.; Carvalho, Paulo V.R.; Silva, Antonio C.F.; Ferreira, Francisco J.O.; Dutra, Marco A.M. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mail: grecco@ien.gov.br; luquetti@ien.gov.br; mol@ien.gov.br; paulov@ien.gov.br; tonico@ien.gov.br; fferreira@ien.gov.br; dutra@ien.gov.br

    2007-07-01

    The Virtual Reality (VR) is an advanced computer interface technology that allows the user to internet or to explore a three-dimensional environment through the computer, as was part of the virtual world. This technology presents great applicability in the most diverse areas of the human knowledge. This paper presents a study on the use of the VR as tool for human factors requirements evaluation in design of the nuclear reactors control desks. Moreover, this paper presents a case study: a virtual model of the control desk, developed using virtual reality technology to be used in the human factors requirements evaluation. This case study was developed in the Virtual Reality Laboratory at IEN, and understands the stereo visualization of the Argonauta research nuclear reactor control desk for a static ergonomic evaluation using check-lists, in accordance to the standards and human factors nuclear international guides (IEC 1771, NUREG-0700). (author)

  13. Value-Focused Thinking Model to Evaluate SHM System Alternatives From Military end User Requirements Point of View

    Directory of Open Access Journals (Sweden)

    Klimaszewski Sławomir

    2016-12-01

    Full Text Available The article describes Value-Focused Thinking (VFT model developed in order to evaluate various alternatives for implementation of Structural Health Monitoring (SHM system on a military aircraft. Four SHM system alternatives are considered based on: visual inspection (current approach, piezoelectric (PZT sensors, Fiber Bragg Grating (FBG sensors and Comparative Vacuum Monitoring (CVM sensors. A numerical example is shown to illustrate the model capability. Sensitivity analyses are performed on values such as Cost, Performance, Aircraft Availability and Technology Readiness Level in order to examine influence of these values on overall value of structural state of awareness provided by particular SHM system alternative.

  14. Capturing security requirements for software systems.

    Science.gov (United States)

    El-Hadary, Hassan; El-Kassas, Sherif

    2014-07-01

    Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.

  15. Capturing security requirements for software systems

    Directory of Open Access Journals (Sweden)

    Hassan El-Hadary

    2014-07-01

    Full Text Available Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.

  16. Capturing security requirements for software systems

    Science.gov (United States)

    El-Hadary, Hassan; El-Kassas, Sherif

    2014-01-01

    Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way. PMID:25685514

  17. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    . There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  18. Resident Evaluation of a Required Telepsychiatry Clinical Experience.

    Science.gov (United States)

    Teshima, John; Hodgins, Michael; Boydell, Katherine M; Pignatiello, Antonio

    2016-04-01

    The authors explored resident experiences of telepsychiatry clinical training. This paper describes an analysis of evaluation forms completed by psychiatry residents following a required training experience in telepsychiatry. Retrospective numeric and narrative data were collected from 2005 to 2012. Using a five-point Likert-type scale (1 = strongly disagree and 5 = strongly agree), residents ranked the session based on the following characteristics: the overall experience, interest in participating in telepsychiatry in the future, understanding service provision to underserved areas, telepsychiatry as mode of service delivery, and the unique aspects of telepsychiatry work. The authors also conducted a content analysis of narrative comments in response to open-ended questions about the positive and negative aspects of the training experience. In all, 88% of residents completed (n = 335) an anonymous evaluation following their participation in telepsychiatry consultation sessions. Numeric results were mostly positive and indicated that the experience was interesting and enjoyable, enhanced interest in participating in telepsychiatry in the future, and increased understanding of providing psychiatric services to underserved communities. Narrative data demonstrated that the most valuable aspects of training included the knowledge acquired in terms of establishing rapport and engaging with patients, using the technology, working collaboratively, identifying different approaches used, and awareness of the complexity of cases. Resident desire for more training of this nature was prevalent, specifically a wish for more detail, additional time for discussion and debriefing, and further explanation of the unique aspects of telepsychiatry as mode of delivery. More evaluation of telepsychiatry training, elective or required, is needed. The context of this training offered potential side benefits of learning about interprofessional and collaborative care for the

  19. Overview of contaminant arrival distributions as general evaluation requirements

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    The environmental consequences of subsurface contamination problems can be completely and effectively evaluated by fulfilling the following five requirements: Determine each present or future outflow boundary of contaminated groundwater; provide the location/arrival-time distributions; provide the location/outflow-quantity distributions; provide these distributions for each individual chemical or biological constituent of environmental importance; and use the arrival distributions to determine the quantity and concentration of each contaminant that will interface with the environment as time passes. The arrival distributions on which these requirements are based provide a reference point for communication among scientists and public decision makers by enabling complicated scientific analyses to be presented as simple summary relationships

  20. Evaluating the power consumption of wireless sensor network applications using models.

    Science.gov (United States)

    Dâmaso, Antônio; Freitas, Davi; Rosa, Nelson; Silva, Bruno; Maciel, Paulo

    2013-03-13

    Power consumption is the main concern in developing Wireless Sensor Network (WSN) applications. Consequently, several strategies have been proposed for investigating the power consumption of this kind of application. These strategies can help to predict the WSN lifetime, provide recommendations to application developers and may optimize the energy consumed by the WSN applications. While measurement is a known and precise strategy for power consumption evaluation, it is very costly, tedious and may be unfeasible considering the (usual) large number of WSN nodes. Furthermore, due to the inherent dynamism of WSNs, the instrumentation required by measurement techniques makes difficult their use in several different scenarios. In this context, this paper presents an approach for evaluating the power consumption of WSN applications by using simulation models along with a set of tools to automate the proposed approach. Starting from a programming language code, we automatically generate consumption models used to predict the power consumption of WSN applications. In order to evaluate the proposed approach, we compare the results obtained by using the generated models against ones obtained by measurement.

  1. Model-based Assessment for Balancing Privacy Requirements and Operational Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-17

    The smart grid changes the way energy is produced and distributed. In addition both, energy and information is exchanged bidirectionally among participating parties. Therefore heterogeneous systems have to cooperate effectively in order to achieve a common high-level use case, such as smart metering for billing or demand response for load curtailment. Furthermore, a substantial amount of personal data is often needed for achieving that goal. Capturing and processing personal data in the smart grid increases customer concerns about privacy and in addition, certain statutory and operational requirements regarding privacy aware data processing and storage have to be met. An increase of privacy constraints, however, often limits the operational capabilities of the system. In this paper, we present an approach that automates the process of finding an optimal balance between privacy requirements and operational requirements in a smart grid use case and application scenario. This is achieved by formally describing use cases in an abstract model and by finding an algorithm that determines the optimum balance by forward mapping privacy and operational impacts. For this optimal balancing algorithm both, a numeric approximation and – if feasible – an analytic assessment are presented and investigated. The system is evaluated by applying the tool to a real-world use case from the University of Southern California (USC) microgrid.

  2. Individual model evaluation and probabilistic weighting of models

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-01-01

    This note stresses the importance of trying to assess the accuracy of each model individually. Putting a Bayesian probability distribution on a population of models faces conceptual and practical complications, and apparently can come only after the work of evaluating the individual models. Moreover, the primary issue is open-quotes How good is this modelclose quotes? Therefore, the individual evaluations are first in both chronology and importance. They are not easy, but some ideas are given here on how to perform them

  3. Reponsive and Open Learning Environments (ROLE: Requirements, Evaluation and Reflection

    Directory of Open Access Journals (Sweden)

    Effie Lai-Chong Law

    2013-02-01

    Full Text Available Coordinating requirements engineering (RE and evaluation studies across heterogeneous technology-enhanced learning (TEL environments is deemed challenging, because each of them is situated in a specific organizational, technical and socio-cultural context. We have dealt with such challenges in the project of ROLE (http://www.role-project.eu/ in which five test-beds are involved in deploying and evaluating Personal Learning Environments (PLEs. They include Higher Education Institutions (HEIs and global enterprises in and beyond Europe, representing a range of values and assumptions. While the diversity provides fertile grounds for validating our research ideas, it poses many challenges for conducting comparison studies. In the paper, we first provide an overview of the ROLE project, focusing on its missions and aims. Next we present a Web2.0-inspired RE approach called Social Requirements Engineering (SRE. Then we depict our initial attempts to evaluate the ROLE framework and report some preliminary findings. One major outcome is that the technology adoption process must work on the basis of existing LMS, extending them with the ROLE functionality rather than embracing LMS functionality in ROLE.

  4. Requirements for an evaluation infrastructure for reliable pervasive healthcare research

    DEFF Research Database (Denmark)

    Wagner, Stefan Rahr; Toftegaard, Thomas Skjødeberg; Bertelsen, Olav W.

    2012-01-01

    The need for a non-intrusive evaluation infrastructure platform to support research on reliable pervasive healthcare in the unsupervised setting is analyzed and challenges and possibilities are identified. A list of requirements is presented and a solution is suggested that would allow researchers...

  5. Rock mechanics models evaluation report

    International Nuclear Information System (INIS)

    1987-08-01

    This report documents the evaluation of the thermal and thermomechanical models and codes for repository subsurface design and for design constraint analysis. The evaluation was based on a survey of the thermal and thermomechanical codes and models that are applicable to subsurface design, followed by a Kepner-Tregoe (KT) structured decision analysis of the codes and models. The primary recommendations of the analysis are that the DOT code be used for two-dimensional thermal analysis and that the STEALTH and HEATING 5/6 codes be used for three-dimensional and complicated two-dimensional thermal analysis. STEALTH and SPECTROM 32 are recommended for thermomechanical analyses. The other evaluated codes should be considered for use in certain applications. A separate review of salt creep models indicate that the commonly used exponential time law model is appropriate for use in repository design studies. 38 refs., 1 fig., 7 tabs

  6. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  7. A Compositional Knowledge Level Process Model of Requirements Engineering

    NARCIS (Netherlands)

    Herlea, D.E.; Jonker, C.M.; Treur, J.; Wijngaards, W.C.A.

    2002-01-01

    In current literature few detailed process models for Requirements Engineering are presented: usually high-level activities are distinguished, without a more precise specification of each activity. In this paper the process of Requirements Engineering has been analyzed using knowledge-level

  8. Psychosocial adjustment of children with chronic illness: an evaluation of three models.

    Science.gov (United States)

    Gartstein, M A; Short, A D; Vannatta, K; Noll, R B

    1999-06-01

    This study was designed to assess social, emotional, and behavioral functioning of children with chronic illness and to evaluate three models addressing the impact of chronic illness on psychosocial functioning: discrete disease, noncategorical, and mixed. Families of children with cancer, sickle cell disease, hemophilia, and juvenile rheumatoid arthritis participated, along with families of classroom comparison peers without a chronic illness who had the closest date of birth and were of the same race and gender (COMPs). Mothers, fathers, and children provided information regarding current functioning of the child with chronic illness or the COMP child. Child Behavior Checklist and Children's Depression Inventory scores were examined. Results provided support for the noncategorical model. Thus, the mixed model evaluated in this study requires modifications before its effectiveness as a classification system can be demonstrated.

  9. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  10. Requirements for the evaluation of computational speech segregation systems

    DEFF Research Database (Denmark)

    May, Tobias; Dau, Torsten

    2014-01-01

    Recent studies on computational speech segregation reported improved speech intelligibility in noise when estimating and applying an ideal binary mask with supervised learning algorithms. However, an important requirement for such systems in technical applications is their robustness to acoustic...... associated with perceptual attributes in speech segregation. The results could help establish a framework for a systematic evaluation of future segregation systems....

  11. Evaluation and hydrological modelization in the natural hazard prevention

    International Nuclear Information System (INIS)

    Pla Sentis, Ildefonso

    2011-01-01

    Soil degradation affects negatively his functions as a base to produce food, to regulate the hydrological cycle and the environmental quality. All over the world soil degradation is increasing partly due to lacks or deficiencies in the evaluations of the processes and causes of this degradation on each specific situation. The processes of soil physical degradation are manifested through several problems as compaction, runoff, hydric and Eolic erosion, landslides with collateral effects in situ and in the distance, often with disastrous consequences as foods, landslides, sedimentations, droughts, etc. These processes are frequently associated to unfavorable changes into the hydrologic processes responsible of the water balance and soil hydric regimes, mainly derived to soil use changes and different management practices and climatic changes. The evaluation of these processes using simple simulation models; under several scenarios of climatic change, soil properties and land use and management; would allow to predict the occurrence of this disastrous processes and consequently to select and apply the appropriate practices of soil conservation to eliminate or reduce their effects. This simulation models require, as base, detailed climatic information and hydrologic soil properties data. Despite of the existence of methodologies and commercial equipment (each time more sophisticated and precise) to measure the different physical and hydrological soil properties related with degradation processes, most of them are only applicable under really specific or laboratory conditions. Often indirect methodologies are used, based on relations or empiric indexes without an adequate validation, that often lead to expensive mistakes on the evaluation of soil degradation processes and their effects on natural disasters. It could be preferred simple field methodologies, direct and adaptable to different soil types and climates and to the sample size and the spatial variability of the

  12. Adapting Evaluations of Alternative Payment Models to a Changing Environment.

    Science.gov (United States)

    Grannemann, Thomas W; Brown, Randall S

    2018-04-01

    To identify the most robust methods for evaluating alternative payment models (APMs) in the emerging health care delivery system environment. We assess the impact of widespread testing of alternative payment models on the ability to find credible comparison groups. We consider the applicability of factorial research designs for assessing the effects of these models. The widespread adoption of alternative payment models could effectively eliminate the possibility of comparing APM results with a "pure" control or comparison group unaffected by other interventions. In this new environment, factorial experiments have distinct advantages over the single-model experimental or quasi-experimental designs that have been the mainstay of recent tests of Medicare payment and delivery models. The best prospects for producing definitive evidence of the effects of payment incentives for APMs include fractional factorial experiments that systematically vary requirements and payment provisions within a payment model. © Health Research and Educational Trust.

  13. Reuse-centric Requirements Analysis with Task Models, Scenarios, and Critical Parameters

    Directory of Open Access Journals (Sweden)

    Cyril Montabert

    2007-02-01

    Full Text Available This paper outlines a requirements-analysis process that unites task models, scenarios, and critical parameters to exploit and generate reusable knowledge at the requirements phase. Through the deployment of a critical-parameter-based approach to task modeling, the process yields the establishment of an integrative and formalized model issued from scenarios that can be used for requirements characterization. Furthermore, not only can this entity serve as interface to a knowledge repository relying on a critical-parameter-based taxonomy to support reuse but its characterization in terms of critical parameters also allows the model to constitute a broader reuse solution. We discuss our vision for a user-centric and reuse-centric approach to requirements analysis, present previous efforts implicated with this line of work, and state the revisions brought to extend the reuse potential and effectiveness of a previous iteration of a requirements tool implementing such process. Finally, the paper describes the sequence and nature of the activities involved with the conduct of our proposed requirements-analysis technique, concluding by previewing ongoing work in the field that will explore the feasibility for designers to use our approach.

  14. Evaluation of three energy balance-based evaporation models for estimating monthly evaporation for five lakes using derived heat storage changes from a hysteresis model

    Science.gov (United States)

    Duan, Zheng; Bastiaanssen, W. G. M.

    2017-02-01

    The heat storage changes (Q t) can be a significant component of the energy balance in lakes, and it is important to account for Q t for reasonable estimation of evaporation at monthly and finer timescales if the energy balance-based evaporation models are used. However, Q t has been often neglected in many studies due to the lack of required water temperature data. A simple hysteresis model (Q t = a*Rn + b + c* dRn/dt) has been demonstrated to reasonably estimate Q t from the readily available net all wave radiation (Rn) and three locally calibrated coefficients (a-c) for lakes and reservoirs. As a follow-up study, we evaluated whether this hysteresis model could enable energy balance-based evaporation models to yield good evaporation estimates. The representative monthly evaporation data were compiled from published literature and used as ground-truth to evaluate three energy balance-based evaporation models for five lakes. The three models in different complexity are De Bruin-Keijman (DK), Penman, and a new model referred to as Duan-Bastiaanssen (DB). All three models require Q t as input. Each model was run in three scenarios differing in the input Q t (S1: measured Q t; S2: modelled Q t from the hysteresis model; S3: neglecting Q t) to evaluate the impact of Q t on the modelled evaporation. Evaluation showed that the modelled Q t agreed well with measured counterparts for all five lakes. It was confirmed that the hysteresis model with locally calibrated coefficients can predict Q t with good accuracy for the same lake. Using modelled Q t as inputs all three evaporation models yielded comparably good monthly evaporation to those using measured Q t as inputs and significantly better than those neglecting Q t for the five lakes. The DK model requiring minimum data generally performed the best, followed by the Penman and DB model. This study demonstrated that once three coefficients are locally calibrated using historical data the simple hysteresis model can offer

  15. Evidence used in model-based economic evaluations for evaluating pharmacogenetic and pharmacogenomic tests: a systematic review protocol.

    Science.gov (United States)

    Peters, Jaime L; Cooper, Chris; Buchanan, James

    2015-11-11

    Decision models can be used to conduct economic evaluations of new pharmacogenetic and pharmacogenomic tests to ensure they offer value for money to healthcare systems. These models require a great deal of evidence, yet research suggests the evidence used is diverse and of uncertain quality. By conducting a systematic review, we aim to investigate the test-related evidence used to inform decision models developed for the economic evaluation of genetic tests. We will search electronic databases including MEDLINE, EMBASE and NHS EEDs to identify model-based economic evaluations of pharmacogenetic and pharmacogenomic tests. The search will not be limited by language or date. Title and abstract screening will be conducted independently by 2 reviewers, with screening of full texts and data extraction conducted by 1 reviewer, and checked by another. Characteristics of the decision problem, the decision model and the test evidence used to inform the model will be extracted. Specifically, we will identify the reported evidence sources for the test-related evidence used, describe the study design and how the evidence was identified. A checklist developed specifically for decision analytic models will be used to critically appraise the models described in these studies. Variations in the test evidence used in the decision models will be explored across the included studies, and we will identify gaps in the evidence in terms of both quantity and quality. The findings of this work will be disseminated via a peer-reviewed journal publication and at national and international conferences. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  16. FTR power-to-melt study. Phase I. Evaluation of deterministic models

    International Nuclear Information System (INIS)

    1978-09-01

    SIEX is an HEDL fuel thermal performance code. It is designed to be a fast running code for steady state thermal performance analysis of coolant, cladding and fuel temperature throughout the history of a fuel element. The need to have a good predictive model coupled with short running time in the probabilistic analysis has made SIEX one of the potential deterministic models to be adopted. The probabilistic code to be developed will be a general thermal performance code acceptable on a national basis. It is, therefore, necessary to ensure that the physical model meets the requirements of an analytical tool. Since SIEX incorporates some physically-based correlated models, its general validity and limitations should be evaluated before being adopted

  17. A Tire Model for Off-Highway Vehicle Simulation and Comfort Evaluation

    DEFF Research Database (Denmark)

    Langer, Thomas Heegaard; Mouritsen, Ole Ø.; Ebbesen, Morten Kjeld

    2009-01-01

    Manufacturers of construction machinery are challenged to accommodate legal requirements on the vibration exposure associated with their products. Hence, the ability to evaluate ride comfort by virtual prototyping is needed. One of the derived necessities is a modeling approach that can handle big...... off-road tires on irregular terrain and even the passing of sharp corner obstacles. In this paper a simple tire model combining the well known slip theory and a displaced volume approach is presented. A non-gradient optimization routine is applied for parameter identification by minimizing...

  18. Irrigation Requirement Estimation Using Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Bounoua, Lahouari; Imhoff, Marc L.; Franks, Shannon

    2010-01-01

    We explore an inverse biophysical modeling process forced by satellite and climatological data to quantify irrigation requirements in semi-arid agricultural areas. We constrain the carbon and water cycles modeled under both equilibrium, balance between vegetation and climate, and non-equilibrium, water added through irrigation. We postulate that the degree to which irrigated dry lands vary from equilibrium climate conditions is related to the amount of irrigation. The amount of water required over and above precipitation is considered as an irrigation requirement. For July, results show that spray irrigation resulted in an additional amount of water of 1.3 mm per occurrence with a frequency of 24.6 hours. In contrast, the drip irrigation required only 0.6 mm every 45.6 hours or 46% of that simulated by the spray irrigation. The modeled estimates account for 87% of the total reported irrigation water use, when soil salinity is not important and 66% in saline lands.

  19. Local difference measures between complex networks for dynamical system model evaluation.

    Science.gov (United States)

    Lange, Stefan; Donges, Jonathan F; Volkholz, Jan; Kurths, Jürgen

    2015-01-01

    A faithful modeling of real-world dynamical systems necessitates model evaluation. A recent promising methodological approach to this problem has been based on complex networks, which in turn have proven useful for the characterization of dynamical systems. In this context, we introduce three local network difference measures and demonstrate their capabilities in the field of climate modeling, where these measures facilitate a spatially explicit model evaluation.Building on a recent study by Feldhoff et al. [8] we comparatively analyze statistical and dynamical regional climate simulations of the South American monsoon system [corrected]. types of climate networks representing different aspects of rainfall dynamics are constructed from the modeled precipitation space-time series. Specifically, we define simple graphs based on positive as well as negative rank correlations between rainfall anomaly time series at different locations, and such based on spatial synchronizations of extreme rain events. An evaluation against respective networks built from daily satellite data provided by the Tropical Rainfall Measuring Mission 3B42 V7 reveals far greater differences in model performance between network types for a fixed but arbitrary climate model than between climate models for a fixed but arbitrary network type. We identify two sources of uncertainty in this respect. Firstly, climate variability limits fidelity, particularly in the case of the extreme event network; and secondly, larger geographical link lengths render link misplacements more likely, most notably in the case of the anticorrelation network; both contributions are quantified using suitable ensembles of surrogate networks. Our model evaluation approach is applicable to any multidimensional dynamical system and especially our simple graph difference measures are highly versatile as the graphs to be compared may be constructed in whatever way required. Generalizations to directed as well as edge- and node

  20. Data modeling and evaluation

    International Nuclear Information System (INIS)

    Bauge, E.; Hilaire, S.

    2006-01-01

    This lecture is devoted to the nuclear data evaluation process, during which the current knowledge (experimental or theoretical) of nuclear reactions is condensed and synthesised into a computer file (the evaluated data file) that application codes can process and use for simulation calculations. After an overview of the content of evaluated nuclear data files, we describe the different methods used for evaluating nuclear data. We specifically focus on the model based approach which we use to evaluate data in the continuum region. A few examples, coming from the day to day practice of data evaluation will illustrate this lecture. Finally, we will discuss the most likely perspectives for improvement of the evaluation process in the next decade. (author)

  1. Historical model evaluation data requirements

    International Nuclear Information System (INIS)

    Simpson, B.C.; McCain, D.J.

    1995-01-01

    Several studies about tank waste contents have been published using historical records of tank transactions and various analytical measurements. While these records offer a wealth of information, the results are questionable until error estimates associated with the results can be established. However, they do provide a direction for investigation. Two principal observations from the studies are: (1) Large quantities of individual waste types from the various separations processes were widely distributed throughout the tank farms, and (2) The compositions of many of these waste types are quite distinct from one another. A key assumption associated with these observations is that the effects of time and location on the tank wastes are either nominal or not discernable. Since each waste type has a distinct composition, it would benefit all programs to better quantify that composition, and establish an uncertainty for each element of that composition. Various process, disposal, or other decisions could then be made based on current information reducing the need for extended sampling and analysis

  2. Li-NMC Batteries Model Evaluation with Experimental Data for Electric Vehicle Application

    Directory of Open Access Journals (Sweden)

    Aleksandra Baczyńska

    2018-02-01

    Full Text Available The aim of the paper is to present the battery equivalent circuit for electric vehicle application. Moreover, the model described below is dedicated to lithium-ion types of batteries. The purpose of this paper is to introduce an efficient and transparent method to develop a battery equivalent circuit model. Battery modeling requires, depending on the chosen method, either significant calculations or a highly developed mathematical model for optimization. The model is evaluated in comparison to the real data measurements, to present the performance of the method. Battery measurements based on charge/discharge tests at a fixed C-rate are presented to show the relation of the output voltage profiles with the battery state of charge. The pulse discharge test is presented to obtain the electric parameters of the battery equivalent circuit model, using a Thévenin circuit. According to the Reverse Trike Ecologic Electric Vehicle (VEECO RT characteristics used as a case study in this work, new values for vehicle autonomy and battery pack volume based on lithium nickel manganese cobalt oxide cells are evaluated.

  3. Evaluation of an objective plan-evaluation model in the three dimensional treatment of nonsmall cell lung cancer

    International Nuclear Information System (INIS)

    Graham, Mary V.; Jain, Nilesh L.; Kahn, Michael G.; Drzymala, Robert E.; Purdy, James A.

    1996-01-01

    Purpose: Evaluation of three dimensional (3D) radiotherapy plans is difficult because it requires the review of vast amounts of data. Selecting the optimal plan from a set of competing plans involves making trade-offs among the doses delivered to the target volumes and normal tissues. The purpose of this study was to test an objective plan-evaluation model and evaluate its clinical usefulness in 3D treatment planning for nonsmall cell lung cancer. Methods and Materials: Twenty patients with inoperable nonsmall cell lung cancer treated with definitive radiotherapy were studied using full 3D techniques for treatment design and implementation. For each patient, the evaluator (the treating radiation oncologist) initially ranked three plans using room-view dose-surface isplays and dose-volume histograms, and identified the issues that needed to be improved. The three plans were then ranked by the objective plan-evaluation model. A figure of merit (FOM) was computed for each plan by combining the numerical score (utility in decision-theoretic terms) for each clinical issue. The utility was computed from a probability of occurrence of the issue and a physician-specific weight indicating its clinical relevance. The FOM was used to rank the competing plans for a patient, and the utility was used to identify issues that needed to be improved. These were compared with the initial evaluations of the physician and discrepancies were analyzed. The issues identified in the best treatment plan were then used to attempt further manual optimization of this plan. Results: For the 20 patients (60 plans) in the study, the final plan ranking produced by the plan-evaluation model had an initial 73% agreement with the ranking provided by the evaluator. After discrepant cases were reviewed by the physician, the model was usually judged more objective or 'correct'. In most cases the model was also able to correctly identify the issues that needed improvement in each plan. Subsequent

  4. Evaluation of the effect of torsemide on warfarin dosage requirements.

    Science.gov (United States)

    Lai, Sophia; Momper, Jeremiah D; Yam, Felix K

    2017-08-01

    Background According to drug interaction databases, torsemide may potentiate the effects of warfarin. Evidence for this drug-drug interaction, however, is conflicting and the clinical significance is unknown. Objective The aim of this study is to evaluate the impact of torsemide initiation on warfarin dosage requirements. Setting This study was conducted at the Veterans Affairs Healthcare System in San Diego, California. Method A retrospective cohort study was conducted using Veterans Affairs data from patients who were converted from bumetanide to torsemide between March 2014 and July 2014. Patients were also prescribed and taking warfarin during the observation period. Warfarin dosage requirements were evaluated to determine if any changes occurred within the first 3 months of starting torsemide. Main outcome measure The primary outcome was the average weekly warfarin dose before and after torsemide initiation. Results Eighteen patients met study inclusion criteria. The weekly warfarin dose before and after initiation of torsemide was not significantly different (34 ± 15 and 34 ± 13 mg, p > 0.05). Of those eighteen patients, only two experienced elevations in INR that required a decrease in warfarin dosage after torsemide initiation. Between those two patients, dosage reductions ranged from 5.3 to 18%. Conclusion These results indicated that most patients did not require any warfarin dosage adjustments after torsemide was initiated. The potential for interaction, however, still exists. While empiric warfarin dosage adjustments are not recommended when initiating torsemide, increased monitoring is warranted to minimize the risk of adverse effects.

  5. Technical evaluation of Tom Scurry Associates: Model PM-203 doorway monitor

    International Nuclear Information System (INIS)

    1978-07-01

    Under a basic assignment by the Office of Safeguards and Security, the Tom Scurry Associates Model PM-203 Personnel Doorway SNM Monitor manufactured by Tom Scurry Associates was evaluated by LASL Group Q-2 against the DOE Personnel Doorway Monitor standards. During the evaluation, a small change was required in detector shielding to eliminate low sensitivity areas at the portal sides. With the modified shielding described, the PM-203 meets the Office of Safeguards and Security SNM doorway monitor specifications for detecting either 235 U or 239 Pu-- 233 U. This system is also capable of monitoring 238 Pu

  6. Bioprocesses: Modelling needs for process evaluation and sustainability assessment

    DEFF Research Database (Denmark)

    Jiménez-Gonzaléz, Concepcion; Woodley, John

    2010-01-01

    development such that they can also be used to evaluate processes against sustainability metrics, as well as economics as an integral part of assessments. Finally, property models will also be required based on compounds not currently present in existing databases. It is clear that many new opportunities......The next generation of process engineers will face a new set of challenges, with the need to devise new bioprocesses, with high selectivity for pharmaceutical manufacture, and for lower value chemicals manufacture based on renewable feedstocks. In this paper the current and predicted future roles...... of process system engineering and life cycle inventory and assessment in the design, development and improvement of sustainable bioprocesses are explored. The existing process systems engineering software tools will prove essential to assist this work. However, the existing tools will also require further...

  7. Estimates of nutritional requirements and use of Small Ruminant Nutrition System model for hair sheep in semiarid conditions

    Directory of Open Access Journals (Sweden)

    Alessandra Pinto de Oliveira

    2014-09-01

    Full Text Available The objective was to determine the efficiency of utilization of metabolizable energy for maintenance (km and weight gain (kf, the dietary requirements of total digestible nutrients (TDN and metabolizable protein (MP, as well as, evaluate the Small Ruminant Nutrition System (SRNS model to predict the dry matter intake (DMI and the average daily gain (ADG of Santa Ines lambs, fed diets containing different levels of metabolizable energy (ME. Thirty five lambs, non-castrated, with initial body weight (BW of 14.77 ± 1.26 kg at approximate two months old, were used. At the beginning of the experiment, five animals were slaughtered to serve as reference for the estimative of empty body weight (EBW and initial body composition of the 30 remaining animals, which were distributed in randomized block design with five treatments (1.13; 1.40; 1.73; 2.22 and 2.60 Mcal/kg DM, and six repetitions. The requirement of metabolizable energy for maintenance was 78.53 kcal/kg EBW0,75/day, with a utilization efficiency of 66%. The average value of efficiency of metabolizable energy utilization for weight gain was 48%. The dietary requirements of TDN and MP increased with the increase in BW and ADG of the animals. The SRNS model underestimated the DMI and ADG of the animals in 6.2% and 24.6%, respectively. Concludes that the values of km and kf are consistent with those observed in several studies with lambs created in the tropics. The dietary requirements of TDN and MP of Santa Ines lambs for different BW and ADG are, approximately, 42% and 24%, respectively, lower than those suggested by the american system of evaluation of food and nutrient requirements of small ruminants. The SRNS model was sensitive to predict the DMI in Santa Ines lambs, however, for variable ADG, more studies are needed, since the model underestimated the response of the animals of this study.

  8. Evaluation of Models of the Reading Process.

    Science.gov (United States)

    Balajthy, Ernest

    A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…

  9. Evaluation and qualification of novel control techniques with safety requirements

    International Nuclear Information System (INIS)

    Gossner, S.; Wach, D.

    1985-01-01

    The paper discusses the questions related to the assessment and qualification of new I and C-systems. The tasks of nuclear power plant I and Cs as well as the efficiency of the new techniques are reflected. Problems with application of new I and Cs and the state of application in Germany and abroad are addressed. Starting from the essential differencies between conventional and new I and C-systems it is evaluated, if and in which way existing safety requirements can be met and to what extent new requirements need to be formulated. An overall concept has to be developed comprising the definition of graded requirement profiles for design and qualification. Associated qualification procedures and tools have to be adapted, developed and tuned upon each other. (orig./HP) [de

  10. Comparative approaches from empirical to mechanistic simulation modelling in Land Evaluation studies

    Science.gov (United States)

    Manna, P.; Basile, A.; Bonfante, A.; Terribile, F.

    2009-04-01

    The Land Evaluation (LE) comprise the evaluation procedures to asses the attitudes of the land to a generic or specific use (e.g. biomass production). From local to regional and national scale the approach to the land use planning should requires a deep knowledge of the processes that drive the functioning of the soil-plant-atmosphere system. According to the classical approaches the assessment of attitudes is the result of a qualitative comparison between the land/soil physical properties and the land use requirements. These approaches have a quick and inexpensive applicability; however, they are based on empirical and qualitative models with a basic knowledge structure specifically built for a specific landscape and for the specific object of the evaluation (e.g. crop). The outcome from this situation is the huge difficulties in the spatial extrapolation of the LE results and the rigidity of the system. Modern techniques instead, rely on the application of mechanistic and quantitative simulation modelling that allow a dynamic characterisation of the interrelated physical and chemical processes taking place in the soil landscape. Moreover, the insertion of physical based rules in the LE procedure may make it less difficult in terms of both extending spatially the results and changing the object (e.g. crop species, nitrate dynamics, etc.) of the evaluation. On the other side these modern approaches require high quality and quantity of input data that cause a significant increase in costs. In this scenario nowadays the LE expert is asked to choose the best LE methodology considering costs, complexity of the procedure and benefits in handling a specific land evaluation. In this work we performed a forage maize land suitability study by comparing 9 different methods having increasing complexity and costs. The study area, of about 2000 ha, is located in North Italy in the Lodi plain (Po valley). The range of the 9 employed methods ranged from standard LE approaches to

  11. Models of Human Information Requirements: "When Reasonable Aiding Systems Disagree"

    Science.gov (United States)

    Corker, Kevin; Pisanich, Gregory; Shafto, Michael (Technical Monitor)

    1994-01-01

    Aircraft flight management and Air Traffic Control (ATC) automation are under development to maximize the economy of flight and to increase the capacity of the terminal area airspace while maintaining levels of flight safety equal to or better than current system performance. These goals are being realized by the introduction of flight management automation aiding and operations support systems on the flight deck and by new developments of ATC aiding systems that seek to optimize scheduling of aircraft while potentially reducing required separation and accounting for weather and wake vortex turbulence. Aiding systems on both the flight deck and the ground operate through algorithmic functions on models of the aircraft and of the airspace. These models may differ from each other as a result of variations in their models of the immediate environment. The resultant flight operations or ATC commands may differ in their response requirements (e.g. different preferred descent speeds or descent initiation points). The human operators in the system must then interact with the automation to reconcile differences and resolve conflicts. We have developed a model of human performance including cognitive functions (decision-making, rule-based reasoning, procedural interruption recovery and forgetting) that supports analysis of the information requirements for resolution of flight aiding and ATC conflicts. The model represents multiple individuals in the flight crew and in ATC. The model is supported in simulation on a Silicon Graphics' workstation using Allegro Lisp. Design guidelines for aviation automation aiding systems have been developed using the model's specification of information and team procedural requirements. Empirical data on flight deck operations from full-mission flight simulation are provided to support the model's predictions. The paper describes the model, its development and implementation, the simulation test of the model predictions, and the empirical

  12. Multi-site evaluation of terrestrial evaporation models using FLUXNET data

    KAUST Repository

    Ershadi, Ali

    2014-04-01

    We evaluated the performance of four commonly applied land surface evaporation models using a high-quality dataset of selected FLUXNET towers. The models that were examined include an energy balance approach (Surface Energy Balance System; SEBS), a combination-type technique (single-source Penman-Monteith; PM), a complementary method (advection-aridity; AA) and a radiation based approach (modified Priestley-Taylor; PT-JPL). Twenty FLUXNET towers were selected based upon satisfying stringent forcing data requirements and representing a wide range of biomes. These towers encompassed a number of grassland, cropland, shrubland, evergreen needleleaf forest and deciduous broadleaf forest sites. Based on the mean value of the Nash-Sutcliffe efficiency (NSE) and the root mean squared difference (RMSD), the order of overall performance of the models from best to worst were: ensemble mean of models (0.61, 64), PT-JPL (0.59, 66), SEBS (0.42, 84), PM (0.26, 105) and AA (0.18, 105) [statistics stated as (NSE, RMSD in Wm-2)]. Although PT-JPL uses a relatively simple and largely empirical formulation of the evaporative process, the technique showed improved performance compared to PM, possibly due to its partitioning of total evaporation (canopy transpiration, soil evaporation, wet canopy evaporation) and lower uncertainties in the required forcing data. The SEBS model showed low performance over tall and heterogeneous canopies, which was likely a consequence of the effects of the roughness sub-layer parameterization employed in this scheme. However, SEBS performed well overall. Relative to PT-JPL and SEBS, the PM and AA showed low performance over the majority of sites, due to their sensitivity to the parameterization of resistances. Importantly, it should be noted that no single model was consistently best across all biomes. Indeed, this outcome highlights the need for further evaluation of each model\\'s structure and parameterizations to identify sensitivities and their

  13. Mobility Models for Systems Evaluation

    Science.gov (United States)

    Musolesi, Mirco; Mascolo, Cecilia

    Mobility models are used to simulate and evaluate the performance of mobile wireless systems and the algorithms and protocols at the basis of them. The definition of realistic mobility models is one of the most critical and, at the same time, difficult aspects of the simulation of applications and systems designed for mobile environments. There are essentially two possible types of mobility patterns that can be used to evaluate mobile network protocols and algorithms by means of simulations: traces and synthetic models [130]. Traces are obtained by means of measurements of deployed systems and usually consist of logs of connectivity or location information, whereas synthetic models are mathematical models, such as sets of equations, which try to capture the movement of the devices.

  14. Evaluation of Uncertainties in hydrogeological modeling and groundwater flow analyses. Model calibration

    International Nuclear Information System (INIS)

    Ijiri, Yuji; Ono, Makoto; Sugihara, Yutaka; Shimo, Michito; Yamamoto, Hajime; Fumimura, Kenichi

    2003-03-01

    This study involves evaluation of uncertainty in hydrogeological modeling and groundwater flow analysis. Three-dimensional groundwater flow in Shobasama site in Tono was analyzed using two continuum models and one discontinuous model. The domain of this study covered area of four kilometers in east-west direction and six kilometers in north-south direction. Moreover, for the purpose of evaluating how uncertainties included in modeling of hydrogeological structure and results of groundwater simulation decreased with progress of investigation research, updating and calibration of the models about several modeling techniques of hydrogeological structure and groundwater flow analysis techniques were carried out, based on the information and knowledge which were newly acquired. The acquired knowledge is as follows. As a result of setting parameters and structures in renewal of the models following to the circumstances by last year, there is no big difference to handling between modeling methods. The model calibration is performed by the method of matching numerical simulation with observation, about the pressure response caused by opening and closing of a packer in MIU-2 borehole. Each analysis technique attains reducing of residual sum of squares of observations and results of numerical simulation by adjusting hydrogeological parameters. However, each model adjusts different parameters as water conductivity, effective porosity, specific storage, and anisotropy. When calibrating models, sometimes it is impossible to explain the phenomena only by adjusting parameters. In such case, another investigation may be required to clarify details of hydrogeological structure more. As a result of comparing research from beginning to this year, the following conclusions are obtained about investigation. (1) The transient hydraulic data are effective means in reducing the uncertainty of hydrogeological structure. (2) Effective porosity for calculating pore water velocity of

  15. On Early Conflict Identification by Requirements Modeling of Energy System Control Structures

    DEFF Research Database (Denmark)

    Heussen, Kai; Gehrke, Oliver; Niemann, Hans Henrik

    2015-01-01

    issues early. For requirements formulation of control structures, cyber and physical aspects need to be jointly represented to express interdependencies, check for consistency and discover potentially conflicting requirements. Early identification of potential conflicts may prevent larger problems...... at later design stages. However, languages employed for requirements modeling today do not offer the expressiveness necessary to represent control purposes in relation to domain level interactions and therefore miss several types of interdependencies. This paper introduces the idea of control structure...... modeling for early requirements checking using a suitable modeling language, and illustrates how this approach enables the identification of several classes of controller conflict....

  16. Study on dynamic team performance evaluation methodology based on team situation awareness model

    International Nuclear Information System (INIS)

    Kim, Suk Chul

    2005-02-01

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  17. Study on dynamic team performance evaluation methodology based on team situation awareness model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk Chul

    2005-02-15

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  18. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    ... associated with military training exercises. The model combines the business practice of Material Requirements Planning and the commercial spreadsheet software capabilities of Lotus 1-2-3 to calculate the requirements for food, consumable...

  19. Requirements for high level models supporting design space exploration in model-based systems engineering

    NARCIS (Netherlands)

    Haveman, Steven; Bonnema, Gerrit Maarten

    2013-01-01

    Most formal models are used in detailed design and focus on a single domain. Few effective approaches exist that can effectively tie these lower level models to a high level system model during design space exploration. This complicates the validation of high level system requirements during

  20. Evaluation of energy requirements for all-electric range of plug-in hybrid electric two-wheeler

    International Nuclear Information System (INIS)

    Amjad, Shaik; Rudramoorthy, R.; Neelakrishnan, S.; Sri Raja Varman, K.; Arjunan, T.V.

    2011-01-01

    Recently plug-in hybrid electric vehicles (PHEVs) are emerging as one of the promising alternative to improve the sustainability of transportation energy and air quality especially in urban areas. The all-electric range in PHEV design plays a significant role in sizing of battery pack and cost. This paper presents the evaluation of battery energy and power requirements for a plug-in hybrid electric two-wheeler for different all-electric ranges. An analytical vehicle model and MATLAB simulation analysis has been discussed. The MATLAB simulation results estimate the impact of driving cycle and all-electric range on energy capacity, additional mass and initial cost of lead-acid, nickel-metal hydride and lithium-ion batteries. This paper also focuses on influence of cycle life on annual cost of battery pack and recommended suitable battery pack for implementing in plug-in hybrid electric two-wheelers. -- Research highlights: → Evaluates the battery energy and power requirements for a plug-in hybrid electric two-wheeler. → Simulation results reveal that the IDC demand more energy and cost of battery compared to ECE R40. → If cycle life is considered, the annual cost of Ni-MH battery pack is lower than lead-acid and Li-ion.

  1. Evaluating topic models with stability

    CSIR Research Space (South Africa)

    De Waal, A

    2008-11-01

    Full Text Available Topic models are unsupervised techniques that extract likely topics from text corpora, by creating probabilistic word-topic and topic-document associations. Evaluation of topic models is a challenge because (a) topic models are often employed...

  2. Using a Mixed Model to Explore Evaluation Criteria for Bank Supervision: A Banking Supervision Law Perspective.

    Directory of Open Access Journals (Sweden)

    Sang-Bing Tsai

    Full Text Available Financial supervision means that monetary authorities have the power to supervise and manage financial institutions according to laws. Monetary authorities have this power because of the requirements of improving financial services, protecting the rights of depositors, adapting to industrial development, ensuring financial fair trade, and maintaining stable financial order. To establish evaluation criteria for bank supervision in China, this study integrated fuzzy theory and the decision making trial and evaluation laboratory (DEMATEL and proposes a fuzzy-DEMATEL model. First, fuzzy theory was applied to examine bank supervision criteria and analyze fuzzy semantics. Second, the fuzzy-DEMATEL model was used to calculate the degree to which financial supervision criteria mutually influenced one another and their causal relationship. Finally, an evaluation criteria model for evaluating bank and financial supervision was established.

  3. Using a Mixed Model to Explore Evaluation Criteria for Bank Supervision: A Banking Supervision Law Perspective

    Science.gov (United States)

    Tsai, Sang-Bing; Chen, Kuan-Yu; Zhao, Hongrui; Wei, Yu-Min; Wang, Cheng-Kuang; Zheng, Yuxiang; Chang, Li-Chung; Wang, Jiangtao

    2016-01-01

    Financial supervision means that monetary authorities have the power to supervise and manage financial institutions according to laws. Monetary authorities have this power because of the requirements of improving financial services, protecting the rights of depositors, adapting to industrial development, ensuring financial fair trade, and maintaining stable financial order. To establish evaluation criteria for bank supervision in China, this study integrated fuzzy theory and the decision making trial and evaluation laboratory (DEMATEL) and proposes a fuzzy-DEMATEL model. First, fuzzy theory was applied to examine bank supervision criteria and analyze fuzzy semantics. Second, the fuzzy-DEMATEL model was used to calculate the degree to which financial supervision criteria mutually influenced one another and their causal relationship. Finally, an evaluation criteria model for evaluating bank and financial supervision was established. PMID:27992449

  4. Conducting requirements analyses for research using routinely collected health data: a model driven approach.

    Science.gov (United States)

    de Lusignan, Simon; Cashman, Josephine; Poh, Norman; Michalakidis, Georgios; Mason, Aaron; Desombre, Terry; Krause, Paul

    2012-01-01

    Medical research increasingly requires the linkage of data from different sources. Conducting a requirements analysis for a new application is an established part of software engineering, but rarely reported in the biomedical literature; and no generic approaches have been published as to how to link heterogeneous health data. Literature review, followed by a consensus process to define how requirements for research, using, multiple data sources might be modeled. We have developed a requirements analysis: i-ScheDULEs - The first components of the modeling process are indexing and create a rich picture of the research study. Secondly, we developed a series of reference models of progressive complexity: Data flow diagrams (DFD) to define data requirements; unified modeling language (UML) use case diagrams to capture study specific and governance requirements; and finally, business process models, using business process modeling notation (BPMN). These requirements and their associated models should become part of research study protocols.

  5. ZNJPrice/Earnings Ratio Model through Dividend Yield and Required Yield Above Expected Inflation

    Directory of Open Access Journals (Sweden)

    Emil Mihalina

    2010-07-01

    Full Text Available Price/earnings ratio is the most popular and most widespread evaluation model used to assess relative capital asset value on financial markets. In functional terms, company earnings in the very long term can be described with high significance. Empirically, it is visible from long-term statistics that the demanded (required yield on capital markets has certain regularity. Thus, investors first require a yield above the stable inflation rate and then a dividend yield and a capital increase caused by the growth of earnings that influence the price, with the assumption that the P/E ratio is stable. By combining the Gordon model for current dividend value, the model of market capitalization of earnings (price/earnings ratio and bearing in mind the influence of the general price levels on company earnings, it is possible to adjust the price/earnings ratio by deriving a function of the required yield on capital markets measured by a market index through dividend yield and inflation rate above the stable inflation rate increased by profit growth. The S&P 500 index for example, has in the last 100 years grown by exactly the inflation rate above the stable inflation rate increased by profit growth. The comparison of two series of price/earnings ratios, a modelled one and an average 7-year ratio, shows a notable correlation in the movement of two series of variables, with a three year deviation. Therefore, it could be hypothesized that three years of the expected inflation level, dividend yield and profit growth rate of the market index are discounted in the current market prices. The conclusion is that, at the present time, the relationship between the adjusted average price/earnings ratio and its effect on the market index on one hand and the modelled price/earnings ratio on the other can clearly show the expected dynamics and course in the following period.

  6. 34 CFR 300.305 - Additional requirements for evaluations and reevaluations.

    Science.gov (United States)

    2010-07-01

    ... evaluation (if appropriate) and as part of any reevaluation under this part, the IEP Team and other qualified... the measurable annual goals set out in the IEP of the child and to participate, as appropriate, in the...) of this section. (d) Requirements if additional data are not needed. (1) If the IEP Team and other...

  7. Multi-site evaluation of terrestrial evaporation models using FLUXNET data

    KAUST Repository

    Ershadi, Ali; McCabe, Matthew; Evans, Jason P.; Chaney, Nathaniel W.; Wood, Eric F.

    2014-01-01

    We evaluated the performance of four commonly applied land surface evaporation models using a high-quality dataset of selected FLUXNET towers. The models that were examined include an energy balance approach (Surface Energy Balance System; SEBS), a combination-type technique (single-source Penman-Monteith; PM), a complementary method (advection-aridity; AA) and a radiation based approach (modified Priestley-Taylor; PT-JPL). Twenty FLUXNET towers were selected based upon satisfying stringent forcing data requirements and representing a wide range of biomes. These towers encompassed a number of grassland, cropland, shrubland, evergreen needleleaf forest and deciduous broadleaf forest sites. Based on the mean value of the Nash-Sutcliffe efficiency (NSE) and the root mean squared difference (RMSD), the order of overall performance of the models from best to worst were: ensemble mean of models (0.61, 64), PT-JPL (0.59, 66), SEBS (0.42, 84), PM (0.26, 105) and AA (0.18, 105) [statistics stated as (NSE, RMSD in Wm-2)]. Although PT-JPL uses a relatively simple and largely empirical formulation of the evaporative process, the technique showed improved performance compared to PM, possibly due to its partitioning of total evaporation (canopy transpiration, soil evaporation, wet canopy evaporation) and lower uncertainties in the required forcing data. The SEBS model showed low performance over tall and heterogeneous canopies, which was likely a consequence of the effects of the roughness sub-layer parameterization employed in this scheme. However, SEBS performed well overall. Relative to PT-JPL and SEBS, the PM and AA showed low performance over the majority of sites, due to their sensitivity to the parameterization of resistances. Importantly, it should be noted that no single model was consistently best across all biomes. Indeed, this outcome highlights the need for further evaluation of each model's structure and parameterizations to identify sensitivities and their

  8. Modelling requirements for future assessments based on FEP analysis

    International Nuclear Information System (INIS)

    Locke, J.; Bailey, L.

    1998-01-01

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed safety assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs that are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. The structuring of FEPs on a Master Directed Diagram (MDD) provides a systematic framework for identifying those FEPs that form part of the natural evolution of the system and those, which may define alternative potential evolutions of the repository system. In order to construct a description of the base scenario, FEPs have been grouped into a series of conceptual models. Conceptual models are groups of FEPs, identified from the MDD, representing a specific component or process within the disposal system. It has been found appropriate to define conceptual models in terms of the three main components of the disposal system: the repository engineered system, the surrounding geosphere and the biosphere. For each of these components, conceptual models provide a description of the relevant subsystem in terms of its initial characteristics, subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. The aim of this document is to present the methodology that has been developed for deriving modelling requirements and to illustrate the application of the methodology by

  9. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  10. A fuzzy TOPSIS model to evaluate the Business Intelligence competencies of Port Community Systems

    Directory of Open Access Journals (Sweden)

    Ghazanfari Mehdi

    2014-04-01

    Full Text Available Evaluation of the Business Intelligence (BI competencies of port community systems before they are bought and deployed is a vital importance for establishment of a decision-support environment for managers. This study proposes a new model which provides a simple approach to the assessment of the BI competencies of port community systems in organization. This approach helps decision-makers to select an enterprise system with appropriate intelligence requirements to support the managers’ decision-making tasks. Thirtyfour criteria for BI specifications are determined from a thorough review of the literature. The proposed model uses the fuzzy TOPSIS technique, which employs fuzzy weights of the criteria and fuzzy judgments of port community systems to compute the evaluation scores and rankings. The application of the model is realized in the evaluation, ranking and selecting of the needed port community systems in a port and maritime organization, in order to validate the proposed model with a real application. With utilizing the proposed model organizations can assess, select, and purchase port community systems which will provide a better decision-support environment for their business systems.

  11. Ozone modeling for compliance planning: A synopsis of ''The Use of Photochemical Air Quality Models for Evaluating Emission Control Strategies: A Synthesis Report''

    International Nuclear Information System (INIS)

    Blanchard, C.L.

    1992-12-01

    The 1990 federal Clean Air Act Amendments require that many nonattainment areas use gridded, photochemical air quality models to develop compliance plans for meeting the ambient ozone standard. Both industry and regulatory agencies will need to consider explicitly the strengths and limitations of the models. Photochemical air quality models constitute the principal tool available for evaluating the relative effectiveness of alternative emission control strategies. Limitations in the utility of modeling results stem from the uncertainty and bias of predictions for modeled episodes, possible compensating errors, limitations in the number of modeled episodes, and incompatibility between deterministic model predictions and the statistical form of the air quality standard for ozone. If emissions estimates (including naturally produced ''biogenic'' emissions) are accurate, intensive aerometric data are available, and an evaluation of performance (including diagnostic evaluations) is successfully completed, gridded photochemical airquality models can determine (1) the types of emission controls - VOC, NO x , or both - that would be most effective for reducing ozone concentrations, and (2) the approximate magnitudes - to within about 20--40% - of the estimated ozone reductions

  12. Minimization of required model runs in the Random Mixing approach to inverse groundwater flow and transport modeling

    Science.gov (United States)

    Hoerning, Sebastian; Bardossy, Andras; du Plessis, Jaco

    2017-04-01

    Most geostatistical inverse groundwater flow and transport modelling approaches utilize a numerical solver to minimize the discrepancy between observed and simulated hydraulic heads and/or hydraulic concentration values. The optimization procedure often requires many model runs, which for complex models lead to long run times. Random Mixing is a promising new geostatistical technique for inverse modelling. The method is an extension of the gradual deformation approach. It works by finding a field which preserves the covariance structure and maintains observed hydraulic conductivities. This field is perturbed by mixing it with new fields that fulfill the homogeneous conditions. This mixing is expressed as an optimization problem which aims to minimize the difference between the observed and simulated hydraulic heads and/or concentration values. To preserve the spatial structure, the mixing weights must lie on the unit hyper-sphere. We present a modification to the Random Mixing algorithm which significantly reduces the number of model runs required. The approach involves taking n equally spaced points on the unit circle as weights for mixing conditional random fields. Each of these mixtures provides a solution to the forward model at the conditioning locations. For each of the locations the solutions are then interpolated around the circle to provide solutions for additional mixing weights at very low computational cost. The interpolated solutions are used to search for a mixture which maximally reduces the objective function. This is in contrast to other approaches which evaluate the objective function for the n mixtures and then interpolate the obtained values. Keeping the mixture on the unit circle makes it easy to generate equidistant sampling points in the space; however, this means that only two fields are mixed at a time. Once the optimal mixture for two fields has been found, they are combined to form the input to the next iteration of the algorithm. This

  13. An evaluation framework for participatory modelling

    Science.gov (United States)

    Krueger, T.; Inman, A.; Chilvers, J.

    2012-04-01

    Strong arguments for participatory modelling in hydrology can be made on substantive, instrumental and normative grounds. These arguments have led to increasingly diverse groups of stakeholders (here anyone affecting or affected by an issue) getting involved in hydrological research and the management of water resources. In fact, participation has become a requirement of many research grants, programs, plans and policies. However, evidence of beneficial outcomes of participation as suggested by the arguments is difficult to generate and therefore rare. This is because outcomes are diverse, distributed, often tacit, and take time to emerge. In this paper we develop an evaluation framework for participatory modelling focussed on learning outcomes. Learning encompasses many of the potential benefits of participation, such as better models through diversity of knowledge and scrutiny, stakeholder empowerment, greater trust in models and ownership of subsequent decisions, individual moral development, reflexivity, relationships, social capital, institutional change, resilience and sustainability. Based on the theories of experiential, transformative and social learning, complemented by practitioner experience our framework examines if, when and how learning has occurred. Special emphasis is placed on the role of models as learning catalysts. We map the distribution of learning between stakeholders, scientists (as a subgroup of stakeholders) and models. And we analyse what type of learning has occurred: instrumental learning (broadly cognitive enhancement) and/or communicative learning (change in interpreting meanings, intentions and values associated with actions and activities; group dynamics). We demonstrate how our framework can be translated into a questionnaire-based survey conducted with stakeholders and scientists at key stages of the participatory process, and show preliminary insights from applying the framework within a rural pollution management situation in

  14. Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements

    Science.gov (United States)

    Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.

    2018-01-01

    The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation

  15. Input data requirements for performance modelling and monitoring of photovoltaic plants

    DEFF Research Database (Denmark)

    Gavriluta, Anamaria Florina; Spataru, Sergiu; Sera, Dezso

    2018-01-01

    This work investigates the input data requirements in the context of performance modeling of thin-film photovoltaic (PV) systems. The analysis focuses on the PVWatts performance model, well suited for on-line performance monitoring of PV strings, due to its low number of parameters and high......, modelling the performance of the PV modules at high irradiances requires a dataset of only a few hundred samples in order to obtain a power estimation accuracy of ~1-2\\%....

  16. A nutrition mathematical model to account for dietary supply and requirements of energy and nutrients for domesticated small ruminants: the development and evaluation of the Small Ruminant Nutrition System

    Directory of Open Access Journals (Sweden)

    Luis Orlindo Tedeschi

    2008-07-01

    Full Text Available A mechanistic model that predicts nutrient requirements and biological values of feeds for sheep (Cornell Net Carbohydrate and Protein System; CNCPS-S was expanded to include goats and the name was changed to the Small Ruminant Nutrition System (SRNS. The SRNS uses animal and environmental factors to predict metabolizable energy (ME and protein, and Ca and P requirements. Requirements for goats in the SRNS are predicted based on the equations developed for CNCPS-S, modified to account for specific requirements of goats, including maintenance, lactation, and pregnancy requirements, and body reserves. Feed biological values are predicted based on carbohydrate and protein fractions and their ruminal fermentation rates, forage, concentrate and liquid passage rates, and microbial growth. The evaluation of the SRNS for sheep using published papers (19 treatment means indicated no mean bias (MB; 1.1 g/100 g and low root mean square prediction error (RMSPE; 3.6 g/100g when predicting dietary organic matter digestibility for diets not deficient in ruminal nitrogen. The SRNS accurately predicted gains and losses of shrunk body weight (SBW of adult sheep (15 treatment means; MB = 5.8 g/d and RMSPE = 30 g/d when diets were not deficient in ruminal nitrogen. The SRNS for sheep had MB varying from -34 to 1 g/d and RSME varying from 37 to 56 g/d when predicting average daily gain (ADG of growing lambs (42 treatment means. The evaluation of the SRNS for goats based on literature data showed accurate predictions for ADG of kids (31 treatment means; RMSEP = 32.5 g/d; r2= 0.85; concordance correlation coefficient, CCC, = 0.91, daily ME intake (21 treatment means; RMSEP = 0.24 Mcal/d g/d; r2 = 0.99; CCC = 0.99, and energy balance (21 treatment means; RMSEP = 0.20 Mcal/d g/d; r2 = 0.87; CCC = 0.90 of goats. In conclusion, the SRNS for sheep can accurately predict dietary organic matter digestibility, ADG of growing lambs and changes in SBW of mature sheep. The SRNS

  17. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  18. QUALITY OF AN ACADEMIC STUDY PROGRAMME - EVALUATION MODEL

    Directory of Open Access Journals (Sweden)

    Mirna Macur

    2016-01-01

    Full Text Available Quality of an academic study programme is evaluated by many: employees (internal evaluation and by external evaluators: experts, agencies and organisations. Internal and external evaluation of an academic programme follow written structure that resembles on one of the quality models. We believe the quality models (mostly derived from EFQM excellence model don’t fit very well into non-profit activities, policies and programmes, because they are much more complex than environment, from which quality models derive from (for example assembly line. Quality of an academic study programme is very complex and understood differently by various stakeholders, so we present dimensional evaluation in the article. Dimensional evaluation, as opposed to component and holistic evaluation, is a form of analytical evaluation in which the quality of value of the evaluand is determined by looking at its performance on multiple dimensions of merit or evaluation criteria. First stakeholders of a study programme and their views, expectations and interests are presented, followed by evaluation criteria. They are both joined into the evaluation model revealing which evaluation criteria can and should be evaluated by which stakeholder. Main research questions are posed and research method for each dimension listed.

  19. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  20. An Integrated Modeling Approach to Evaluate and Optimize Data Center Sustainability, Dependability and Cost

    Directory of Open Access Journals (Sweden)

    Gustavo Callou

    2014-01-01

    Full Text Available Data centers have evolved dramatically in recent years, due to the advent of social networking services, e-commerce and cloud computing. The conflicting requirements are the high availability levels demanded against the low sustainability impact and cost values. The approaches that evaluate and optimize these requirements are essential to support designers of data center architectures. Our work aims to propose an integrated approach to estimate and optimize these issues with the support of the developed environment, Mercury. Mercury is a tool for dependability, performance and energy flow evaluation. The tool supports reliability block diagrams (RBD, stochastic Petri nets (SPNs, continuous-time Markov chains (CTMC and energy flow (EFM models. The EFM verifies the energy flow on data center architectures, taking into account the energy efficiency and power capacity that each device can provide (assuming power systems or extract (considering cooling components. The EFM also estimates the sustainability impact and cost issues of data center architectures. Additionally, a methodology is also considered to support the modeling, evaluation and optimization processes. Two case studies are presented to illustrate the adopted methodology on data center power systems.

  1. Requirements of Integrated Design Teams While Evaluating Advanced Energy Retrofit Design Options in Immersive Virtual Environments

    Directory of Open Access Journals (Sweden)

    Xue Yang

    2015-12-01

    Full Text Available One of the significant ways to save energy use in buildings is to implement advanced energy retrofits in existing buildings. Improving energy performance of buildings through advanced energy retrofitting requires a clear understanding of the cost and energy implications of design alternatives from various engineering disciplines when different retrofit options are considered. The communication of retrofit design alternatives and their energy implications is essential in the decision-making process, as it affects the final retrofit selections and hence the energy efficiency of the retrofitted buildings. The objective of the research presented here was to identify a generic list of information requirements that are needed to be shared and collectively analyzed by integrated design teams during advanced energy retrofit design review meetings held in immersive settings. While identifying such requirements, the authors used an immersive environment based iterative requirements elicitation approach. The technology was used as a means to better identify the information requirements of integrated design teams to be analyzed as a group. This paper provides findings on information requirements of integrated design teams when evaluating retrofit options in immersive virtual environments. The information requirements were identified through interactions with sixteen experts in design and energy modeling domain, and validated with another group of participants consisting of six design experts who were experienced in integrated design processes. Industry practitioners can use the findings in deciding on what information to share with integrated design team members during design review meetings that utilize immersive virtual environments.

  2. A review of typhoid fever transmission dynamic models and economic evaluations of vaccination.

    Science.gov (United States)

    Watson, Conall H; Edmunds, W John

    2015-06-19

    Despite a recommendation by the World Health Organization (WHO) that typhoid vaccines be considered for the control of endemic disease and outbreaks, programmatic use remains limited. Transmission models and economic evaluation may be informative in decision making about vaccine programme introductions and their role alongside other control measures. A literature search found few typhoid transmission models or economic evaluations relative to analyses of other infectious diseases of similar or lower health burden. Modelling suggests vaccines alone are unlikely to eliminate endemic disease in the short to medium term without measures to reduce transmission from asymptomatic carriage. The single identified data-fitted transmission model of typhoid vaccination suggests vaccines can reduce disease burden substantially when introduced programmatically but that indirect protection depends on the relative contribution of carriage to transmission in a given setting. This is an important source of epidemiological uncertainty, alongside the extent and nature of natural immunity. Economic evaluations suggest that typhoid vaccination can be cost-saving to health services if incidence is extremely high and cost-effective in other high-incidence situations, when compared to WHO norms. Targeting vaccination to the highest incidence age-groups is likely to improve cost-effectiveness substantially. Economic perspective and vaccine costs substantially affect estimates, with disease incidence, case-fatality rates, and vaccine efficacy over time also important determinants of cost-effectiveness and sources of uncertainty. Static economic models may under-estimate benefits of typhoid vaccination by omitting indirect protection. Typhoid fever transmission models currently require per-setting epidemiological parameterisation to inform their use in economic evaluation, which may limit their generalisability. We found no economic evaluation based on transmission dynamic modelling, and no

  3. Orbiter data reduction complex data processing requirements for the OFT mission evaluation team (level C)

    Science.gov (United States)

    1979-01-01

    This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.

  4. Mathematical modeling and evaluation of radionuclide transport parameters from the ANL Laboratory Analog Program

    International Nuclear Information System (INIS)

    Chen, B.C.J.; Hull, J.R.; Seitz, M.G.; Sha, W.T.; Shah, V.L.; Soo, S.L.

    1984-07-01

    Computer model simulation is required to evaluate the performance of proposed or future high-level radioactive waste geological repositories. However, the accuracy of a model in predicting the real situation depends on how well the values of the transport properties are prescribed as input parameters. Knowledge of transport parameters is therefore essential. We have modeled ANL's Experiment Analog Program which was designed to simulate long-term radwaste migration process by groundwater flowing through a high-level radioactive waste repository. Using this model and experimental measurements, we have evaluated neptunium (actinide) deposition velocity and analyzed the complex phenomena of simultaneous deposition, erosion, and reentrainment of bentonite when groundwater is flowing through a narrow crack in a basalt rock. The present modeling demonstrates that we can obtain the values of transport parameters, as added information without any additional cost, from the available measurements of laboratory analog experiments. 8 figures, 3 tables

  5. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  6. Pragmatic geometric model evaluation

    Science.gov (United States)

    Pamer, Robert

    2015-04-01

    Quantification of subsurface model reliability is mathematically and technically demanding as there are many different sources of uncertainty and some of the factors can be assessed merely in a subjective way. For many practical applications in industry or risk assessment (e. g. geothermal drilling) a quantitative estimation of possible geometric variations in depth unit is preferred over relative numbers because of cost calculations for different scenarios. The talk gives an overview of several factors that affect the geometry of structural subsurface models that are based upon typical geological survey organization (GSO) data like geological maps, borehole data and conceptually driven construction of subsurface elements (e. g. fault network). Within the context of the trans-European project "GeoMol" uncertainty analysis has to be very pragmatic also because of different data rights, data policies and modelling software between the project partners. In a case study a two-step evaluation methodology for geometric subsurface model uncertainty is being developed. In a first step several models of the same volume of interest have been calculated by omitting successively more and more input data types (seismic constraints, fault network, outcrop data). The positions of the various horizon surfaces are then compared. The procedure is equivalent to comparing data of various levels of detail and therefore structural complexity. This gives a measure of the structural significance of each data set in space and as a consequence areas of geometric complexity are identified. These areas are usually very data sensitive hence geometric variability in between individual data points in these areas is higher than in areas of low structural complexity. Instead of calculating a multitude of different models by varying some input data or parameters as it is done by Monte-Carlo-simulations, the aim of the second step of the evaluation procedure (which is part of the ongoing work) is to

  7. Research on the evaluation model of the software reliability in nuclear safety class digital instrumentation and control system

    International Nuclear Information System (INIS)

    Liu Ying; Yang Ming; Li Fengjun; Ma Zhanguo; Zeng Hai

    2014-01-01

    In order to analyze the software reliability (SR) in nuclear safety class digital instrumentation and control system (D-I and C), firstly, the international software design standards were analyzed, the standards' framework was built, and we found that the D-I and C software standards should follow the NUREG-0800 BTP7-14, according to the NRC NUREG-0800 review of requirements. Secondly, the quantitative evaluation model of SR using Bayesian Belief Network and thirteen sub-model frameworks were established. Thirdly, each sub-models and the weight of corresponding indexes in the evaluation model were analyzed. Finally, the safety case was introduced. The models lay a foundation for review and quantitative evaluation on the SR in nuclear safety class D-I and C. (authors)

  8. Formal Requirements Modeling for Reactive Systems with Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon

    This dissertation presents the contributions of seven publications all concerned with the application of Coloured Petri Nets (CPN) to requirements modeling for reactive systems. The publications are introduced along with relevant background material and related work, and their contributions...... to take into concern that the behavior of human actors is less likely to be predictable than the behavior of e.g. mechanical components.   In the second approach, the CPN model is parameterized and utilizes a generic and reusable CPN module operating as an SD interpreter. In addition to distinguishing...... and events. A tool is presented that allows automated validation of the structure of CPN models with respect to the guidelines. Next, three publications on integrating Jackson's Problem Frames with CPN requirements models are presented: The first publication introduces a method for systematically structuring...

  9. 42 CFR 418.205 - Special requirements for hospice pre-election evaluation and counseling services.

    Science.gov (United States)

    2010-10-01

    ... evaluation and counseling services. 418.205 Section 418.205 Public Health CENTERS FOR MEDICARE & MEDICAID... Services § 418.205 Special requirements for hospice pre-election evaluation and counseling services. (a... evaluation and counseling services as specified in § 418.304(d) may be made to a hospice on behalf of a...

  10. Training effectiveness evaluation model

    International Nuclear Information System (INIS)

    Penrose, J.B.

    1993-01-01

    NAESCO's Training Effectiveness Evaluation Model (TEEM) integrates existing evaluation procedures with new procedures. The new procedures are designed to measure training impact on organizational productivity. TEEM seeks to enhance organizational productivity through proactive training focused on operation results. These results can be identified and measured by establishing and tracking performance indicators. Relating training to organizational productivity is not easy. TEEM is a team process. It offers strategies to assess more effectively organizational costs and benefits of training. TEEM is one organization's attempt to refine, manage and extend its training evaluation program

  11. Evaluation of HVS models in the application of medical image quality assessment

    Science.gov (United States)

    Zhang, L.; Cavaro-Menard, C.; Le Callet, P.

    2012-03-01

    In this study, four of the most widely used Human Visual System (HVS) models are applied on Magnetic Resonance (MR) images for signal detection task. Their performances are evaluated against gold standard derived from radiologists' majority decision. The task-based image quality assessment requires taking into account the human perception specificities, for which various HVS models have been proposed. However to our knowledge, no work was conducted to evaluate and compare the suitability of these models with respect to the assessment of medical image qualities. This pioneering study investigates the performances of different HVS models on medical images in terms of approximation to radiologist performance. We propose to score the performance of each HVS model using the AUC (Area Under the receiver operating characteristic Curve) and its variance estimate as the figure of merit. The radiologists' majority decision is used as gold standard so that the estimated AUC measures the distance between the HVS model and the radiologist perception. To calculate the variance estimate of AUC, we adopted the one-shot method that is independent of the HVS model's output range. The results of this study will help to provide arguments to the application of some HVS model on our future medical image quality assessment metric.

  12. Model-Independent Evaluation of Tumor Markers and a Logistic-Tree Approach to Diagnostic Decision Support

    Directory of Open Access Journals (Sweden)

    Weizeng Ni

    2014-01-01

    Full Text Available Sensitivity and specificity of using individual tumor markers hardly meet the clinical requirement. This challenge gave rise to many efforts, e.g., combing multiple tumor markers and employing machine learning algorithms. However, results from different studies are often inconsistent, which are partially attributed to the use of different evaluation criteria. Also, the wide use of model-dependent validation leads to high possibility of data overfitting when complex models are used for diagnosis. We propose two model-independent criteria, namely, area under the curve (AUC and Relief to evaluate the diagnostic values of individual and multiple tumor markers, respectively. For diagnostic decision support, we propose the use of logistic-tree which combines decision tree and logistic regression. Application on a colorectal cancer dataset shows that the proposed evaluation criteria produce results that are consistent with current knowledge. Furthermore, the simple and highly interpretable logistic-tree has diagnostic performance that is competitive with other complex models.

  13. Development of loca calculation capability with relap5-3D in accordance with the evaluation model methodology

    International Nuclear Information System (INIS)

    Liang, T.K.S.; Huan-Jen, Hung; Chin-Jang, Chang; Lance, Wang

    2001-01-01

    In light water reactors, particularly the pressurized water reactor (PWR), the severity of a LOCA (loss of coolant accident) will limit how high the reactor power can operate. Although the best-estimate LOCA licensing methodology can provide the greatest margin on the PCT (peak cladding temperature) evaluation during LOCA, it generally takes more resources to develop. Instead, implementation of evaluation models required by the Appendix K of 10 CFR 50 upon an advanced thermal-hydraulic platform can also enlarge significant margin between the highest calculated PCT and the safety limit of 2200 F. The compliance of the current RELAP5-3D code with Appendix K of 10 CFR50 has been evaluated, and it was found that there are ten areas where code assessment and/or further modifications were required to satisfy the requirements set forth in the Appendix K of 10 CFR 50. The associated models for LOCA consequent phenomenon analysis should follow the major concern of regulation and be expected to give more conservative results than those by the best-estimate methodology. They were required to predict the decay power level, the blowdown hydraulics, the blowdown heat transfer, the flooding rate, and the flooding heat transfer. All of the ten areas included in above classified simulations have been further evaluated and the RELAP5-3D has been successfully modified to fulfill the associated requirements. In addition, to verify and assess the development of the Appendix K version of RELAP5-3D, nine separate-effect experiments were adopted. Through the assessments against separate-effect experiments, the success of the code modification in accordance with the Appendix K of 10 CFR 50 was demonstrated. We will apply another six sets of integral-effect experiments in the next step to assure the integral conservatism of the Appendix K version of RELAP5-3D on LOCA licensing evaluation. (authors)

  14. Nuclear models to 200 MeV for high-energy data evaluations. Vol.12

    International Nuclear Information System (INIS)

    Chadwick, M.; Reffo, G.; Dunford, C.L.; Oblozinsky, P.

    1998-01-01

    The work of the Nuclear Energy Agency's Subgroup 12 is described, which represents a collaborative effort to summarize the current status of nuclear reaction modelling codes and prioritize desired future model improvements. Nuclear reaction modelling codes that use appropriate physics in the energy region up to 200 MeV are the focus of this study, particularly those that have proved useful in nuclear data evaluation work. This study is relevant to developing needs in accelerator-driven technology programs, which require accurate nuclear data to high energies for enhanced radiation transport simulations to guide engineering design. (author)

  15. Capabilities and requirements for modelling radionuclide transport in the geosphere

    International Nuclear Information System (INIS)

    Paige, R.W.; Piper, D.

    1989-02-01

    This report gives an overview of geosphere flow and transport models suitable for use by the Department of the Environment in the performance assessment of radioactive waste disposal sites. An outline methodology for geosphere modelling is proposed, consisting of a number of different types of model. A brief description of each of the component models is given, indicating the purpose of the model, the processes being modelled and the methodologies adopted. Areas requiring development are noted. (author)

  16. A State-Based Modeling Approach for Efficient Performance Evaluation of Embedded System Architectures at Transaction Level

    Directory of Open Access Journals (Sweden)

    Anthony Barreteau

    2012-01-01

    Full Text Available Abstract models are necessary to assist system architects in the evaluation process of hardware/software architectures and to cope with the still increasing complexity of embedded systems. Efficient methods are required to create reliable models of system architectures and to allow early performance evaluation and fast exploration of the design space. In this paper, we present a specific transaction level modeling approach for performance evaluation of hardware/software architectures. This approach relies on a generic execution model that exhibits light modeling effort. Created models are used to evaluate by simulation expected processing and memory resources according to various architectures. The proposed execution model relies on a specific computation method defined to improve the simulation speed of transaction level models. The benefits of the proposed approach are highlighted through two case studies. The first case study is a didactic example illustrating the modeling approach. In this example, a simulation speed-up by a factor of 7,62 is achieved by using the proposed computation method. The second case study concerns the analysis of a communication receiver supporting part of the physical layer of the LTE protocol. In this case study, architecture exploration is led in order to improve the allocation of processing functions.

  17. Modelling and evaluation of surgical performance using hidden Markov models.

    Science.gov (United States)

    Megali, Giuseppe; Sinigaglia, Stefano; Tonet, Oliver; Dario, Paolo

    2006-10-01

    Minimally invasive surgery has become very widespread in the last ten years. Since surgeons experience difficulties in learning and mastering minimally invasive techniques, the development of training methods is of great importance. While the introduction of virtual reality-based simulators has introduced a new paradigm in surgical training, skill evaluation methods are far from being objective. This paper proposes a method for defining a model of surgical expertise and an objective metric to evaluate performance in laparoscopic surgery. Our approach is based on the processing of kinematic data describing movements of surgical instruments. We use hidden Markov model theory to define an expert model that describes expert surgical gesture. The model is trained on kinematic data related to exercises performed on a surgical simulator by experienced surgeons. Subsequently, we use this expert model as a reference model in the definition of an objective metric to evaluate performance of surgeons with different abilities. Preliminary results show that, using different topologies for the expert model, the method can be efficiently used both for the discrimination between experienced and novice surgeons, and for the quantitative assessment of surgical ability.

  18. Evaluating to Solve Educational Problems: An Alternative Model.

    Science.gov (United States)

    Friedman, Myles I.; Anderson, Lorin W.

    1979-01-01

    A 19-step general evaluation model is described through its four stages: identifying problems, prescribing program solutions, evaluating the operation of the program, and evaluating the effectiveness of the model. The role of the evaluator in decision making is also explored. (RAO)

  19. Modeling the allocation and economic evaluation of PV panels and wind turbines in urban areas

    NARCIS (Netherlands)

    Mohammadi, S.; Vries, de B.; Schaefer, W.F.; Timmermans, H.

    2014-01-01

    A model for allocating PV panels and wind turbines in urban areas is developed. Firstly, it examines the spatial and technical requirements for the installation of PV panels and wind turbines and then evaluates their economic feasibilities in order to generate the cost effective electricity neutral

  20. Market Competitiveness Evaluation of Mechanical Equipment with a Pairwise Comparisons Hierarchical Model.

    Science.gov (United States)

    Hou, Fujun

    2016-01-01

    This paper provides a description of how market competitiveness evaluations concerning mechanical equipment can be made in the context of multi-criteria decision environments. It is assumed that, when we are evaluating the market competitiveness, there are limited number of candidates with some required qualifications, and the alternatives will be pairwise compared on a ratio scale. The qualifications are depicted as criteria in hierarchical structure. A hierarchical decision model called PCbHDM was used in this study based on an analysis of its desirable traits. Illustration and comparison shows that the PCbHDM provides a convenient and effective tool for evaluating the market competitiveness of mechanical equipment. The researchers and practitioners might use findings of this paper in application of PCbHDM.

  1. Evaluation of data requirements for computerized constructability analysis of pavement rehabilitation projects.

    Science.gov (United States)

    2013-08-01

    This research aimed to evaluate the data requirements for computer assisted construction planning : and staging methods that can be implemented in pavement rehabilitation projects in the state of : Georgia. Results showed that two main issues for the...

  2. Required experimental accuracy to select between supersymmetrical models

    Science.gov (United States)

    Grellscheid, David

    2004-03-01

    We will present a method to decide a priori whether various supersymmetrical scenarios can be distinguished based on sparticle mass data alone. For each model, a scan over all free SUSY breaking parameters reveals the extent of that model's physically allowed region of sparticle-mass-space. Based on the geometrical configuration of these regions in mass-space, it is possible to obtain an estimate of the required accuracy of future sparticle mass measurements to distinguish between the models. We will illustrate this algorithm with an example. This talk is based on work done in collaboration with B C Allanach (LAPTH, Annecy) and F Quevedo (DAMTP, Cambridge).

  3. Modeling and simulation in the design and evaluation of conceptual safeguards systems

    International Nuclear Information System (INIS)

    Cobb, D.D.; Smith, D.B.

    1977-01-01

    Recent work in modeling nuclear fuel-cycle processes and the materials measurement and accountability portion of integrated safeguards systems is summarized. Process variability, especially in the levels of in-process holdup and in scrap and waste sidestreams, impacts significantly on the design of dynamic materials accounting and control systems. In the absence of operating data from modern facilities, detailed dynamic models of process operation are required in order to evaluate systems design concepts. Numerical data are presented from the simulated operation of major portions of spent fuel reprocessing, plutonium nitrate-to-oxide conversion, and mixed-oxide fuel-fabrication processes

  4. A Model for Evaluating Student Clinical Psychomotor Skills.

    Science.gov (United States)

    And Others; Fiel, Nicholas J.

    1979-01-01

    A long-range plan to evaluate medical students' physical examination skills was undertaken at the Ingham Family Medical Clinic at Michigan State University. The development of the psychomotor skills evaluation model to evaluate the skill of blood pressure measurement, tests of the model's reliability, and the use of the model are described. (JMD)

  5. Saphire models and software for ASP evaluations

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission's (NRC's) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented

  6. Generic skills requirements (KSA model) towards future mechanical ...

    African Journals Online (AJOL)

    ... Statistics and Discriminant Analysis (DA) as required to achieve the objective of the study. This study will guide all future engineers, especially in the field of Mechanical Engineering in Malaysia to penetrate the job market according to the current market needs. Keywords: generic skills; KSA model; mechanical engineers; ...

  7. Modelling of capital requirements in the energy sector: capital market access. Final memorandum

    Energy Technology Data Exchange (ETDEWEB)

    1978-04-01

    Formal modelling techniques for analyzing the capital requirements of energy industries have been performed at DOE. A survey has been undertaken of a number of models which forecast energy-sector capital requirements or which detail the interactions of the energy sector and the economy. Models are identified which can be useful as prototypes for some portion of DOE's modelling needs. The models are examined to determine any useful data bases which could serve as inputs to an original DOE model. A selected group of models are examined which can comply with the stated capabilities. The data sources being used by these models are covered and a catalog of the relevant data bases is provided. The models covered are: capital markets and capital availability models (Fossil 1, Bankers Trust Co., DRI Macro Model); models of physical capital requirements (Bechtel Supply Planning Model, ICF Oil and Gas Model and Coal Model, Stanford Research Institute National Energy Model); macroeconomic forecasting models with input-output analysis capabilities (Wharton Annual Long-Term Forecasting Model, Brookhaven/University of Illinois Model, Hudson-Jorgenson/Brookhaven Model); utility models (MIT Regional Electricity Model-Baughman Joskow, Teknekron Electric Utility Simulation Model); and others (DRI Energy Model, DRI/Zimmerman Coal Model, and Oak Ridge Residential Energy Use Model).

  8. 45 CFR 2516.820 - What types of internal evaluation activities are required of programs?

    Science.gov (United States)

    2010-10-01

    ... required to: (a) Continuously assess management effectiveness, the quality of services provided, and the satisfaction of both participants and service recipients. Internal evaluations should seek frequent feedback... 45 Public Welfare 4 2010-10-01 2010-10-01 false What types of internal evaluation activities are...

  9. Commonsense Psychology and the Functional Requirements of Cognitive Models

    National Research Council Canada - National Science Library

    Gordon, Andrew S

    2005-01-01

    In this paper we argue that previous models of cognitive abilities (e.g. memory, analogy) have been constructed to satisfy functional requirements of implicit commonsense psychological theories held by researchers and nonresearchers alike...

  10. Licensing evaluation of CANDU-PHW nuclear power plants relative to U.S. regulatory requirements

    International Nuclear Information System (INIS)

    Erp, J.B. van

    1978-01-01

    Differences between the U.S. and Canadian approach to safety and licensing are discussed. U.S. regulatory requirements are evaluated as regards their applicability to CANDU-PHW reactors; vice-versa the CANDU-PHW reactor is evaluated with respect to current Regulatory Requirements and Guides. A number of design modifications are proposed to be incorporated into the CANDU-PHW reactor in order to facilitate its introduction into the U.S. These modifications are proposed solely for the purpose of maintaining consistency within the current U.S. regulatory system and not out of a need to improve the safety of current-design CANDU-PHW nuclear power plants. A number of issues are identified which still require resolution. Most of these issues are concerned with design areas not (yet) covered by the ASME code. (author)

  11. Model evaluation of the phytoextraction potential of heavy metal hyperaccumulators and non-hyperaccumulators

    Energy Technology Data Exchange (ETDEWEB)

    Liang, H.-M. [Agricultural Biotechnology Research Center, Academia Sinica, 128 Section 2, Academia Road, Taipei, Taiwan 11529, Taiwan (China); Lin, T.-H. [Department of Statistics, National Taipei University, Taiwan (China); Chiou, J.-M. [Institute of Statistical Science, Academia Sinica, Taiwan (China); Yeh, K.-C., E-mail: kcyeh@gate.sinica.edu.t [Agricultural Biotechnology Research Center, Academia Sinica, 128 Section 2, Academia Road, Taipei, Taiwan 11529, Taiwan (China)

    2009-06-15

    Evaluation of the remediation ability of zinc/cadmium in hyper- and non-hyperaccumulator plant species through greenhouse studies is limited. To bridge the gap between greenhouse studies and field applications for phytoextraction, we used published data to examine the partitioning of heavy metals between plants and soil (defined as the bioconcentration factor). We compared the remediation ability of the Zn/Cd hyperaccumulators Thlaspi caerulescens and Arabidopsis halleri and the non-hyperaccumulators Nicotiana tabacum and Brassica juncea using a hierarchical linear model (HLM). A recursive algorithm was then used to evaluate how many harvest cycles were required to clean a contaminated site to meet Taiwan Environmental Protection Agency regulations. Despite the high bioconcentration factor of both hyperaccumulators, metal removal was still limited because of the plants' small biomass. Simulation with N. tabacum and the Cadmium model suggests further study and development of plants with high biomass and improved phytoextraction potential for use in environmental cleanup. - A quantitative solution enables the evaluation of Zn/Cd phytoextraction.

  12. Model evaluation of the phytoextraction potential of heavy metal hyperaccumulators and non-hyperaccumulators

    International Nuclear Information System (INIS)

    Liang, H.-M.; Lin, T.-H.; Chiou, J.-M.; Yeh, K.-C.

    2009-01-01

    Evaluation of the remediation ability of zinc/cadmium in hyper- and non-hyperaccumulator plant species through greenhouse studies is limited. To bridge the gap between greenhouse studies and field applications for phytoextraction, we used published data to examine the partitioning of heavy metals between plants and soil (defined as the bioconcentration factor). We compared the remediation ability of the Zn/Cd hyperaccumulators Thlaspi caerulescens and Arabidopsis halleri and the non-hyperaccumulators Nicotiana tabacum and Brassica juncea using a hierarchical linear model (HLM). A recursive algorithm was then used to evaluate how many harvest cycles were required to clean a contaminated site to meet Taiwan Environmental Protection Agency regulations. Despite the high bioconcentration factor of both hyperaccumulators, metal removal was still limited because of the plants' small biomass. Simulation with N. tabacum and the Cadmium model suggests further study and development of plants with high biomass and improved phytoextraction potential for use in environmental cleanup. - A quantitative solution enables the evaluation of Zn/Cd phytoextraction.

  13. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  14. Specification of advanced safety modeling requirements (Rev. 0)

    International Nuclear Information System (INIS)

    Fanning, T. H.; Tautges, T. J.

    2008-01-01

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models will

  15. Performance Evaluation of Sadoghi Hospital Based on «EFQM» Organizational Excellence Model

    Directory of Open Access Journals (Sweden)

    A Sanayeei

    2013-04-01

    Full Text Available Introduction: Realm of health care that organizations have faced in recent years has been described with high level of dynamism and development. To survive in such conditions, performance evaluation can have an effective role in satisfying proper quality for services. This study aimed to evaluate the performance of Shahid Sadoghi Yazd hospital through EFQM approach. Methods: This was a descriptive cross-sectional study. Data collection instrument was EFQM organization Excellence Model questionnaire which was completed by all the managers. The research data was gathered from a sample of 302 patients, staff, personnel and medical staff working in different parts of the hospital. Random stratified samples were selected and descriptive statistics were utilized in order to analyze the data. Results: The results revealed that Shahid Sadoughi hospital acquired 185.41 points out of the total 500 points considered in the model EFQM. In other words, the rating reflects the fact that regarding the defined desired position, the hospital has not achieved the desired rating. Conclusion: Since the hospital performance is posited in a low-middle class, much more attention is required in regard to therapeutic management in this hospital. Therefore, codifying an efficient and effective program to improve the hospital performance is necessary. Furthermore, it seems that EFQM model can be considered as a comprehensive model for performance evaluation in hospitals.

  16. Modeling for Green Supply Chain Evaluation

    Directory of Open Access Journals (Sweden)

    Elham Falatoonitoosi

    2013-01-01

    Full Text Available Green supply chain management (GSCM has become a practical approach to develop environmental performance. Under strict regulations and stakeholder pressures, enterprises need to enhance and improve GSCM practices, which are influenced by both traditional and green factors. This study developed a causal evaluation model to guide selection of qualified suppliers by prioritizing various criteria and mapping causal relationships to find effective criteria to improve green supply chain. The aim of the case study was to model and examine the influential and important main GSCM practices, namely, green logistics, organizational performance, green organizational activities, environmental protection, and green supplier evaluation. In the case study, decision-making trial and evaluation laboratory technique is applied to test the developed model. The result of the case study shows only “green supplier evaluation” and “green organizational activities” criteria of the model are in the cause group and the other criteria are in the effect group.

  17. Evaluating Energy Efficiency Policies with Energy-Economy Models

    Energy Technology Data Exchange (ETDEWEB)

    Mundaca, Luis; Neij, Lena; Worrell, Ernst; McNeil, Michael A.

    2010-08-01

    The growing complexities of energy systems, environmental problems and technology markets are driving and testing most energy-economy models to their limits. To further advance bottom-up models from a multidisciplinary energy efficiency policy evaluation perspective, we review and critically analyse bottom-up energy-economy models and corresponding evaluation studies on energy efficiency policies to induce technological change. We use the household sector as a case study. Our analysis focuses on decision frameworks for technology choice, type of evaluation being carried out, treatment of market and behavioural failures, evaluated policy instruments, and key determinants used to mimic policy instruments. Although the review confirms criticism related to energy-economy models (e.g. unrealistic representation of decision-making by consumers when choosing technologies), they provide valuable guidance for policy evaluation related to energy efficiency. Different areas to further advance models remain open, particularly related to modelling issues, techno-economic and environmental aspects, behavioural determinants, and policy considerations.

  18. Assessing Requirements Quality through Requirements Coverage

    Science.gov (United States)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software

  19. Two-handed assisted laparoscopic surgery: Evaluation in an animal model

    Directory of Open Access Journals (Sweden)

    Eduardo Sanchez-de-Badajoz

    2014-10-01

    Full Text Available Purposes To evaluate in an animal model the feasibility of a novel concept of hand-assisted surgery consisting of inserting two hands into the abdomen instead of one. The chosen procedure was retroperitoneal lymph node dissection (L-RPLND that was performed in five pigs. Surgical Technique A Pfannestiel and a transverse epigastric incisions were made through which both hands were introduced. The scope was inserted through the umbilicus. The colon was moved medially and the dissection was performed as in open surgery using short conventional surgical instruments. Comments The surgery was fulfilled easily and safely in quite a similar way as in open surgery. Two-handed laparoscopy may be indicated in cases that still today require an open approach as apparently makes the operation easier and significantly shortens the surgery time. However, new opinions and trials are required.

  20. Further Evaluation of a Practitioner Model for Increasing Eye Contact in Children With Autism.

    Science.gov (United States)

    Rapp, John T; Cook, Jennifer L; Nuta, Raluca; Balagot, Carissa; Crouchman, Kayla; Jenkins, Claire; Karim, Sidrah; Watters-Wybrow, Chelsea

    2018-02-01

    Cook et al. recently described a progressive model for teaching children with autism spectrum disorder (ASD) to provide eye contact with an instructor following a name call. The model included the following phases: contingent praise only, contingent edibles plus praise, stimulus prompts plus contingent edibles and praise, contingent video and praise, schedule thinning, generalization assessments, and maintenance evaluations. In the present study, we evaluated the extent to which modifications to the model were needed to train 15 children with ASD to engage in eye contact. Results show that 11 of 15 participants acquired eye contact with the progressive model; however, eight participants required one or more procedural modifications to the model to acquire eye contact. In addition, the four participants who did not acquire eye contact received one or more modifications. Results also show that participants who acquired eye contact with or without modifications continued to display high levels of the behavior during follow-up probes. We discuss directions for future research with and limitations of this progressive model.

  1. A Hybrid Network Model to Extract Key Criteria and Its Application for Brand Equity Evaluation

    Directory of Open Access Journals (Sweden)

    Chin-Yi Chen

    2012-01-01

    Full Text Available Making a decision implies that there are alternative choices to be considered, and a major challenge of decision-making is to identify the adequate criteria for program planning or problem evaluation. The decision-makers’ criteria consists of the characteristics or requirements each alternative must possess and the alternatives are rated on how well they possess each criterion. We often use criteria developed and used by different researchers and institutions, and these criteria have similar means and can be substituted for one another. Choosing from existing criteria offers a practical method to engineers hoping to derive a set of criteria for evaluating objects or programs. We have developed a hybrid model for extracting evaluation criteria which considers substitutions between the criteria. The model is developed based on Social Network Analysis and Maximum Mean De-Entropy algorithms. In this paper, the introduced methodology will also be applied to analyze the criteria for assessing brand equity as an application example. The proposed model demonstrates that it is useful in planning feasibility criteria and has applications in other evaluation-planning purposes.

  2. Challenges for the registration of vaccines in emerging countries: Differences in dossier requirements, application and evaluation processes.

    Science.gov (United States)

    Dellepiane, Nora; Pagliusi, Sonia

    2018-06-07

    The divergence of regulatory requirements and processes in developing and emerging countries contributes to hamper vaccines' registration, and therefore delay access to high-quality, safe and efficacious vaccines for their respective populations. This report focuses on providing insights on the heterogeneity of registration requirements in terms of numbering structure and overall content of dossiers for marketing authorisation applications for vaccines in different areas of the world. While it also illustrates the divergence of regulatory processes in general, as well as the need to avoid redundant reviews, it does not claim to provide a comprehensive view of all processes nor existing facilitating mechanisms, nor is it intended to touch upon the differences in assessments made by different regulatory authorities. This report describes the work analysed by regulatory experts from vaccine manufacturing companies during a meeting held in Geneva in May 2017, in identifying and quantifying differences in the requirements for vaccine registration in three aspects for comparison: the dossier numbering structure and contents, the application forms, and the evaluation procedures, in different countries and regions. The Module 1 of the Common Technical Document (CTD) of 10 countries were compared. Modules 2-5 of the CTDs of two regions and three countries were compared to the CTD of the US FDA. The application forms of eight countries were compared and the registration procedures of 134 importing countries were compared as well. The analysis indicates a high degree of divergence in numbering structure and content requirements. Possible interventions that would lead to significant improvements in registration efficiency include alignment in CTD numbering structure, a standardised model-application form, and better convergence of evaluation procedures. Copyright © 2018.

  3. [Decision modeling for economic evaluation of health technologies].

    Science.gov (United States)

    de Soárez, Patrícia Coelho; Soares, Marta Oliveira; Novaes, Hillegonda Maria Dutilh

    2014-10-01

    Most economic evaluations that participate in decision-making processes for incorporation and financing of technologies of health systems use decision models to assess the costs and benefits of the compared strategies. Despite the large number of economic evaluations conducted in Brazil, there is a pressing need to conduct an in-depth methodological study of the types of decision models and their applicability in our setting. The objective of this literature review is to contribute to the knowledge and use of decision models in the national context of economic evaluations of health technologies. This article presents general definitions about models and concerns with their use; it describes the main models: decision trees, Markov chains, micro-simulation, simulation of discrete and dynamic events; it discusses the elements involved in the choice of model; and exemplifies the models addressed in national economic evaluation studies of diagnostic and therapeutic preventive technologies and health programs.

  4. [Impact of a training model for the Child Development Evaluation Test in primary care].

    Science.gov (United States)

    Rizzoli-Córdoba, Antonio; Delgado-Ginebra, Ismael; Cruz-Ortiz, Leopoldo Alfonso; Baqueiro-Hernández, César Iván; Martain-Pérez, Itzamara Jacqueline; Palma-Tavera, Josuha Alexander; Villasís-Keever, Miguel Ángel; Reyes-Morales, Hortensia; O'Shea-Cuevas, Gabriel; Aceves-Villagrán, Daniel; Carrasco-Mendoza, Joaquín; Antillón-Ocampo, Fátima Adriana; Villagrán-Muñoz, Víctor Manuel; Halley-Castillo, Elizabeth; Vargas-López, Guillermo; Muñoz-Hernández, Onofre

    The Child Development Evaluation (CDE) Test is a screening tool designed and validated in Mexico for the early detection of child developmental problems. For professionals who will be administering the test in primary care facilities, previous acquisition of knowledge about the test is required in order to generate reliable results. The aim of this work was to evaluate the impact of a training model for primary care workers from different professions through the comparison of knowledge acquired during the training course. The study design was a before/after type considering the participation in a training course for the CDE test as the intervention. The course took place in six different Mexican states from October to December 2013. The same questions were used before and after. There were 394 participants included. Distribution according to professional profile was as follows: general physicians 73.4%, nursing 7.7%, psychology 7.1%, nutrition 6.1% and other professions 5.6%. The questions with the lowest correct answer rates were associated with the scoring of the CDE test. In the initial evaluation, 64.9% obtained a grade lower than 20 compared with 1.8% in the final evaluation. In the initial evaluation only 1.8% passed compared with 75.15% in the final evaluation. The proposed model allows the participants to acquire general knowledge about the CDE Test. To improve the general results in future training courses, it is required to reinforce during training the scoring and interpretation of the test together with the previous lecture of the material by the participants. Copyright © 2015 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.

  5. Presenting an Evaluation Model for the Cancer Registry Software.

    Science.gov (United States)

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  6. Evaluation of the BPMN According to the Requirements of the Enterprise Architecture Methodology

    Directory of Open Access Journals (Sweden)

    Václav Řepa

    2012-04-01

    Full Text Available This article evaluates some characteristics of the Business Process Modelling Notation from the perspective of the business system modelling methodology. Firstly the enterprise architecture context of the business process management as well as the importance of standards are discussed. Then the Business System Modelling Methodology is introduced with special attention paid to the Business Process Meta-model as a basis for the evaluation of the BPMN features. Particular basic concepts from the Business Process Meta-model are mapped to the usable constructs of the BPMN and related issues are analysed. Finally the basic conclusions are made and the general context is discussed.

  7. CMAQ Model Evaluation Framework

    Science.gov (United States)

    CMAQ is tested to establish the modeling system’s credibility in predicting pollutants such as ozone and particulate matter. Evaluation of CMAQ has been designed to assess the model’s performance for specific time periods and for specific uses.

  8. Evaluation of models of particulate suspension for a thorium ore stockpile

    International Nuclear Information System (INIS)

    Smith, W.J.

    1983-01-01

    Fifteen mathematical models of particle saltation, suspension, and resuspension were reviewed and categorized. Appropriate models were applied to the estimation of particulate releases from a hypothetical thorium ore storage pile. An assumed location (near Lemhi Pass, Montana) was used to permit the development of site specific information on ore characteristics and environmental influences. The available models were characterized in terms of suitability for representing aspects of the ore pile, such as rough surface features, wide particle size range, and site specific climate. Five models were selected for detailed study. A computer code for each of these is given. Site specific data for the assumed ore stockpile location were prepared. These data were manipulated to provide the input values required for each of the five models. Representative values and ranges for model variables are tabulated. The response of each model to input data for selected variables was determined. Each model was evaluated in terms of the physical realism of its response of each model to input data for selected variables was determined. Each model was evaluated in terms of the physical realism of its responses and its overall ability to represent the features of an ore stockpile. The two models providing the best representation were a modified version of the dust suspension subroutine TAILPS from the computer code MILDOS, and the dust suspension formulation from the computer code REDIST. Their responses are physically reasonable, although different from each other for two parameters: ore moisture and surface roughness. With the input values judged most representative of an ore pile near Lemhi Pass, the estimate of the release of suspended particulates is on the order of 1 g/m 2 -yr

  9. Management Model for Evaluation and Selection of Engineering Equipment Suppliers for Construction Projects in Iraq

    Directory of Open Access Journals (Sweden)

    Kadhim Raheem Erzaij

    2016-06-01

    Full Text Available Engineering equipment is essential part in the construction project and usually manufactured with long lead times, large costs and special engineering requirements. Construction manager targets that equipment to be delivered in the site need date with the right quantity, appropriate cost and required quality, and this entails an efficient supplier can satisfy these targets. Selection of engineering equipment supplier is a crucial managerial process .it requires evaluation of multiple suppliers according to multiple criteria. This process is usually performed manually and based on just limited evaluation criteria, so better alternatives may be neglected. Three stages of survey comprised number of public and private companies in Iraqi construction sector were employed to identify main criteria and sub criteria for supplier selection and their priorities.The main criteria identified were quality of product, commercial aspect, delivery, reputation and position, and system quality . An effective technique in multiple criteria decision making (MCDM as analytical hierarchy process (AHP have been used to get importance weights of criteria based on experts judgment. Thereafter, a management software system for Evaluation and Selection of Engineering Equipment Suppliers (ESEES has been developed based on the results obtained from AHP. This model was validated in a case study at municipality of Baghdad involved actual cases of selection pumps suppliers for infrastructure projects .According to experts, this model can improve the current process followed in the supplier selection and aid decision makers to adopt better choices in the domain of selection engineering equipment suppliers.

  10. Improved Traceability of Mission Concept to Requirements Using Model Based Systems Engineering

    Science.gov (United States)

    Reil, Robin

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the traditional document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This thesis presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magics MagicDraw modeling tool. The model incorporates mission concept and requirement information from the missions original DBSE design efforts. Active dependency relationships are modeled to analyze the completeness and consistency of the requirements to the mission concept. Overall experience and methodology are presented for both the MBSE and original DBSE design efforts of SporeSat.

  11. Critical Business Requirements Model and Metrics for Intranet ROI

    OpenAIRE

    Luqi; Jacoby, Grant A.

    2005-01-01

    Journal of Electronic Commerce Research, Vol. 6, No. 1, pp. 1-30. This research provides the first theoretical model, the Intranet Efficiency and Effectiveness Model (IEEM), to measure intranet overall value contributions based on a corporation’s critical business requirements by applying a balanced baseline of metrics and conversion ratios linked to key business processes of knowledge workers, IT managers and business decision makers -- in effect, closing the gap of understanding...

  12. A Synergistic Approach for Evaluating Climate Model Output for Ecological Applications

    Directory of Open Access Journals (Sweden)

    Rachel D. Cavanagh

    2017-09-01

    Full Text Available Increasing concern about the impacts of climate change on ecosystems is prompting ecologists and ecosystem managers to seek reliable projections of physical drivers of change. The use of global climate models in ecology is growing, although drawing ecologically meaningful conclusions can be problematic. The expertise required to access and interpret output from climate and earth system models is hampering progress in utilizing them most effectively to determine the wider implications of climate change. To address this issue, we present a joint approach between climate scientists and ecologists that explores key challenges and opportunities for progress. As an exemplar, our focus is the Southern Ocean, notable for significant change with global implications, and on sea ice, given its crucial role in this dynamic ecosystem. We combined perspectives to evaluate the representation of sea ice in global climate models. With an emphasis on ecologically-relevant criteria (sea ice extent and seasonality we selected a subset of eight models that reliably reproduce extant sea ice distributions. While the model subset shows a similar mean change to the full ensemble in sea ice extent (approximately 50% decline in winter and 30% decline in summer, there is a marked reduction in the range. This improved the precision of projected future sea ice distributions by approximately one third, and means they are more amenable to ecological interpretation. We conclude that careful multidisciplinary evaluation of climate models, in conjunction with ongoing modeling advances, should form an integral part of utilizing model output.

  13. Sustainable BECCS pathways evaluated by an integrated assessment model

    Science.gov (United States)

    Kato, E.

    2017-12-01

    Negative emissions technologies, particularly Bioenergy with Carbon Capture and Storage (BECCS), are key components of mitigation strategies in ambitious future socioeconomic scenarios analysed by integrated assessment models. Generally, scenarios aiming to keep mean global temperature rise below 2°C above pre-industrial would require net negative carbon emissions in the end of the 21st century. Also, in the context of Paris agreement which acknowledges "a balance between anthropogenic emissions by sources and removals by sinks of greenhouse gases in the second half of this century", RD&D for the negative emissions technologies in this decade has a crucial role for the possibility of early deployment of the technology. Because of the requirement of potentially extensive use of land and water for producing the bioenergy feedstock to get the anticipated level of gross negative emissions, researches on how to develop sustainable scenarios of BECCS is needed. Here, we present BECCS deployment scenarios that consider economically viable flow of bioenergy system including power generation and conversion process to liquid and gaseous fuels for transportation and heat with consideration of sustainable global biomass use. In the modelling process, detailed bioenergy representations, i.e. various feedstock and conversion technologies with and without CCS, are implemented in an integrated assessment (IA) model GRAPE (Global Relationship Assessment to Protect the Environment). Also, to overcome a general discrepancy about assumed future agricultural yield between 'top-down' IA models and 'bottom-up' estimates, which would crucially affect the land-use pattern, we applied yields change of food and energy crops consistent with process-based biophysical crop models in consideration of changing climate conditions. Using the framework, economically viable strategy for implementing sustainable bioenergy and BECCS flow are evaluated in the scenarios targeting to keep global average

  14. Distributed Evaluation of Local Sensitivity Analysis (DELSA), with application to hydrologic models

    Science.gov (United States)

    Rakovec, O.; Hill, M. C.; Clark, M. P.; Weerts, A. H.; Teuling, A. J.; Uijlenhoet, R.

    2014-01-01

    This paper presents a hybrid local-global sensitivity analysis method termed the Distributed Evaluation of Local Sensitivity Analysis (DELSA), which is used here to identify important and unimportant parameters and evaluate how model parameter importance changes as parameter values change. DELSA uses derivative-based "local" methods to obtain the distribution of parameter sensitivity across the parameter space, which promotes consideration of sensitivity analysis results in the context of simulated dynamics. This work presents DELSA, discusses how it relates to existing methods, and uses two hydrologic test cases to compare its performance with the popular global, variance-based Sobol' method. The first test case is a simple nonlinear reservoir model with two parameters. The second test case involves five alternative "bucket-style" hydrologic models with up to 14 parameters applied to a medium-sized catchment (200 km2) in the Belgian Ardennes. Results show that in both examples, Sobol' and DELSA identify similar important and unimportant parameters, with DELSA enabling more detailed insight at much lower computational cost. For example, in the real-world problem the time delay in runoff is the most important parameter in all models, but DELSA shows that for about 20% of parameter sets it is not important at all and alternative mechanisms and parameters dominate. Moreover, the time delay was identified as important in regions producing poor model fits, whereas other parameters were identified as more important in regions of the parameter space producing better model fits. The ability to understand how parameter importance varies through parameter space is critical to inform decisions about, for example, additional data collection and model development. The ability to perform such analyses with modest computational requirements provides exciting opportunities to evaluate complicated models as well as many alternative models.

  15. Evaluation of weather-based rice yield models in India

    Science.gov (United States)

    Sudharsan, D.; Adinarayana, J.; Reddy, D. Raji; Sreenivas, G.; Ninomiya, S.; Hirafuji, M.; Kiura, T.; Tanaka, K.; Desai, U. B.; Merchant, S. N.

    2013-01-01

    The objective of this study was to compare two different rice simulation models—standalone (Decision Support System for Agrotechnology Transfer [DSSAT]) and web based (SImulation Model for RIce-Weather relations [SIMRIW])—with agrometeorological data and agronomic parameters for estimation of rice crop production in southern semi-arid tropics of India. Studies were carried out on the BPT5204 rice variety to evaluate two crop simulation models. Long-term experiments were conducted in a research farm of Acharya N G Ranga Agricultural University (ANGRAU), Hyderabad, India. Initially, the results were obtained using 4 years (1994-1997) of data with weather parameters from a local weather station to evaluate DSSAT simulated results with observed values. Linear regression models used for the purpose showed a close relationship between DSSAT and observed yield. Subsequently, yield comparisons were also carried out with SIMRIW and DSSAT, and validated with actual observed values. Realizing the correlation coefficient values of SIMRIW simulation values in acceptable limits, further rice experiments in monsoon (Kharif) and post-monsoon (Rabi) agricultural seasons (2009, 2010 and 2011) were carried out with a location-specific distributed sensor network system. These proximal systems help to simulate dry weight, leaf area index and potential yield by the Java based SIMRIW on a daily/weekly/monthly/seasonal basis. These dynamic parameters are useful to the farming community for necessary decision making in a ubiquitous manner. However, SIMRIW requires fine tuning for better results/decision making.

  16. A 2nd generation static model of greenhouse energy requirements (horticern) : a comparison with dynamic models

    CERN Document Server

    Jolliet, O; Munday, G L

    1989-01-01

    Optimisation of a greenhouse and its components requires a suitable model permitting precise determination of its energy requirements. Existing static models are simple but lack precision; dynamic models though more precise, are unsuitable for use over long periods and difficult to handle in practice. A theoretical study and measurements from the CERN trial greenhouse have allowed the development of new static model named "HORTICERN", precise and easy to use for predicting energy consumption and which takes into account effects of solar energy, wind and radiative loss to the sky. This paper compares the HORTICERN model with the dynamic models of Bot, Takakura, Van Bavel and Gembloux, and demonstrates that its precision is comparable; differences on average being less than 5%, it is independent of type of greenhouse (e.g. single or double glazing, Hortiplus, etc.) and climate. The HORTICERN method has been developed for PC use and is proving to be a powerful tool for greenhouse optimisation by research work...

  17. Model Evaluation and Uncertainty in Agricultural Impacts Assessments: Results and Strategies from the Agricultural Model Intercomparison and Improvement Project (AgMIP)

    Science.gov (United States)

    Rosenzweig, C.; Hatfield, J.; Jones, J. W.; Ruane, A. C.

    2012-12-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) is an international effort to assess the state of global agricultural modeling and to understand climate impacts on the agricultural sector. AgMIP connects the climate science, crop modeling, and agricultural economic modeling communities to generate probabilistic projections of current and future climate impacts. The goals of AgMIP are to improve substantially the characterization of risk of hunger and world food security due to climate change and to enhance adaptation capacity in both developing and developed countries. This presentation will describe the general approach of AgMIP, highlight AgMIP efforts to evaluate climate, crop, and economic models, and discuss AgMIP uncertainty assessments. Model evaluation efforts will be outlined using examples from various facets of AgMIP, including climate scenario generation, the wheat crop model intercomparison, and the global agricultural economics model intercomparison being led in collaboration with the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP). Strategies developed to quantify uncertainty in each component of AgMIP, as well as the propagation of uncertainty through the climate-crop-economic modeling framework, will be detailed and preliminary uncertainty assessments that highlight crucial areas requiring improved models and data collection will be introduced.

  18. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    OpenAIRE

    Staccini, Pascal M.; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning...

  19. A Fuzzy Comprehensive Evaluation Model for Sustainability Risk Evaluation of PPP Projects

    Directory of Open Access Journals (Sweden)

    Libiao Bai

    2017-10-01

    Full Text Available Evaluating the sustainability risk level of public–private partnership (PPP projects can reduce project risk incidents and achieve the sustainable development of the organization. However, the existing studies about PPP projects risk management mainly focus on exploring the impact of financial and revenue risks but ignore the sustainability risks, causing the concept of “sustainability” to be missing while evaluating the risk level of PPP projects. To evaluate the sustainability risk level and achieve the most important objective of providing a reference for the public and private sectors when making decisions on PPP project management, this paper constructs a factor system of sustainability risk of PPP projects based on an extensive literature review and develops a mathematical model based on the methods of fuzzy comprehensive evaluation model (FCEM and failure mode, effects and criticality analysis (FMECA for evaluating the sustainability risk level of PPP projects. In addition, this paper conducts computational experiment based on a questionnaire survey to verify the effectiveness and feasibility of this proposed model. The results suggest that this model is reasonable for evaluating the sustainability risk level of PPP projects. To our knowledge, this paper is the first study to evaluate the sustainability risk of PPP projects, which would not only enrich the theories of project risk management, but also serve as a reference for the public and private sectors for the sustainable planning and development. Keywords: sustainability risk eva

  20. Developing a conceptual model for selecting and evaluating online markets

    Directory of Open Access Journals (Sweden)

    Sadegh Feizollahi

    2013-04-01

    Full Text Available There are many evidences, which emphasis on the benefits of using new technologies of information and communication in international business and many believe that E-Commerce can help satisfy customer explicit and implicit requirements. Internet shopping is a concept developed after the introduction of electronic commerce. Information technology (IT and its applications, specifically in the realm of the internet and e-mail promoted the development of e-commerce in terms of advertising, motivating and information. However, with the development of new technologies, credit and financial exchange on the internet websites were constructed so to facilitate e-commerce. The proposed study sends a total of 200 questionnaires to the target group (teachers - students - professionals - managers of commercial web sites and it manages to collect 130 questionnaires for final evaluation. Cronbach's alpha test is used for measuring reliability and to evaluate the validity of measurement instruments (questionnaires, and to assure construct validity, confirmatory factor analysis is employed. In addition, in order to analyze the research questions based on the path analysis method and to determine markets selection models, a regular technique is implemented. In the present study, after examining different aspects of e-commerce, we provide a conceptual model for selecting and evaluating online marketing in Iran. These findings provide a consistent, targeted and holistic framework for the development of the Internet market in the country.

  1. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D.; Halford, Keith J.; Binley, Andrew; Lane, John W.; Werkema, Dale D.

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  2. Generalized economic model for evaluating disposal costs at a low-level waste disposal facility

    International Nuclear Information System (INIS)

    Baird, R.D.; Rogers, V.C.

    1985-01-01

    An economic model is developed which can be used to evaluate cash flows associated with the development, operations, closure, and long-term maintenance of a proposed Low-Level Radioactive Waste disposal facility and to determine the unit disposal charges and unit surcharges which might result. The model includes the effects of nominal interest rate (rate of return on investment, or cost of capital), inflation rate, waste volume growth rate, site capacity, duration of various phases of the facility history, and the cash flows associated with each phase. The model uses standard discounted cash flow techniques on an after-tax basis to determine that unit disposal charge which is necessary to cover all costs and expenses and to generate an adequate rate of return on investment. It separately considers cash flows associated with post-operational activities to determine the required unit surcharge. The model is applied to three reference facilities to determine the respective unit disposal charges and unit surcharges, with various values of parameters. The sensitivity of the model results are evaluated for the unit disposal charge

  3. 45 CFR 2516.810 - What types of evaluations are grantees and subgrantees required to perform?

    Science.gov (United States)

    2010-10-01

    ... Welfare (Continued) CORPORATION FOR NATIONAL AND COMMUNITY SERVICE SCHOOL-BASED SERVICE-LEARNING PROGRAMS...? All grantees and subgrantees are required to perform internal evaluations which are ongoing efforts to assess performance and improve quality. Grantees and subgrantees may, but are not required to, arrange...

  4. Economics of the specification 6M safety re-evaluation and regulatory requirements

    International Nuclear Information System (INIS)

    Hopper, C.M.

    1985-01-01

    The objective of this work was to examine the potential economic impact of the DOT Specification 6M criticality safety re-evaluation and regulatory requirements. The examination was based upon comparative analyses of current authorized fissile material load limits for the 6M, current Federal regulations (and interpretations) limiting the contents of Type B fissile material packages, limiting aggregates of fissile material packages, and recent proposed fissile material mass limits derived from specialized criticality safety analyses of the 6M package. The work examines influences on cost in transportation, handling, and storage of fissile materials. Depending upon facility throughput requirements (and assumed incremental costs of fissile material packaging, storage, and transport), operating, facility storage capacity, and transportation costs can be reduced significantly. As an example of the pricing algorithm application based upon reasonable cost influences, the magnitude of the first year cost reductions could extend beyond four times the cost of the packaging nuclear criticality safety re-evaluation. 1 tab

  5. Cognition and procedure representational requirements for predictive human performance models

    Science.gov (United States)

    Corker, K.

    1992-01-01

    Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods

  6. Energy and nutrient deposition and excretion in the reproducing sow: model development and evaluation

    DEFF Research Database (Denmark)

    Hansen, A V; Strathe, A B; Theil, Peter Kappel

    2014-01-01

    requirements for maintenance, and fetal and maternal growth were described. In the lactating module, a factorial approach was used to estimate requirements for maintenance, milk production, and maternal growth. The priority for nutrient partitioning was assumed to be in the order of maintenance, milk...... production, and maternal growth with body tissue losses constrained within biological limits. Global sensitivity analysis showed that nonlinearity in the parameters was small. The model outputs considered were the total protein and fat deposition, average urinary and fecal N excretion, average methane...... emission, manure carbon excretion, and manure production. The model was evaluated using independent data sets from the literature using root mean square prediction error (RMSPE) and concordance correlation coefficients. The gestation module predicted body fat gain better than body protein gain, which...

  7. Educational game models: conceptualization and evaluation ...

    African Journals Online (AJOL)

    Educational game models: conceptualization and evaluation. ... The Game Object Model (GOM), that marries educational theory and game design, forms the basis for the development of the Persona Outlining ... AJOL African Journals Online.

  8. The Nuremberg Code subverts human health and safety by requiring animal modeling

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-07-01

    Full Text Available Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. Summary We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented.

  9. Biomechanical Evaluation of an Electric Power-Assisted Bicycle by a Musculoskeletal Model

    Science.gov (United States)

    Takehara, Shoichiro; Murakami, Musashi; Hase, Kazunori

    In this study, we construct an evaluation system for the muscular activity of the lower limbs when a human pedals an electric power-assisted bicycle. The evaluation system is composed of an electric power-assisted bicycle, a numerical simulator and a motion capture system. The electric power-assisted bicycle in this study has a pedal with an attached force sensor. The numerical simulator for pedaling motion is a musculoskeletal model of a human. The motion capture system measures the joint angles of the lower limb. We examine the influence of the electric power-assisted force on each muscle of the human trunk and legs. First, an experiment of pedaling motion is performed. Then, the musculoskeletal model is calculated by using the experimental data. We discuss the influence on each muscle by electric power-assist. It is found that the muscular activity is decreased by the electric power-assist bicycle, and the reduction of the muscular force required for pedaling motion was quantitatively shown for every muscle.

  10. Tijeras Arroyo Groundwater Current Conceptual Model and Corrective Measures Evaluation Report - December 2016.

    Energy Technology Data Exchange (ETDEWEB)

    Copland, John R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    This Tijeras Arroyo Groundwater Current Conceptual Model and Corrective Measures Evaluation Report (CCM/CME Report) has been prepared by the U.S. Department of Energy (DOE) and Sandia Corporation (Sandia) to meet requirements under the Sandia National Laboratories-New Mexico (SNL/NM) Compliance Order on Consent (the Consent Order). The Consent Order, entered into by the New Mexico Environment Department (NMED), DOE, and Sandia, became effective on April 29, 2004. The Consent Order identified the Tijeras Arroyo Groundwater (TAG) Area of Concern (AOC) as an area of groundwater contamination requiring further characterization and corrective action. This report presents an updated Conceptual Site Model (CSM) of the TAG AOC that describes the contaminant release sites, the geological and hydrogeological setting, and the distribution and migration of contaminants in the subsurface. The dataset used for this report includes the analytical results from groundwater samples collected through December 2015.

  11. Rock mechanics models evaluation report: Draft report

    International Nuclear Information System (INIS)

    1985-10-01

    This report documents the evaluation of the thermal and thermomechanical models and codes for repository subsurface design and for design constraint analysis. The evaluation was based on a survey of the thermal and thermomechanical codes and models that are applicable to subsurface design, followed by a Kepner-Tregoe (KT) structured decision analysis of the codes and models. The end result of the KT analysis is a balanced, documented recommendation of the codes and models which are best suited to conceptual subsurface design for the salt repository. The various laws for modeling the creep of rock salt are also reviewed in this report. 37 refs., 1 fig., 7 tabs

  12. Evaluating Emulation-based Models of Distributed Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Gabert, Kasimir G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Emulytics Initiatives

    2017-08-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  13. Ootw Tool Requirements in Relation to JWARS

    Energy Technology Data Exchange (ETDEWEB)

    Hartley III, D.S.; Packard, S.L.

    1998-01-01

    This document reports the results of the CMke of the Secretary of Defense/Program Analysis & Evaluation (OSD/PA&E) sponsored project to identify how Operations Other Than War (OOTW) tool requirements relate to the Joint Warfare Simulation (JWARS) and, more generally, to joint analytical modeling and simulation (M&S) requirements. It includes recommendations about which OOTW tools (and functionality within tools) should be included in JWARS, which should be managed as joint analytical modeling and simulation (M&S) tools, and which should be left for independent development.

  14. Nuclear safety culture evaluation model based on SSE-CMM

    International Nuclear Information System (INIS)

    Yang Xiaohua; Liu Zhenghai; Liu Zhiming; Wan Yaping; Peng Guojian

    2012-01-01

    Safety culture, which is of great significance to establish safety objectives, characterizes level of enterprise safety production and development. Traditional safety culture evaluation models emphasis on thinking and behavior of individual and organization, and pay attention to evaluation results while ignore process. Moreover, determining evaluation indicators lacks objective evidence. A novel multidimensional safety culture evaluation model, which has scientific and completeness, is addressed by building an preliminary mapping between safety culture and SSE-CMM's (Systems Security Engineering Capability Maturity Model) process area and generic practice. The model focuses on enterprise system security engineering process evaluation and provides new ideas and scientific evidences for the study of safety culture. (authors)

  15. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    This thesis focuses on developing a spreadsheet decision support model that can be used by combat engineer platoon and company commanders in determining the material requirements and estimated costs...

  16. An analytical modeling framework to evaluate converged networks through business-oriented metrics

    International Nuclear Information System (INIS)

    Guimarães, Almir P.; Maciel, Paulo R.M.; Matias, Rivalino

    2013-01-01

    Nowadays, society has increasingly relied on convergent networks as an essential means for individuals, businesses, and governments. Strategies, methods, models and techniques for preventing and handling hardware or software failures as well as avoiding performance degradation are, thus, fundamental for prevailing in business. Issues such as operational costs, revenues and the respective relationship to key performance and dependability metrics are central for defining the required system infrastructure. Our work aims to provide system performance and dependability models for supporting optimization of infrastructure design, aimed at business oriented metrics. In addition, a methodology is also adopted to support both the modeling and the evaluation process. The results showed that the proposed methodology can significantly reduce the complexity of infrastructure design as well as improve the relationship between business and infrastructure aspects

  17. Design of Training Systems (DOTS) Project: Test and Evaluation of Phase II Models

    Science.gov (United States)

    1976-04-01

    when the process being modeled is very much dependent upon human resoarces, precise requirement formulas are usually V unavailable. In this...mixed integer formulation options. The SGRR, in a sense, is an automiation of what is cu~rrently beinig donec men~tall y by instructors and trai ninrg nv...test and evaluation (T&E); information concerning CNETS LCDR R. J. Biersner Human Factors Analysis, N-214 AV 922-1392 CNTECHTRA CDR J. D. Davis

  18. Modeling and Evaluating Pilot Performance in NextGen: Review of and Recommendations Regarding Pilot Modeling Efforts, Architectures, and Validation Studies

    Science.gov (United States)

    Wickens, Christopher; Sebok, Angelia; Keller, John; Peters, Steve; Small, Ronald; Hutchins, Shaun; Algarin, Liana; Gore, Brian Francis; Hooey, Becky Lee; Foyle, David C.

    2013-01-01

    NextGen operations are associated with a variety of changes to the national airspace system (NAS) including changes to the allocation of roles and responsibilities among operators and automation, the use of new technologies and automation, additional information presented on the flight deck, and the entire concept of operations (ConOps). In the transition to NextGen airspace, aviation and air operations designers need to consider the implications of design or system changes on human performance and the potential for error. To ensure continued safety of the NAS, it will be necessary for researchers to evaluate design concepts and potential NextGen scenarios well before implementation. One approach for such evaluations is through human performance modeling. Human performance models (HPMs) provide effective tools for predicting and evaluating operator performance in systems. HPMs offer significant advantages over empirical, human-in-the-loop testing in that (1) they allow detailed analyses of systems that have not yet been built, (2) they offer great flexibility for extensive data collection, (3) they do not require experimental participants, and thus can offer cost and time savings. HPMs differ in their ability to predict performance and safety with NextGen procedures, equipment and ConOps. Models also vary in terms of how they approach human performance (e.g., some focus on cognitive processing, others focus on discrete tasks performed by a human, while others consider perceptual processes), and in terms of their associated validation efforts. The objectives of this research effort were to support the Federal Aviation Administration (FAA) in identifying HPMs that are appropriate for predicting pilot performance in NextGen operations, to provide guidance on how to evaluate the quality of different models, and to identify gaps in pilot performance modeling research, that could guide future research opportunities. This research effort is intended to help the FAA

  19. CRITICAL ANALYSIS OF EVALUATION MODEL LOMCE

    Directory of Open Access Journals (Sweden)

    José Luis Bernal Agudo

    2015-06-01

    Full Text Available The evaluation model that the LOMCE projects sinks its roots into the neoliberal beliefs, reflecting a specific way of understanding the world. What matters is not the process but the results, being the evaluation the center of the education-learning processes. It presents an evil planning, since the theory that justifies the model doesn’t specify upon coherent proposals, where there is an excessive worry for excellence and diversity is left out. A comprehensive way of understanding education should be recovered.

  20. Evaluation of integrated assessment model hindcast experiments: a case study of the GCAM 3.0 land use module

    Science.gov (United States)

    Snyder, Abigail C.; Link, Robert P.; Calvin, Katherine V.

    2017-11-01

    Hindcasting experiments (conducting a model forecast for a time period in which observational data are available) are being undertaken increasingly often by the integrated assessment model (IAM) community, across many scales of models. When they are undertaken, the results are often evaluated using global aggregates or otherwise highly aggregated skill scores that mask deficiencies. We select a set of deviation-based measures that can be applied on different spatial scales (regional versus global) to make evaluating the large number of variable-region combinations in IAMs more tractable. We also identify performance benchmarks for these measures, based on the statistics of the observational dataset, that allow a model to be evaluated in absolute terms rather than relative to the performance of other models at similar tasks. An ideal evaluation method for hindcast experiments in IAMs would feature both absolute measures for evaluation of a single experiment for a single model and relative measures to compare the results of multiple experiments for a single model or the same experiment repeated across multiple models, such as in community intercomparison studies. The performance benchmarks highlight the use of this scheme for model evaluation in absolute terms, providing information about the reasons a model may perform poorly on a given measure and therefore identifying opportunities for improvement. To demonstrate the use of and types of results possible with the evaluation method, the measures are applied to the results of a past hindcast experiment focusing on land allocation in the Global Change Assessment Model (GCAM) version 3.0. The question of how to more holistically evaluate models as complex as IAMs is an area for future research. We find quantitative evidence that global aggregates alone are not sufficient for evaluating IAMs that require global supply to equal global demand at each time period, such as GCAM. The results of this work indicate it is

  1. Implicit misattribution of evaluative responses: contingency-unaware evaluative conditioning requires simultaneous stimulus presentations.

    Science.gov (United States)

    Hütter, Mandy; Sweldens, Steven

    2013-08-01

    Recent research has shown that evaluative conditioning (EC) procedures can change attitudes without participants' awareness of the contingencies between conditioned and unconditioned stimuli (Hütter, Sweldens, Stahl, Unkelbach, & Klauer, 2012). We present a theoretical explanation and boundary condition for the emergence of unaware EC effects based on the implicit misattribution of evaluative responses from unconditioned to conditioned stimuli. We hypothesize that such misattribution is only possible when conditioned and unconditioned stimuli are perceived simultaneously. Therefore we manipulate the simultaneity of the stimulus presentations and apply a process dissociation procedure to distinguish contingency-aware from contingency-unaware EC effects. A multinomial model indicates that with sequential presentations, EC effects do not occur without contingency awareness. However, unaware EC effects do occur with simultaneous presentations. The findings support dual-process theories of learning. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  2. Statistical modeling for visualization evaluation through data fusion.

    Science.gov (United States)

    Chen, Xiaoyu; Jin, Ran

    2017-11-01

    There is a high demand of data visualization providing insights to users in various applications. However, a consistent, online visualization evaluation method to quantify mental workload or user preference is lacking, which leads to an inefficient visualization and user interface design process. Recently, the advancement of interactive and sensing technologies makes the electroencephalogram (EEG) signals, eye movements as well as visualization logs available in user-centered evaluation. This paper proposes a data fusion model and the application procedure for quantitative and online visualization evaluation. 15 participants joined the study based on three different visualization designs. The results provide a regularized regression model which can accurately predict the user's evaluation of task complexity, and indicate the significance of all three types of sensing data sets for visualization evaluation. This model can be widely applied to data visualization evaluation, and other user-centered designs evaluation and data analysis in human factors and ergonomics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. A Convergent Participation Model for Evaluation of Learning Objects

    Directory of Open Access Journals (Sweden)

    John Nesbit

    2002-10-01

    Full Text Available The properties that distinguish learning objects from other forms of educational software - global accessibility, metadata standards, finer granularity and reusability - have implications for evaluation. This article proposes a convergent participation model for learning object evaluation in which representatives from stakeholder groups (e.g., students, instructors, subject matter experts, instructional designers, and media developers converge toward more similar descriptions and ratings through a two-stage process supported by online collaboration tools. The article reviews evaluation models that have been applied to educational software and media, considers models for gathering and meta-evaluating individual user reviews that have recently emerged on the Web, and describes the peer review model adopted for the MERLOT repository. The convergent participation model is assessed in relation to other models and with respect to its support for eight goals of learning object evaluation: (1 aid for searching and selecting, (2 guidance for use, (3 formative evaluation, (4 influence on design practices, (5 professional development and student learning, (6 community building, (7 social recognition, and (8 economic exchange.

  4. Biology learning evaluation model in Senior High Schools

    Directory of Open Access Journals (Sweden)

    Sri Utari

    2017-06-01

    Full Text Available The study was to develop a Biology learning evaluation model in senior high schools that referred to the research and development model by Borg & Gall and the logic model. The evaluation model included the components of input, activities, output and outcomes. The developing procedures involved a preliminary study in the form of observation and theoretical review regarding the Biology learning evaluation in senior high schools. The product development was carried out by designing an evaluation model, designing an instrument, performing instrument experiment and performing implementation. The instrument experiment involved teachers and Students from Grade XII in senior high schools located in the City of Yogyakarta. For the data gathering technique and instrument, the researchers implemented observation sheet, questionnaire and test. The questionnaire was applied in order to attain information regarding teacher performance, learning performance, classroom atmosphere and scientific attitude; on the other hand, test was applied in order to attain information regarding Biology concept mastery. Then, for the analysis of instrument construct, the researchers performed confirmatory factor analysis by means of Lisrel 0.80 software and the results of this analysis showed that the evaluation instrument valid and reliable. The construct validity was between 0.43-0.79 while the reliability of measurement model was between 0.88-0.94. Last but not the least, the model feasibility test showed that the theoretical model had been supported by the empirical data.

  5. Evaluation of a decontamination model

    International Nuclear Information System (INIS)

    Rippin, D.W.T.; Hanulik, J.; Schenker, E.; Ullrich, G.

    1981-02-01

    In the scale-up of a laboratory decontamination process difficulties arise due to the limited understanding of the mechanisms controlling the process. This paper contains some initial proposals which may contribute to the quantitative understanding of the chemical and physical factors which influence decontamination operations. General features required in a mathematical model to describe a fluid-solid reaction are discussed, and initial work is presented with a simple model which has had some success in describing the observed laboratory behaviour. (Auth.)

  6. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  7. Evaluation of the analysis models in the ASTRA nuclear design code system

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Nam Jin; Park, Chang Jea; Kim, Do Sam; Lee, Kyeong Taek; Kim, Jong Woon [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2000-11-15

    In the field of nuclear reactor design, main practice was the application of the improved design code systems. During the process, a lot of basis and knowledge were accumulated in processing input data, nuclear fuel reload design, production and analysis of design data, et al. However less efforts were done in the analysis of the methodology and in the development or improvement of those code systems. Recently, KEPO Nuclear Fuel Company (KNFC) developed the ASTRA (Advanced Static and Transient Reactor Analyzer) code system for the purpose of nuclear reactor design and analysis. In the code system, two group constants were generated from the CASMO-3 code system. The objective of this research is to analyze the analysis models used in the ASTRA/CASMO-3 code system. This evaluation requires indepth comprehension of the models, which is important so much as the development of the code system itself. Currently, most of the code systems used in domestic Nuclear Power Plant were imported, so it is very difficult to maintain and treat the change of the situation in the system. Therefore, the evaluation of analysis models in the ASTRA nuclear reactor design code system in very important.

  8. Ethics Requirement Score: new tool for evaluating ethics in publications

    Science.gov (United States)

    dos Santos, Lígia Gabrielle; Fonseca, Ana Carolina da Costa e; Bica, Claudia Giuliano

    2014-01-01

    Objective To analyze ethical standards considered by health-related scientific journals, and to prepare the Ethics Requirement Score, a bibliometric index to be applied to scientific healthcare journals in order to evaluate criteria for ethics in scientific publication. Methods Journals related to healthcare selected by the Journal of Citation Reports™ 2010 database were considered as experimental units. Parameters related to publication ethics were analyzed for each journal. These parameters were acquired by analyzing the author’s guidelines or instructions in each journal website. The parameters considered were approval by an Internal Review Board, Declaration of Helsinki or Resolution 196/96, recommendations on plagiarism, need for application of Informed Consent Forms with the volunteers, declaration of confidentiality of patients, record in the database for clinical trials (if applicable), conflict of interest disclosure, and funding sources statement. Each item was analyzed considering their presence or absence. Result The foreign journals had a significantly higher Impact Factor than the Brazilian journals, however, no significant results were observed in relation to the Ethics Requirement Score. There was no correlation between the Ethics Requirement Score and the Impact Factor. Conclusion Although the Impact Factor of foreigner journals was considerably higher than that of the Brazilian publications, the results showed that the Impact Factor has no correlation with the proposed score. This allows us to state that the ethical requirements for publication in biomedical journals are not related to the comprehensiveness or scope of the journal. PMID:25628189

  9. Ethics Requirement Score: new tool for evaluating ethics in publications.

    Science.gov (United States)

    Santos, Lígia Gabrielle dos; Costa e Fonseca, Ana Carolina da; Bica, Claudia Giuliano

    2014-01-01

    To analyze ethical standards considered by health-related scientific journals, and to prepare the Ethics Requirement Score, a bibliometric index to be applied to scientific healthcare journals in order to evaluate criteria for ethics in scientific publication. Journals related to healthcare selected by the Journal of Citation Reports™ 2010 database were considered as experimental units. Parameters related to publication ethics were analyzed for each journal. These parameters were acquired by analyzing the author's guidelines or instructions in each journal website. The parameters considered were approval by an Internal Review Board, Declaration of Helsinki or Resolution 196/96, recommendations on plagiarism, need for application of Informed Consent Forms with the volunteers, declaration of confidentiality of patients, record in the database for clinical trials (if applicable), conflict of interest disclosure, and funding sources statement. Each item was analyzed considering their presence or absence. The foreign journals had a significantly higher Impact Factor than the Brazilian journals, however, no significant results were observed in relation to the Ethics Requirement Score. There was no correlation between the Ethics Requirement Score and the Impact Factor. Although the Impact Factor of foreigner journals was considerably higher than that of the Brazilian publications, the results showed that the Impact Factor has no correlation with the proposed score. This allows us to state that the ethical requirements for publication in biomedical journals are not related to the comprehensiveness or scope of the journal.

  10. Evaluation of semi-generic PBTK modeling for emergency risk assessment after acute inhalation exposure to volatile hazardous chemicals

    NARCIS (Netherlands)

    Olie, J Daniël N; Bessems, Jos G; Clewell, Harvey J; Meulenbelt, Jan; Hunault, Claudine C

    BACKGROUND: Physiologically Based Toxicokinetic Models (PBTK) may facilitate emergency risk assessment after chemical incidents with inhalation exposure, but they are rarely used due to their relative complexity and skill requirements. We aimed to tackle this problem by evaluating a semi-generic

  11. Evaluation of semi-generic PBTK modeling for emergency risk assessment after acute inhalation exposure to volatile hazardous chemicals

    NARCIS (Netherlands)

    Olie, J. Daniël N; Bessems, Jos G.; Clewell, Harvey J.; Meulenbelt, Jan; Hunault, Claudine C.

    2015-01-01

    BACKGROUND: Physiologically Based Toxicokinetic Models (PBTK) may facilitate emergency risk assessment after chemical incidents with inhalation exposure, but they are rarely used due to their relative complexity and skill requirements. We aimed to tackle this problem by evaluating a semi-generic

  12. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  13. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  14. Communicating Sustainability: An Operational Model for Evaluating Corporate Websites

    Directory of Open Access Journals (Sweden)

    Alfonso Siano

    2016-09-01

    Full Text Available The interest in corporate sustainability has increased rapidly in recent years and has encouraged organizations to adopt appropriate digital communication strategies, in which the corporate website plays a key role. Despite this growing attention in both the academic and business communities, models for the analysis and evaluation of online sustainability communication have not been developed to date. This paper aims to develop an operational model to identify and assess the requirements of sustainability communication in corporate websites. It has been developed from a literature review on corporate sustainability and digital communication and the analysis of the websites of the organizations included in the “Global CSR RepTrak 2015” by the Reputation Institute. The model identifies the core dimensions of online sustainability communication (orientation, structure, ergonomics, content—OSEC, sub-dimensions, such as stakeholder engagement and governance tools, communication principles, and measurable items (e.g., presence of the materiality matrix, interactive graphs. A pilot study on the websites of the energy and utilities companies included in the Dow Jones Sustainability World Index 2015 confirms the applicability of the OSEC framework. Thus, the model can provide managers and digital communication consultants with an operational tool that is useful for developing an industry ranking and assessing the best practices. The model can also help practitioners to identify corrective actions in the critical areas of digital sustainability communication and avoid greenwashing.

  15. Modeling of requirement specification for safety critical real time computer system using formal mathematical specifications

    International Nuclear Information System (INIS)

    Sankar, Bindu; Sasidhar Rao, B.; Ilango Sambasivam, S.; Swaminathan, P.

    2002-01-01

    Full text: Real time computer systems are increasingly used for safety critical supervision and control of nuclear reactors. Typical application areas are supervision of reactor core against coolant flow blockage, supervision of clad hot spot, supervision of undesirable power excursion, power control and control logic for fuel handling systems. The most frequent cause of fault in safety critical real time computer system is traced to fuzziness in requirement specification. To ensure the specified safety, it is necessary to model the requirement specification of safety critical real time computer systems using formal mathematical methods. Modeling eliminates the fuzziness in the requirement specification and also helps to prepare the verification and validation schemes. Test data can be easily designed from the model of the requirement specification. Z and B are the popular languages used for modeling the requirement specification. A typical safety critical real time computer system for supervising the reactor core of prototype fast breeder reactor (PFBR) against flow blockage is taken as case study. Modeling techniques and the actual model are explained in detail. The advantages of modeling for ensuring the safety are summarized

  16. HIGH RESOLUTION RESISTIVITY LEAK DETECTION DATA PROCESSING and EVALUATION MEHTODS and REQUIREMENTS

    International Nuclear Information System (INIS)

    SCHOFIELD JS

    2007-01-01

    This document has two purposes: (sm b ullet) Describe how data generated by High Resolution REsistivity (HRR) leak detection (LD) systems deployed during single-shell tank (SST) waste retrieval operations are processed and evaluated. (sm b ullet) Provide the basic review requirements for HRR data when Hrr is deployed as a leak detection method during SST waste retrievals

  17. Functional Fit Evaluation to Determine Optimal Ease Requirements in Canadian Forces Chemical Protective Gloves

    National Research Council Canada - National Science Library

    Tremblay-Lutter, Julie

    1995-01-01

    A functional fit evaluation of the Canadian Forces (CF) chemical protective lightweight glove was undertaken in order to quantify the amount of ease required within the glove for optimal functional fit...

  18. Developmental Education Evaluation Model.

    Science.gov (United States)

    Perry-Miller, Mitzi; And Others

    A developmental education evaluation model designed to be used at a multi-unit urban community college is described. The purpose of the design was to determine the cost effectiveness/worth of programs in order to initiate self-improvement. A needs assessment was conducted by interviewing and taping the responses of students, faculty, staff, and…

  19. Evaluation of INL Supplied MOOSE/OSPREY Model: Modeling Water Adsorption on Type 3A Molecular Sieve

    Energy Technology Data Exchange (ETDEWEB)

    Pompilio, L. M. [Syracuse University; DePaoli, D. W. [ORNL; Spencer, B. B. [ORNL

    2014-08-29

    The purpose of this study was to evaluate Idaho National Lab’s Multiphysics Object-Oriented Simulation Environment (MOOSE) software in modeling the adsorption of water onto type 3A molecular sieve (3AMS). MOOSE can be thought-of as a computing framework within which applications modeling specific coupled-phenomena can be developed and run. The application titled Off-gas SeParation and REcoverY (OSPREY) has been developed to model gas sorption in packed columns. The sorbate breakthrough curve calculated by MOOSE/OSPREY was compared to results previously obtained in the deep bed hydration tests conducted at Oak Ridge National Laboratory. The coding framework permits selection of various options, when they exist, for modeling a process. For example, the OSPREY module includes options to model the adsorption equilibrium with a Langmuir model or a generalized statistical thermodynamic adsorption (GSTA) model. The vapor solid equilibria and the operating conditions of the process (e.g., gas phase concentration) are required to calculate the concentration gradient driving the mass transfer between phases. Both the Langmuir and GSTA models were tested in this evaluation. Input variables were either known from experimental conditions, or were available (e.g., density) or were estimated (e.g., thermal conductivity of sorbent) from the literature. Variables were considered independent of time, i.e., rather than having a mass transfer coefficient that varied with time or position in the bed, the parameter was set to remain constant. The calculated results did not coincide with data from laboratory tests. The model accurately estimated the number of bed volumes processed for the given operating parameters, but breakthrough times were not accurately predicted, varying 50% or more from the data. The shape of the breakthrough curves also differed from the experimental data, indicating a much wider sorption band. Model modifications are needed to improve its utility and

  20. Model Penentuan Nilai Target Functional Requirement Berbasis Utilitas

    Directory of Open Access Journals (Sweden)

    Cucuk Nur Rosyidi

    2012-01-01

    Full Text Available In a product design and development process, a designer faces a problem to decide functional requirement (FR target values. That decision is made under a risk since it is conducted in the early design phase using incomplete information. Utility function can be used to reflect the decision maker attitude towards the risk in making such decision. In this research, we develop a utility-based model to determine FR target values using quadratic utility function and information from Quality Function Deployment (QFD. A pencil design is used as a numerical example using quadratic utility function for each FR. The model can be applied for balancing customer and designer interest in determining FR target values.

  1. 75 FR 20269 - Regulatory Reporting Requirements for the Indian Community Development Block Grant Program

    Science.gov (United States)

    2010-04-19

    .... Second, this rule requires ICDBG grantees to use the Logic Model form developed as part of HUD's Notice of Funding Availability (NOFA) process. The required use of the Logic Model will conform the ICDBG reporting requirements to those of other HUD competitive funding programs, and enhance the evaluation of...

  2. Metrics for Evaluation of Student Models

    Science.gov (United States)

    Pelanek, Radek

    2015-01-01

    Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…

  3. Evaluating environmental and economic consequences of alternative pest management strategies: results of modeling workshops

    Science.gov (United States)

    Johnson, Richard L.; Andrews, Austin K.; Auble, Gregor T.L.; Ellison, Richard A.; Hamilton, David B.; Roelle, James E.; McNamee, Peter J.

    1983-01-01

    The U.S. Environmental Protection Agency (EPA) needs a comprehensive method to evaluate the human health and environmental effects of alternative agricultural pest management strategies. This project explored the utility of Adaptive Environmental Assessment (AEA) techniques for meeting this need. The project objectives were to produce models for environmental impact analysis, improve communications, identify research needs and data requirements, and demonstrate a process for resolving conflicts. The project was structured around the construction (in an initial 2 1/2-day workshop) and examination (in a second 2 1/2-day workshop) of a simulation model of a corn agroecosystem.

  4. Evaluation Model of Tea Industry Information Service Quality

    OpenAIRE

    Shi , Xiaohui; Chen , Tian’en

    2015-01-01

    International audience; According to characteristics of tea industry information service, this paper have built service quality evaluation index system for tea industry information service quality, R-cluster analysis and multiple regression have been comprehensively used to contribute evaluation model with a high practice and credibility. Proved by the experiment, the evaluation model of information service quality has a good precision, which has guidance significance to a certain extent to e...

  5. visCOS: An R-package to evaluate model performance of hydrological models

    Science.gov (United States)

    Klotz, Daniel; Herrnegger, Mathew; Wesemann, Johannes; Schulz, Karsten

    2016-04-01

    The evaluation of model performance is a central part of (hydrological) modelling. Much attention has been given to the development of evaluation criteria and diagnostic frameworks. (Klemeš, 1986; Gupta et al., 2008; among many others). Nevertheless, many applications exist for which objective functions do not yet provide satisfying summaries. Thus, the necessity to visualize results arises in order to explore a wider range of model capacities, be it strengths or deficiencies. Visualizations are usually devised for specific projects and these efforts are often not distributed to a broader community (e.g. via open source software packages). Hence, the opportunity to explicitly discuss a state-of-the-art presentation technique is often missed. We therefore present a comprehensive R-package for evaluating model performance by visualizing and exploring different aspects of hydrological time-series. The presented package comprises a set of useful plots and visualization methods, which complement existing packages, such as hydroGOF (Zambrano-Bigiarini et al., 2012). It is derived from practical applications of the hydrological models COSERO and COSEROreg (Kling et al., 2014). visCOS, providing an interface in R, represents an easy-to-use software package for visualizing and assessing model performance and can be implemented in the process of model calibration or model development. The package provides functions to load hydrological data into R, clean the data, process, visualize, explore and finally save the results in a consistent way. Together with an interactive zoom function of the time series, an online calculation of the objective functions for variable time-windows is included. Common hydrological objective functions, such as the Nash-Sutcliffe Efficiency and the Kling-Gupta Efficiency, can also be evaluated and visualized in different ways for defined sub-periods like hydrological years or seasonal sections. Many hydrologists use long-term water-balances as a

  6. An evaluation of BPMN modeling tools

    NARCIS (Netherlands)

    Yan, Z.; Reijers, H.A.; Dijkman, R.M.; Mendling, J.; Weidlich, M.

    2010-01-01

    Various BPMN modeling tools are available and it is close to impossible to understand their functional differences without simply trying them out. This paper presents an evaluation framework and presents the outcomes of its application to a set of five BPMN modeling tools. We report on various

  7. Modeling Energy and Development : An Evaluation of Models and Concepts

    NARCIS (Netherlands)

    Ruijven, Bas van; Urban, Frauke; Benders, René M.J.; Moll, Henri C.; Sluijs, Jeroen P. van der; Vries, Bert de; Vuuren, Detlef P. van

    2008-01-01

    Most global energy models are developed by institutes from developed countries focusing primarily oil issues that are important in industrialized countries. Evaluation of the results for Asia of the IPCC/SRES models shows that broad concepts of energy and development. the energy ladder and the

  8. Model evaluation of the phytoextraction potential of heavy metal hyperaccumulators and non-hyperaccumulators.

    Science.gov (United States)

    Liang, Hong-Ming; Lin, Ting-Hsiang; Chiou, Jeng-Min; Yeh, Kuo-Chen

    2009-06-01

    Evaluation of the remediation ability of zinc/cadmium in hyper- and non-hyperaccumulator plant species through greenhouse studies is limited. To bridge the gap between greenhouse studies and field applications for phytoextraction, we used published data to examine the partitioning of heavy metals between plants and soil (defined as the bioconcentration factor). We compared the remediation ability of the Zn/Cd hyperaccumulators Thlaspi caerulescens and Arabidopsis halleri and the non-hyperaccumulators Nicotiana tabacum and Brassica juncea using a hierarchical linear model (HLM). A recursive algorithm was then used to evaluate how many harvest cycles were required to clean a contaminated site to meet Taiwan Environmental Protection Agency regulations. Despite the high bioconcentration factor of both hyperaccumulators, metal removal was still limited because of the plants' small biomass. Simulation with N. tabacum and the Cadmium model suggests further study and development of plants with high biomass and improved phytoextraction potential for use in environmental cleanup.

  9. A versatile method for confirmatory evaluation of the effects of a covariate in multiple models

    DEFF Research Database (Denmark)

    Pipper, Christian Bressen; Ritz, Christian; Bisgaard, Hans

    2012-01-01

    to provide a fine-tuned control of the overall type I error in a wide range of epidemiological experiments where in reality no other useful alternative exists. The methodology proposed is applied to a multiple-end-point study of the effect of neonatal bacterial colonization on development of childhood asthma.......Modern epidemiology often requires testing of the effect of a covariate on multiple end points from the same study. However, popular state of the art methods for multiple testing require the tests to be evaluated within the framework of a single model unifying all end points. This severely limits...

  10. Estimating the incremental net health benefit of requirements for cardiovascular risk evaluation for diabetes therapies.

    Science.gov (United States)

    Chawla, Anita J; Mytelka, Daniel S; McBride, Stephan D; Nellesen, Dave; Elkins, Benjamin R; Ball, Daniel E; Kalsekar, Anupama; Towse, Adrian; Garrison, Louis P

    2014-03-01

    To evaluate the advantages and disadvantages of pre-approval requirements for safety data to detect cardiovascular (CV) risk contained in the December 2008 U.S. Food and Drug Administration (FDA) guidance for developing type 2 diabetes drugs compared with the February 2008 FDA draft guidance from the perspective of diabetes population health. We applied the incremental net health benefit (INHB) framework to quantify the benefits and risks of investigational diabetes drugs using a common survival metric (life-years [LYs]). We constructed a decision analytic model for clinical program development consistent with the requirements of each guidance and simulated diabetes drugs, some of which had elevated CV risk. Assuming constant research budgets, we estimate the impact of increased trial size on drugs investigated. We aggregate treatment benefit and CV risks for each approved drug over a 35-year horizon under each guidance. The quantitative analysis suggests that the December 2008 guidance adversely impacts diabetes population health. INHB was -1.80 million LYs, attributable to delayed access to diabetes therapies (-0 .18 million LYs) and fewer drugs (-1.64 million LYs), but partially offset by reduced CV risk exposure (0.02 million LYs). Results were robust in sensitivity analyses. The health outcomes impact of all potential benefits and risks should be evaluated in a common survival measure, including health gain from avoided adverse events, lost health benefits from delayed or for gone efficacious products, and impact of alternative policy approaches. Quantitative analysis of the December 2008 FDA guidance for diabetes therapies indicates that negative impact on patient health will result. Copyright © 2014 The Authors. Pharmacoepidemiology and Drug Safety published by John Wiley & Sons, Ltd.

  11. An Evaluation Model for Sustainable Development of China’s Textile Industry: An Empirical Study

    Science.gov (United States)

    Zhao, Hong; Lu, Xiaodong; Yu, Ting; Yin, Yanbin

    2018-04-01

    With economy’s continuous rapid growth, textile industry is required to search for new rules and adjust strategies in order to optimize industrial structure and rationalize social spending. The sustainable development of China’s textile industry is a comprehensive research subject. This study analyzed the status of China’s textile industry and constructed the evaluation model based on the economical, ecologic, and social benefits. Analytic Hierarchy Process (AHP) and Data Envelopment Analysis (DEA) were used for an empirical study of textile industry. The result of evaluation model suggested that the status of the textile industry has become the major problems in the sustainable development of China’s textile industry. It’s nearly impossible to integrate into the global economy if no measures are taken. The enterprises concerned with the textile industry status should be reformed in terms of product design, raw material selection, technological reform, technological progress, and management, in accordance with the ideas and requirements of sustainable development. The results of this study are benefit for 1) discover the main elements restricting the industry’s sustainable development; 2) seek for corresponding solutions for policy formulation and implementation of textile industry; 3) provide references for enterprises’ development transformation in strategic deployment, fund allocation, and personnel assignment.

  12. A model for photothermal responses of flowering in rice. II. Model evaluation.

    NARCIS (Netherlands)

    Yin, X.; Kropff, M.J.; Nakagawa, H.; Horie, T.; Goudriaan, J.

    1997-01-01

    A detailed nonlinear model, the 3s-Beta model, for photothermal responses of flowering in rice (Oryza sativa L.) was evaluated for predicting rice flowering date in field conditions. This model was compared with other three models: a three-plane linear model and two nonlinear models, viz, the

  13. Relevance of the c-statistic when evaluating risk-adjustment models in surgery.

    Science.gov (United States)

    Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Dimick, Justin B; Wang, Edward; Chow, Warren B; Ko, Clifford Y; Bilimoria, Karl Y

    2012-05-01

    The measurement of hospital quality based on outcomes requires risk adjustment. The c-statistic is a popular tool used to judge model performance, but can be limited, particularly when evaluating specific operations in focused populations. Our objectives were to examine the interpretation and relevance of the c-statistic when used in models with increasingly similar case mix and to consider an alternative perspective on model calibration based on a graphical depiction of model fit. From the American College of Surgeons National Surgical Quality Improvement Program (2008-2009), patients were identified who underwent a general surgery procedure, and procedure groups were increasingly restricted: colorectal-all, colorectal-elective cases only, and colorectal-elective cancer cases only. Mortality and serious morbidity outcomes were evaluated using logistic regression-based risk adjustment, and model c-statistics and calibration curves were used to compare model performance. During the study period, 323,427 general, 47,605 colorectal-all, 39,860 colorectal-elective, and 21,680 colorectal cancer patients were studied. Mortality ranged from 1.0% in general surgery to 4.1% in the colorectal-all group, and serious morbidity ranged from 3.9% in general surgery to 12.4% in the colorectal-all procedural group. As case mix was restricted, c-statistics progressively declined from the general to the colorectal cancer surgery cohorts for both mortality and serious morbidity (mortality: 0.949 to 0.866; serious morbidity: 0.861 to 0.668). Calibration was evaluated graphically by examining predicted vs observed number of events over risk deciles. For both mortality and serious morbidity, there was no qualitative difference in calibration identified between the procedure groups. In the present study, we demonstrate how the c-statistic can become less informative and, in certain circumstances, can lead to incorrect model-based conclusions, as case mix is restricted and patients become

  14. A Linguistic Multigranular Sensory Evaluation Model for Olive Oil

    Directory of Open Access Journals (Sweden)

    Luis Martinez

    2008-06-01

    Full Text Available Evaluation is a process that analyzes elements in order to achieve different objectives such as quality inspection, marketing and other fields in industrial companies. This paper focuses on sensory evaluation where the evaluated items are assessed by a panel of experts according to the knowledge acquired via human senses. In these evaluation processes the information provided by the experts implies uncertainty, vagueness and imprecision. The use of the Fuzzy Linguistic Approach 32 has provided successful results modelling such a type of information. In sensory evaluation it may happen that the panel of experts have more or less degree knowledge of about the evaluated items or indicators. So, it seems suitable that each expert could express their preferences in different linguistic term sets based on their own knowledge. In this paper, we present a sensory evaluation model that manages multigranular linguistic evaluation framework based on a decision analysis scheme. This model will be applied to the sensory evaluation process of Olive Oil.

  15. Model-driven requirements engineering (MDRE) for real-time ultra-wide instantaneous bandwidth signal simulation

    Science.gov (United States)

    Chang, Daniel Y.; Rowe, Neil C.

    2013-05-01

    While conducting a cutting-edge research in a specific domain, we realize that (1) requirements clarity and correctness are crucial to our success [1], (2) hardware is hard to change, most work is in software requirements development, coding and testing [2], (3) requirements are constantly changing, so that configurability, reusability, scalability, adaptability, modularity and testability are important non-functional attributes [3], (4) cross-domain knowledge is necessary for complex systems [4], and (5) if our research is successful, the results could be applied to other domains with similar problems. In this paper, we propose to use model-driven requirements engineering (MDRE) to model and guide our requirements/development, since models are easy to understand, execute, and modify. The domain for our research is Electronic Warfare (EW) real-time ultra-wide instantaneous bandwidth (IBW1) signal simulation. The proposed four MDRE models are (1) Switch-and-Filter architecture, (2) multiple parallel data bit streams alignment, (3) post-ADC and pre-DAC bits re-mapping, and (4) Discrete Fourier Transform (DFT) filter bank. This research is unique since the instantaneous bandwidth we are dealing with is in gigahertz range instead of conventional megahertz.

  16. A logic model framework for evaluation and planning in a primary care practice-based research network (PBRN)

    Science.gov (United States)

    Hayes, Holly; Parchman, Michael L.; Howard, Ray

    2012-01-01

    Evaluating effective growth and development of a Practice-Based Research Network (PBRN) can be challenging. The purpose of this article is to describe the development of a logic model and how the framework has been used for planning and evaluation in a primary care PBRN. An evaluation team was formed consisting of the PBRN directors, staff and its board members. After the mission and the target audience were determined, facilitated meetings and discussions were held with stakeholders to identify the assumptions, inputs, activities, outputs, outcomes and outcome indicators. The long-term outcomes outlined in the final logic model are two-fold: 1.) Improved health outcomes of patients served by PBRN community clinicians; and 2.) Community clinicians are recognized leaders of quality research projects. The Logic Model proved useful in identifying stakeholder interests and dissemination activities as an area that required more attention in the PBRN. The logic model approach is a useful planning tool and project management resource that increases the probability that the PBRN mission will be successfully implemented. PMID:21900441

  17. Micro Data and General Equilibrium Models

    DEFF Research Database (Denmark)

    Browning, Martin; Hansen, Lars Peter; Heckman, James J.

    1999-01-01

    Dynamic general equilibrium models are required to evaluate policies applied at the national level. To use these models to make quantitative forecasts requires knowledge of an extensive array of parameter values for the economy at large. This essay describes the parameters required for different...... economic models, assesses the discordance between the macromodels used in policy evaluation and the microeconomic models used to generate the empirical evidence. For concreteness, we focus on two general equilibrium models: the stochastic growth model extended to include some forms of heterogeneity...

  18. Critical evaluation of paradigms for modelling integrated supply chains

    NARCIS (Netherlands)

    Van Dam, K.H.; Adhitya, A.; Srinivasan, R.; Lukszo, Z.

    2009-01-01

    Contemporary problems in process systems engineering often require model-based decision support tool. Among the various modelling paradigms, equation-based models and agent-based models are widely used to develop dynamic models of systems. Which is the most appropriate modelling paradigm for a

  19. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    Science.gov (United States)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  20. Lubrication Theory Model to Evaluate Surgical Alterations in Flow Mechanics of the Lower Esophageal Sphincter

    Science.gov (United States)

    Ghosh, Sudip K.; Brasseur, James G.; Zaki, Tamer; Kahrilas, Peter J.

    2003-11-01

    Surgery is commonly used to rebuild a weak lower esophageal sphincter (LES) and reduce reflux. Because the driving pressure (DP) is proportional to muscle tension generated in the esophagus, we developed models using lubrication theory to evaluate the consequences of surgery on muscle force required to open the LES and drive the flow. The models relate time changes in DP to lumen geometry and trans-LES flow with a manometric catheter. Inertial effects were included and found negligible. Two models, direct (opening specified) and indirect (opening predicted), were combined with manometric pressure and imaging data from normal and post-surgery LES. A very high sensitivity was predicted between the details of the DP and LES opening. The indirect model accurately captured LES opening and predicted a 3-phase emptying process, with phases I and III requiring rapid generation of muscle tone to open the LES and empty the esophagus. Data showed that phases I and III are adversely altered by surgery causing incomplete emptying. Parametric model studies indicated that changes to the surgical procedure can positively alter LES flow mechanics and improve clinical outcomes.

  1. State of the Art : Integrated Management of Requirements in Model-Based Software Engineering

    OpenAIRE

    Thörn, Christer

    2006-01-01

    This report describes the background and future of research concerning integrated management of requirements in model-based software engineering. The focus is on describing the relevant topics and existing theoretical backgrounds that form the basis for the research. The report describes the fundamental difficulties of requirements engineering for software projects, and proposes that the results and methods of models in software engineering can help leverage those problems. Taking inspiration...

  2. A participative evaluation model to refine academic support for first year Indigenous higher education students

    Directory of Open Access Journals (Sweden)

    Bronwyn Rossingh

    2012-03-01

    Full Text Available This paper presents an evaluative approach designed to provide a cycle of continuous improvement to retain Indigenous students during their first year of higher education.   The evaluation model operates in conjunction with a student academic enrichment program that is premised on valuing and respecting each student's background and life experience whilst building capability for learning success.  Data collected will be used for continual improvement of a newly developed innovative academic enrichment program that caters to the needs of Indigenous students.  The defining mechanisms of the model for measuring the first year experience are particularly meaningful for the Australian Centre For Indigenous Knowledges and Education as it moves into its inaugural year of operation in 2012. This preeminent time requires a flexible model to receive timely feedback in a reflexive environment where students guide the process as they continue their journey of accumulating knowledge and leave behind their contribution in shaping the landscape for future first year Indigenous students.  

  3. An Evaluation of Parametric and Nonparametric Models of Fish Population Response.

    Energy Technology Data Exchange (ETDEWEB)

    Haas, Timothy C.; Peterson, James T.; Lee, Danny C.

    1999-11-01

    Predicting the distribution or status of animal populations at large scales often requires the use of broad-scale information describing landforms, climate, vegetation, etc. These data, however, often consist of mixtures of continuous and categorical covariates and nonmultiplicative interactions among covariates, complicating statistical analyses. Using data from the interior Columbia River Basin, USA, we compared four methods for predicting the distribution of seven salmonid taxa using landscape information. Subwatersheds (mean size, 7800 ha) were characterized using a set of 12 covariates describing physiography, vegetation, and current land-use. The techniques included generalized logit modeling, classification trees, a nearest neighbor technique, and a modular neural network. We evaluated model performance using out-of-sample prediction accuracy via leave-one-out cross-validation and introduce a computer-intensive Monte Carlo hypothesis testing approach for examining the statistical significance of landscape covariates with the non-parametric methods. We found the modular neural network and the nearest-neighbor techniques to be the most accurate, but were difficult to summarize in ways that provided ecological insight. The modular neural network also required the most extensive computer resources for model fitting and hypothesis testing. The generalized logit models were readily interpretable, but were the least accurate, possibly due to nonlinear relationships and nonmultiplicative interactions among covariates. Substantial overlap among the statistically significant (P<0.05) covariates for each method suggested that each is capable of detecting similar relationships between responses and covariates. Consequently, we believe that employing one or more methods may provide greater biological insight without sacrificing prediction accuracy.

  4. Study on team evaluation. Team process model for team evaluation

    International Nuclear Information System (INIS)

    Sasou Kunihide; Ebisu, Mitsuhiro; Hirose, Ayako

    2004-01-01

    Several studies have been done to evaluate or improve team performance in nuclear and aviation industries. Crew resource management is the typical example. In addition, team evaluation recently gathers interests in other teams of lawyers, medical staff, accountants, psychiatrics, executive, etc. However, the most evaluation methods focus on the results of team behavior that can be observed through training or actual business situations. What is expected team is not only resolving problems but also training younger members being destined to lead the next generation. Therefore, the authors set the final goal of this study establishing a series of methods to evaluate and improve teams inclusively such as decision making, motivation, staffing, etc. As the first step, this study develops team process model describing viewpoints for the evaluation. The team process is defined as some kinds of power that activate or inactivate competency of individuals that is the components of team's competency. To find the team process, the authors discussed the merits of team behavior with the experienced training instructors and shift supervisors of nuclear/thermal power plants. The discussion finds four team merits and many components to realize those team merits. Classifying those components into eight groups of team processes such as 'Orientation', 'Decision Making', 'Power and Responsibility', 'Workload Management', 'Professional Trust', 'Motivation', 'Training' and 'staffing', the authors propose Team Process Model with two to four sub processes in each team process. In the future, the authors will develop methods to evaluate some of the team processes for nuclear/thermal power plant operation teams. (author)

  5. World Integrated Nuclear Evaluation System: Model documentation

    International Nuclear Information System (INIS)

    1991-12-01

    The World Integrated Nuclear Evaluation System (WINES) is an aggregate demand-based partial equilibrium model used by the Energy Information Administration (EIA) to project long-term domestic and international nuclear energy requirements. WINES follows a top-down approach in which economic growth rates, delivered energy demand growth rates, and electricity demand are projected successively to ultimately forecast total nuclear generation and nuclear capacity. WINES could be potentially used to produce forecasts for any country or region in the world. Presently, WINES is being used to generate long-term forecasts for the United States, and for all countries with commercial nuclear programs in the world, excluding countries located in centrally planned economic areas. Projections for the United States are developed for the period from 2010 through 2030, and for other countries for the period starting in 2000 or 2005 (depending on the country) through 2010. EIA uses a pipeline approach to project nuclear capacity for the period between 1990 and the starting year for which the WINES model is used. This approach involves a detailed accounting of existing nuclear generating units and units under construction, their capacities, their actual or estimated time of completion, and the estimated date of retirements. Further detail on this approach can be found in Appendix B of Commercial Nuclear Power 1991: Prospects for the United States and the World

  6. Evaluation of Mid-Size Male Hybrid III Models for use in Spaceflight Occupant Protection Analysis

    Science.gov (United States)

    Putnam, J.; Somers, J.; Wells, J.; Newby, N.; Currie-Gregg, N.; Lawrence, C.

    2016-01-01

    Introduction: In an effort to improve occupant safety during dynamic phases of spaceflight, the National Aeronautics and Space Administration (NASA) has worked to develop occupant protection standards for future crewed spacecraft. One key aspect of these standards is the identification of injury mechanisms through anthropometric test devices (ATDs). Within this analysis, both physical and computational ATD evaluations are required to reasonably encompass the vast range of loading conditions any spaceflight crew may encounter. In this study the accuracy of publically available mid-size male HIII ATD finite element (FE) models are evaluated within applicable loading conditions against extensive sled testing performed on their physical counterparts. Methods: A series of sled tests were performed at the Wright Patterson Air force Base (WPAFB) employing variations of magnitude, duration, and impact direction to encompass the dynamic loading range for expected spaceflight. FE simulations were developed to the specifications of the test setup and driven using measured acceleration profiles. Both fast and detailed FE models of the mid-size male HIII were ran to quantify differences in their accuracy and thus assess the applicability of each within this field. Results: Preliminary results identify the dependence of model accuracy on loading direction, magnitude, and rate. Additionally the accuracy of individual response metrics are shown to vary across each model within evaluated test conditions. Causes for model inaccuracy are identified based on the observed relationships. Discussion: Computational modeling provides an essential component to ATD injury metric evaluation used to ensure the safety of future spaceflight occupants. The assessment of current ATD models lays the groundwork for how these models can be used appropriately in the future. Identification of limitations and possible paths for improvement aid in the development of these effective analysis tools.

  7. The Alpha Stem Cell Clinic: a model for evaluating and delivering stem cell-based therapies.

    Science.gov (United States)

    Trounson, Alan; DeWitt, Natalie D; Feigal, Ellen G

    2012-01-01

    Cellular therapies require the careful preparation, expansion, characterization, and delivery of cells in a clinical environment. There are major challenges associated with the delivery of cell therapies and high costs that will limit the companies available to fully evaluate their merit in clinical trials, and will handicap their application at the present financial environment. Cells will be manufactured in good manufacturing practice or near-equivalent facilities with prerequisite safety practices in place, and cell delivery systems will be specialized and require well-trained medical and nursing staff, technicians or nurses trained to handle cells once delivered, patient counselors, as well as statisticians and database managers who will oversee the monitoring of patients in relatively long-term follow-up studies. The model proposed for Alpha Stem Cell Clinics will initially use the capacities and infrastructure that exist in the most advanced tertiary medical clinics for delivery of established bone marrow stem cell therapies. As the research evolves, they will incorporate improved procedures and cell preparations. This model enables commercialization of medical devices, reagents, and other products required for cell therapies. A carefully constructed cell therapy clinical infrastructure with the requisite scientific, technical, and medical expertise and operational efficiencies will have the capabilities to address three fundamental and critical functions: 1) fostering clinical trials; 2) evaluating and establishing safe and effective therapies, and 3) developing and maintaining the delivery of therapies approved by the Food and Drug Administration, or other regulatory agencies.

  8. Evaluation model development for sprinkler irrigation uniformity ...

    African Journals Online (AJOL)

    A new evaluation method with accompanying software was developed to precisely calculate uniformity from catch-can test data, assuming sprinkler distribution data to be a continuous variable. Two interpolation steps are required to compute unknown water application depths at grid distribution points from radial ...

  9. Patient-centered care requires a patient-oriented workflow model.

    Science.gov (United States)

    Ozkaynak, Mustafa; Brennan, Patricia Flatley; Hanauer, David A; Johnson, Sharon; Aarts, Jos; Zheng, Kai; Haque, Saira N

    2013-06-01

    Effective design of health information technology (HIT) for patient-centered care requires consideration of workflow from the patient's perspective, termed 'patient-oriented workflow.' This approach organizes the building blocks of work around the patients who are moving through the care system. Patient-oriented workflow complements the more familiar clinician-oriented workflow approaches, and offers several advantages, including the ability to capture simultaneous, cooperative work, which is essential in care delivery. Patient-oriented workflow models can also provide an understanding of healthcare work taking place in various formal and informal health settings in an integrated manner. We present two cases demonstrating the potential value of patient-oriented workflow models. Significant theoretical, methodological, and practical challenges must be met to ensure adoption of patient-oriented workflow models. Patient-oriented workflow models define meaningful system boundaries and can lead to HIT implementations that are more consistent with cooperative work and its emergent features.

  10. Using measurements for evaluation of black carbon modeling

    Directory of Open Access Journals (Sweden)

    S. Gilardoni

    2011-01-01

    Full Text Available The ever increasing use of air quality and climate model assessments to underpin economic, public health, and environmental policy decisions makes effective model evaluation critical. This paper discusses the properties of black carbon and light attenuation and absorption observations that are the key to a reliable evaluation of black carbon model and compares parametric and nonparametric statistical tools for the quantification of the agreement between models and observations. Black carbon concentrations are simulated with TM5/M7 global model from July 2002 to June 2003 at four remote sites (Alert, Jungfraujoch, Mace Head, and Trinidad Head and two regional background sites (Bondville and Ispra. Equivalent black carbon (EBC concentrations are calculated using light attenuation measurements from January 2000 to December 2005. Seasonal trends in the measurements are determined by fitting sinusoidal functions and the representativeness of the period simulated by the model is verified based on the scatter of the experimental values relative to the fit curves. When the resolution of the model grid is larger than 1° × 1°, it is recommended to verify that the measurement site is representative of the grid cell. For this purpose, equivalent black carbon measurements at Alert, Bondville and Trinidad Head are compared to light absorption and elemental carbon measurements performed at different sites inside the same model grid cells. Comparison of these equivalent black carbon and elemental carbon measurements indicates that uncertainties in black carbon optical properties can compromise the comparison between model and observations. During model evaluation it is important to examine the extent to which a model is able to simulate the variability in the observations over different integration periods as this will help to identify the most appropriate timescales. The agreement between model and observation is accurately described by the overlap of

  11. Hybrid supply chain model for material requirement planning under financial constraints: A case study

    Science.gov (United States)

    Curci, Vita; Dassisti, Michele; Josefa, Mula Bru; Manuel, Díaz Madroñero

    2014-10-01

    Supply chain model (SCM) are potentially capable to integrate different aspects in supporting decision making for enterprise management tasks. The aim of the paper is to propose an hybrid mathematical programming model for optimization of production requirements resources planning. The preliminary model was conceived bottom-up from a real industrial case analysed oriented to maximize cash flow. Despite the intense computational effort required to converge to a solution, optimisation done brought good result in solving the objective function.

  12. On meeting capital requirements with a chance-constrained optimization model.

    Science.gov (United States)

    Atta Mills, Ebenezer Fiifi Emire; Yu, Bo; Gu, Lanlan

    2016-01-01

    This paper deals with a capital to risk asset ratio chance-constrained optimization model in the presence of loans, treasury bill, fixed assets and non-interest earning assets. To model the dynamics of loans, we introduce a modified CreditMetrics approach. This leads to development of a deterministic convex counterpart of capital to risk asset ratio chance constraint. We pursue the scope of analyzing our model under the worst-case scenario i.e. loan default. The theoretical model is analyzed by applying numerical procedures, in order to administer valuable insights from a financial outlook. Our results suggest that, our capital to risk asset ratio chance-constrained optimization model guarantees banks of meeting capital requirements of Basel III with a likelihood of 95 % irrespective of changes in future market value of assets.

  13. Popularity Evaluation Model for Microbloggers Online Social Network

    Directory of Open Access Journals (Sweden)

    Xia Zhang

    2014-01-01

    Full Text Available Recently, microblogging is widely studied by the researchers in the domain of the online social network (OSN. How to evaluate the popularities of microblogging users is an important research field, which can be applied to commercial advertising, user behavior analysis and information dissemination, and so forth. Previous studies on the evaluation methods cannot effectively solve and accurately evaluate the popularities of the microbloggers. In this paper, we proposed an electromagnetic field theory based model to analyze the popularities of microbloggers. The concept of the source in microblogging field is first put forward, which is based on the concept of source in the electromagnetic field; then, one’s microblogging flux is calculated according to his/her behaviors (send or receive feedbacks on the microblogging platform; finally, we used three methods to calculate one’s microblogging flux density, which can represent one’s popularity on the microblogging platform. In the experimental work, we evaluated our model using real microblogging data and selected the best one from the three popularity measure methods. We also compared our model with the classic PageRank algorithm; and the results show that our model is more effective and accurate to evaluate the popularities of the microbloggers.

  14. Using satellite observations in performance evaluation for regulatory air quality modeling: Comparison with ground-level measurements

    Science.gov (United States)

    Odman, M. T.; Hu, Y.; Russell, A.; Chai, T.; Lee, P.; Shankar, U.; Boylan, J.

    2012-12-01

    Regulatory air quality modeling, such as State Implementation Plan (SIP) modeling, requires that model performance meets recommended criteria in the base-year simulations using period-specific, estimated emissions. The goal of the performance evaluation is to assure that the base-year modeling accurately captures the observed chemical reality of the lower troposphere. Any significant deficiencies found in the performance evaluation must be corrected before any base-case (with typical emissions) and future-year modeling is conducted. Corrections are usually made to model inputs such as emission-rate estimates or meteorology and/or to the air quality model itself, in modules that describe specific processes. Use of ground-level measurements that follow approved protocols is recommended for evaluating model performance. However, ground-level monitoring networks are spatially sparse, especially for particulate matter. Satellite retrievals of atmospheric chemical properties such as aerosol optical depth (AOD) provide spatial coverage that can compensate for the sparseness of ground-level measurements. Satellite retrievals can also help diagnose potential model or data problems in the upper troposphere. It is possible to achieve good model performance near the ground, but have, for example, erroneous sources or sinks in the upper troposphere that may result in misleading and unrealistic responses to emission reductions. Despite these advantages, satellite retrievals are rarely used in model performance evaluation, especially for regulatory modeling purposes, due to the high uncertainty in retrievals associated with various contaminations, for example by clouds. In this study, 2007 was selected as the base year for SIP modeling in the southeastern U.S. Performance of the Community Multiscale Air Quality (CMAQ) model, at a 12-km horizontal resolution, for this annual simulation is evaluated using both recommended ground-level measurements and non-traditional satellite

  15. Connecting Requirements to Architecture and Analysis via Model-Based Systems Engineering

    Science.gov (United States)

    Cole, Bjorn F.; Jenkins, J. Steven

    2015-01-01

    In traditional systems engineering practice, architecture, concept development, and requirements development are related but still separate activities. Concepts for operation, key technical approaches, and related proofs of concept are developed. These inform the formulation of an architecture at multiple levels, starting with the overall system composition and functionality and progressing into more detail. As this formulation is done, a parallel activity develops a set of English statements that constrain solutions. These requirements are often called "shall statements" since they are formulated to use "shall." The separation of requirements from design is exacerbated by well-meaning tools like the Dynamic Object-Oriented Requirements System (DOORS) that remained separated from engineering design tools. With the Europa Clipper project, efforts are being taken to change the requirements development approach from a separate activity to one intimately embedded in formulation effort. This paper presents a modeling approach and related tooling to generate English requirement statements from constraints embedded in architecture definition.

  16. Evaluation of Satellite and Model Precipitation Products Over Turkey

    Science.gov (United States)

    Yilmaz, M. T.; Amjad, M.

    2017-12-01

    Satellite-based remote sensing, gauge stations, and models are the three major platforms to acquire precipitation dataset. Among them satellites and models have the advantage of retrieving spatially and temporally continuous and consistent datasets, while the uncertainty estimates of these retrievals are often required for many hydrological studies to understand the source and the magnitude of the uncertainty in hydrological response parameters. In this study, satellite and model precipitation data products are validated over various temporal scales (daily, 3-daily, 7-daily, 10-daily and monthly) using in-situ measured precipitation observations from a network of 733 gauges from all over the Turkey. Tropical Rainfall Measurement Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) 3B42 version 7 and European Center of Medium-Range Weather Forecast (ECMWF) model estimates (daily, 3-daily, 7-daily and 10-daily accumulated forecast) are used in this study. Retrievals are evaluated for their mean and standard deviation and their accuracies are evaluated via bias, root mean square error, error standard deviation and correlation coefficient statistics. Intensity vs frequency analysis and some contingency table statistics like percent correct, probability of detection, false alarm ratio and critical success index are determined using daily time-series. Both ECMWF forecasts and TRMM observations, on average, overestimate the precipitation compared to gauge estimates; wet biases are 10.26 mm/month and 8.65 mm/month, respectively for ECMWF and TRMM. RMSE values of ECMWF forecasts and TRMM estimates are 39.69 mm/month and 41.55 mm/month, respectively. Monthly correlations between Gauges-ECMWF, Gauges-TRMM and ECMWF-TRMM are 0.76, 0.73 and 0.81, respectively. The model and the satellite error statistics are further compared against the gauges error statistics based on inverse distance weighting (IWD) analysis. Both the model and satellite data have less IWD errors (14

  17. Network-Based Material Requirements Planning (NBMRP) in ...

    African Journals Online (AJOL)

    Network-Based Material Requirements Planning (NBMRP) in Product Development Project. ... International Journal of Development and Management Review ... To address the problems, this study evaluated the existing material planning practice, and formulated a NBMRP model out of the variables of the existing MRP and ...

  18. Evaluating Models of Human Performance: Safety-Critical Systems Applications

    Science.gov (United States)

    Feary, Michael S.

    2012-01-01

    This presentation is part of panel discussion on Evaluating Models of Human Performance. The purpose of this panel is to discuss the increasing use of models in the world today and specifically focus on how to describe and evaluate models of human performance. My presentation will focus on discussions of generating distributions of performance, and the evaluation of different strategies for humans performing tasks with mixed initiative (Human-Automation) systems. I will also discuss issues with how to provide Human Performance modeling data to support decisions on acceptability and tradeoffs in the design of safety critical systems. I will conclude with challenges for the future.

  19. Evaluation of Features, Events, and Processes (FEP) for the Biosphere Model

    International Nuclear Information System (INIS)

    Wasiolek, M. A.

    2003-01-01

    The purpose of this report is to document the evaluation of biosphere features, events, and processes (FEPs) that relate to the license application (LA) process as required by the U.S. Nuclear Regulatory Commission (NRC) regulations at 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The evaluation determines whether specific biosphere-related FEPs should be included or excluded from consideration in the Total System Performance Assessment (TSPA). This analysis documents the technical basis for screening decisions as required at 10 CFR 63.114 (d, e, and f) [DIRS 156605]. For FEPs that are included in the TSPA, this analysis provides a TSPA disposition, which summarizes how the FEP has been included and addressed in the TSPA model, and cites the analysis reports and model reports that provide the technical basis and description of its disposition. For FEPs that are excluded from the TSPA, this analysis report provides a screening argument, which identifies the basis for the screening decision (i.e., low probability, low consequence, or by regulation) and discusses the technical basis that supports that decision. In cases, where a FEP covers multiple technical areas and is shared with other FEP analysis reports, this analysis may provide only a partial technical basis for the screening of the FEP. The full technical basis for these shared FEPs is addressed collectively by all FEP analysis reports that cover technical disciplines sharing a FEP. FEPs must be included in the TSPA unless they can be excluded by low probability, low consequence, or regulation. A FEP can be excluded from the TSPA by low probability per 10 CFR 63.114(d) [DIRS 156605], by showing that it has less than one chance in 10,000 of occurring over 10,000 years (or an approximately equivalent annualized probability of 10 -8 ). A FEP can be excluded from the TSPA by low consequence per 10 CFR 63.114 (e or f) [DIRS 156605], by showing that omitting the FEP would not significantly change the magnitude and

  20. A funding model for health visiting: baseline requirements--part 1.

    Science.gov (United States)

    Cowley, Sarah

    2007-11-01

    A funding model proposed in two papers will outline the health visiting resource, including team skill mix, required to deliver the recommended approach of 'progressive universalism,' taking account of health inequalities, best evidence and impact on outcomes that might be anticipated. The model has been discussed as far as possible across the professional networks of both the Community Practitioners' and Health Visitors' Association (CPHVA) and United Kingdom Public Health Association (UKPHA), and is a consensus statement agreed by all who have participated.

  1. Metal Mixture Modeling Evaluation project: 2. Comparison of four modeling approaches

    Science.gov (United States)

    Farley, Kevin J.; Meyer, Joe; Balistrieri, Laurie S.; DeSchamphelaere, Karl; Iwasaki, Yuichi; Janssen, Colin; Kamo, Masashi; Lofts, Steve; Mebane, Christopher A.; Naito, Wataru; Ryan, Adam C.; Santore, Robert C.; Tipping, Edward

    2015-01-01

    As part of the Metal Mixture Modeling Evaluation (MMME) project, models were developed by the National Institute of Advanced Industrial Science and Technology (Japan), the U.S. Geological Survey (USA), HDR⎪HydroQual, Inc. (USA), and the Centre for Ecology and Hydrology (UK) to address the effects of metal mixtures on biological responses of aquatic organisms. A comparison of the 4 models, as they were presented at the MMME Workshop in Brussels, Belgium (May 2012), is provided herein. Overall, the models were found to be similar in structure (free ion activities computed by WHAM; specific or non-specific binding of metals/cations in or on the organism; specification of metal potency factors and/or toxicity response functions to relate metal accumulation to biological response). Major differences in modeling approaches are attributed to various modeling assumptions (e.g., single versus multiple types of binding site on the organism) and specific calibration strategies that affected the selection of model parameters. The models provided a reasonable description of additive (or nearly additive) toxicity for a number of individual toxicity test results. Less-than-additive toxicity was more difficult to describe with the available models. Because of limitations in the available datasets and the strong inter-relationships among the model parameters (log KM values, potency factors, toxicity response parameters), further evaluation of specific model assumptions and calibration strategies is needed.

  2. An evaluation of sex-age-kill (SAK) model performance

    Science.gov (United States)

    Millspaugh, Joshua J.; Skalski, John R.; Townsend, Richard L.; Diefenbach, Duane R.; Boyce, Mark S.; Hansen, Lonnie P.; Kammermeyer, Kent

    2009-01-01

    The sex-age-kill (SAK) model is widely used to estimate abundance of harvested large mammals, including white-tailed deer (Odocoileus virginianus). Despite a long history of use, few formal evaluations of SAK performance exist. We investigated how violations of the stable age distribution and stationary population assumption, changes to male or female harvest, stochastic effects (i.e., random fluctuations in recruitment and survival), and sampling efforts influenced SAK estimation. When the simulated population had a stable age distribution and λ > 1, the SAK model underestimated abundance. Conversely, when λ < 1, the SAK overestimated abundance. When changes to male harvest were introduced, SAK estimates were opposite the true population trend. In contrast, SAK estimates were robust to changes in female harvest rates. Stochastic effects caused SAK estimates to fluctuate about their equilibrium abundance, but the effect dampened as the size of the surveyed population increased. When we considered both stochastic effects and sampling error at a deer management unit scale the resultant abundance estimates were within ±121.9% of the true population level 95% of the time. These combined results demonstrate extreme sensitivity to model violations and scale of analysis. Without changes to model formulation, the SAK model will be biased when λ ≠ 1. Furthermore, any factor that alters the male harvest rate, such as changes to regulations or changes in hunter attitudes, will bias population estimates. Sex-age-kill estimates may be precise at large spatial scales, such as the state level, but less so at the individual management unit level. Alternative models, such as statistical age-at-harvest models, which require similar data types, might allow for more robust, broad-scale demographic assessments.

  3. A navigational evaluation model for content management systems

    International Nuclear Information System (INIS)

    Gilani, S.; Majeed, A.

    2016-01-01

    Web applications are widely used world-wide, however it is important that the navigation of these websites is effective, to enhance usability. Navigation is not limited to links between pages, it is also how we complete a task. Navigational structure presented as hypertext is one of the most important component of the Web application besides content and presentation. The main objective of this paper is to explore the navigational structure of various open source Content Management Systems from the developer's perspective. For this purpose three CMS are chosen which are WordPress, Joomla, and Drupal. Objective of the research is to identify the important navigational aspects present in these CMSs. Moreover, a comparative study of these CMSs in terms of navigational support is required. For this purpose an industrial survey is conducted based on our proposed navigational evaluation model. The results shows that there exist correlation between the identified factors and these CMSs provide helpful and effective navigational support to their users. (author)

  4. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    Science.gov (United States)

    The model performance evaluation consists of metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors.

  5. Diverse Secreted Effectors Are Required for Salmonella Persistence in a Mouse Infection Model

    Energy Technology Data Exchange (ETDEWEB)

    Kidwai, Afshan S.; Mushamiri, Ivy T.; Niemann, George; Brown, Roslyn N.; Adkins, Joshua N.; Heffron, Fred

    2013-08-12

    Salmonella enterica serovar Typhimurium causes typhoid-like disease in mice and is a model of typhoid fever in humans. One of the hallmarks of typhoid is persistence, the ability of the bacteria to survive in the host weeks after infection. Virulence factors called effectors facilitate this process by direct transfer to the cytoplasm of infected cells thereby subverting cellular processes. Secretion of effectors to the cell cytoplasm takes place through multiple routes, including two separate type III secretion (T3SS) apparati as well as outer membrane vesicles. The two T3SS are encoded on separate pathogenicity islands, SPI-1 and -2, with SPI-1 more strongly associated with the intestinal phase of infection, and SPI-2 with the systemic phase. Both T3SS are required for persistence, but the effectors required have not been systematically evaluated. In this study, mutations in 48 described effectors were tested for persistence. We replaced each effector with a specific DNA barcode sequence by allelic exchange and co-infected with a wild-type reference to calculate the ratio of wild-type parent to mutant at different times after infection. The competitive index (CI) was determined by quantitative PCR in which primers that correspond to the barcode were used for amplification. Mutations in all but seven effectors reduced persistence demonstrating that most effectors were required. One exception was CigR, a recently discovered effector that is widely conserved throughout enteric bacteria. Deletion of cigR increased lethality, suggesting that it may be an anti-virulence factor. The fact that almost all Salmonella effectors are required for persistence argues against redundant functions. This is different from effector repertoires in other intracellular pathogens such as Legionella.

  6. Presenting an evaluation model of the trauma registry software.

    Science.gov (United States)

    Asadi, Farkhondeh; Paydar, Somayeh

    2018-04-01

    Trauma is a major cause of 10% death in the worldwide and is considered as a global concern. This problem has made healthcare policy makers and managers to adopt a basic strategy in this context. Trauma registry has an important and basic role in decreasing the mortality and the disabilities due to injuries resulted from trauma. Today, different software are designed for trauma registry. Evaluation of this software improves management, increases efficiency and effectiveness of these systems. Therefore, the aim of this study is to present an evaluation model for trauma registry software. The present study is an applied research. In this study, general and specific criteria of trauma registry software were identified by reviewing literature including books, articles, scientific documents, valid websites and related software in this domain. According to general and specific criteria and related software, a model for evaluating trauma registry software was proposed. Based on the proposed model, a checklist designed and its validity and reliability evaluated. Mentioned model by using of the Delphi technique presented to 12 experts and specialists. To analyze the results, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved by the experts and professionals, the final version of the evaluation model for the trauma registry software was presented. For evaluating of criteria of trauma registry software, two groups were presented: 1- General criteria, 2- Specific criteria. General criteria of trauma registry software were classified into four main categories including: 1- usability, 2- security, 3- maintainability, and 4-interoperability. Specific criteria were divided into four main categories including: 1- data submission and entry, 2- reporting, 3- quality control, 4- decision and research support. The presented model in this research has introduced important general and specific criteria of trauma registry software

  7. Modelo de requisitos para sistemas embebidos: Model of requirements for embedded systems

    Directory of Open Access Journals (Sweden)

    Liliana González Palacio

    2008-07-01

    Full Text Available En este artículo se presenta un modelo de requisitos como apoyo para la construcción de sistemas embebidos. En la actualidad, las metodologías de Ingeniería de Requisitos propuestas para este dominio no establecen continuidad en su proceso de desarrollo, ya que poseen una fuerte orientación a la etapa de diseño y un énfasis más débil en la etapa de análisis. Además, dichas metodologías ofrecen pautas para tratar los requisitos luego de que han sido obtenidos, pero no proponen herramientas; como por ejemplo, un modelo de requisitos, para la obtención de estos. Este trabajo hace parte de un proyecto de investigación que tiene como objetivo proponer una metodología de Ingeniería de Requisitos (IR para el análisis de Sistemas Embebidos (SE. El modelo de requisitos propuesto y su forma de utilización se ilustran mediante un caso de aplicación consistente en la obtención de requisitos para un sistema de sensado de movimiento, embebido en un sistema de alarma para hogar.In this paper a model of requirements for supporting the construction of embedded systems is presented. Currently, the methodologies of Engineering of Requirements, in this field, do not let continuity in their development process, since they have a strong orientation to design stage and a weaker emphasis on the analysis stage. Furthermore, such methodologies provide guidelines for treating requirements after being obtained. However, they do not propose tools such as a model of requirements for obtaining them. This paper is the result of a research project which objective is to propose engineering of requirements methodology for embedded systems analysis. The model of proposed requirements and its use are illustrated through an application case consisting on obtaining requirements for a movement sensing system, embedded in a home alarm system.

  8. Evaluation and comparison of models and modelling tools simulating nitrogen processes in treatment wetlands

    DEFF Research Database (Denmark)

    Edelfeldt, Stina; Fritzson, Peter

    2008-01-01

    with Modelica 2.1 (Wiley-IEEE Press, USA, 2004).] and an associated tool. The differences and similarities between the MathModelica Model Editor and three other ecological modelling tools have also been evaluated. The results show that the models can well be modelled and simulated in the MathModelica Model...... Editor, and that nitrogen decrease in a constructed treatment wetland should be described and simulated using the Nitrification/Denitrification model as this model has the highest overall quality score and provides a more variable environment.......In this paper, two ecological models of nitrogen processes in treatment wetlands have been evaluated and compared. These models were implemented, simulated, and visualized using the Modelica modelling and simulation language [P. Fritzson, Principles of Object-Oriented Modelling and Simulation...

  9. Research on evaluation of enterprise project culture based on Denison model

    Directory of Open Access Journals (Sweden)

    Yucheng Zeng

    2015-05-01

    Full Text Available Purpose: The purpose of this paper is to build enterprise project culture evaluation model and search for the best evaluation method for Chinese enterprise project culture on the basis of studying and drawing lessons from enterprise culture evaluation theory and method at home and abroad. Design/methodology/approach: Referring to the Denison enterprise culture evaluation model, this paper optimizes it according to the difference of enterprise project culture, designs the enterprise project culture evaluation model and proves the practicability of the model through empirical. Finding: This paper finds that it`s more applicable to use the Denison model for enterprise project culture evaluation through the comparative analysis of domestic and foreign enterprise culture evaluation theory and method, the systematic project culture management framework of Chinese enterprises has not yet formed through empirical research, and four factors in enterprise project culture have important influence on project operation performance improvement. Research limitations/implications: The research on evaluation of enterprise project culture based on Denison model is a preliminary attempt, the design of evaluation index system, evaluation model and scale structure also need to be improved, but the thinking of this paper in this field provides a valuable reference for future research. Practical Implications: This paper provides the support of theory and practice for evaluating the present situation of enterprise project culture construction and analyzing the advantages and disadvantages of project culture, which contributes to the "dialectical therapy" of enterprise project management, enterprise management and enterprise project culture construction. Originality/value: The main contribution of this paper is the introduction of Denison enterprise culture model. Combining with the actual situation of enterprises, this paper also builds the evaluation model for

  10. Local fit evaluation of structural equation models using graphical criteria.

    Science.gov (United States)

    Thoemmes, Felix; Rosseel, Yves; Textor, Johannes

    2018-03-01

    Evaluation of model fit is critically important for every structural equation model (SEM), and sophisticated methods have been developed for this task. Among them are the χ² goodness-of-fit test, decomposition of the χ², derived measures like the popular root mean square error of approximation (RMSEA) or comparative fit index (CFI), or inspection of residuals or modification indices. Many of these methods provide a global approach to model fit evaluation: A single index is computed that quantifies the fit of the entire SEM to the data. In contrast, graphical criteria like d-separation or trek-separation allow derivation of implications that can be used for local fit evaluation, an approach that is hardly ever applied. We provide an overview of local fit evaluation from the viewpoint of SEM practitioners. In the presence of model misfit, local fit evaluation can potentially help in pinpointing where the problem with the model lies. For models that do fit the data, local tests can identify the parts of the model that are corroborated by the data. Local tests can also be conducted before a model is fitted at all, and they can be used even for models that are globally underidentified. We discuss appropriate statistical local tests, and provide applied examples. We also present novel software in R that automates this type of local fit evaluation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance ad equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  12. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance and equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  13. The Use of AMET & Automated Scripts for Model Evaluation

    Science.gov (United States)

    Brief overview of EPA’s new CMAQ website to be launched publically in June, 2017. Details on the upcoming release of the Atmospheric Model Evaluation Tool (AMET) and the creation of automated scripts for post-processing and evaluating air quality model data.

  14. Agent-based modeling as a tool for program design and evaluation.

    Science.gov (United States)

    Lawlor, Jennifer A; McGirr, Sara

    2017-12-01

    Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Review of models used for determining consequences of UF6 release: Development of model evaluation criteria. Volume 1

    International Nuclear Information System (INIS)

    Nair, S.K.; Chambers, D.B.; Park, S.H.; Hoffman, F.O.

    1997-11-01

    The objective of this study is to examine the usefulness and effectiveness of currently existing models that simulate the release of uranium hexafluoride from UF 6 -handling facilities, subsequent reactions of UF 6 with atmospheric moisture, and the dispersion of UF 6 and reaction products in the atmosphere. The study evaluates screening-level and detailed public-domain models that were specifically developed for UF 6 and models that were originally developed for the treatment of dense gases but are applicable to UF 6 release, reaction, and dispersion. The model evaluation process is divided into three specific tasks: model-component evaluation; applicability evaluation; and user interface and quality assurance and quality control (QA/QC) evaluation. Within the model-component evaluation process, a model's treatment of source term, thermodynamics, and atmospheric dispersion are considered and model predictions are compared with actual observations. Within the applicability evaluation process, a model's applicability to Integrated Safety Analysis, Emergency Response Planning, and Post-Accident Analysis, and to site-specific considerations are assessed. Finally, within the user interface and QA/QC evaluation process, a model's user-friendliness, presence and clarity of documentation, ease of use, etc. are assessed, along with its handling of QA/QC. This document presents the complete methodology used in the evaluation process

  16. Evaluation of safety, an unavoidable requirement in the applications of ionizing radiations

    International Nuclear Information System (INIS)

    Jova Sed, Luis Andres

    2013-01-01

    The safety assessments should be conducted as a means to evaluate compliance with safety requirements (and thus the application of fundamental safety principles) for all facilities and activities in order to determine the measures to be taken to ensure safety. It is an essential tool in decision making. For long time we have linked the safety assessment to nuclear facilities and not to all practices involving the use of ionizing radiation in daily life. However, the main purpose of the safety assessment is to determine if it has reached an appropriate level of safety for an installation or activity and if it has fulfilled the objectives of safety and basic safety criteria set by the designer, operating organization and the regulatory body under the protection and safety requirements set out in the International Basic safety Standards for Protection against Ionizing Radiation and for the Safety of Radiation Sources. This paper presents some criteria and personal experiences with the new international recommendations on this subject and its practical application in the region and demonstrates the importance of this requirement. Reflects the need to train personnel of the operator and the regulatory body in the proportional application of this requirement in practice with ionizing radiation

  17. Promoting Excellence in Nursing Education (PENE): Pross evaluation model.

    Science.gov (United States)

    Pross, Elizabeth A

    2010-08-01

    The purpose of this article is to examine the Promoting Excellence in Nursing Education (PENE) Pross evaluation model. A conceptual evaluation model, such as the one described here, may be useful to nurse academicians in the ongoing evaluation of educational programs, especially those with goals of excellence. Frameworks for evaluating nursing programs are necessary because they offer a way to systematically assess the educational effectiveness of complex nursing programs. This article describes the conceptual framework and its tenets of excellence. Copyright 2009 Elsevier Ltd. All rights reserved.

  18. An economic evaluation of photovoltaic grid connected systems (PVGCS) in Flanders for companies: A generic model

    International Nuclear Information System (INIS)

    Audenaert, Amaryllis; De Boeck, Liesje; De Cleyn, Sven; Lizin, Sebastien; Adam, Jean-Francois

    2010-01-01

    In this paper an economic evaluation of photovoltaic grid connected systems (PVGCS) for companies situated in Flanders (Belgium) is conducted by using a generic Excel model. The model is unique in that it includes the dimension of taxation. This inclusion is required, otherwise the fiscal benefit of using solar panels is not accounted for. The model uses the cash flow projection method. This technique allows the calculation of the following classical evaluation criteria: net present value, internal rate of return, payback period, discounted payback period, profitability index, yield unit cost, yield unit revenue and break-even turnkey cost. Their outcome makes it possible to answer the question whether installing a PVGCS in Flanders is a responsible financial investment for companies. Furthermore, the paper estimates whether the corporate environment is ready for a subsidy legislation change. This change has recently been announced and as such it is possible to gauge whether the current market situation is profitable given future legislation. (author)

  19. Modelling basin-wide variations in Amazon forest productivity – Part 1: Model calibration, evaluation and upscaling functions for canopy photosynthesis

    Directory of Open Access Journals (Sweden)

    L. M. Mercado

    2009-07-01

    Full Text Available Given the importance of Amazon rainforest in the global carbon and hydrological cycles, there is a need to parameterize and validate ecosystem gas exchange and vegetation models for this region in order to adequately simulate present and future carbon and water balances. In this study, a sun and shade canopy gas exchange model is calibrated and evaluated at five rainforest sites using eddy correlation measurements of carbon and energy fluxes.

    Results from the model-data evaluation suggest that with adequate parameterisation, photosynthesis models taking into account the separation of diffuse and direct irradiance and the dynamics of sunlit and shaded leaves can accurately represent photosynthesis in these forests. Also, stomatal conductance formulations that only take into account atmospheric demand fail to correctly simulate moisture and CO2 fluxes in forests with a pronounced dry season, particularly during afternoon conditions. Nevertheless, it is also the case that large uncertainties are associated not only with the eddy correlation data, but also with the estimates of ecosystem respiration required for model validation. To accurately simulate Gross Primary Productivity (GPP and energy partitioning the most critical parameters and model processes are the quantum yield of photosynthetic uptake, the maximum carboxylation capacity of Rubisco, and simulation of stomatal conductance.

    Using this model-data synergy, we developed scaling functions to provide estimates of canopy photosynthetic parameters for a range of diverse forests across the Amazon region, utilising the best fitted parameter for maximum carboxylation capacity of Rubisco, and foliar nutrients (N and P for all sites.

  20. 14 CFR 382.133 - What are the requirements concerning the evaluation and use of passenger-supplied electronic...

    Science.gov (United States)

    2010-01-01

    ... evaluation and use of passenger-supplied electronic devices that assist passengers with respiration in the... What are the requirements concerning the evaluation and use of passenger-supplied electronic devices... to use in the passenger cabin during air transportation, a ventilator, respirator, continuous...

  1. Model of service-oriented catering supply chain performance evaluation

    Directory of Open Access Journals (Sweden)

    Juanqiong Gou

    2013-03-01

    Full Text Available Purpose: The aim of this paper is constructing a performance evaluation model for service-oriented catering supply chain. Design/methodology/approach: With the research on the current situation of catering industry, this paper summarized the characters of the catering supply chain, and then presents the service-oriented catering supply chain model based on the platform of logistics and information. At last, the fuzzy AHP method is used to evaluate the performance of service-oriented catering supply chain. Findings: With the analysis of the characteristics of catering supply chain, we construct the performance evaluation model in order to guarantee the food safety, logistics efficiency, price stability and so on. Practical implications: In order to evolve an efficient and effective service supply chain, it can not only used to own enterprise improvement, but also can be used for selecting different customers, to choose a different model of development. Originality/value: This paper has a new definition of service-oriented catering supply chain. And it offers a model to evaluate the performance of this catering supply chain.

  2. Evaluation of potential crushed-salt constitutive models

    International Nuclear Information System (INIS)

    Callahan, G.D.; Loken, M.C.; Sambeek, L.L. Van; Chen, R.; Pfeifle, T.W.; Nieland, J.D.; Hansen, F.D.

    1995-12-01

    Constitutive models describing the deformation of crushed salt are presented in this report. Ten constitutive models with potential to describe the phenomenological and micromechanical processes for crushed salt were selected from a literature search. Three of these ten constitutive models, termed Sjaardema-Krieg, Zeuch, and Spiers models, were adopted as candidate constitutive models. The candidate constitutive models were generalized in a consistent manner to three-dimensional states of stress and modified to include the effects of temperature, grain size, and moisture content. A database including hydrostatic consolidation and shear consolidation tests conducted on Waste Isolation Pilot Plant and southeastern New Mexico salt was used to determine material parameters for the candidate constitutive models. Nonlinear least-squares model fitting to data from the hydrostatic consolidation tests, the shear consolidation tests, and a combination of the shear and hydrostatic tests produces three sets of material parameter values for the candidate models. The change in material parameter values from test group to test group indicates the empirical nature of the models. To evaluate the predictive capability of the candidate models, each parameter value set was used to predict each of the tests in the database. Based on the fitting statistics and the ability of the models to predict the test data, the Spiers model appeared to perform slightly better than the other two candidate models. The work reported here is a first-of-its kind evaluation of constitutive models for reconsolidation of crushed salt. Questions remain to be answered. Deficiencies in models and databases are identified and recommendations for future work are made. 85 refs

  3. An evaluation of the predictive performance of distributional models for flora and fauna in north-east New South Wales.

    Science.gov (United States)

    Pearce, J; Ferrier, S; Scotts, D

    2001-06-01

    To use models of species distributions effectively in conservation planning, it is important to determine the predictive accuracy of such models. Extensive modelling of the distribution of vascular plant and vertebrate fauna species within north-east New South Wales has been undertaken by linking field survey data to environmental and geographical predictors using logistic regression. These models have been used in the development of a comprehensive and adequate reserve system within the region. We evaluate the predictive accuracy of models for 153 small reptile, arboreal marsupial, diurnal bird and vascular plant species for which independent evaluation data were available. The predictive performance of each model was evaluated using the relative operating characteristic curve to measure discrimination capacity. Good discrimination ability implies that a model's predictions provide an acceptable index of species occurrence. The discrimination capacity of 89% of the models was significantly better than random, with 70% of the models providing high levels of discrimination. Predictions generated by this type of modelling therefore provide a reasonably sound basis for regional conservation planning. The discrimination ability of models was highest for the less mobile biological groups, particularly the vascular plants and small reptiles. In the case of diurnal birds, poor performing models tended to be for species which occur mainly within specific habitats not well sampled by either the model development or evaluation data, highly mobile species, species that are locally nomadic or those that display very broad habitat requirements. Particular care needs to be exercised when employing models for these types of species in conservation planning.

  4. The Benefit of Ambiguity in Understanding Goals in Requirements Modelling

    DEFF Research Database (Denmark)

    Paay, Jeni; Pedell, Sonja; Sterling, Leon

    2011-01-01

    This paper examines the benefit of ambiguity in describing goals in requirements modelling for the design of socio-technical systems using concepts from Agent-Oriented Software Engineering (AOSE) and ethnographic and cultural probe methods from Human Computer Interaction (HCI). The authors’ aim...... ambiguity in the process of elicitation and analysis through the use of empirically informed quality goals attached to functional goals. The authors demonstrate the benefit of articulating a quality goal without turning it into a functional goal. Their study shows that quality goals kept at a high level...... of abstraction, ambiguous and open for conversations through the modelling process add richness to goal models, and communicate quality attributes of the interaction being modelled to the design phase, where this ambiguity is regarded as a resource for design....

  5. Evaluation of atmospheric dispersion/consequence models supporting safety analysis

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Lazaro, M.A.; Woodard, K.

    1996-01-01

    Two DOE Working Groups have completed evaluation of accident phenomenology and consequence methodologies used to support DOE facility safety documentation. The independent evaluations each concluded that no one computer model adequately addresses all accident and atmospheric release conditions. MACCS2, MATHEW/ADPIC, TRAC RA/HA, and COSYMA are adequate for most radiological dispersion and consequence needs. ALOHA, DEGADIS, HGSYSTEM, TSCREEN, and SLAB are recommended for chemical dispersion and consequence applications. Additional work is suggested, principally in evaluation of new models, targeting certain models for continued development, training, and establishing a Web page for guidance to safety analysts

  6. Evaluating the impact of strategic personnel policies using a MILP model: The public university case

    International Nuclear Information System (INIS)

    Torre, R. de la; Lusa, A.; Mateo, M.

    2016-01-01

    Purpose: The main purpose of the paper is to evaluate the impact of diverse personnel policies around personnel promotion in the design of the strategic staff plan for a public university. The strategic staff planning consists in the determination of the size and composition of the workforce for an organization. Design/methodology/approach: The staff planning is solved using a Mixed Integer Linear Programming (MILP) model. The MILP model represents the organizational structure of the university, the personnel categories and capacity decisions, the demand requirements, the required service level and budget restrictions. All these aspects are translated into a set of data, as well as the parameters and constraints building up the mathematical model for optimization. The required data for the model is adopted from a Spanish public university. Findings: The development of appropriate policies for personnel promotion can effectively reduce the number of dismissals while proposing a transition towards different preferable workforce structures in the university. Research limitations/implications: The long term staff plan for the university is solved by the MILP model considering a time horizon of 8 years. For this time horizon, the required input data is derived from current data of the university. Different scenarios are proposed considering different temporal trends for input data, such as in demand and admissible promotional ratios for workers. Originality/value: The literature review reports a lack of formalized procedures for staff planning in universities taking into account, at the same time, the regulations on hiring, dismissals, promotions and the workforce heterogeneity, all considered to optimize workforce size and composition addressing not only an economic criteria, but also the required workforce expertise and the quality in the service offered. This paper adopts a formalized procedure developed by the authors in previous works, and exploits it to assess the

  7. Evaluating the impact of strategic personnel policies using a MILP model: The public university case

    Energy Technology Data Exchange (ETDEWEB)

    Torre, R. de la; Lusa, A.; Mateo, M.

    2016-07-01

    Purpose: The main purpose of the paper is to evaluate the impact of diverse personnel policies around personnel promotion in the design of the strategic staff plan for a public university. The strategic staff planning consists in the determination of the size and composition of the workforce for an organization. Design/methodology/approach: The staff planning is solved using a Mixed Integer Linear Programming (MILP) model. The MILP model represents the organizational structure of the university, the personnel categories and capacity decisions, the demand requirements, the required service level and budget restrictions. All these aspects are translated into a set of data, as well as the parameters and constraints building up the mathematical model for optimization. The required data for the model is adopted from a Spanish public university. Findings: The development of appropriate policies for personnel promotion can effectively reduce the number of dismissals while proposing a transition towards different preferable workforce structures in the university. Research limitations/implications: The long term staff plan for the university is solved by the MILP model considering a time horizon of 8 years. For this time horizon, the required input data is derived from current data of the university. Different scenarios are proposed considering different temporal trends for input data, such as in demand and admissible promotional ratios for workers. Originality/value: The literature review reports a lack of formalized procedures for staff planning in universities taking into account, at the same time, the regulations on hiring, dismissals, promotions and the workforce heterogeneity, all considered to optimize workforce size and composition addressing not only an economic criteria, but also the required workforce expertise and the quality in the service offered. This paper adopts a formalized procedure developed by the authors in previous works, and exploits it to assess the

  8. The Use of AMET and Automated Scripts for Model Evaluation

    Science.gov (United States)

    The Atmospheric Model Evaluation Tool (AMET) is a suite of software designed to facilitate the analysis and evaluation of meteorological and air quality models. AMET matches the model output for particular locations to the corresponding observed values from one or more networks ...

  9. A Model for Telestrok Network Evaluation

    DEFF Research Database (Denmark)

    Storm, Anna; Günzel, Franziska; Theiss, Stephan

    2011-01-01

    analysis lacking, current telestroke reimbursement by third-party payers is limited to special contracts and not included in the regular billing system. Based on a systematic literature review and expert interviews with health care economists, third-party payers and neurologists, a Markov model...... was developed from the third-party payer perspective. In principle, it enables telestroke networks to conduct cost-effectiveness studies, because the majority of the required data can be extracted from health insurance companies’ databases and the telestroke network itself. The model presents a basis...

  10. Modelling of the activity system - development of an evaluation method for integrated system validation

    International Nuclear Information System (INIS)

    Norros, Leena; Savioja, Paula

    2004-01-01

    In this paper we present our recent research which focuses on creating an evaluation method for human-system interfaces of complex systems. The method is aimed for use in the validation of modernised nuclear power plant (NPP) control rooms, and other complex systems with high reliability requirements. The task in validation is to determine whether the human-system functions safely and effectively. This question can be operationalized to the selection of relevant operational features and their appropriate acceptance criteria. Thus, there is a need to ensure that the results of the evaluation can be generalized so that they serve the purpose of integrated system validation. The definition of the appropriate acceptance criteria provides basis for the judgement of the appropriateness of the performance of the system. We propose that the operational situations and the acceptance criteria should be defined based on modelling of the NPP operation that is comprehended as an activity system. We developed a new core-tasks modelling framework. It is a formative modelling approach that combines causal, functional and understanding explanations of system performance. In this paper we reason how modelling can be used as a medium to determine the validity of the emerging control room system. (Author)

  11. A mathematical approach for evaluating Markov models in continuous time without discrete-event simulation.

    Science.gov (United States)

    van Rosmalen, Joost; Toy, Mehlika; O'Mahony, James F

    2013-08-01

    Markov models are a simple and powerful tool for analyzing the health and economic effects of health care interventions. These models are usually evaluated in discrete time using cohort analysis. The use of discrete time assumes that changes in health states occur only at the end of a cycle period. Discrete-time Markov models only approximate the process of disease progression, as clinical events typically occur in continuous time. The approximation can yield biased cost-effectiveness estimates for Markov models with long cycle periods and if no half-cycle correction is made. The purpose of this article is to present an overview of methods for evaluating Markov models in continuous time. These methods use mathematical results from stochastic process theory and control theory. The methods are illustrated using an applied example on the cost-effectiveness of antiviral therapy for chronic hepatitis B. The main result is a mathematical solution for the expected time spent in each state in a continuous-time Markov model. It is shown how this solution can account for age-dependent transition rates and discounting of costs and health effects, and how the concept of tunnel states can be used to account for transition rates that depend on the time spent in a state. The applied example shows that the continuous-time model yields more accurate results than the discrete-time model but does not require much computation time and is easily implemented. In conclusion, continuous-time Markov models are a feasible alternative to cohort analysis and can offer several theoretical and practical advantages.

  12. A qualitative evaluation approach for energy system modelling frameworks

    DEFF Research Database (Denmark)

    Wiese, Frauke; Hilpert, Simon; Kaldemeyer, Cord

    2018-01-01

    properties define how useful it is in regard to the existing challenges. For energy system models, evaluation methods exist, but we argue that many decisions upon properties are rather made on the model generator or framework level. Thus, this paper presents a qualitative approach to evaluate frameworks...

  13. Toward the Decision Tree for Inferring Requirements Maturation Types

    Science.gov (United States)

    Nakatani, Takako; Kondo, Narihito; Shirogane, Junko; Kaiya, Haruhiko; Hori, Shozo; Katamine, Keiichi

    Requirements are elicited step by step during the requirements engineering (RE) process. However, some types of requirements are elicited completely after the scheduled requirements elicitation process is finished. Such a situation is regarded as problematic situation. In our study, the difficulties of eliciting various kinds of requirements is observed by components. We refer to the components as observation targets (OTs) and introduce the word “Requirements maturation.” It means when and how requirements are elicited completely in the project. The requirements maturation is discussed on physical and logical OTs. OTs Viewed from a logical viewpoint are called logical OTs, e.g. quality requirements. The requirements of physical OTs, e.g., modules, components, subsystems, etc., includes functional and non-functional requirements. They are influenced by their requesters' environmental changes, as well as developers' technical changes. In order to infer the requirements maturation period of each OT, we need to know how much these factors influence the OTs' requirements maturation. According to the observation of actual past projects, we defined the PRINCE (Pre Requirements Intelligence Net Consideration and Evaluation) model. It aims to guide developers in their observation of the requirements maturation of OTs. We quantitatively analyzed the actual cases with their requirements elicitation process and extracted essential factors that influence the requirements maturation. The results of interviews of project managers are analyzed by WEKA, a data mining system, from which the decision tree was derived. This paper introduces the PRINCE model and the category of logical OTs to be observed. The decision tree that helps developers infer the maturation type of an OT is also described. We evaluate the tree through real projects and discuss its ability to infer the requirements maturation types.

  14. Airline service quality evaluation: A review on concepts and models

    OpenAIRE

    Navid Haghighat

    2017-01-01

    This paper reviews different major service quality concept and models which led to great developments in evaluating service quality with focusing on improvement process of the models through discussing criticisms of each model. Criticisms against these models are discussed to clarify development steps of newer models which led to the improvement of airline service quality models. The precise and accurate evaluation of service quality needs utilizing a reliable concept with comprehensive crite...

  15. Evaluation of scoring models for identifying the need for therapeutic intervention of upper gastrointestinal bleeding: A new prediction score model for Japanese patients.

    Science.gov (United States)

    Iino, Chikara; Mikami, Tatsuya; Igarashi, Takasato; Aihara, Tomoyuki; Ishii, Kentaro; Sakamoto, Jyuichi; Tono, Hiroshi; Fukuda, Shinsaku

    2016-11-01

    Multiple scoring systems have been developed to predict outcomes in patients with upper gastrointestinal bleeding. We determined how well these and a newly established scoring model predict the need for therapeutic intervention, excluding transfusion, in Japanese patients with upper gastrointestinal bleeding. We reviewed data from 212 consecutive patients with upper gastrointestinal bleeding. Patients requiring endoscopic intervention, operation, or interventional radiology were allocated to the therapeutic intervention group. Firstly, we compared areas under the curve for the Glasgow-Blatchford, Clinical Rockall, and AIMS65 scores. Secondly, the scores and factors likely associated with upper gastrointestinal bleeding were analyzed with a logistic regression analysis to form a new scoring model. Thirdly, the new model and the existing model were investigated to evaluate their usefulness. Therapeutic intervention was required in 109 patients (51.4%). The Glasgow-Blatchford score was superior to both the Clinical Rockall and AIMS65 scores for predicting therapeutic intervention need (area under the curve, 0.75 [95% confidence interval, 0.69-0.81] vs 0.53 [0.46-0.61] and 0.52 [0.44-0.60], respectively). Multivariate logistic regression analysis retained seven significant predictors in the model: systolic blood pressure upper gastrointestinal bleeding. © 2016 Japan Gastroenterological Endoscopy Society.

  16. Validating and Determining the Weight of Items Used for Evaluating Clinical Governance Implementation Based on Analytic Hierarchy Process Model

    Directory of Open Access Journals (Sweden)

    Elaheh Hooshmand

    2015-10-01

    Full Text Available Background The purpose of implementing a system such as Clinical Governance (CG is to integrate, establish and globalize distinct policies in order to improve quality through increasing professional knowledge and the accountability of healthcare professional toward providing clinical excellence. Since CG is related to change, and change requires money and time, CG implementation has to be focused on priority areas that are in more dire need of change. The purpose of the present study was to validate and determine the significance of items used for evaluating CG implementation. Methods The present study was descriptive-quantitative in method and design. Items used for evaluating CG implementation were first validated by the Delphi method and then compared with one another and ranked based on the Analytical Hierarchy Process (AHP model. Results The items that were validated for evaluating CG implementation in Iran include performance evaluation, training and development, personnel motivation, clinical audit, clinical effectiveness, risk management, resource allocation, policies and strategies, external audit, information system management, research and development, CG structure, implementation prerequisites, the management of patients’ non-medical needs, complaints and patients’ participation in the treatment process. The most important items based on their degree of significance were training and development, performance evaluation, and risk management. The least important items included the management of patients’ non-medical needs, patients’ participation in the treatment process and research and development. Conclusion The fundamental requirements of CG implementation included having an effective policy at national level, avoiding perfectionism, using the expertise and potentials of the entire country and the coordination of this model with other models of quality improvement such as accreditation and patient safety.

  17. Validating and determining the weight of items used for evaluating clinical governance implementation based on analytic hierarchy process model.

    Science.gov (United States)

    Hooshmand, Elaheh; Tourani, Sogand; Ravaghi, Hamid; Vafaee Najar, Ali; Meraji, Marziye; Ebrahimipour, Hossein

    2015-04-08

    The purpose of implementing a system such as Clinical Governance (CG) is to integrate, establish and globalize distinct policies in order to improve quality through increasing professional knowledge and the accountability of healthcare professional toward providing clinical excellence. Since CG is related to change, and change requires money and time, CG implementation has to be focused on priority areas that are in more dire need of change. The purpose of the present study was to validate and determine the significance of items used for evaluating CG implementation. The present study was descriptive-quantitative in method and design. Items used for evaluating CG implementation were first validated by the Delphi method and then compared with one another and ranked based on the Analytical Hierarchy Process (AHP) model. The items that were validated for evaluating CG implementation in Iran include performance evaluation, training and development, personnel motivation, clinical audit, clinical effectiveness, risk management, resource allocation, policies and strategies, external audit, information system management, research and development, CG structure, implementation prerequisites, the management of patients' non-medical needs, complaints and patients' participation in the treatment process. The most important items based on their degree of significance were training and development, performance evaluation, and risk management. The least important items included the management of patients' non-medical needs, patients' participation in the treatment process and research and development. The fundamental requirements of CG implementation included having an effective policy at national level, avoiding perfectionism, using the expertise and potentials of the entire country and the coordination of this model with other models of quality improvement such as accreditation and patient safety. © 2015 by Kerman University of Medical Sciences.

  18. A modeling ontology for integrating vulnerabilities into security requirements conceptual foundations

    NARCIS (Netherlands)

    Elahi, G.; Yu, E.; Zannone, N.; Laender, A.H.F.; Castano, S.; Dayal, U.; Casati, F.; Palazzo Moreira de Oliveira, J.

    2009-01-01

    Vulnerabilities are weaknesses in the requirements, design, and implementation, which attackers exploit to compromise the system. This paper proposes a vulnerability-centric modeling ontology, which aims to integrate empirical knowledge of vulnerabilities into the system development process. In

  19. Fraud Risk Modelling: Requirements Elicitation in the Case of Telecom Services

    DEFF Research Database (Denmark)

    Yesuf, Ahmed; Wolos, Lars Peter; Rannenberg, Kai

    2017-01-01

    Telecom providers are losing tremendous amounts of money due to fraud risks posed to Telecom services and products. Currently, they are mainly focusing on fraud detection approaches to reduce the impact of fraud risks against their services. However, fraud prevention approaches should also...... be investigated in order to further reduce fraud risks and improve the revenue of Telecom providers. Fraud risk modelling is a fraud prevention approach aims at identifying the potential fraud risks, estimating the damage and setting up preventive mechanisms before the fraud risks lead to actual losses....... In this paper, we highlight the important requirements for a usable and context-aware fraud risk modelling approach for Telecom services. To do so, we have conducted two workshops with experts from a Telecom provider and experts from multi-disciplinary areas. In order to show and document the requirements, we...

  20. Issues in Value-at-Risk Modeling and Evaluation

    NARCIS (Netherlands)

    J. Daníelsson (Jón); C.G. de Vries (Casper); B.N. Jorgensen (Bjørn); P.F. Christoffersen (Peter); F.X. Diebold (Francis); T. Schuermann (Til); J.A. Lopez (Jose); B. Hirtle (Beverly)

    1998-01-01

    textabstractDiscusses the issues in value-at-risk modeling and evaluation. Value of value at risk; Horizon problems and extreme events in financial risk management; Methods of evaluating value-at-risk estimates.

  1. Baseline requirements of the proposed action for the Transportation Management Division routing models

    International Nuclear Information System (INIS)

    Johnson, P.E.; Joy, D.S.

    1995-02-01

    The potential impacts associated with the transportation of hazardous materials are important to shippers, carriers, and the general public. This is particularly true for shipments of radioactive material. The shippers are primarily concerned with safety, security, efficiency, and equipment requirements. The carriers are concerned with the potential impact that radioactive shipments may have on their operations--particularly if such materials are involved in an accident. The general public has also expressed concerns regarding the safety of transporting radioactive and other hazardous materials through their communities. Because transportation routes are a central concern in hazardous material transport, the prediction of likely routes is the first step toward resolution of these issues. In response to these routing needs, several models have been developed over the past fifteen years at Oak Ridge National Laboratory (ORNL). The HIGHWAY routing model is used to predict routes for truck transportation, the INTERLINE routing model is used to predict both rail and barge routes, and the AIRPORT locator model is used to determine airports with specified criteria near a specific location. As part of the ongoing improvement of the US Department of Energy's (DOE) Environmental Management Transportation Management Division's (EM-261) computer systems and development efforts, a Baseline Requirements Assessment Session on the HIGHWAY, INTERLINE, and AIRPORT models was held at ORNL on April 27, 1994. The purpose of this meeting was to discuss the existing capabilities of the models and data bases and to review enhancements of the models and data bases to expand their usefulness. The results of the Baseline Requirements Assessment Section will be discussed in this report. The discussions pertaining to the different models are contained in separate sections

  2. [Logic model of the Franche-Comté Regional Health Project: advantages and limitations for the evaluation process].

    Science.gov (United States)

    Michaud, Claude; Sannino, Nadine; Duboudin, Cédric; Baudier, François; Guillin, Caroline; Billondeau, Christine; Mansion, Sylvie

    2014-01-01

    The French "Hospitals, patients, health and territories" law of July 2009 created the Regional Health Project (PRS) to support regional health policy, and requires evaluation of these projects. The construction of these projects, which includes prevention planning, care planning, and medical and social welfare planning, presents an unprecedented complexity in France, where evaluation programmes are still in their infancy. To support future evaluations, the Franche-Comté Regional Health Agency (ARS FC), assisted by the expertise of EFECT Consultants, decided to reconstruct the PRS logic model. This article analyzes the advantages and limitations of this approach. The resulting logic model allows visualization of the strategy adopted to achieve the Franche-Comté PRS ambitions and expected results. The model highlights four main aspects of structural change to the health system, often poorly visible in PRS presentation documents. This model also establishes links with the usual public policy evaluation issues and facilitates their prioritization. This approach also provides a better understanding of the importance of analysis of the programme construction in order to be effective rather than direct analysis of the effects, which constitutes the natural tendency of current practice. The main controversial limit concerns the retrospective design of the PRS framework, both in terms of the reliability of interpretation and adoption by actors not directly involved in this initiative.

  3. Evaluation Model for Sentient Cities

    Directory of Open Access Journals (Sweden)

    Mª Florencia Fergnani Brion

    2016-11-01

    Full Text Available In this article we made a research about the Sentient Cities and produced an assessment model to analyse if a city is or could be potentially considered one. It can be used to evaluate the current situation of a city before introducing urban policies based on citizen participation in hybrid environments (physical and digital. To that effect, we've developed evaluation grids with the main elements that form a Sentient City and their measurement values. The Sentient City is a variation of the Smart City, also based on technology progress and innovation, but where the citizens are the principal agent. In this model, governments aim to have a participatory and sustainable system for achieving the Knowledge Society and Collective Intelligence development, as well as the city’s efficiency. Also, they increase communication channels between the Administration and citizens. In this new context, citizens are empowered because they have the opportunity to create a Local Identity and transform their surroundings through open and horizontal initiatives.

  4. Evaluation of Features, Events, and Processes (FEP) for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-10-09

    The purpose of this report is to document the evaluation of biosphere features, events, and processes (FEPs) that relate to the license application (LA) process as required by the U.S. Nuclear Regulatory Commission (NRC) regulations at 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The evaluation determines whether specific biosphere-related FEPs should be included or excluded from consideration in the Total System Performance Assessment (TSPA). This analysis documents the technical basis for screening decisions as required at 10 CFR 63.114 (d, e, and f) [DIRS 156605]. For FEPs that are included in the TSPA, this analysis provides a TSPA disposition, which summarizes how the FEP has been included and addressed in the TSPA model, and cites the analysis reports and model reports that provide the technical basis and description of its disposition. For FEPs that are excluded from the TSPA, this analysis report provides a screening argument, which identifies the basis for the screening decision (i.e., low probability, low consequence, or by regulation) and discusses the technical basis that supports that decision. In cases, where a FEP covers multiple technical areas and is shared with other FEP analysis reports, this analysis may provide only a partial technical basis for the screening of the FEP. The full technical basis for these shared FEPs is addressed collectively by all FEP analysis reports that cover technical disciplines sharing a FEP. FEPs must be included in the TSPA unless they can be excluded by low probability, low consequence, or regulation. A FEP can be excluded from the TSPA by low probability per 10 CFR 63.114(d) [DIRS 156605], by showing that it has less than one chance in 10,000 of occurring over 10,000 years (or an approximately equivalent annualized probability of 10{sup -8}). A FEP can be excluded from the TSPA by low consequence per 10 CFR 63.114 (e or f) [DIRS 156605], by showing that omitting the FEP would not significantly change the magnitude and

  5. Empirical evaluation of the conceptual model underpinning a regional aquatic long-term monitoring program using causal modelling

    Science.gov (United States)

    Irvine, Kathryn M.; Miller, Scott; Al-Chokhachy, Robert K.; Archer, Erik; Roper, Brett B.; Kershner, Jeffrey L.

    2015-01-01

    Conceptual models are an integral facet of long-term monitoring programs. Proposed linkages between drivers, stressors, and ecological indicators are identified within the conceptual model of most mandated programs. We empirically evaluate a conceptual model developed for a regional aquatic and riparian monitoring program using causal models (i.e., Bayesian path analysis). We assess whether data gathered for regional status and trend estimation can also provide insights on why a stream may deviate from reference conditions. We target the hypothesized causal pathways for how anthropogenic drivers of road density, percent grazing, and percent forest within a catchment affect instream biological condition. We found instream temperature and fine sediments in arid sites and only fine sediments in mesic sites accounted for a significant portion of the maximum possible variation explainable in biological condition among managed sites. However, the biological significance of the direct effects of anthropogenic drivers on instream temperature and fine sediments were minimal or not detected. Consequently, there was weak to no biological support for causal pathways related to anthropogenic drivers’ impact on biological condition. With weak biological and statistical effect sizes, ignoring environmental contextual variables and covariates that explain natural heterogeneity would have resulted in no evidence of human impacts on biological integrity in some instances. For programs targeting the effects of anthropogenic activities, it is imperative to identify both land use practices and mechanisms that have led to degraded conditions (i.e., moving beyond simple status and trend estimation). Our empirical evaluation of the conceptual model underpinning the long-term monitoring program provided an opportunity for learning and, consequently, we discuss survey design elements that require modification to achieve question driven monitoring, a necessary step in the practice of

  6. Gamified Requirements Engineering: Model and Experimentation

    NARCIS (Netherlands)

    Lombriser, Philipp; Dalpiaz, Fabiano; Lucassen, Garm; Brinkkemper, Sjaak

    2016-01-01

    [Context & Motivation] Engaging stakeholders in requirements engineering (RE) influences the quality of the requirements and ultimately of the system to-be. Unfortunately, stakeholder engagement is often insufficient, leading to too few, low-quality requirements. [Question/problem] We aim to

  7. Airline service quality evaluation: A review on concepts and models

    Directory of Open Access Journals (Sweden)

    Navid Haghighat

    2017-12-01

    Full Text Available This paper reviews different major service quality concept and models which led to great developments in evaluating service quality with focusing on improvement process of the models through discussing criticisms of each model. Criticisms against these models are discussed to clarify development steps of newer models which led to the improvement of airline service quality models. The precise and accurate evaluation of service quality needs utilizing a reliable concept with comprehensive criteria and effective measurement techniques as the fundamentals of a valuable framework. In this paper, service quality models improvement is described based on three major service quality concepts, the disconfirmation, performance and hierarchical concepts which are developed subsequently. Reviewing various criteria and different measurement techniques such a statistical analysis and multi-criteria decision making assist researchers to have a clear understanding of the development of the evaluation framework in the airline industry. This study aims at promoting reliable frameworks for evaluating airline service quality in different countries and societies due to economic, cultural and social aspects of each society.

  8. BALANCED SCORECARDS EVALUATION MODEL THAT INCLUDES ELEMENTS OF ENVIRONMENTAL MANAGEMENT SYSTEM USING AHP MODEL

    Directory of Open Access Journals (Sweden)

    Jelena Jovanović

    2010-03-01

    Full Text Available The research is oriented on improvement of environmental management system (EMS using BSC (Balanced Scorecard model that presents strategic model of measurem ents and improvement of organisational performance. The research will present approach of objectives and environmental management me trics involvement (proposed by literature review in conventional BSC in "Ad Barska plovi dba" organisation. Further we will test creation of ECO-BSC model based on business activities of non-profit organisations in order to improve envir onmental management system in parallel with other systems of management. Using this approach we may obtain 4 models of BSC that includ es elements of environmen tal management system for AD "Barska plovidba". Taking into acc ount that implementation and evaluation need long period of time in AD "Barska plovidba", the final choice will be based on 14598 (Information technology - Software product evaluation and ISO 9126 (Software engineering - Product quality using AHP method. Those standards are usually used for evaluation of quality software product and computer programs that serve in organisation as support and factors for development. So, AHP model will be bas ed on evolution criteria based on suggestion of ISO 9126 standards and types of evaluation from two evaluation teams. Members of team & will be experts in BSC and environmental management system that are not em ployed in AD "Barska Plovidba" organisation. The members of team 2 will be managers of AD "Barska Plovidba" organisation (including manage rs from environmental department. Merging results based on previously cr eated two AHP models, one can obtain the most appropriate BSC that includes elements of environmental management system. The chosen model will present at the same time suggestion for approach choice including ecological metrics in conventional BSC model for firm that has at least one ECO strategic orientation.

  9. Evaluation of multivariate calibration models transferred between spectroscopic instruments

    DEFF Research Database (Denmark)

    Eskildsen, Carl Emil Aae; Hansen, Per W.; Skov, Thomas

    2016-01-01

    In a setting where multiple spectroscopic instruments are used for the same measurements it may be convenient to develop the calibration model on a single instrument and then transfer this model to the other instruments. In the ideal scenario, all instruments provide the same predictions for the ......In a setting where multiple spectroscopic instruments are used for the same measurements it may be convenient to develop the calibration model on a single instrument and then transfer this model to the other instruments. In the ideal scenario, all instruments provide the same predictions...... for the same samples using the transferred model. However, sometimes the success of a model transfer is evaluated by comparing the transferred model predictions with the reference values. This is not optimal, as uncertainties in the reference method will impact the evaluation. This paper proposes a new method...... for calibration model transfer evaluation. The new method is based on comparing predictions from different instruments, rather than comparing predictions and reference values. A total of 75 flour samples were available for the study. All samples were measured on ten near infrared (NIR) instruments from two...

  10. Knowledge representation requirements for model sharing between model-based reasoning and simulation in process flow domains

    Science.gov (United States)

    Throop, David R.

    1992-01-01

    The paper examines the requirements for the reuse of computational models employed in model-based reasoning (MBR) to support automated inference about mechanisms. Areas in which the theory of MBR is not yet completely adequate for using the information that simulations can yield are identified, and recent work in these areas is reviewed. It is argued that using MBR along with simulations forces the use of specific fault models. Fault models are used so that a particular fault can be instantiated into the model and run. This in turn implies that the component specification language needs to be capable of encoding any fault that might need to be sensed or diagnosed. It also means that the simulation code must anticipate all these faults at the component level.

  11. Business Process Modelling is an Essential Part of a Requirements Analysis. Contribution of EFMI Primary Care Working Group.

    Science.gov (United States)

    de Lusignan, S; Krause, P; Michalakidis, G; Vicente, M Tristan; Thompson, S; McGilchrist, M; Sullivan, F; van Royen, P; Agreus, L; Desombre, T; Taweel, A; Delaney, B

    2012-01-01

    To perform a requirements analysis of the barriers to conducting research linking of primary care, genetic and cancer data. We extended our initial data-centric approach to include socio-cultural and business requirements. We created reference models of core data requirements common to most studies using unified modelling language (UML), dataflow diagrams (DFD) and business process modelling notation (BPMN). We conducted a stakeholder analysis and constructed DFD and UML diagrams for use cases based on simulated research studies. We used research output as a sensitivity analysis. Differences between the reference model and use cases identified study specific data requirements. The stakeholder analysis identified: tensions, changes in specification, some indifference from data providers and enthusiastic informaticians urging inclusion of socio-cultural context. We identified requirements to collect information at three levels: micro- data items, which need to be semantically interoperable, meso- the medical record and data extraction, and macro- the health system and socio-cultural issues. BPMN clarified complex business requirements among data providers and vendors; and additional geographical requirements for patients to be represented in both linked datasets. High quality research output was the norm for most repositories. Reference models provide high-level schemata of the core data requirements. However, business requirements' modelling identifies stakeholder issues and identifies what needs to be addressed to enable participation.

  12. Center for Integrated Nanotechnologies (CINT) Chemical Release Modeling Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Stirrup, Timothy Scott [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-12-20

    This evaluation documents the methodology and results of chemical release modeling for operations at Building 518, Center for Integrated Nanotechnologies (CINT) Core Facility. This evaluation is intended to supplement an update to the CINT [Standalone] Hazards Analysis (SHA). This evaluation also updates the original [Design] Hazards Analysis (DHA) completed in 2003 during the design and construction of the facility; since the original DHA, additional toxic materials have been evaluated and modeled to confirm the continued low hazard classification of the CINT facility and operations. This evaluation addresses the potential catastrophic release of the current inventory of toxic chemicals at Building 518 based on a standard query in the Chemical Information System (CIS).

  13. A SIL quantification approach based on an operating situation model for safety evaluation in complex guided transportation systems

    International Nuclear Information System (INIS)

    Beugin, J.; Renaux, D.; Cauffriez, L.

    2007-01-01

    Safety analysis in guided transportation systems is essential to avoid rare but potentially catastrophic accidents. This article presents a quantitative probabilistic model that integrates Safety Integrity Levels (SIL) for evaluating the safety of such systems. The standardized SIL indicator allows the safety requirements of each safety subsystem, function and/or piece of equipment to be specified, making SILs pivotal parameters in safety evaluation. However, different interpretations of SIL exist, and faced with the complexity of guided transportation systems, the current SIL allocation methods are inadequate for the task of safety assessment. To remedy these problems, the model developed in this paper seeks to verify, during the design phase of guided transportation system, whether or not the safety specifications established by the transport authorities allow the overall safety target to be attained (i.e., if the SIL allocated to the different safety functions are sufficient to ensure the required level of safety). To meet this objective, the model is based both on the operating situation concept and on Monte Carlo simulation. The former allows safety systems to be formalized and their dynamics to be analyzed in order to show the evolution of the system in time and space, and the latter make it possible to perform probabilistic calculations based on the scenario structure obtained

  14. Evaluation of global climate models for Indian monsoon climatology

    International Nuclear Information System (INIS)

    Kodra, Evan; Ganguly, Auroop R; Ghosh, Subimal

    2012-01-01

    The viability of global climate models for forecasting the Indian monsoon is explored. Evaluation and intercomparison of model skills are employed to assess the reliability of individual models and to guide model selection strategies. Two dominant and unique patterns of Indian monsoon climatology are trends in maximum temperature and periodicity in total rainfall observed after 30 yr averaging over India. An examination of seven models and their ensembles reveals that no single model or model selection strategy outperforms the rest. The single-best model for the periodicity of Indian monsoon rainfall is the only model that captures a low-frequency natural climate oscillator thought to dictate the periodicity. The trend in maximum temperature, which most models are thought to handle relatively better, is best captured through a multimodel average compared to individual models. The results suggest a need to carefully evaluate individual models and model combinations, in addition to physical drivers where possible, for regional projections from global climate models. (letter)

  15. Comparative analysis of used car price evaluation models

    Science.gov (United States)

    Chen, Chuancan; Hao, Lulu; Xu, Cong

    2017-05-01

    An accurate used car price evaluation is a catalyst for the healthy development of used car market. Data mining has been applied to predict used car price in several articles. However, little is studied on the comparison of using different algorithms in used car price estimation. This paper collects more than 100,000 used car dealing records throughout China to do empirical analysis on a thorough comparison of two algorithms: linear regression and random forest. These two algorithms are used to predict used car price in three different models: model for a certain car make, model for a certain car series and universal model. Results show that random forest has a stable but not ideal effect in price evaluation model for a certain car make, but it shows great advantage in the universal model compared with linear regression. This indicates that random forest is an optimal algorithm when handling complex models with a large number of variables and samples, yet it shows no obvious advantage when coping with simple models with less variables.

  16. Evaluation of the ARCHITECT urine NGAL assay: Assay performance, specimen handling requirements and biological variability

    NARCIS (Netherlands)

    Grenier, F.C.; Ali, S.; Syed, H.; Workman, R.; Martens, F.; Liao, M.; Wang, Y.; Wong, P.Y.

    2010-01-01

    Objectives: NGAL (Neutrophil Gelatinase-Associated Lipocalin) has emerged as a new biomarker for the identification of acute kidney injury. Reliable clinical evaluations require a simple, robust test method for NGAL, and knowledge of specimen handling and specimen stability characteristics. We

  17. Frequency domain reflectometry modeling for nondestructive evaluation of nuclear power plant cables

    Science.gov (United States)

    Glass, S. W.; Fifield, L. S.; Jones, A. M.; Hartman, T. S.

    2018-04-01

    Cable insulation polymers are among the more susceptible materials to age-related degradation within a nuclear power plant. This is recognized by both regulators and utilities, so all plants have developed cable aging management programs to detect damage before critical component failure in compliance with regulatory guidelines. Although a wide range of tools are available to evaluate cables and cable systems, cable aging management programs vary in how condition monitoring and nondestructive examinations are conducted as utilities search for the most reliable and cost-effective ways to assess cable system condition. Frequency domain reflectometry (FDR) is emerging as one valuable tool to locate and assess damaged portions of a cable system with minimal cost and only requires access in most cases to one of the cable terminal ends. Since laboratory studies to evaluate the use of FDR for inspection of aged cables can be expensive and data interpretation may be confounded by multiple factors which influence results, a model-based approach is desired to parametrically investigate the effect of insulation material damage in a controlled manner. This work describes development of a physics-based FDR model which uses finite element simulations of cable segments in conjunction with cascaded circuit element simulations to efficiently study a cable system. One or more segments of the cable system model have altered physical or electrical properties which represent the degree of damage and the location of the damage in the system. This circuit model is then subjected to a simulated FDR examination. The modeling approach is verified using several experimental cases and by comparing it to a commercial simulator suitable for simulation of some cable configurations. The model is used to examine a broad range of parameters including defect length, defect profile, degree of degradation, number and location of defects, FDR bandwidth, and addition of impedance-matched extensions to

  18. Technology Evaluation of Process Configurations for Second Generation Bioethanol Production using Dynamic Model-based Simulations

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    An assessment of a number of different process flowsheets for bioethanol production was performed using dynamic model-based simulations. The evaluation employed diverse operational scenarios such as, fed-batch, continuous and continuous with recycle configurations. Each configuration was evaluated...... against the following benchmark criteria, yield (kg ethanol/kg dry-biomass), final product concentration and number of unit operations required in the different process configurations. The results has shown the process configuration for simultaneous saccharification and co-fermentation (SSCF) operating...... in continuous mode with a recycle of the SSCF reactor effluent, results in the best productivity of bioethanol among the proposed process configurations, with a yield of 0.18 kg ethanol /kg dry-biomass....

  19. Evaluation of hydrodynamic ocean models as a first step in larval dispersal modelling

    Science.gov (United States)

    Vasile, Roxana; Hartmann, Klaas; Hobday, Alistair J.; Oliver, Eric; Tracey, Sean

    2018-01-01

    Larval dispersal modelling, a powerful tool in studying population connectivity and species distribution, requires accurate estimates of the ocean state, on a high-resolution grid in both space (e.g. 0.5-1 km horizontal grid) and time (e.g. hourly outputs), particularly of current velocities and water temperature. These estimates are usually provided by hydrodynamic models based on which larval trajectories and survival are computed. In this study we assessed the accuracy of two hydrodynamic models around Australia - Bluelink ReANalysis (BRAN) and Hybrid Coordinate Ocean Model (HYCOM) - through comparison with empirical data from the Australian National Moorings Network (ANMN). We evaluated the models' predictions of seawater parameters most relevant to larval dispersal - temperature, u and v velocities and current speed and direction - on the continental shelf where spawning and nursery areas for major fishery species are located. The performance of each model in estimating ocean parameters was found to depend on the parameter investigated and to vary from one geographical region to another. Both BRAN and HYCOM models systematically overestimated the mean water temperature, particularly in the top 140 m of water column, with over 2 °C bias at some of the mooring stations. HYCOM model was more accurate than BRAN for water temperature predictions in the Great Australian Bight and along the east coast of Australia. Skill scores between each model and the in situ observations showed lower accuracy in the models' predictions of u and v ocean current velocities compared to water temperature predictions. For both models, the lowest accuracy in predicting ocean current velocities, speed and direction was observed at 200 m depth. Low accuracy of both model predictions was also observed in the top 10 m of the water column. BRAN had more accurate predictions of both u and v velocities in the upper 50 m of water column at all mooring station locations. While HYCOM

  20. Improved Traceability of a Small Satellite Mission Concept to Requirements Using Model Based System Engineering

    Science.gov (United States)

    Reil, Robin L.

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.

  1. Model-based segmentation in orbital volume measurement with cone beam computed tomography and evaluation against current concepts.

    Science.gov (United States)

    Wagner, Maximilian E H; Gellrich, Nils-Claudius; Friese, Karl-Ingo; Becker, Matthias; Wolter, Franz-Erich; Lichtenstein, Juergen T; Stoetzer, Marcus; Rana, Majeed; Essig, Harald

    2016-01-01

    Objective determination of the orbital volume is important in the diagnostic process and in evaluating the efficacy of medical and/or surgical treatment of orbital diseases. Tools designed to measure orbital volume with computed tomography (CT) often cannot be used with cone beam CT (CBCT) because of inferior tissue representation, although CBCT has the benefit of greater availability and lower patient radiation exposure. Therefore, a model-based segmentation technique is presented as a new method for measuring orbital volume and compared to alternative techniques. Both eyes from thirty subjects with no known orbital pathology who had undergone CBCT as a part of routine care were evaluated (n = 60 eyes). Orbital volume was measured with manual, atlas-based, and model-based segmentation methods. Volume measurements, volume determination time, and usability were compared between the three methods. Differences in means were tested for statistical significance using two-tailed Student's t tests. Neither atlas-based (26.63 ± 3.15 mm(3)) nor model-based (26.87 ± 2.99 mm(3)) measurements were significantly different from manual volume measurements (26.65 ± 4.0 mm(3)). However, the time required to determine orbital volume was significantly longer for manual measurements (10.24 ± 1.21 min) than for atlas-based (6.96 ± 2.62 min, p < 0.001) or model-based (5.73 ± 1.12 min, p < 0.001) measurements. All three orbital volume measurement methods examined can accurately measure orbital volume, although atlas-based and model-based methods seem to be more user-friendly and less time-consuming. The new model-based technique achieves fully automated segmentation results, whereas all atlas-based segmentations at least required manipulations to the anterior closing. Additionally, model-based segmentation can provide reliable orbital volume measurements when CT image quality is poor.

  2. Personal recommender systems for learners in lifelong learning: requirements, techniques and model

    NARCIS (Netherlands)

    Drachsler, Hendrik; Hummel, Hans; Koper, Rob

    2007-01-01

    Drachsler, H., Hummel, H. G. K., & Koper, R. (2008). Personal recommender systems for learners in lifelong learning: requirements, techniques and model. International Journal of Learning Technology, 3(4), 404-423.

  3. Animal models for evaluation of oral delivery of biopharmaceuticals

    DEFF Research Database (Denmark)

    Harloff-Helleberg, Stine; Nielsen, Line Hagner; Nielsen, Hanne Mørck

    2017-01-01

    of systems for oral delivery of biopharmaceuticals may result in new treatment modalities to increase the patient compliance and reduce product cost. In the preclinical development phase, use of experimental animal models is essential for evaluation of new formulation designs. In general, the limited oral...... bioavailability of biopharmaceuticals, of just a few percent, is expected, and therefore, the animal models and the experimental settings must be chosen with utmost care. More knowledge and focus on this topic is highly needed, despite experience from the numerous studies evaluating animal models for oral drug...... delivery of small molecule drugs. This review highlights and discusses pros and cons of the most currently used animal models and settings. Additionally, it also looks into the influence of anesthetics and sampling methods for evaluation of drug delivery systems for oral delivery of biopharmaceuticals...

  4. The model for evaluation of the effectiveness of civil service modernization

    Directory of Open Access Journals (Sweden)

    O. A. Lyndyuk

    2016-09-01

    Full Text Available The effectiveness of the civil service modernization depends on the timely implementation of control measures and evaluating of the effectiveness of modernization processes and system components. The article analyzes the basic problems of evaluation the effectiveness of civil service modernization and scientific papers on these issues. The basic theoretical approaches to the definition of «assessment» and «evaluation» are studied. Existing theoretical and methodological approaches to the assessment process are analyzed and summarized, the main methods of evaluating the effectiveness of the civil service modernization and the most common assessment methods are defined. Eligible for evaluating the effectiveness of civil service modernization are special analytical techniques: functional review, Balanced Scorecard, taxonomic analysis, Key Performance Indicators, methods of multivariate analysis and others. Among the methods of studying consumer expectations about the effectiveness of the civil service modernization such ones are singled out: questionnaires, surveys, interviews, testing, monitoring, analysis of statistical sources, contents of documents, reports and regulatory framework and others. The methods of improving efficiency include: benchmarking, reengineering, performance assessment models and more. The importance of gradual replacement of cost evaluation methods by the results evaluation method is determined. It was shown the need for a comprehensive balanced scorecard to evaluate. With a view to the mutual agreement of principles, mechanisms and instruments for evaluating the effectiveness of civil service modernization the expediency of a systematic, targeted, synergistic, process, situational, strategic and resource approaches is grounded. Development of theoretical concepts and methodological principles of evaluating the effectiveness of civil service modernization should be based on the harmonious combination (integration of all

  5. Designing the model for evaluating business quality in Croatia

    Directory of Open Access Journals (Sweden)

    Ana Ježovita

    2015-01-01

    Full Text Available The main objective of the paper includes designing a model for evaluating the financial quality of business operations. In that context, for the paper purposes, the financial quality of business operations is defined as an ability to achieve adequate value of individual financial ratios for financial position and performance evaluation. The objective of the model is to obtain comprehensive conclusion about the financial quality of business operation using only value of the function. Data used for designing the model is limited to financial data available from the annual balance sheet and income statement. Those limitations offer the opportunity for all sizes of companies from the non-financial business economy sector to use the designed model for evaluation purposes. Statistical methods used for designing the model are multivariate discriminant analysis and logistic regression. Discriminant analysis resulted in the function which includes five individual financial ratios with the best discriminant power. Respecting the results obtained in the classification matrix with classification accuracy of 95.92% by the original sample, or accuracy of 96.06% for the independent sample, it can be concluded that it is possible to evaluate the financial quality of business operations of companies in Croatia by using the model composed of individual financial ratios. Conducted logistic regression confirms the results obtained using discriminant analysis.

  6. Application of Learning Curves for Didactic Model Evaluation: Case Studies

    Directory of Open Access Journals (Sweden)

    Felix Mödritscher

    2013-01-01

    Full Text Available The success of (online courses depends, among other factors, on the underlying didactical models which have always been evaluated with qualitative and quantitative research methods. Several new evaluation techniques have been developed and established in the last years. One of them is ‘learning curves’, which aim at measuring error rates of users when they interact with adaptive educational systems, thereby enabling the underlying models to be evaluated and improved. In this paper, we report how we have applied this new method to two case studies to show that learning curves are useful to evaluate didactical models and their implementation in educational platforms. Results show that the error rates follow a power law distribution with each additional attempt if the didactical model of an instructional unit is valid. Furthermore, the initial error rate, the slope of the curve and the goodness of fit of the curve are valid indicators for the difficulty level of a course and the quality of its didactical model. As a conclusion, the idea of applying learning curves for evaluating didactical model on the basis of usage data is considered to be valuable for supporting teachers and learning content providers in improving their online courses.

  7. Using the BSC Model to Evaluate the Financial Performance of the Urban Water and Wastewater Industry

    Directory of Open Access Journals (Sweden)

    Mahdi Goli Aysek

    2017-03-01

    Full Text Available Among the different models so far proposed for the guiding and evaluation of organizational performance, the balanced scorecard (BSC model is the only one that has been found capable of guiding an organization towards its goals from the lowest to the topmost levels in an integrated, sustained, efficient, and effective manner. The model in question is based on the goals and strategies adopted by an organization and it is, thus, a holistic approach that envisions the organization in all its aspects, leading to sysnergy among all the organization’s divisions. Moreover, the model has been found capable of lifting the inadequacies in performance evaluation systems in firms which strive to comply with financial milestones that draw heavily on reducing the unit price through practicing scales of economy and mass production. The present study initially investigates the effects of employing the criteria inherent to the BSC model on the financial performance evaluation of the urban water and wastewater industry. The required data are collected from 35 companies forming the statistical population over a four-year period from 2007 to 2010. The (four independent variables belong to the SCR model and performance evaluation (i.e., sales efficiency rate accounts for the independent one. Due to the insignificance of the coefficients of independent variables and the lack of correlation among the dependent ones, the step-by-step method is employed to enter the values for the variables into the model when testing the research hypotheses. The new model is found to confirm all the hypotheses. Moreover, a direct relationship is established between the SCR criteria, on the one hand, and the firm’s performance, on the other, such that any improvements in SCR evaluation criteria directly lead to improvements in performance. Finally, a value equal to unity obtained for hypothesis selection indicates the strong linear relationship holding between the financial SCR

  8. Dynamic Thermal Loads and Cooling Requirements Calculations for V ACs System in Nuclear Fuel Processing Facilities Using Computer Aided Energy Conservation Models

    International Nuclear Information System (INIS)

    EL Fawal, M.M.; Gadalla, A.A.; Taher, B.M.

    2010-01-01

    In terms of nuclear safety, the most important function of ventilation air conditioning (VAC) systems is to maintain safe ambient conditions for components and structures important to safety inside the nuclear facility and to maintain appropriate working conditions for the plant's operating and maintenance staff. As a part of a study aimed to evaluate the performance of VAC system of the nuclear fuel cycle facility (NFCF) a computer model was developed and verified to evaluate the thermal loads and cooling requirements for different zones of fuel processing facility. The program is based on transfer function method (TFM) and it is used to calculate the dynamic heat gain by various multilayer walls constructions and windows hour by hour at any orientation of the building. The developed model was verified by comparing the obtained calculated results of the solar heat gain by a given building with the corresponding calculated values using finite difference method (FDM) and total equivalent temperature different method (TETD). As an example the developed program is used to calculate the cooling loads of the different zones of a typical nuclear fuel facility the results showed that the cooling capacities of the different cooling units of each zone of the facility meet the design requirements according to safety regulations in nuclear facilities.

  9. Integrated model for supplier selection and performance evaluation

    Directory of Open Access Journals (Sweden)

    Borges de Araújo, Maria Creuza

    2015-08-01

    Full Text Available This paper puts forward a model for selecting suppliers and evaluating the performance of those already working with a company. A simulation was conducted in a food industry. This sector has high significance in the economy of Brazil. The model enables the phases of selecting and evaluating suppliers to be integrated. This is important so that a company can have partnerships with suppliers who are able to meet their needs. Additionally, a group method is used to enable managers who will be affected by this decision to take part in the selection stage. Finally, the classes resulting from the performance evaluation are shown to support the contractor in choosing the most appropriate relationship with its suppliers.

  10. Fuel Cycle Requirements Code (FLYER). Summary report

    International Nuclear Information System (INIS)

    Gift, E.H.; Goode, W.D.

    1976-01-01

    Planning for, and the analysis of, the fuel requirements of the nuclear industry requires the ability to evaluate contingencies in many areas of the nuclear fuel cycle. The areas of nuclear fuel utilization, both uranium and plutonium, and of separative work requirements are of particular interest. The Fuel Cycle Requirements (FLYER) model has been developed to provide a flexible, easily managed tool for obtaining a comprehensive analysis of the nuclear fuel cycle. The model allows analysis of the interactions among the nuclear capacity growth rate, reactor technology and mix, and uranium and plutonium recycling capabilities. The model was initially developed as a means of analyzing nuclear growth contingencies with particular emphasis on the uranium feed and separative work requirements. It served to provide the planning group with analyses similar to the OPA's NUFUEL code which has only recently become available for general use. The model has recently been modified to account for some features of the fuel cycle in a more explicit manner than the NUFUEL code. For instance, the uranium requirements for all reactors installed in a given year are calculated for the total lifetime of those reactors. These values are cumulated in order to indicate the total uranium committed for reactors installed by any given year of the campaign. Similarly, the interactions in the back end of the fuel cycle are handled specifically, such as, the impacts resulting from limitations on the industrial capacity for reprocessing and mixed oxide fabrication of both light water reactor and breeder fuels. The principal features of the modified FLYER code are presented in summary form

  11. REQUIREMENTS FOR SYSTEMS DEVELOPMENT LIFE CYCLE MODELS FOR LARGE-SCALE DEFENSE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Kadir Alpaslan DEMIR

    2015-10-01

    Full Text Available TLarge-scale defense system projects are strategic for maintaining and increasing the national defense capability. Therefore, governments spend billions of dollars in the acquisition and development of large-scale defense systems. The scale of defense systems is always increasing and the costs to build them are skyrocketing. Today, defense systems are software intensive and they are either a system of systems or a part of it. Historically, the project performances observed in the development of these systems have been signifi cantly poor when compared to other types of projects. It is obvious that the currently used systems development life cycle models are insuffi cient to address today’s challenges of building these systems. Using a systems development life cycle model that is specifi cally designed for largescale defense system developments and is effective in dealing with today’s and near-future challenges will help to improve project performances. The fi rst step in the development a large-scale defense systems development life cycle model is the identifi cation of requirements for such a model. This paper contributes to the body of literature in the fi eld by providing a set of requirements for system development life cycle models for large-scale defense systems. Furthermore, a research agenda is proposed.

  12. Influence of FRAPCON-1 evaluation models on fuel behavior calculations for commercial power reactors

    International Nuclear Information System (INIS)

    Chambers, R.; Laats, E.T.

    1981-01-01

    A preliminary set of nine evaluation models (EMs) was added to the FRAPCON-1 computer code, which is used to calculate fuel rod behavior in a nuclear reactor during steady-state operation. The intent was to provide an audit code to be used in the United States Nuclear Regulatory Commission (NRC) licensing activities when calculations of conservative fuel rod temperatures are required. The EMs place conservatisms on the calculation of rod temperature by modifying the calculation of rod power history, fuel and cladding behavior models, and materials properties correlations. Three of the nine EMs provide either input or model specifications, or set the reference temperature for stored energy calculations. The remaining six EMs were intended to add thermal conservatism through model changes. To determine the relative influence of these six EMs upon fuel behavior calculations for commercial power reactors, a sensitivity study was conducted. That study is the subject of this paper

  13. Evaluation of Cost Models and Needs & Gaps Analysis

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad

    2014-01-01

    they breakdown costs. This is followed by an in depth analysis of stakeholders’ needs for financial information derived from the 4C project stakeholder consultation.The stakeholders’ needs analysis indicated that models should:• support accounting, but more importantly they should enable budgeting• be able......his report ’D3.1—Evaluation of Cost Models and Needs & Gaps Analysis’ provides an analysis of existing research related to the economics of digital curation and cost & benefit modelling. It reports upon the investigation of how well current models and tools meet stakeholders’ needs for calculating...... andcomparing financial information. Based on this evaluation, it aims to point out gaps that need to be bridged in order to increase the uptake of cost & benefit modelling and good practices that will enable costing and comparison of the costs of alternative scenarios—which in turn provides a starting point...

  14. Methodology for Applying Cyber Security Risk Evaluation from BN Model to PSA Model

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Jin Soo; Heo, Gyun Young [Kyung Hee University, Youngin (Korea, Republic of); Kang, Hyun Gook [KAIST, Dajeon (Korea, Republic of); Son, Han Seong [Joongbu University, Chubu (Korea, Republic of)

    2014-08-15

    There are several advantages to use digital equipment such as cost, convenience, and availability. It is inevitable to use the digital I and C equipment replaced analog. Nuclear facilities have already started applying the digital system to I and C system. However, the nuclear facilities also have to change I and C system even though it is difficult to use digital equipment due to high level of safety, irradiation embrittlement, and cyber security. A cyber security which is one of important concerns to use digital equipment can affect the whole integrity of nuclear facilities. For instance, cyber-attack occurred to nuclear facilities such as the SQL slammer worm, stuxnet, DUQU, and flame. The regulatory authorities have published many regulatory requirement documents such as U.S. NRC Regulatory Guide 5.71, 1.152, IAEA guide NSS-17, IEEE Standard, and KINS Regulatory Guide. One of the important problem of cyber security research for nuclear facilities is difficulty to obtain the data through the penetration experiments. Therefore, we make cyber security risk evaluation model with Bayesian network (BN) for nuclear reactor protection system (RPS), which is one of the safety-critical systems to trip the reactor when the accident is happened to the facilities. BN can be used for overcoming these problems. We propose a method to apply BN cyber security model to probabilistic safety assessment (PSA) model, which had been used for safety assessment of system, structure and components of facility. The proposed method will be able to provide the insight of safety as well as cyber risk to the facility.

  15. Methodology for Applying Cyber Security Risk Evaluation from BN Model to PSA Model

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Heo, Gyun Young; Kang, Hyun Gook; Son, Han Seong

    2014-01-01

    There are several advantages to use digital equipment such as cost, convenience, and availability. It is inevitable to use the digital I and C equipment replaced analog. Nuclear facilities have already started applying the digital system to I and C system. However, the nuclear facilities also have to change I and C system even though it is difficult to use digital equipment due to high level of safety, irradiation embrittlement, and cyber security. A cyber security which is one of important concerns to use digital equipment can affect the whole integrity of nuclear facilities. For instance, cyber-attack occurred to nuclear facilities such as the SQL slammer worm, stuxnet, DUQU, and flame. The regulatory authorities have published many regulatory requirement documents such as U.S. NRC Regulatory Guide 5.71, 1.152, IAEA guide NSS-17, IEEE Standard, and KINS Regulatory Guide. One of the important problem of cyber security research for nuclear facilities is difficulty to obtain the data through the penetration experiments. Therefore, we make cyber security risk evaluation model with Bayesian network (BN) for nuclear reactor protection system (RPS), which is one of the safety-critical systems to trip the reactor when the accident is happened to the facilities. BN can be used for overcoming these problems. We propose a method to apply BN cyber security model to probabilistic safety assessment (PSA) model, which had been used for safety assessment of system, structure and components of facility. The proposed method will be able to provide the insight of safety as well as cyber risk to the facility

  16. Basic Requirements for Systems Software Research and Development

    Science.gov (United States)

    Kuszmaul, Chris; Nitzberg, Bill

    1996-01-01

    Our success over the past ten years evaluating and developing advanced computing technologies has been due to a simple research and development (R/D) model. Our model has three phases: (a) evaluating the state-of-the-art, (b) identifying problems and creating innovations, and (c) developing solutions, improving the state- of-the-art. This cycle has four basic requirements: a large production testbed with real users, a diverse collection of state-of-the-art hardware, facilities for evalua- tion of emerging technologies and development of innovations, and control over system management on these testbeds. Future research will be irrelevant and future products will not work if any of these requirements is eliminated. In order to retain our effectiveness, the numerical aerospace simulator (NAS) must replace out-of-date production testbeds in as timely a fashion as possible, and cannot afford to ignore innovative designs such as new distributed shared memory machines, clustered commodity-based computers, and multi-threaded architectures.

  17. Evaluation of two ozone air quality modelling systems

    Directory of Open Access Journals (Sweden)

    S. Ortega

    2004-01-01

    Full Text Available The aim of this paper is to compare two different modelling systems and to evaluate their ability to simulate high values of ozone concentration in typical summer episodes which take place in the north of Spain near the metropolitan area of Barcelona. As the focus of the paper is the comparison of the two systems, we do not attempt to improve the agreement by adjusting the emission inventory or model parameters. The first model, or forecasting system, is made up of three modules. The first module is a mesoscale model (MASS. This provides the initial condition for the second module, which is a nonlocal boundary layer model based on the transilient turbulence scheme. The third module is a photochemical box model (OZIPR, which is applied in Eulerian and Lagrangian modes and receives suitable information from the two previous modules. The model forecast is evaluated against ground base stations during summer 2001. The second model is the MM5/UAM-V. This is a grid model designed to predict the hourly three-dimensional ozone concentration fields. The model is applied during an ozone episode that occurred between 21 and 23 June 2001. Our results reflect the good performance of the two modelling systems when they are used in a specific episode.

  18. Example of emergency response model evaluation of studies using the Mathew/Adpic models

    International Nuclear Information System (INIS)

    Dickerson, M.H.; Lange, R.

    1986-04-01

    This report summarizes model evaluation studies conducted for the MATHEW/ADPIC transport and diffusion models during the past ten years. These models support the US Department of Energy Atmospheric Release Advisory Capability, an emergency response service for atmospheric releases of nuclear material. Field studies involving tracer releases used in these studies cover a broad range of meteorology, terrain and tracer release heights, the three most important aspects of estimating air concentration values resulting from airborne releases of toxic material. Results of these studies show that these models can estimate air concentration values within a factor of 2 20% to 50% of the time and a factor of 5 40% to 80% of the time. As the meterology and terrain become more complex and the release height of the tracer is increased, the accuracy of the model calculations degrades. This band of uncertainty appears to correctly represent the capability of these models at this time. A method for estimating angular uncertainty in the model calculations is described and used to suggest alternative methods for evaluating emergency response models

  19. Modeling the dynamics of evaluation: a multilevel neural network implementation of the iterative reprocessing model.

    Science.gov (United States)

    Ehret, Phillip J; Monroe, Brian M; Read, Stephen J

    2015-05-01

    We present a neural network implementation of central components of the iterative reprocessing (IR) model. The IR model argues that the evaluation of social stimuli (attitudes, stereotypes) is the result of the IR of stimuli in a hierarchy of neural systems: The evaluation of social stimuli develops and changes over processing. The network has a multilevel, bidirectional feedback evaluation system that integrates initial perceptual processing and later developing semantic processing. The network processes stimuli (e.g., an individual's appearance) over repeated iterations, with increasingly higher levels of semantic processing over time. As a result, the network's evaluations of stimuli evolve. We discuss the implications of the network for a number of different issues involved in attitudes and social evaluation. The success of the network supports the IR model framework and provides new insights into attitude theory. © 2014 by the Society for Personality and Social Psychology, Inc.

  20. Transport properties site descriptive model. Guidelines for evaluation and modelling

    International Nuclear Information System (INIS)

    Berglund, Sten; Selroos, Jan-Olof

    2004-04-01

    This report describes a strategy for the development of Transport Properties Site Descriptive Models within the SKB Site Investigation programme. Similar reports have been produced for the other disciplines in the site descriptive modelling (Geology, Hydrogeology, Hydrogeochemistry, Rock mechanics, Thermal properties, and Surface ecosystems). These reports are intended to guide the site descriptive modelling, but also to provide the authorities with an overview of modelling work that will be performed. The site descriptive modelling of transport properties is presented in this report and in the associated 'Strategy for the use of laboratory methods in the site investigations programme for the transport properties of the rock', which describes laboratory measurements and data evaluations. Specifically, the objectives of the present report are to: Present a description that gives an overview of the strategy for developing Site Descriptive Models, and which sets the transport modelling into this general context. Provide a structure for developing Transport Properties Site Descriptive Models that facilitates efficient modelling and comparisons between different sites. Provide guidelines on specific modelling issues where methodological consistency is judged to be of special importance, or where there is no general consensus on the modelling approach. The objectives of the site descriptive modelling process and the resulting Transport Properties Site Descriptive Models are to: Provide transport parameters for Safety Assessment. Describe the geoscientific basis for the transport model, including the qualitative and quantitative data that are of importance for the assessment of uncertainties and confidence in the transport description, and for the understanding of the processes at the sites. Provide transport parameters for use within other discipline-specific programmes. Contribute to the integrated evaluation of the investigated sites. The site descriptive modelling of

  1. Modeling of the global carbon cycle - isotopic data requirements

    International Nuclear Information System (INIS)

    Ciais, P.

    1994-01-01

    Isotopes are powerful tools to constrain carbon cycle models. For example, the combinations of the CO 2 and the 13 C budget allows to calculate the net-carbon fluxes between atmosphere, ocean, and biosphere. Observations of natural and bomb-produced radiocarbon allow to estimate gross carbon exchange fluxes between different reservoirs and to deduce time scales of carbon overturning in important reservoirs. 18 O in CO 2 is potentially a tool to make the deconvolution of C fluxes within the land biosphere (assimilation vs respirations). The scope of this article is to identify gaps in our present knowledge about isotopes in the light of their use as constraint for the global carbon cycle. In the following we will present a list of some future data requirements for carbon cycle models. (authors)

  2. Comprehensive environment-suitability evaluation model about Carya cathayensis

    International Nuclear Information System (INIS)

    Da-Sheng, W.; Li-Juan, L.; Qin-Fen, Y.

    2013-01-01

    On the relation between the suitable environment and the distribution areas of Carya cathayensis Sarg., the current studies are mainly committed to qualitative descriptions, but did not consider quantitative models. The objective of this study was to establish a environment-suitability evaluation model which used to predict potential suitable areas of C. cathayensis. Firstly, the 3 factor data of soil type, soil parent material and soil thickness were obtained based on 2-class forest resource survey, and other factor data, which included elevation, slope, aspect, surface curvature, humidity index, and solar radiation index, were extracted from DEM (Digital Elevation Model). Additionally, the key affecting factors were defined by PCA (Principal Component Analysis), the weights of evaluation factors were determined by AHP (Analysis Hierarchy Process) and the quantitative classification of single factor was determined by membership function with fuzzy mathematics. Finally, a comprehensive environment-suitability evaluation model was established and which was also used to predict the potential suitable areas of C. cathayensis in Daoshi Town in the study region. The results showed that 85.6% of actual distribution areas were in the most suitable and more suitable regions and 11.5% in the general suitable regions

  3. Corporate Data Network (CDN) data requirements task. Enterprise Model. Volume 1

    International Nuclear Information System (INIS)

    1985-11-01

    The NRC has initiated a multi-year program to centralize its information processing in a Corporate Data Network (CDN). The new information processing environment will include shared databases, telecommunications, office automation tools, and state-of-the-art software. Touche Ross and Company was contracted with to perform a general data requirements analysis for shared databases and to develop a preliminary plan for implementation of the CDN concept. The Enterprise Model (Vol. 1) provided the NRC with agency-wide information requirements in the form of data entities and organizational demand patterns as the basis for clustering the entities into logical groups. The Data Dictionary (Vol. 2) provided the NRC with definitions and example attributes and properties for each entity. The Data Model (Vol. 3) defined logical databases and entity relationships within and between databases. The Preliminary Strategic Data Plan (Vol. 4) prioritized the development of databases and included a workplan and approach for implementation of the shared database component of the Corporate Data Network

  4. Defining climate modeling user needs: which data are actually required to support impact analysis and adaptation policy development?

    Science.gov (United States)

    Swart, R. J.; Pagé, C.

    2010-12-01

    Until recently, the policy applications of Earth System Models in general and climate models in particular were focusing mainly on the potential future changes in the global and regional climate and attribution of observed changes to anthropogenic activities. Is climate change real? And if so, why do we have to worry about it? Following the broad acceptance of the reality of the risks by the majority of governments, particularly after the publication of IPCC’s 4th Assessment Report and the increasing number of observations of changes in ecological and socio-economic systems that are consistent with the observed climatic changes, governments, companies and other societal groups have started to evaluate their own vulnerability in more detail and to develop adaptation and mitigation strategies. After an early focus on the most vulnerable developing countries, recently, an increasing number of industrialized countries have embarked on the design of adaptation and mitigation plans, or on studies to evaluate the level of climate resilience of their development plans and projects. Which climate data are actually required to effectively support these activities? This paper reports on the efforts of the IS-ENES project, the infrastructure project of the European Network for Earth System Modeling, to address this question. How do we define user needs and can the existing gap between the climate modeling and impact research communities be bridged in support of the ENES long-term strategy? In contrast from the climate modeling community, which has a relatively long history of collaboration facilitated by a relatively uniform subject matter, commonly agreed definitions of key terminology and some level of harmonization of methods, the climate change impacts research community is very diverse and fragmented, using a wide variety of data sources, methods and tools. An additional complicating factor is that researchers working on adaptation usually closely collaborate with non

  5. Design Concept Evaluation Using System Throughput Model

    International Nuclear Information System (INIS)

    Sequeira, G.; Nutt, W. M.

    2004-01-01

    The U.S. Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) is currently developing the technical bases to support the submittal of a license application for construction of a geologic repository at Yucca Mountain, Nevada to the U.S. Nuclear Regulatory Commission. The Office of Repository Development (ORD) is responsible for developing the design of the proposed repository surface facilities for the handling of spent nuclear fuel and high level nuclear waste. Preliminary design activities are underway to sufficiently develop the repository surface facilities design for inclusion in the license application. The design continues to evolve to meet mission needs and to satisfy both regulatory and program requirements. A system engineering approach is being used in the design process since the proposed repository facilities are dynamically linked by a series of sub-systems and complex operations. In addition, the proposed repository facility is a major system element of the overall waste management process being developed by the OCRWM. Such an approach includes iterative probabilistic dynamic simulation as an integral part of the design evolution process. A dynamic simulation tool helps to determine if: (1) the mission and design requirements are complete, robust, and well integrated; (2) the design solutions under development meet the design requirements and mission goals; (3) opportunities exist where the system can be improved and/or optimized; and (4) proposed changes to the mission, and design requirements have a positive or negative impact on overall system performance and if design changes may be necessary to satisfy these changes. This paper will discuss the type of simulation employed to model the waste handling operations. It will then discuss the process being used to develop the Yucca Mountain surface facilities model. The latest simulation model and the results of the simulation and how the data were used in the design

  6. Building beef cow nutritional programs with the 1996 NRC beef cattle requirements model.

    Science.gov (United States)

    Lardy, G P; Adams, D C; Klopfenstein, T J; Patterson, H H

    2004-01-01

    Designing a sound cow-calf nutritional program requires knowledge of nutrient requirements, diet quality, and intake. Effectively using the NRC (1996) beef cattle requirements model (1996NRC) also requires knowledge of dietary degradable intake protein (DIP) and microbial efficiency. Objectives of this paper are to 1) describe a framework in which 1996NRC-applicable data can be generated, 2) describe seasonal changes in nutrients on native range, 3) use the 1996NRC to predict nutrient balance for cattle grazing these forages, and 4) make recommendations for using the 1996NRC for forage-fed cattle. Extrusa samples were collected over 2 yr on native upland range and subirrigated meadow in the Nebraska Sandhills. Samples were analyzed for CP, in vitro OM digestibility (IVOMD), and DIP. Regression equations to predict nutrients were developed from these data. The 1996NRC was used to predict nutrient balances based on the dietary nutrient analyses. Recommendations for model users were also developed. On subirrigated meadow, CP and IVOMD increased rapidly during March and April. On native range, CP and IVOMD increased from April through June but decreased rapidly from August through September. Degradable intake protein (DM basis) followed trends similar to CP for both native range and subirrigated meadow. Predicted nutrient balances for spring- and summer-calving cows agreed with reported values in the literature, provided that IVOMD values were converted to DE before use in the model (1.07 x IVOMD - 8.13). When the IVOMD-to-DE conversion was not used, the model gave unrealistically high NE(m) balances. To effectively use the 1996NRC to estimate protein requirements, users should focus on three key estimates: DIP, microbial efficiency, and TDN intake. Consequently, efforts should be focused on adequately describing seasonal changes in forage nutrient content. In order to increase use of the 1996NRC, research is needed in the following areas: 1) cost-effective and

  7. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species

    OpenAIRE

    Matthew P. Adams; Catherine J. Collier; Sven Uthicke; Yan X. Ow; Lucas Langlois; Katherine R. O’Brien

    2017-01-01

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluat...

  8. Uranium resources evaluation model as an exploration tool

    International Nuclear Information System (INIS)

    Ruzicka, V.

    1976-01-01

    Evaluation of uranium resources, as conducted by the Uranium Resources Evaluation Section of the Geological Survey of Canada, comprises operations analogous with those performed during the preparatory stages of uranium exploration. The uranium resources evaluation model, simulating the estimation process, can be divided into four steps. The first step includes definition of major areas and ''unit subdivisions'' for which geological data are gathered, coded, computerized and retrieved. Selection of these areas and ''unit subdivisions'' is based on a preliminary appraisal of their favourability for uranium mineralization. The second step includes analyses of the data, definition of factors controlling uranium minearlization, classification of uranium occurrences into genetic types, and final delineation of favourable areas; this step corresponds to the selection of targets for uranium exploration. The third step includes geological field work; it is equivalent to geological reconnaissance in exploration. The fourth step comprises computation of resources; the preliminary evaluation techniques in the exploration are, as a rule, analogous with the simplest methods employed in the resource evaluation. The uranium resources evaluation model can be conceptually applied for decision-making during exploration or for formulation of exploration strategy using the quantified data as weighting factors. (author)

  9. A Novel OBDD-Based Reliability Evaluation Algorithm for Wireless Sensor Networks on the Multicast Model

    Directory of Open Access Journals (Sweden)

    Zongshuai Yan

    2015-01-01

    Full Text Available The two-terminal reliability calculation for wireless sensor networks (WSNs is a #P-hard problem. The reliability calculation of WSNs on the multicast model provides an even worse combinatorial explosion of node states with respect to the calculation of WSNs on the unicast model; many real WSNs require the multicast model to deliver information. This research first provides a formal definition for the WSN on the multicast model. Next, a symbolic OBDD_Multicast algorithm is proposed to evaluate the reliability of WSNs on the multicast model. Furthermore, our research on OBDD_Multicast construction avoids the problem of invalid expansion, which reduces the number of subnetworks by identifying the redundant paths of two adjacent nodes and s-t unconnected paths. Experiments show that the OBDD_Multicast both reduces the complexity of the WSN reliability analysis and has a lower running time than Xing’s OBDD- (ordered binary decision diagram- based algorithm.

  10. An evaluation model of protective function in forest management planning: slope stability in regard to shallow landslide events

    Directory of Open Access Journals (Sweden)

    2006-01-01

    Full Text Available The evaluation of forest protective function has been divided into four main branches according to the different types of instability phenomena (landslide events, erosion, floods, avalanches. This paper presents the first module related to landslide events and will be followed by others. Two opposing factors have been considered: instability tendency of a selected land unit and protective function of the vegetation. While the first factor tends to promote landslide events, the second one opposes their occurrence. An expert evaluation of these aspects allows us to derive some qualitative indexes. For each territorial unit, these indexes express the protection value of vegetation, the degree of management restrictions and suitable improvements to the forest cover. Both propension and protective functional character are influenced by many and different parameters which require multidisciplinary competences to correctly evaluate them. Such skills are not easily found in technicians in charge of forest management. The present paper aims to provide a decision making support tool; based on neural network models, it should be able to interpret and simulate the expert knowledge, extending to the "standard" forest technician the opportunity to perform such a kind of evaluation. The neural network training required the identification and characterization of explanatory variables related to both aspects, and the subsequent expert definition of the respective datasets of real classification examples. The descriptive variables were chosen considering the information availability and its compatibility with GIS techniques as well. Model performances have been validated by testing the whole procedure in two sites situated in Antrona valley (Piemonte and Acqualagna (Marche. The results are discussed in detail and put in evidence a good accordance with both field survey and general conceptual assumptions. Future developments will involve similar analysis and

  11. Employing Quality Principles in Evaluating Research Companies According to I.S.O Requirements 9001/2000 (AL-Melad Company Case Study

    Directory of Open Access Journals (Sweden)

    Abdulrahman A. Ibrahem

    2013-05-01

    Full Text Available Research companies are considered to be the pioneer in the country since it have  advanced personal staff of high qualification. Thus it is considered as shining point in the society. Therefore, it have to be the first in applying the quality requirements. This will positively reflected on its performance.            The research aim is to evaluate the quality management for Al-Melad Company. In the present study, the current system of Al-Melad company is evaluated according to  I.S.O 9001/2000 requirement. A computer program is prepared for this purpose to facilitate evaluation process depending on check lists of quality requirements and on field existing conditions living with experts and on the information presented to the program operator through interviews, documents and registries.            The program  will evaluate each element and the pivot in the system and then a total evaluation of the applied quality system. It is found that the congruence percentage with the requirements is 29.84%. This finding illustrate the need of construction quality system based on mixing the functions of the main management (planning and organizing with I.S.O specification 9001/2000 and principles of total quality management.

  12. A QFD-Based Evaluation Method for Business Models of Product Service Systems

    Directory of Open Access Journals (Sweden)

    Tianyang Li

    2016-01-01

    Full Text Available Organizations have been approaching Product Service Systems (PSS in unoptimized business model fashions. This is partially because there is ineffective evaluation of PSS business models. Therefore, a more sufficient evaluation method might advance the evaluation of PSS business models and assist organizations that are considering a servitisation strategy. In this paper, we develop a value oriented method by using the Quality Function Deployment (QFD technique to employ correlations derived from the design information of PSS business models to evaluate these PSS business models. We describe the method applying steps and its practical application in a real life case study. This method improves the formulation of an evaluation step within a design process of PSS business models based on correlations of different dimensions of the PSS value and PSS business models; it allows a balance between the customer value and organization value that are delivered by PSS business models to be made quantitatively. Finally, it fosters the effective utilization of the design information accumulated in the earlier part of the design process to quantitatively evaluate whether a PSS business model is optimized for providing appropriate values for customers and organizations.

  13. Evaluation of New Chemical Entities as Substrates of Liver Transporters in the Pharmaceutical Industry: Response to Regulatory Requirements and Future Steps.

    Science.gov (United States)

    Okudaira, Noriko

    2017-09-01

    This article discusses the evaluation of drug candidates as hepatic transporter substrates. Recently, research on the applications of hepatic transporters in the pharmaceutical industry has improved to meet the requirements of the regulatory guidelines for the evaluation of drug interactions. To identify the risk of transporter-mediated drug-drug interactions at an early stage of drug development, we used a strategy of reviewing the in vivo animal pharmacokinetics and tissue distribution data obtained in the discovery stage together with the in vitro data obtained for regulatory submission. In the context of nonclinical evaluation of new chemical entities as medicines, we believe that transporter studies are emerging as a key strategy to predict their pharmacological and toxicological effects. In combination with the recent progress in systems approaches, the estimation of effective concentrations in the target tissues, by using mathematical models to describe the transporter-mediated distribution and elimination, has enabled us to identify promising compounds for clinical development at the discovery stage. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  14. Report on a survey in fiscal 1999. Analysis of English literatures related to unified evaluation models for global warming; 1999 nendo chikyu ondanka togo hyoka model kanren eibun shiryo no bunseki choa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    This paper summarizes the basic materials related to unified evaluation models for global warming. The unified evaluation is a disciplinary process to combine, interpret and share the information from different scientific disciplinary areas in such a form that the whole cause and effect chain can be evaluated from a macroscopic view. The process has higher utilization value than the evaluation overemphasizing a single academic area, and can provide decision makers with useful information. The process is suitable to model complex interactions and feedback mechanisms in diversified scenes such as climate change. The unified evaluation can identify the policy criteria along with a measure framework having consistency. The evaluation process is repetitive and continuous, wherein a science community can convey comprehensive knowledge and finding to a decision making community. In turn, the decision making side can feed back the experiences and achievements in learning. Execution of the evaluation requires different approaches, such as judgement of specialists including the modeling methods and experience, discovery as a result of applying the policies, and survey methods. The paper also describes the gaming conception, scenario analysis, and unification evaluation methods. (NEDO)

  15. Delayed Hydride Cracking Mechanism in Zirconium Alloys and Technical Requirements for In-Service Evaluation of Zr-2.5Nb Tubes with Flaws

    International Nuclear Information System (INIS)

    Kim, Young Suk

    2007-01-01

    In association with periodic inspection of CANDU nuclear power plant components, Canadian Standards Association issued CSA N285.8 in 2005 as technical requirements for in-service evaluation of zirconium alloy pressure tubes in CANDU reactors. This first version, CSA N285.8 involves procedures for, firstly, the evaluation of pressure tube flaws, secondly, the evaluation of pressure tube to calandria tube contact and, thirdly, the assessment of a reactor core, and material properties and derived quantities. The evaluation of pressure tube flaws includes delayed hydride cracking evaluation the procedures of which are stipulated based on the existing delayed hydride cracking models. For example, the evaluation of flaw-tip hydride precipitation during reactor cooldown involves a procedure to calculate the equilibrium hydrogen equivalent concentration in solution at the flaw tip, Htipas follows: Htip=Hfexp[- (VH delta no.)/RT], where Hf is the total bulk hydrogen equivalent concentration, VH partial molar volume of hydrogen in zirconium, δ a difference in hydrostatic stress between the bulk and the crack tip. When Htip ≥TSSP at temperature, then flaw-tip hydride is predicted to precipitate. Eq. (1) suggests that hydrogen concentration at the crack tip would increase due to an work energy given by the difference in the hydrostatic stress

  16. An inverse problem strategy based on forward model evaluations: Gradient-based optimization without adjoint solves

    Energy Technology Data Exchange (ETDEWEB)

    Aguilo Valentin, Miguel Alejandro [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-07-01

    This study presents a new nonlinear programming formulation for the solution of inverse problems. First, a general inverse problem formulation based on the compliance error functional is presented. The proposed error functional enables the computation of the Lagrange multipliers, and thus the first order derivative information, at the expense of just one model evaluation. Therefore, the calculation of the Lagrange multipliers does not require the solution of the computationally intensive adjoint problem. This leads to significant speedups for large-scale, gradient-based inverse problems.

  17. Evaluating energy efficiency policies with energy-economy models

    NARCIS (Netherlands)

    Mundaca, L.; Neij, L.; Worrell, E.; McNeil, M.

    2010-01-01

    The growing complexities of energy systems, environmental problems, and technology markets are driving and testing most energy-economy models to their limits. To further advance bottom-up models from a multidisciplinary energy efficiency policy evaluation perspective, we review and critically

  18. Understanding the relationship between Kano model's customer satisfaction scores and self-stated requirements importance.

    Science.gov (United States)

    Mkpojiogu, Emmanuel O C; Hashim, Nor Laily

    2016-01-01

    Customer satisfaction is the result of product quality and viability. The place of the perceived satisfaction of users/customers for a software product cannot be neglected especially in today competitive market environment as it drives the loyalty of customers and promotes high profitability and return on investment. Therefore understanding the importance of requirements as it is associated with the satisfaction of users/customers when their requirements are met is worth the pain considering. It is necessary to know the relationship between customer satisfactions when their requirements are met (or their dissatisfaction when their requirements are unmet) and the importance of such requirement. So many works have been carried out on customer satisfaction in connection with the importance of requirements but the relationship between customer satisfaction scores (coefficients) of the Kano model and users/customers self-stated requirements importance have not been sufficiently explored. In this study, an attempt is made to unravel the underlying relationship existing between Kano model's customer satisfaction indexes and users/customers self reported requirements importance. The results of the study indicate some interesting associations between these considered variables. These bivariate associations reveal that customer satisfaction index (SI), and average satisfaction coefficient (ASC) and customer dissatisfaction index (DI) and average satisfaction coefficient (ASC) are highly correlated (r = 96 %) and thus ASC can be used in place of either SI or DI in representing customer satisfaction scores. Also, these Kano model's customer satisfaction variables (SI, DI, and ASC) are each associated with self-stated requirements importance (IMP). Further analysis indicates that the value customers or users place on requirements that are met or on features that are incorporated into a product influences the level of satisfaction such customers derive from the product. The

  19. Bayesian uncertainty analysis for complex systems biology models: emulation, global parameter searches and evaluation of gene functions.

    Science.gov (United States)

    Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith

    2018-01-02

    Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour

  20. Emergency evacuation/transportation plan update: Traffic model development and evaluation of early closure procedures. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-10-28

    Prolonged delays in traffic experienced by Laboratory personnel during a recent early dismissal in inclement weather, coupled with reconstruction efforts along NM 502 east of the White Rock Wye for the next 1 to 2 years, has prompted Los Alamos National Laboratory (LANL) to re-evaluate and improve the present transportation plan and its integration with contingency plans maintained in other organizations. Facilities planners and emergency operations staff need to evaluate the transportation system`s capability to inefficiently and safely evacuate LANL under different low-level emergency conditions. A variety of potential procedures governing the release of employees from the different technical areas (TAs) requires evaluation, perhaps with regard to multiple emergency-condition scenarios, with one or more optimal procedures ultimately presented for adoption by Lab Management. The work undertaken in this project will hopefully lay a foundation for an on-going, progressive transportation system analysis capability. It utilizes microscale simulation techniques to affirm, reassess and validate the Laboratory`s Early Dismissal/Closure/Delayed Opening Plan. The Laboratory is required by Federal guidelines, and compelled by prudent practice and conscientious regard for the welfare of employees and nearby residents, to maintain plans and operating procedures for evacuation if the need arises. The tools developed during this process can be used outside of contingency planning. It is anticipated that the traffic models developed will allow site planners to evaluate changes to the traffic network which could better serve the normal traffic levels. Changes in roadway configuration, control strategies (signalization and signing), response strategies to traffic accidents, and patterns of demand can be modelled using the analysis tools developed during this project. Such scenarios typically are important considerations in master planning and facilities programming.

  1. Evaluation and Quantification of Uncertainty in the Modeling of Contaminant Transport and Exposure Assessment at a Radioactive Waste Disposal Site

    Science.gov (United States)

    Tauxe, J.; Black, P.; Carilli, J.; Catlett, K.; Crowe, B.; Hooten, M.; Rawlinson, S.; Schuh, A.; Stockton, T.; Yucel, V.

    2002-12-01

    The disposal of low-level radioactive waste (LLW) in the United States (U.S.) is a highly regulated undertaking. The U.S. Department of Energy (DOE), itself a large generator of such wastes, requires a substantial amount of analysis and assessment before permitting disposal of LLW at its facilities. One of the requirements that must be met in assessing the performance of a disposal site and technology is that a Performance Assessment (PA) demonstrate "reasonable expectation" that certain performance objectives, such as dose to a hypothetical future receptor, not be exceeded. The phrase "reasonable expectation" implies recognition of uncertainty in the assessment process. In order for this uncertainty to be quantified and communicated to decision makers, the PA computer model must accept probabilistic (uncertain) input (parameter values) and produce results which reflect that uncertainty as it is propagated through the model calculations. The GoldSim modeling software was selected for the task due to its unique facility with both probabilistic analysis and radioactive contaminant transport. Probabilistic model parameters range from water content and other physical properties of alluvium to the activity of radionuclides disposed to the amount of time a future resident might be expected to spend tending a garden. Although these parameters govern processes which are defined in isolation as rather simple differential equations, the complex interaction of couple processes makes for a highly nonlinear system with often unanticipated results. The decision maker has the difficult job of evaluating the uncertainty of modeling results in the context of granting permission for LLW disposal. This job also involves the evaluation of alternatives, such as the selection of disposal technologies. Various scenarios can be evaluated in the model, so that the effects of, for example, using a thicker soil cap over the waste cell can be assessed. This ability to evaluate mitigation

  2. Maintenance personnel performance simulation (MAPPS) model: overview and evaluation efforts

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.; Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Ryan, T.G.

    1984-01-01

    The development of the MAPPS model has been completed and the model is currently undergoing evaluation. These efforts are addressing a number of identified issues concerning practicality, acceptability, usefulness, and validity. Preliminary analysis of the evaluation data that has been collected indicates that MAPPS will provide comprehensive and reliable data for PRA purposes and for a number of other applications. The MAPPS computer simulation model provides the user with a sophisticated tool for gaining insights into tasks performed by NPP maintenance personnel. Its wide variety of input parameters and output data makes it extremely flexible for application to a number of diverse applications. With the demonstration of favorable model evaluation results, the MAPPS model will represent a valuable source of NPP maintainer reliability data and provide PRA studies with a source of data on maintainers that has previously not existed

  3. Educational Program Evaluation Model, From the Perspective of the New Theories

    Directory of Open Access Journals (Sweden)

    Soleiman Ahmady

    2014-05-01

    Full Text Available Introduction: This study is focused on common theories that influenced the history of program evaluation and introduce the educational program evaluation proposal format based on the updated theory. Methods: Literature searches were carried out in March-December 2010 with a combination of key words, MeSH terms and other free text terms as suitable for the purpose. A comprehensive search strategy was developed to search Medline by the PubMed interface, ERIC (Education Resources Information Center and the main journal of medical education regarding current evaluation models and theories. We included all study designs in our study. We found 810 articles related to our topic, and finally 63 with the full text article included. We compared documents and used expert consensus for selection the best model. Results: We found that the complexity theory using logic model suggests compatible evaluation proposal formats, especially with new medical education programs. Common components of a logic model are: situation, inputs, outputs, and outcomes that our proposal format is based on. Its contents are: title page, cover letter, situation and background, introduction and rationale, project description, evaluation design, evaluation methodology, reporting, program evaluation management, timeline, evaluation budget based on the best evidences, and supporting documents. Conclusion: We found that the logic model is used for evaluation program planning in many places, but more research is needed to see if it is suitable for our context.

  4. The Context, Process, and Outcome Evaluation Model for Organisational Health Interventions

    Science.gov (United States)

    Fridrich, Annemarie; Jenny, Gregor J.; Bauer, Georg F.

    2015-01-01

    To facilitate evaluation of complex, organisational health interventions (OHIs), this paper aims at developing a context, process, and outcome (CPO) evaluation model. It builds on previous model developments in the field and advances them by clearly defining and relating generic evaluation categories for OHIs. Context is defined as the underlying frame that influences and is influenced by an OHI. It is further differentiated into the omnibus and discrete contexts. Process is differentiated into the implementation process, as the time-limited enactment of the original intervention plan, and the change process of individual and collective dynamics triggered by the implementation process. These processes lead to proximate, intermediate, and distal outcomes, as all results of the change process that are meaningful for various stakeholders. Research questions that might guide the evaluation of an OHI according to the CPO categories and a list of concrete themes/indicators and methods/sources applied within the evaluation of an OHI project at a hospital in Switzerland illustrate the model's applicability in structuring evaluations of complex OHIs. In conclusion, the model supplies a common language and a shared mental model for improving communication between researchers and company members and will improve the comparability and aggregation of evaluation study results. PMID:26557665

  5. The Context, Process, and Outcome Evaluation Model for Organisational Health Interventions.

    Science.gov (United States)

    Fridrich, Annemarie; Jenny, Gregor J; Bauer, Georg F

    2015-01-01

    To facilitate evaluation of complex, organisational health interventions (OHIs), this paper aims at developing a context, process, and outcome (CPO) evaluation model. It builds on previous model developments in the field and advances them by clearly defining and relating generic evaluation categories for OHIs. Context is defined as the underlying frame that influences and is influenced by an OHI. It is further differentiated into the omnibus and discrete contexts. Process is differentiated into the implementation process, as the time-limited enactment of the original intervention plan, and the change process of individual and collective dynamics triggered by the implementation process. These processes lead to proximate, intermediate, and distal outcomes, as all results of the change process that are meaningful for various stakeholders. Research questions that might guide the evaluation of an OHI according to the CPO categories and a list of concrete themes/indicators and methods/sources applied within the evaluation of an OHI project at a hospital in Switzerland illustrate the model's applicability in structuring evaluations of complex OHIs. In conclusion, the model supplies a common language and a shared mental model for improving communication between researchers and company members and will improve the comparability and aggregation of evaluation study results.

  6. A qualitative readiness-requirements assessment model for enterprise big-data infrastructure investment

    Science.gov (United States)

    Olama, Mohammed M.; McNair, Allen W.; Sukumar, Sreenivas R.; Nutaro, James J.

    2014-05-01

    In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its own set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system's readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMADMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system's data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.

  7. An Efficient Dynamic Trust Evaluation Model for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Zhengwang Ye

    2017-01-01

    Full Text Available Trust evaluation is an effective method to detect malicious nodes and ensure security in wireless sensor networks (WSNs. In this paper, an efficient dynamic trust evaluation model (DTEM for WSNs is proposed, which implements accurate, efficient, and dynamic trust evaluation by dynamically adjusting the weights of direct trust and indirect trust and the parameters of the update mechanism. To achieve accurate trust evaluation, the direct trust is calculated considering multitrust including communication trust, data trust, and energy trust with the punishment factor and regulating function. The indirect trust is evaluated conditionally by the trusted recommendations from a third party. Moreover, the integrated trust is measured by assigning dynamic weights for direct trust and indirect trust and combining them. Finally, we propose an update mechanism by a sliding window based on induced ordered weighted averaging operator to enhance flexibility. We can dynamically adapt the parameters and the interactive history windows number according to the actual needs of the network to realize dynamic update of direct trust value. Simulation results indicate that the proposed dynamic trust model is an efficient dynamic and attack-resistant trust evaluation model. Compared with existing approaches, the proposed dynamic trust model performs better in defending multiple malicious attacks.

  8. The Galleria mellonella larvae as an in vivo model for evaluation of Shigella virulence.

    Science.gov (United States)

    Barnoy, Shoshana; Gancz, Hanan; Zhu, Yuewei; Honnold, Cary L; Zurawski, Daniel V; Venkatesan, Malabi M

    2017-07-04

    Shigella spp. causing bacterial diarrhea and dysentery are human enteroinvasive bacterial pathogens that are orally transmitted through contaminated food and water and cause bacillary dysentery. Although natural Shigella infections are restricted to humans and primates, several smaller animal models are used to analyze individual steps in pathogenesis. No animal model fully duplicates the human response and sustaining the models requires expensive animals, costly maintenance of animal facilities, veterinary services and approved animal protocols. This study proposes the development of the caterpillar larvae of Galleria mellonella as a simple, inexpensive, informative, and rapid in-vivo model for evaluating virulence and the interaction of Shigella with cells of the insect innate immunity. Virulent Shigella injected through the forelegs causes larvae death. The mortality rates were dependent on the Shigella strain, the infectious dose, and the presence of the virulence plasmid. Wild-type S. flexneri 2a, persisted and replicated within the larvae, resulting in haemocyte cell death, whereas plasmid-cured mutants were rapidly cleared. Histology of the infected larvae in conjunction with fluorescence, immunofluorescence, and transmission electron microscopy indicate that S. flexneri reside within a vacuole of the insect haemocytes that ultrastructurally resembles vacuoles described in studies with mouse and human macrophage cell lines. Some of these bacteria-laden vacuoles had double-membranes characteristic of autophagosomes. These results suggest that G. mellonella larvae can be used as an easy-to-use animal model to understand Shigella pathogenesis that requires none of the time and labor-consuming procedures typical of other systems.

  9. Evaluation Of Supplemental Pre-Treatment Development Requirements To Meet TRL 6: Rotary Microfiltration

    International Nuclear Information System (INIS)

    Huber, H.J.

    2011-01-01

    In spring 2011, the Technology Maturation Plan (TMP) for the Supplemental Treatment Project (RPP-PLAN-49827, Rev. 0), Technology Maturation Plan for the Treatment Project (T4S01) was developed. This plan contains all identified actions required to reach technical maturity for a field-deployable waste feed pretreatment system. The supplemental pretreatment system has a filtration and a Cs-removal component. Subsequent to issuance of the TMP, rotary microfiltration (RMF) has been identified as the prime filtration technology for this application. The prime Cs-removal technology is small column ion exchange (ScIX) using spherical resorcinol formaldehyde (sRF) as the exchange resin. During fiscal year 2011 (FY2011) some of the tasks identified in the TMP have been completed. As of September 2011, the conceptual design package has been submitted to DOE as part of the critical decision (CD-1) process. This document describes the remaining tasks identified in the TMP to reach technical maturity and evaluates the validity of the proposed tests to fill the gaps as previously identified in the TMP. The potential vulnerabilities are presented and the completed list of criteria for the DOE guide DOE G 413.3-4 different technology readiness levels are added in an attachment. This evaluation has been conducted from a technology development perspective - all programmatic and manufacturing aspects were excluded from this exercise. Compliance with the DOE G 413.3-4 programmatic and manufacturing requirements will be addressed directly by the Treatment Project during the course of engineering design. The results of this evaluation show that completion of the proposed development tasks in the TMP are sufficient to reach TRL 6 from a technological point of view. The tasks involve actual waste tests using the current baseline configuration (2nd generation disks, 40 psi differential pressure, 30 C feed temperature) and three different simulants - the PEP, an AP-Farm and an S

  10. Learning analytics: Dataset for empirical evaluation of entry requirements into engineering undergraduate programs in a Nigerian university.

    Science.gov (United States)

    Odukoya, Jonathan A; Popoola, Segun I; Atayero, Aderemi A; Omole, David O; Badejo, Joke A; John, Temitope M; Olowo, Olalekan O

    2018-04-01

    In Nigerian universities, enrolment into any engineering undergraduate program requires that the minimum entry criteria established by the National Universities Commission (NUC) must be satisfied. Candidates seeking admission to study engineering discipline must have reached a predetermined entry age and met the cut-off marks set for Senior School Certificate Examination (SSCE), Unified Tertiary Matriculation Examination (UTME), and the post-UTME screening. However, limited effort has been made to show that these entry requirements eventually guarantee successful academic performance in engineering programs because the data required for such validation are not readily available. In this data article, a comprehensive dataset for empirical evaluation of entry requirements into engineering undergraduate programs in a Nigerian university is presented and carefully analyzed. A total sample of 1445 undergraduates that were admitted between 2005 and 2009 to study Chemical Engineering (CHE), Civil Engineering (CVE), Computer Engineering (CEN), Electrical and Electronics Engineering (EEE), Information and Communication Engineering (ICE), Mechanical Engineering (MEE), and Petroleum Engineering (PET) at Covenant University, Nigeria were randomly selected. Entry age, SSCE aggregate, UTME score, Covenant University Scholastic Aptitude Screening (CUSAS) score, and the Cumulative Grade Point Average (CGPA) of the undergraduates were obtained from the Student Records and Academic Affairs unit. In order to facilitate evidence-based evaluation, the robust dataset is made publicly available in a Microsoft Excel spreadsheet file. On yearly basis, first-order descriptive statistics of the dataset are presented in tables. Box plot representations, frequency distribution plots, and scatter plots of the dataset are provided to enrich its value. Furthermore, correlation and linear regression analyses are performed to understand the relationship between the entry requirements and the

  11. Study on process evaluation model of students' learning in practical course

    Science.gov (United States)

    Huang, Jie; Liang, Pei; Shen, Wei-min; Ye, Youxiang

    2017-08-01

    In practical course teaching based on project object method, the traditional evaluation methods include class attendance, assignments and exams fails to give incentives to undergraduate students to learn innovatively and autonomously. In this paper, the element such as creative innovation, teamwork, document and reporting were put into process evaluation methods, and a process evaluation model was set up. Educational practice shows that the evaluation model makes process evaluation of students' learning more comprehensive, accurate, and fairly.

  12. Model for Quantitative Evaluation of Enzyme Replacement Treatment

    Directory of Open Access Journals (Sweden)

    Radeva B.

    2009-12-01

    Full Text Available Gaucher disease is the most frequent lysosomal disorder. Its enzyme replacement treatment was the new progress of modern biotechnology, successfully used in the last years. The evaluation of optimal dose of each patient is important due to health and economical reasons. The enzyme replacement is the most expensive treatment. It must be held continuously and without interruption. Since 2001, the enzyme replacement therapy with Cerezyme*Genzyme was formally introduced in Bulgaria, but after some time it was interrupted for 1-2 months. The dose of the patients was not optimal. The aim of our work is to find a mathematical model for quantitative evaluation of ERT of Gaucher disease. The model applies a kind of software called "Statistika 6" via the input of the individual data of 5-year-old children having the Gaucher disease treated with Cerezyme. The output results of the model gave possibilities for quantitative evaluation of the individual trends in the development of the disease of each child and its correlation. On the basis of this results, we might recommend suitable changes in ERT.

  13. Requirements for tokamak remote operation: Application to JT-60SA

    International Nuclear Information System (INIS)

    Innocente, Paolo; Barbato, Paolo; Farthing, Jonathan; Giruzzi, Gerardo; Ide, Shunsuke; Imbeaux, Frédéric; Joffrin, Emmanuel; Kamada, Yutaka; Kühner, Georg; Naito, Osamu; Urano, Hajime; Yoshida, Maiko

    2015-01-01

    Highlights: • We analyzed the data management system (DMS) appropriate for international collaboration. • We define the principal requirements for all components of the DMS. • We evaluated application of DMS requirements to the JT-60SA experiment. • We evaluated the role network bandwidth and time delay between EU and Japan. - Abstract: Remote operation and data analysis are becoming key requirements of any fusion devices. In this framework a well-conceived data management system integrated with a suite of analysis and support tools are essential components for an efficient remote exploitation of any fusion device. The following components must be considered: data archiving data model architecture; remote data and computers access; pulse schedule, data analysis software and support tools; remote control room specifications and security issues. The definition of a device-generic data model plays also important role in improving the ability to share solution and reducing learning time. As for the remote control room, the implementation of an Operation Request Gateway has been identified as an answer to security issues meanwhile remotely proving all the required features to effectively operate a device. Previous requirements have been analyzed for the new JT-60SA tokamak device. Remote exploitation is paramount in the JT-60SA case which is expected to be jointly operated between Japan and Europe. Due to the geographical distance of the two parties an optimal remote operation and remote data-analysis is considered as a key requirement in the development of JT-60SA. It this case the effects of network speed and delay have been also evaluated and tests have confirmed that the performance can vary significantly depending on the technology used.

  14. Requirements for tokamak remote operation: Application to JT-60SA

    Energy Technology Data Exchange (ETDEWEB)

    Innocente, Paolo, E-mail: paolo.innocente@igi.cnr.it [Consorzio RFX, Corso Stati Uniti 4, 35127 Padova (Italy); Barbato, Paolo [Consorzio RFX, Corso Stati Uniti 4, 35127 Padova (Italy); Farthing, Jonathan [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Giruzzi, Gerardo [CEA, IRFM, F-13108 Saint-Paul-lez-Durance (France); Ide, Shunsuke [Japan Atomic Energy Agency, Naka, Ibaraki-ken 311-0193 (Japan); Imbeaux, Frédéric; Joffrin, Emmanuel [CEA, IRFM, F-13108 Saint-Paul-lez-Durance (France); Kamada, Yutaka [Japan Atomic Energy Agency, Naka, Ibaraki-ken 311-0193 (Japan); Kühner, Georg [Max-Planck-Institute for Plasma Physics, EURATOM Association, Wendelsteinstr. 1, 17491 Greifswald (Germany); Naito, Osamu; Urano, Hajime; Yoshida, Maiko [Japan Atomic Energy Agency, Naka, Ibaraki-ken 311-0193 (Japan)

    2015-10-15

    Highlights: • We analyzed the data management system (DMS) appropriate for international collaboration. • We define the principal requirements for all components of the DMS. • We evaluated application of DMS requirements to the JT-60SA experiment. • We evaluated the role network bandwidth and time delay between EU and Japan. - Abstract: Remote operation and data analysis are becoming key requirements of any fusion devices. In this framework a well-conceived data management system integrated with a suite of analysis and support tools are essential components for an efficient remote exploitation of any fusion device. The following components must be considered: data archiving data model architecture; remote data and computers access; pulse schedule, data analysis software and support tools; remote control room specifications and security issues. The definition of a device-generic data model plays also important role in improving the ability to share solution and reducing learning time. As for the remote control room, the implementation of an Operation Request Gateway has been identified as an answer to security issues meanwhile remotely proving all the required features to effectively operate a device. Previous requirements have been analyzed for the new JT-60SA tokamak device. Remote exploitation is paramount in the JT-60SA case which is expected to be jointly operated between Japan and Europe. Due to the geographical distance of the two parties an optimal remote operation and remote data-analysis is considered as a key requirement in the development of JT-60SA. It this case the effects of network speed and delay have been also evaluated and tests have confirmed that the performance can vary significantly depending on the technology used.

  15. Evaluation of methods to estimate the essential amino acids requirements of fish from the muscle amino acid profile

    Directory of Open Access Journals (Sweden)

    Álvaro José de Almeida Bicudo

    2014-03-01

    Full Text Available Many methods to estimate amino acid requirement based on amino acid profile of fish have been proposed. This study evaluates the methodology proposed by Meyer & Fracalossi (2005 and by Tacon (1989 to estimate amino acids requirement of fish, which do exempt knowledge on previous nutritional requirement of reference amino acid. Data on amino acid requirement of pacu, Piaractus mesopotamicus, were used to validate de accuracy of those methods. Meyer & Fracalossi's and Tacon's methodology estimated the lysine requirement of pacu, respectively, at 13 and 23% above requirement determined using dose-response method. The values estimated by both methods lie within the range of requirements determined for other omnivorous fish species, the Meyer & Fracalossi (2005 method showing better accuracy.

  16. Evaluation of different models to segregate Pelibuey and Katahdin ewes into resistant or susceptible to gastrointestinal nematodes.

    Science.gov (United States)

    Palomo-Couoh, Jovanny Gaspar; Aguilar-Caballero, Armando Jacinto; Torres-Acosta, Juan Felipe de Jesús; Magaña-Monforte, Juan Gabriel

    2016-12-01

    This study evaluated four models based on the number of eggs per gram of faeces (EPG) to segregate Pelibuey or Katahdin ewes during the lactation period into resistant or susceptible to gastrointestinal nematodes (GIN) in tropical Mexico. Nine hundred and thirty EPG counts of Pelibuey ewes and 710 of Katahdin ewes were obtained during 10 weeks of lactation. Ewes were segregated into resistant, intermediate and susceptible, using their individual EPG every week. Then, data of every ewe was used to provide a reference classification, which included all the EPG values of each animal. Then, four models were evaluated against such reference. Model 1 was based on the 10-week mean EPG count ± 2 SE. Models 2, 3 and 4 were based on the mean EPG count of 10, 5 and 2 weeks of lactation. The cutoff points for the segregation of ewe in those three models were the quartiles ≤Q1 (low elimination) and ≥Q3 (high elimination). In all the models evaluated, the ewes classified as resistant had lower EPG than intermediates and susceptible (P ewes classified as susceptible had higher EPG than intermediate and resistant (P 70 %). Model 3 tended to show higher sensitivity and specificity with the reference data, but no difference was found with other models. The present study showed that the phenotypic marker EPG might serve to identify and segregate populations of adult ewes during the lactation period. All models used served to segregate Pelibuey and Katahdin ewes into resistant, intermediate and susceptible. The model 3 (mean of 5 weeks) could be used because it required less sampling effort without losing sensitivity or specificity in the segregation of animals. However, model 2 (mean of 2 weeks) was less labour-intensive.

  17. A COMPARISON OF SEMANTIC SIMILARITY MODELS IN EVALUATING CONCEPT SIMILARITY

    Directory of Open Access Journals (Sweden)

    Q. X. Xu

    2012-08-01

    Full Text Available The semantic similarities are important in concept definition, recognition, categorization, interpretation, and integration. Many semantic similarity models have been established to evaluate semantic similarities of objects or/and concepts. To find out the suitability and performance of different models in evaluating concept similarities, we make a comparison of four main types of models in this paper: the geometric model, the feature model, the network model, and the transformational model. Fundamental principles and main characteristics of these models are introduced and compared firstly. Land use and land cover concepts of NLCD92 are employed as examples in the case study. The results demonstrate that correlations between these models are very high for a possible reason that all these models are designed to simulate the similarity judgement of human mind.

  18. Modeling traceability information and functionality requirement in export-oriented tilapia chain.

    Science.gov (United States)

    Zhang, Xiaoshuan; Feng, Jianying; Xu, Mark; Hu, Jinyou

    2011-05-01

    Tilapia has been named as the 'food fish of the 21st century' and has become the most important farmed fish. China is the world leader in tilapia production and export. Identifying information and functional requirements is critical in developing an efficient traceability system because traceability has become a fundamental prerequisite for exporting aquaculture products. This paper examines the export-oriented tilapia chains and information flow in the chains, and identifies the key actors, information requirements and information-capturing points. Unified Modeling Language (UML) technology is adopted to describe the information and functionality requirement for chain traceability. The barriers of traceability system adoption are also identified. The results show that the traceability data consist of four categories that must be recorded by each link in the chain. The functionality requirement is classified into four categories from the fundamental information record to decisive quality control; the top three barriers to the traceability system adoption are: high costs of implementing the system, lack of experienced and professional staff; and low level of government involvement and support. Copyright © 2011 Society of Chemical Industry.

  19. Dynamic Model-Based Evaluation of Process Configurations for Integrated Operation of Hydrolysis and Co-Fermentation for Bioethanol Production from Lignocellulose

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    In this study a number of different process flowsheets were generated and their feasibility evaluated using simulations of dynamic models. A dynamic modeling framework was used for the assessment of operational scenarios such as, fed-batch, continuous and continuous with recycle configurations. E......) operating in continuous mode with a recycle of the SSCF reactor effluent, results in the best productivity of bioethanol among the proposed process configurations, with a yield of 0.18 kg ethanol/kg dry-biomass........ Each configuration was evaluated against the following benchmark criteria, yield (kg ethanol/kg dry-biomass), final product concentration and number of unit operations required in the different process configurations. The results show that simultaneous saccharification and co-fermentation (SSCF...

  20. A formative model for student nurse development and evaluation

    Directory of Open Access Journals (Sweden)

    A. S. van der Merwe

    1996-03-01

    Full Text Available Preparing student nurses for the profession is a complex task for nurse educators; especially when dealing with the development of personal and interpersonal skills, qualities and values held in high esteem by the nursing profession and the community they serve. These researchers developed a model for formative evaluation of students by using the principles of inductive and deductive reasoning. This model was implemented in clinical practice situations and evaluated for its usefulness. It seems that the model enhanced the standards of nursing care because it had a positive effect on the behaviour of students and they were better motivated; the model also improved interpersonal relationships and communication between practising nurses and students.

  1. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    Science.gov (United States)

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model

  2. Evaluation of COSMO-ART in the Framework of the Air Quality Model Evaluation International Initiative (AQMEII)

    Science.gov (United States)

    Giordano, Lea; Brunner, Dominik; Im, Ulas; Galmarini, Stefano

    2014-05-01

    The Air Quality Model Evaluation International Initiative (AQMEII) coordinated by the EC-JRC and US-EPA, promotes since 2008 research on regional air quality model evaluation across the atmospheric modelling communities of Europe and North America. AQMEII has now reached its Phase 2 that is dedicated to the evaluation of on-line coupled chemistry-meteorology models as opposed to Phase 1 where only off-line models were considered. At European level, AQMEII collaborates with the COST Action "European framework for on-line integrated air quality and meteorology modelling" (EuMetChem). All European groups participating in AQMEII performed simulations over the same spatial domain (Europe at a resolution of about 20 km) and using the same simulation strategy (e.g. no nudging allowed) and the same input data as much as possible. The initial and boundary conditions (IC/BC) were shared between all groups. Emissions were provided by the TNO-MACC database for anthropogenic emissions and the FMI database for biomass burning emissions. Chemical IC/BC data were taken from IFS-MOZART output, and meteorological IC/BC from the ECWMF global model. Evaluation data sets were collected by the Joint Research Center (JRC) and include measurements from surface in situ networks (AirBase and EMEP), vertical profiles from ozone sondes and aircraft (MOZAIC), and remote sensing (AERONET, satellites). Since Phase 2 focuses on on-line coupled models, a special effort is devoted to the detailed speciation of particulate matter components, with the goal of studying feedback processes. For the AQMEII exercise, COSMO-ART has been run with 40 levels of vertical resolution, and a chemical scheme that includes the SCAV module of Knote and Brunner (ACP 2013) for wet-phase chemistry and the SOA treatment according to VBS (volatility basis set) approach (Athanasopoulou et al., ACP 2013). The COSMO-ART evaluation shows that, next to a good performance in the meteorology, the gas phase chemistry is well

  3. Modeling and evaluating proliferation resistance of nuclear energy systems for strategy switching proliferation

    International Nuclear Information System (INIS)

    Yue, M.; Cheng, L.-Y.; Bari, R.A.

    2013-01-01

    Highlights: ► Sensitivity analysis is carried out for the model and physical input parameters. ► Interphase drag has minor effect on the dryout heat flux (DHF) in 1D configuration. ► Model calibration on pressure drop experiments fails to improve prediction of DHF. ► Calibrated classical model provides the best agreement with DHF data from 1D tests. ► Further validation of drag models requires data from 2D and 3D experiments on DHF. - Abstract: This paper reports a Markov model based approach to systematically evaluating the proliferation resistance (PR) of nuclear energy systems (NESs). The focus of the study is on the development of the Markov models for a class of complex PR scenarios, i.e., mixed covert/overt strategy switching proliferation, for NESs with two modes of material flow, batch and continuous. In particular, a set of diversion and/or breakout scenarios and covert/overt misuse scenarios are studied in detail for an Example Sodium Fast Reactor (ESFR) system. Both probabilistic and deterministic PR measures are calculated using a software tool that implements the proposed approach and can be used to quantitatively compare proliferation resistant characteristics of different scenarios for a given NES, according to the computed PR measures

  4. Evaluating spatial patterns in hydrological modelling

    DEFF Research Database (Denmark)

    Koch, Julian

    the contiguous United Sates (10^6 km2). To this end, the thesis at hand applies a set of spatial performance metrics on various hydrological variables, namely land-surface-temperature (LST), evapotranspiration (ET) and soil moisture. The inspiration for the applied metrics is found in related fields...... is not fully exploited by current modelling frameworks due to the lack of suitable spatial performance metrics. Furthermore, the traditional model evaluation using discharge is found unsuitable to lay confidence on the predicted catchment inherent spatial variability of hydrological processes in a fully...

  5. Pipe fracture evaluations for leak-rate detection: Probabilistic models

    International Nuclear Information System (INIS)

    Rahman, S.; Wilkowski, G.; Ghadiali, N.

    1993-01-01

    This is the second in series of three papers generated from studies on nuclear pipe fracture evaluations for leak-rate detection. This paper focuses on the development of novel probabilistic models for stochastic performance evaluation of degraded nuclear piping systems. It was accomplished here in three distinct stages. First, a statistical analysis was conducted to characterize various input variables for thermo-hydraulic analysis and elastic-plastic fracture mechanics, such as material properties of pipe, crack morphology variables, and location of cracks found in nuclear piping. Second, a new stochastic model was developed to evaluate performance of degraded piping systems. It is based on accurate deterministic models for thermo-hydraulic and fracture mechanics analyses described in the first paper, statistical characterization of various input variables, and state-of-the-art methods of modem structural reliability theory. From this model. the conditional probability of failure as a function of leak-rate detection capability of the piping systems can be predicted. Third, a numerical example was presented to illustrate the proposed model for piping reliability analyses. Results clearly showed that the model provides satisfactory estimates of conditional failure probability with much less computational effort when compared with those obtained from Monte Carlo simulation. The probabilistic model developed in this paper will be applied to various piping in boiling water reactor and pressurized water reactor plants for leak-rate detection applications

  6. Automata learning algorithms and processes for providing more complete systems requirements specification by scenario generation, CSP-based syntax-oriented model construction, and R2D2C system requirements transformation

    Science.gov (United States)

    Hinchey, Michael G. (Inventor); Margaria, Tiziana (Inventor); Rash, James L. (Inventor); Rouff, Christopher A. (Inventor); Steffen, Bernard (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, automata learning algorithms and techniques are implemented to generate a more complete set of scenarios for requirements based programming. More specifically, a CSP-based, syntax-oriented model construction, which requires the support of a theorem prover, is complemented by model extrapolation, via automata learning. This may support the systematic completion of the requirements, the nature of the requirement being partial, which provides focus on the most prominent scenarios. This may generalize requirement skeletons by extrapolation and may indicate by way of automatically generated traces where the requirement specification is too loose and additional information is required.

  7. Evaluation of the stomatal conductance formulation in the EMEP ozone deposition model for Picea abies

    Science.gov (United States)

    Wieser, G.; Emberson, L. D.

    It is widely acknowledged that the possible impacts of ozone on forest trees are more closely related to ozone flux through the stomata than to external ozone exposure. However, the application of the flux approach on a European scale requires the availability of appropriate models, such as the European Monitoring and Evaluation Programme (EMEP) ozone deposition model, for estimating ozone flux and cumulative ozone uptake. Within this model stomatal conductance is the key variable, since it determines the amount of ozone absorbed by the leaves. This paper describes the suitability of the existing EMEP ozone deposition model parameterisation and formulation to represent stomatal behaviour determined from field measurements on adult Norway spruce ( Picea abies (L.) Karst.) trees in the Central European Alps. Parameters affecting maximum stomatal conductance (e.g. seasonal phenology, needle position, needle age, nutrient deficiency and ozone itself) and stomatal response functions to temperature, irradiance, vapour pressure deficit, and soil water content are investigated. Finally, current limitations and possible alterations of the EMEP model will be discussed with respect to spatial scales of available input data for future flux modelling.

  8. Evaluation of constitutive models for crushed salt

    International Nuclear Information System (INIS)

    Callahan, G.D.; Loken, M.C.; Hurtado, L.D.; Hansen, F.D.

    1996-01-01

    Three constitutive models are recommended as candidates for describing the deformation of crushed salt. These models are generalized to three-dimensional states of stress to include the effects of mean and deviatoric stress and modified to include effects of temperature, grain size, and moisture content. A database including hydrostatic consolidation and shear consolidation tests conducted on Waste Isolation Pilot Plant (WIPP) and southeastern New Mexico salt is used to determine material parameters for the models. To evaluate the capability of the models, parameter values obtained from fitting the complete database are used to predict the individual tests. Finite element calculations of a WIPP shaft with emplaced crushed salt demonstrate the model predictions

  9. The Norwegian Noark Model Requirements for EDRMS in the context of open government and access to governmental information

    Directory of Open Access Journals (Sweden)

    Olav Hagen Sataslåtten

    2017-11-01

    Full Text Available This article analyses the relationship between the Norwegian Noark Standard and the concepts of Open Government and Freedom of Information. Noark is the Norwegian model requirements for Electronic Documents and Records Management Systems (EDRMS. It was introduced in 1984, making it not only the world’s first model requirement for EDRMS, but also, through the introduction of versions from Noark 1 to the present Noark 5, internationally the model requirement with the longest continuation of implementation.

  10. Fracture toughness requirements of reactor vessel material in evaluation of the safety analysis report of nuclear power plants

    International Nuclear Information System (INIS)

    Widia Lastana Istanto

    2011-01-01

    Fracture toughness requirements of reactor vessel material that must be met by applicants for nuclear power plants construction permit has been investigated in this paper. The fracture toughness should be described in the Safety Analysis Reports (SARs) document that will be evaluated by the Nuclear Energy Regulatory Agency (BAPETEN). Because BAPETEN does not have a regulations or standards/codes regarding the material used for the reactor vessel, especially in the fracture toughness requirements, then the acceptance criteria that applied to evaluate the fracture toughness of reactor vessel material refers to the regulations/provisions from the countries that have been experienced in the operation of nuclear power plants, such as from the United States, Japan and Korea. Regulations and standards used are 10 CFR Part 50, ASME and ASTM. Fracture toughness of reactor vessel materials are evaluated to ensure compliance of the requirements and provisions of the Regulatory Body and the applicable standards, such as ASME or ASTM, in order to assure a reliability and integrity of the reactor vessels as well as providing an adequate safety margin during the operation, testing, maintenance, and postulated accident conditions over the reactor vessel lifetime. (author)

  11. An Evaluation Model of Quantitative and Qualitative Fuzzy Multi-Criteria Decision-Making Approach for Location Selection of Transshipment Ports

    Directory of Open Access Journals (Sweden)

    Ji-Feng Ding

    2013-01-01

    Full Text Available The role of container logistics centre as home bases for merchandise transportation has become increasingly important. The container carriers need to select a suitable centre location of transshipment port to meet the requirements of container shipping logistics. In the light of this, the main purpose of this paper is to develop a fuzzy multi-criteria decision-making (MCDM model to evaluate the best selection of transshipment ports for container carriers. At first, some concepts and methods used to develop the proposed model are briefly introduced. The performance values of quantitative and qualitative subcriteria are discussed to evaluate the fuzzy ratings. Then, the ideal and anti-ideal concepts and the modified distance measure method are used in the proposed model. Finally, a step-by-step example is illustrated to study the computational process of the quantitative and qualitative fuzzy MCDM model. The proposed approach has successfully accomplished our goal. In addition, the proposed fuzzy MCDM model can be empirically employed to select the best location of transshipment port for container carriers in the future study.

  12. Modelling human resource requirements for the nuclear industry in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Roelofs, Ferry [Nuclear Research and Consultancy Group (NRG) (Netherlands); Flore, Massimo; Estorff, Ulrik von [Joint Research Center (JRC) (Netherlands)

    2017-11-15

    The European Human Resource Observatory for Nuclear (EHRO-N) provides the European Commission with essential data related to supply and demand for nuclear experts in the EU-28 and the enlargement and integration countries based on bottom-up information from the nuclear industry. The objective is to assess how the supply of experts for the nuclear industry responds to the needs for the same experts for present and future nuclear projects in the region. Complementary to the bottom-up approach taken by the EHRO-N team at JRC, a top-down modelling approach has been taken in a collaboration with NRG in the Netherlands. This top-down modelling approach focuses on the human resource requirements for operation, construction, decommissioning, and efforts for long term operation of nuclear power plants. This paper describes the top-down methodology, the model input, the main assumptions, and the results of the analyses.

  13. Modelling human resource requirements for the nuclear industry in Europe

    International Nuclear Information System (INIS)

    Roelofs, Ferry; Flore, Massimo; Estorff, Ulrik von

    2017-01-01

    The European Human Resource Observatory for Nuclear (EHRO-N) provides the European Commission with essential data related to supply and demand for nuclear experts in the EU-28 and the enlargement and integration countries based on bottom-up information from the nuclear industry. The objective is to assess how the supply of experts for the nuclear industry responds to the needs for the same experts for present and future nuclear projects in the region. Complementary to the bottom-up approach taken by the EHRO-N team at JRC, a top-down modelling approach has been taken in a collaboration with NRG in the Netherlands. This top-down modelling approach focuses on the human resource requirements for operation, construction, decommissioning, and efforts for long term operation of nuclear power plants. This paper describes the top-down methodology, the model input, the main assumptions, and the results of the analyses.

  14. Model of service-oriented catering supply chain performance evaluation

    OpenAIRE

    Gou, Juanqiong; Shen, Guguan; Chai, Rui

    2013-01-01

    Purpose: The aim of this paper is constructing a performance evaluation model for service-oriented catering supply chain. Design/methodology/approach: With the research on the current situation of catering industry, this paper summarized the characters of the catering supply chain, and then presents the service-oriented catering supply chain model based on the platform of logistics and information. At last, the fuzzy AHP method is used to evaluate the performance of service-oriented catering ...

  15. BALANCED SCORE CARD MODEL EVALUATION: THE CASE OF AD BARSKA PLOVIDBA

    Directory of Open Access Journals (Sweden)

    Jelena Jovanović

    2009-06-01

    Full Text Available The paper analyses creation of Balanced Scorecard, which includes environmental protection elements in AD Barska Plovidba. Firstly,the paper presents proposed models that include elements of conventional Balanced scorecard, and then we start with proposed models evaluation. In fact, as implementation and evaluation of the model in AD Barska Plovidba takes longer period of time, its evaluation and final choice is based on ISO 14598 and ISO 9126 with use of AHP method. Usually those standards are used for quality evaluation of software products, computer programs and databases inside organisation. After all, they serve as support for their development and acceptance because they provide quality evaluation during the phase when software is not yet implemented inside organistaion, what we assume as very important.

  16. Evaluating the value chain model for service organisational strategy: International hotels.

    OpenAIRE

    Choi, Keetag.

    2000-01-01

    Strategic models like Porter's (1985) value chain have not been fully evaluated in the strategy literature and applied to all industries. To theoretically redefine the value chain technique, this research evaluates the value chain's use with various strategic issues by applying it to a specific aspect in the service field, namely the hotel industry. The study defines five key questions by which to evaluate a strategic model and the value chain model is examined using them. This research is a ...

  17. Comparison of static model and dynamic model for the evaluation of station blackout sequences

    International Nuclear Information System (INIS)

    Lee, Kwang-Nam; Kang, Sun-Koo; Hong, Sung-Yull.

    1992-01-01

    Station blackout is one of major contributors to the core damage frequency (CDF) in many PSA studies. Since station blackout sequence exhibits dynamic features, accurate calculation of CDF for the station blackout sequence is not possible with event tree/fault tree (ET/FT) method. Although the integral method can determine accurate CDF, it is time consuming and is difficult to evaluate various alternative AC source configuration and sensitivities. In this study, a comparison is made between static model and dynamic model and a new methodology which combines static model and dynamic model is provided for the accurate quantification of CDF and evaluation of improvement alternatives. Results of several case studies show that accurate calculation of CDF is possible by introducing equivalent mission time. (author)

  18. Radiation risk models for all solid cancers other than those types of cancer requiring individual assessments after a nuclear accident

    International Nuclear Information System (INIS)

    Walsh, Linda; Zhang, Wei

    2016-01-01

    In the assessment of health risks after nuclear accidents, some health consequences require special attention. For example, in their 2013 report on health risk assessment after the Fukushima nuclear accident, the World Health Organisation (WHO) panel of experts considered risks of breast cancer, thyroid cancer and leukaemia. For these specific cancer types, use was made of already published excess relative risk (ERR) and excess absolute risk (EAR) models for radiation-related cancer incidence fitted to the epidemiological data from the Japanese A-bomb Life Span Study (LSS). However, it was also considered important to assess all other types of solid cancer together and the WHO, in their above-mentioned report, stated ''No model to calculate the risk for all other solid cancer excluding breast and thyroid cancer risks is available from the LSS data''. Applying the LSS models for all solid cancers along with the models for the specific sites means that some cancers have an overlap in the risk evaluations. Thus, calculating the total solid cancer risk plus the breast cancer risk plus the thyroid cancer risk can overestimate the total risk by several per cent. Therefore, the purpose of this paper was to publish the required models for all other solid cancers, i.e. all solid cancers other than those types of cancer requiring special attention after a nuclear accident. The new models presented here have been fitted to the same LSS data set from which the risks provided by the WHO were derived. Although it is known already that the EAR and ERR effect modifications by sex are statistically significant for the outcome ''all solid cancer'', it is shown here that sex modification is not statistically significant for the outcome ''all solid cancer other than thyroid and breast cancer''. It is also shown here that the sex-averaged solid cancer risks with and without the sex modification are very similar once breast and thyroid cancers are factored out. Some other notable model

  19. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  20. Comparison of Allogeneic and Syngeneic Rat Glioma Models by Using MRI and Histopathologic Evaluation.

    Science.gov (United States)

    Biasibetti, Elena; Valazza, Alberto; Capucchio, Maria T; Annovazzi, Laura; Battaglia, Luigi; Chirio, Daniela; Gallarate, Marina; Mellai, Marta; Muntoni, Elisabetta; Peira, Elena; Riganti, Chiara; Schiffer, Davide; Panciani, Pierpaolo; Lanotte, Michele

    2017-03-01

    Research in neurooncology traditionally requires appropriate in vivo animal models, on which therapeutic strategies are tested before human trials are designed and proceed. Several reproducible animal experimental models, in which human physiologic conditions can be mimicked, are available for studying glioblastoma multiforme. In an ideal rat model, the tumor is of glial origin, grows in predictable and reproducible patterns, closely resembles human gliomas histopathologically, and is weakly or nonimmunogenic. In the current study, we used MRI and histopathologic evaluation to compare the most widely used allogeneic rat glioma model, C6-Wistar, with the F98-Fischer syngeneic rat glioma model in terms of percentage tumor growth or regression and growth rate. In vivo MRI demonstrated considerable variation in tumor volume and frequency between the 2 rat models despite the same stereotactic implantation technique. Faster and more reproducible glioma growth occurred in the immunoresponsive environment of the F98-Fischer model, because the immune response is minimized toward syngeneic cells. The marked inability of the C6-Wistar allogeneic system to generate a reproducible model and the episodes of spontaneous tumor regression with this system may have been due to the increased humoral and cellular immune responses after tumor implantation.

  1. Evaluation of burst pressure prediction models for line pipes

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Xian-Kui, E-mail: zhux@battelle.org [Battelle Memorial Institute, 505 King Avenue, Columbus, OH 43201 (United States); Leis, Brian N. [Battelle Memorial Institute, 505 King Avenue, Columbus, OH 43201 (United States)

    2012-01-15

    Accurate prediction of burst pressure plays a central role in engineering design and integrity assessment of oil and gas pipelines. Theoretical and empirical solutions for such prediction are evaluated in this paper relative to a burst pressure database comprising more than 100 tests covering a variety of pipeline steel grades and pipe sizes. Solutions considered include three based on plasticity theory for the end-capped, thin-walled, defect-free line pipe subjected to internal pressure in terms of the Tresca, von Mises, and ZL (or Zhu-Leis) criteria, one based on a cylindrical instability stress (CIS) concept, and a large group of analytical and empirical models previously evaluated by Law and Bowie (International Journal of Pressure Vessels and Piping, 84, 2007: 487-492). It is found that these models can be categorized into either a Tresca-family or a von Mises-family of solutions, except for those due to Margetson and Zhu-Leis models. The viability of predictions is measured via statistical analyses in terms of a mean error and its standard deviation. Consistent with an independent parallel evaluation using another large database, the Zhu-Leis solution is found best for predicting burst pressure, including consideration of strain hardening effects, while the Tresca strength solutions including Barlow, Maximum shear stress, Turner, and the ASME boiler code provide reasonably good predictions for the class of line-pipe steels with intermediate strain hardening response. - Highlights: Black-Right-Pointing-Pointer This paper evaluates different burst pressure prediction models for line pipes. Black-Right-Pointing-Pointer The existing models are categorized into two major groups of Tresca and von Mises solutions. Black-Right-Pointing-Pointer Prediction quality of each model is assessed statistically using a large full-scale burst test database. Black-Right-Pointing-Pointer The Zhu-Leis solution is identified as the best predictive model.

  2. Evaluation of burst pressure prediction models for line pipes

    International Nuclear Information System (INIS)

    Zhu, Xian-Kui; Leis, Brian N.

    2012-01-01

    Accurate prediction of burst pressure plays a central role in engineering design and integrity assessment of oil and gas pipelines. Theoretical and empirical solutions for such prediction are evaluated in this paper relative to a burst pressure database comprising more than 100 tests covering a variety of pipeline steel grades and pipe sizes. Solutions considered include three based on plasticity theory for the end-capped, thin-walled, defect-free line pipe subjected to internal pressure in terms of the Tresca, von Mises, and ZL (or Zhu-Leis) criteria, one based on a cylindrical instability stress (CIS) concept, and a large group of analytical and empirical models previously evaluated by Law and Bowie (International Journal of Pressure Vessels and Piping, 84, 2007: 487–492). It is found that these models can be categorized into either a Tresca-family or a von Mises-family of solutions, except for those due to Margetson and Zhu-Leis models. The viability of predictions is measured via statistical analyses in terms of a mean error and its standard deviation. Consistent with an independent parallel evaluation using another large database, the Zhu-Leis solution is found best for predicting burst pressure, including consideration of strain hardening effects, while the Tresca strength solutions including Barlow, Maximum shear stress, Turner, and the ASME boiler code provide reasonably good predictions for the class of line-pipe steels with intermediate strain hardening response. - Highlights: ► This paper evaluates different burst pressure prediction models for line pipes. ► The existing models are categorized into two major groups of Tresca and von Mises solutions. ► Prediction quality of each model is assessed statistically using a large full-scale burst test database. ► The Zhu-Leis solution is identified as the best predictive model.

  3. Evaluation model applied to TRANSPETRO's Marine Terminals Standardization Program

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de; Mueller, Gabriela [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Garcia, Luciano Maldonado [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes an innovative evaluation model applied to TRANSPETRO's 'Marine Terminals Standardization Program' based on updating approaches of programs evaluation and organizational learning. Since the program was launched in 2004, the need for having an evaluation model able to evaluate its implementation progress, to measure the degree of standards compliance and its potential economic, social and environmental impacts has become evident. Within a vision of safe and environmentally responsible operations of marine terminals, this evaluation model was jointly designed by TRANSPETRO and PUC-Rio to promote continuous improvement and learning in operational practices and in the standardization process itself. TRANSPETRO believes that standardization supports its services and management innovation capability by creating objective and internationally recognized parameters, targets and metrology for its business activities. The conceptual model and application guidelines for this important tool are presented in this paper, as well as the next steps towards its implementation. (author)

  4. RAMSI management model and evaluation criteria for Nordic offshore wind asset

    Energy Technology Data Exchange (ETDEWEB)

    Tiusanen, R.; Jaennes, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Liyanage, J. P. [Univ. of Stavanger, Center for Industrial Asset Management (Norway)

    2012-07-01

    The offshore wind energy sector is in the early stages of development, but it is growing fast. Due to the European Union's renewable-energy and climate goals along with national legislation, the offshore wind sector will develop strongly over the coming years in Europe. In the offshore wind energy sector, there are many different wind-turbine designs ranging from traditional mono pile structures to floating platforms, depending on the water depth. Today, most offshore turbines are based on onshore turbine designs, and turbine technology continues to develop incrementally. At the same time, there is strong demand in the market for new, innovative designs for offshore wind turbines whose main focus is reliability and cost efficiency. For floating offshore wind turbine designs, there may be new types of uncertainty and system risks compared with onshore wind turbines. Wind turbines in cold climates, such as those experienced in the Nordic countries, may be exposed to extreme conditions, such as formation of ice or very low temperatures that are outside the design limits of standard wind turbines. In the offshore wind energy sector, specification, implementation and verification of the so-called R&M's (Reliability, Availability, Maintainability, Safety and Inspect ability) requirements during development work are important for companies delivering wind turbines, from the perspective of system integrity. Decisions made before the formal design phase strongly determine the costs and benefits gained during the whole life cycle of a wind turbine. The benefits of implementing the R&M's program include support with investment decisions, cost management, improved management of resource requirements, systematic support with development and implementation of products, and integration of dependability and safety requirements. This publication outlines a model for managing R&M's factors during the conceptual design phase of an offshore wind turbine. The model

  5. Evaluating energy saving system of data centers based on AHP and fuzzy comprehensive evaluation model

    Science.gov (United States)

    Jiang, Yingni

    2018-03-01

    Due to the high energy consumption of communication, energy saving of data centers must be enforced. But the lack of evaluation mechanisms has restrained the process on energy saving construction of data centers. In this paper, energy saving evaluation index system of data centers was constructed on the basis of clarifying the influence factors. Based on the evaluation index system, analytical hierarchy process was used to determine the weights of the evaluation indexes. Subsequently, a three-grade fuzzy comprehensive evaluation model was constructed to evaluate the energy saving system of data centers.

  6. Observational uncertainty and regional climate model evaluation: A pan-European perspective

    Science.gov (United States)

    Kotlarski, Sven; Szabó, Péter; Herrera, Sixto; Räty, Olle; Keuler, Klaus; Soares, Pedro M.; Cardoso, Rita M.; Bosshard, Thomas; Pagé, Christian; Boberg, Fredrik; Gutiérrez, José M.; Jaczewski, Adam; Kreienkamp, Frank; Liniger, Mark. A.; Lussana, Cristian; Szepszo, Gabriella

    2017-04-01

    Local and regional climate change assessments based on downscaling methods crucially depend on the existence of accurate and reliable observational reference data. In dynamical downscaling via regional climate models (RCMs) observational data can influence model development itself and, later on, model evaluation, parameter calibration and added value assessment. In empirical-statistical downscaling, observations serve as predictand data and directly influence model calibration with corresponding effects on downscaled climate change projections. Focusing on the evaluation of RCMs, we here analyze the influence of uncertainties in observational reference data on evaluation results in a well-defined performance assessment framework and on a European scale. For this purpose we employ three different gridded observational reference grids, namely (1) the well-established EOBS dataset (2) the recently developed EURO4M-MESAN regional re-analysis, and (3) several national high-resolution and quality-controlled gridded datasets that recently became available. In terms of climate models five reanalysis-driven experiments carried out by five different RCMs within the EURO-CORDEX framework are used. Two variables (temperature and precipitation) and a range of evaluation metrics that reflect different aspects of RCM performance are considered. We furthermore include an illustrative model ranking exercise and relate observational spread to RCM spread. The results obtained indicate a varying influence of observational uncertainty on model evaluation depending on the variable, the season, the region and the specific performance metric considered. Over most parts of the continent, the influence of the choice of the reference dataset for temperature is rather small for seasonal mean values and inter-annual variability. Here, model uncertainty (as measured by the spread between the five RCM simulations considered) is typically much larger than reference data uncertainty. For

  7. Capital Cost Optimization for Prefabrication: A Factor Analysis Evaluation Model

    Directory of Open Access Journals (Sweden)

    Hong Xue

    2018-01-01

    Full Text Available High capital cost is a significant hindrance to the promotion of prefabrication. In order to optimize cost management and reduce capital cost, this study aims to explore the latent factors and factor analysis evaluation model. Semi-structured interviews were conducted to explore potential variables and then questionnaire survey was employed to collect professionals’ views on their effects. After data collection, exploratory factor analysis was adopted to explore the latent factors. Seven latent factors were identified, including “Management Index”, “Construction Dissipation Index”, “Productivity Index”, “Design Efficiency Index”, “Transport Dissipation Index”, “Material increment Index” and “Depreciation amortization Index”. With these latent factors, a factor analysis evaluation model (FAEM, divided into factor analysis model (FAM and comprehensive evaluation model (CEM, was established. The FAM was used to explore the effect of observed variables on the high capital cost of prefabrication, while the CEM was used to evaluate comprehensive cost management level on prefabrication projects. Case studies were conducted to verify the models. The results revealed that collaborative management had a positive effect on capital cost of prefabrication. Material increment costs and labor costs had significant impacts on production cost. This study demonstrated the potential of on-site management and standardization design to reduce capital cost. Hence, collaborative management is necessary for cost management of prefabrication. Innovation and detailed design were needed to improve cost performance. The new form of precast component factories can be explored to reduce transportation cost. Meanwhile, targeted strategies can be adopted for different prefabrication projects. The findings optimized the capital cost and improved the cost performance through providing an evaluation and optimization model, which helps managers to

  8. Non-formal techniques for requirements elicitation, modeling, and early assessment for services

    NARCIS (Netherlands)

    van der Veer, Gerrit C.; Vyas, Dhaval; Dittmar, A.; Forbig, P.

    2011-01-01

    Designing systems for multiple stakeholders requires frequent collaboration with multiple stakeholders from the start. In many cases at least some stakeholders lack a professional habit of formal modeling. We report observations from two case studies of stakeholder involvement in early design where

  9. Modelling in Evaluating a Working Life Project in Higher Education

    Science.gov (United States)

    Sarja, Anneli; Janhonen, Sirpa; Havukainen, Pirjo; Vesterinen, Anne

    2012-01-01

    This article describes an evaluation method based on collaboration between the higher education, a care home and university, in a R&D project. The aim of the project was to elaborate modelling as a tool of developmental evaluation for innovation and competence in project cooperation. The approach was based on activity theory. Modelling enabled a…

  10. Guidance to Risk-Informed Evaluation of Technical Specifications using PSA

    International Nuclear Information System (INIS)

    Baeckstroem, Ola; Haeggstroem, Anna; Maennistoe, Ilkka

    2010-04-01

    This report presents guidance for evaluation of Technical Specification conditions with PSA. It covers quality in PSA, how to verify that the PSA model is sufficiently robust and sufficiently complete and general requirements on methods. Acceptance criteria for evaluation of changes in the TS conditions are presented. As the probabilistic safety assessment (PSA) has developed over the years, it has demonstrated to constitute a useful tool for evaluating many aspects of the TS from a risk point of view. and in that way making the PSAs as well as the decision tools better. This also means that it will be possible to take credit for safety system overcapacity as well as inherent safety features and strength of non-safety classed systems. However, PSA is only one of the tools that shall be used in an evaluation process of TS changes (strengthening/relaxation). PSA is an excellent tool to be used to verify the importance, and thereby possibly relaxation, of TS requirements. But, since PSA is only one tool in the evaluation, it is not sufficient in itself for defining which equipment that shall or shall not have TS requirements. The purpose of this guidance document is to provide general requirements, requirements on methods and acceptance criteria on risk-informed evaluation of TS changes based on PSA. The purpose is not to provide a single solution. As part of the review of the TS conditions this guidance specify requirements on: - Quality verification of the PSA model; - Verification that the PSA model is sufficiently robust with regard to SSCs for which requirements both are and are not defined by the TS; - Verification that the SSCs, for which TS demands are to be evaluated, are modelled in a sufficient manner; - Methods for performing the evaluation; - Which evaluation criteria that shall be used (and how that is verified to be correct); - Acceptance criteria: This guidance also briefly discusses the documentation of the analysis of the TS changes. This guidance

  11. Guidance to risk-informed evaluation of technical specifications using PSA

    International Nuclear Information System (INIS)

    Baeckstroem, O.; Haeggstroem, A.; Maennistoe, I.

    2010-10-01

    This report presents guidance for evaluation of Technical Specification conditions with PSA. It covers quality in PSA, how to verify that the PSA model is sufficiently robust and sufficiently complete and general requirements on methods. Acceptance criteria for evaluation of changes in the TS conditions are presented. As the probabilistic safety assessment (PSA) has developed over the years, it has demonstrated to constitute a useful tool for evaluating many aspects of the TS from a risk point of view. and in that way making the PSAs as well as the decision tools better. This also means that it will be possible to take credit for safety system overcapacity as well as inherent safety features and strength of non-safety classed systems. However, PSA is only one of the tools that shall be used in an evaluation process of TS changes (strengthening/relaxation). PSA is an excellent tool to be used to verify the importance, and thereby possibly relaxation, of TS requirements. But, since PSA is only one tool in the evaluation, it is not sufficient in itself for defining which equipment that shall or shall not have TS requirements. The purpose of this guidance document is to provide general requirements, requirements on methods and acceptance criteria on risk-informed evaluation of TS changes based on PSA. The purpose is not to provide a single solution. As part of the review of the TS conditions this guidance specify requirements on: - Quality verification of the PSA model; - Verification that the PSA model is sufficiently robust with regard to SSCs for which requirements both are and are not defined by the TS; - Verification that the SSCs, for which TS demands are to be evaluated, are modelled in a sufficient manner; - Methods for performing the evaluation; - Which evaluation criteria that shall be used (and how that is verified to be correct); - Acceptance criteria: This guidance also briefly discusses the documentation of the analysis of the TS changes. This guidance

  12. Guidance to risk-informed evaluation of technical specifications using PSA

    Energy Technology Data Exchange (ETDEWEB)

    Baeckstroem, O.; Haeggstroem, A. (Scandpower AB, Stockholm (Sweden)); Maennistoe, I. (VTT, Helsingfors (Finland))

    2010-04-15

    This report presents guidance for evaluation of Technical Specification conditions with PSA. It covers quality in PSA, how to verify that the PSA model is sufficiently robust and sufficiently complete and general requirements on methods. Acceptance criteria for evaluation of changes in the TS conditions are presented. As the probabilistic safety assessment (PSA) has developed over the years, it has demonstrated to constitute a useful tool for evaluating many aspects of the TS from a risk point of view. and in that way making the PSAs as well as the decision tools better. This also means that it will be possible to take credit for safety system overcapacity as well as inherent safety features and strength of non-safety classed systems. However, PSA is only one of the tools that shall be used in an evaluation process of TS changes (strengthening/relaxation). PSA is an excellent tool to be used to verify the importance, and thereby possibly relaxation, of TS requirements. But, since PSA is only one tool in the evaluation, it is not sufficient in itself for defining which equipment that shall or shall not have TS requirements. The purpose of this guidance document is to provide general requirements, requirements on methods and acceptance criteria on risk-informed evaluation of TS changes based on PSA. The purpose is not to provide a single solution. As part of the review of the TS conditions this guidance specify requirements on: - Quality verification of the PSA model; - Verification that the PSA model is sufficiently robust with regard to SSCs for which requirements both are and are not defined by the TS; - Verification that the SSCs, for which TS demands are to be evaluated, are modelled in a sufficient manner; - Methods for performing the evaluation; - Which evaluation criteria that shall be used (and how that is verified to be correct); - Acceptance criteria: This guidance also briefly discusses the documentation of the analysis of the TS changes. This guidance

  13. Guidance to Risk-Informed Evaluation of Technical Specifications using PSA

    Energy Technology Data Exchange (ETDEWEB)

    Baeckstroem, Ola; Haeggstroem, Anna (Scandpower AB, Stockholm (Sweden)); Maennistoe, Ilkka (VTT, Helsingfors (Finland))

    2010-04-15

    This report presents guidance for evaluation of Technical Specification conditions with PSA. It covers quality in PSA, how to verify that the PSA model is sufficiently robust and sufficiently complete and general requirements on methods. Acceptance criteria for evaluation of changes in the TS conditions are presented. As the probabilistic safety assessment (PSA) has developed over the years, it has demonstrated to constitute a useful tool for evaluating many aspects of the TS from a risk point of view. and in that way making the PSAs as well as the decision tools better. This also means that it will be possible to take credit for safety system overcapacity as well as inherent safety features and strength of non-safety classed systems. However, PSA is only one of the tools that shall be used in an evaluation process of TS changes (strengthening/relaxation). PSA is an excellent tool to be used to verify the importance, and thereby possibly relaxation, of TS requirements. But, since PSA is only one tool in the evaluation, it is not sufficient in itself for defining which equipment that shall or shall not have TS requirements. The purpose of this guidance document is to provide general requirements, requirements on methods and acceptance criteria on risk-informed evaluation of TS changes based on PSA. The purpose is not to provide a single solution. As part of the review of the TS conditions this guidance specify requirements on: - Quality verification of the PSA model; - Verification that the PSA model is sufficiently robust with regard to SSCs for which requirements both are and are not defined by the TS; - Verification that the SSCs, for which TS demands are to be evaluated, are modelled in a sufficient manner; - Methods for performing the evaluation; - Which evaluation criteria that shall be used (and how that is verified to be correct); - Acceptance criteria: This guidance also briefly discusses the documentation of the analysis of the TS changes. This guidance

  14. Hydrology model evaluation at the Hanford Nuclear Waste Facility

    International Nuclear Information System (INIS)

    1977-04-01

    One and two-dimensional flow and contaminant transport computer models have been developed at Hanford to assess the rate and direction of contaminant movement from waste disposal sites. The primary objective of this work was to evaluate the potential improvement in accuracy that a three-dimensional model might offer over the simpler one and two-dimensional models. INTERA's hydrology contaminant transport model was used for this evaluation. Although this study was conceptual in nature, an attempt was made to relate it as closely as possible to Hanford conditions. Two-dimensional model runs were performed over the period of 1968 to 1973 using estimates of waste discharge flows, tritium concentrations, vertically averaged values of aquifer properties and boundary conditions. The well test interpretation runs confirmed the applicability of the areal hydraulic conductivity distribution. Velocity fields calculated by the two-dimensional and three-dimensional models and surface concentration profiles calculated by the two-dimensional and three-dimensional models show significant differences. Vertical concentration profiles calculated by a three-dimensional model show better qualitative agreement with the limited observed concentration profile data supplied by ARHCO

  15. Intuitionistic fuzzy (IF) evaluations of multidimensional model

    International Nuclear Information System (INIS)

    Valova, I.

    2012-01-01

    There are different logical methods for data structuring, but no one is perfect enough. Multidimensional model-MD of data is presentation of data in a form of cube (referred also as info-cube or hypercube) with data or in form of 'star' type scheme (referred as multidimensional scheme), by use of F-structures (Facts) and set of D-structures (Dimensions), based on the notion of hierarchy of D-structures. The data, being subject of analysis in a specific multidimensional model is located in a Cartesian space, being restricted by D-structures. In fact, the data is either dispersed or 'concentrated', therefore the data cells are not distributed evenly within the respective space. The moment of occurrence of any event is difficult to be predicted and the data is concentrated as per time periods, location of performed business event, etc. To process such dispersed or concentrated data, various technical strategies are needed. The basic methods for presentation of such data should be selected. The approaches of data processing and respective calculations are connected with different options for data representation. The use of intuitionistic fuzzy evaluations (IFE) provide us new possibilities for alternative presentation and processing of data, subject of analysis in any OLAP application. The use of IFE at the evaluation of multidimensional models will result in the following advantages: analysts will dispose with more complete information for processing and analysis of respective data; benefit for the managers is that the final decisions will be more effective ones; enabling design of more functional multidimensional schemes. The purpose of this work is to apply intuitionistic fuzzy evaluations of multidimensional model of data. (authors)

  16. Use of an operational model evaluation system for model intercomparison

    Energy Technology Data Exchange (ETDEWEB)

    Foster, K. T., LLNL

    1998-03-01

    The Atmospheric Release Advisory Capability (ARAC) is a centralized emergency response system used to assess the impact from atmospheric releases of hazardous materials. As part of an on- going development program, new three-dimensional diagnostic windfield and Lagrangian particle dispersion models will soon replace ARAC`s current operational windfield and dispersion codes. A prototype model performance evaluation system has been implemented to facilitate the study of the capabilities and performance of early development versions of these new models relative to ARAC`s current operational codes. This system provides tools for both objective statistical analysis using common performance measures and for more subjective visualization of the temporal and spatial relationships of model results relative to field measurements. Supporting this system is a database of processed field experiment data (source terms and meteorological and tracer measurements) from over 100 individual tracer releases.

  17. Development and evaluation of a connective tissue phantom model for subsurface visualization of cancers requiring wide local excision

    Science.gov (United States)

    Samkoe, Kimberley S.; Bates, Brent D.; Tselepidakis, Niki N.; DSouza, Alisha V.; Gunn, Jason R.; Ramkumar, Dipak B.; Paulsen, Keith D.; Pogue, Brian W.; Henderson, Eric R.

    2017-12-01

    Wide local excision (WLE) of tumors with negative margins remains a challenge because surgeons cannot directly visualize the mass. Fluorescence-guided surgery (FGS) may improve surgical accuracy; however, conventional methods with direct surface tumor visualization are not immediately applicable, and properties of tissues surrounding the cancer must be considered. We developed a phantom model for sarcoma resection with the near-infrared fluorophore IRDye 800CW and used it to iteratively define the properties of connective tissues that typically surround sarcoma tumors. We then tested the ability of a blinded surgeon to resect fluorescent tumor-simulating inclusions with ˜1-cm margins using predetermined target fluorescence intensities and a Solaris open-air fluorescence imaging system. In connective tissue-simulating phantoms, fluorescence intensity decreased with increasing blood concentration and increased with increasing intralipid concentrations. Fluorescent inclusions could be resolved at ≥1-cm depth in all inclusion concentrations and sizes tested. When inclusion depth was held constant, fluorescence intensity decreased with decreasing volume. Using targeted fluorescence intensities, a blinded surgeon was able to successfully excise inclusions with ˜1-cm margins from fat- and muscle-simulating phantoms with inclusion-to-background contrast ratios as low as 2∶1. Indirect, subsurface FGS is a promising tool for surgical resection of cancers requiring WLE.

  18. Evaluation of Usability Utilizing Markov Models

    Science.gov (United States)

    Penedo, Janaina Rodrigues; Diniz, Morganna; Ferreira, Simone Bacellar Leal; Silveira, Denis S.; Capra, Eliane

    2012-01-01

    Purpose: The purpose of this paper is to analyze the usability of a remote learning system in its initial development phase, using a quantitative usability evaluation method through Markov models. Design/methodology/approach: The paper opted for an exploratory study. The data of interest of the research correspond to the possible accesses of users…

  19. Development and application of a statistical methodology to evaluate the predictive accuracy of building energy baseline models

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.; Price, Phillip N. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.

    2014-03-01

    This paper documents the development and application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-­building energy savings. The methodology complements the principles addressed in resources such as ASHRAE Guideline 14 and the International Performance Measurement and Verification Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the model to predict total electricity consumption during a subsequent ``prediction period.’’ We illustrate the methodology by evaluating five baseline models using data from 29 buildings. The training period and prediction period were varied, and model predictions of daily, weekly, and monthly energy consumption were compared to meter data to determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  20. Nonfunctional requirements in systems analysis and design

    CERN Document Server

    Adams, Kevin MacG

    2015-01-01

    This book will help readers gain a solid understanding of non-functional requirements inherent in systems design endeavors. It contains essential information for those who design, use, and maintain complex engineered systems, including experienced designers, teachers of design, system stakeholders, and practicing engineers. Coverage approaches non-functional requirements in a novel way by presenting a framework of four systems concerns into which the 27 major non-functional requirements fall: sustainment, design, adaptation, and viability. Within this model, the text proceeds to define each non-functional requirement, to specify how each is treated as an element of the system design process, and to develop an associated metric for their evaluation. Systems are designed to meet specific functional needs. Because non-functional requirements are not directly related to tasks that satisfy these proposed needs, designers and stakeholders often fail to recognize the importance of such attributes as availability, su...