WorldWideScience

Sample records for model requires evaluation

  1. How Many Model Evaluations Are Required To Predict The AEP Of A Wind Power Plant?

    DEFF Research Database (Denmark)

    Murcia Leon, Juan Pablo; Réthoré, Pierre-Elouan; Natarajan, Anand

    2015-01-01

    (AEP) predictions expensive. The objective of the present paper is to minimize the number of model evaluations required to capture the wind power plant's AEP using stationary wind farm flow models. Polynomial chaos techniques are proposed based on arbitrary Weibull distributed wind speed and Von Misses...... distributed wind direction. The correlation between wind direction and wind speed are captured by defining Weibull-parameters as functions of wind direction. In order to evaluate the accuracy of these methods the expectation and variance of the wind farm power distributions are compared against...... the traditional binning method with trapezoidal and Simpson's integration rules. The wind farm flow model used in this study is the semi-empirical wake model developed by Larsen [1]. Three test cases are studied: a single turbine, a simple and a real offshore wind power plant. A reduced number of model...

  2. How Many Model Evaluations Are Required To Predict The AEP Of A Wind Power Plant?

    International Nuclear Information System (INIS)

    Murcia, J P; Réthoré, P E; Natarajan, A; Sørensen, J D

    2015-01-01

    Wind farm flow models have advanced considerably with the use of large eddy simulations (LES) and Reynolds averaged Navier-Stokes (RANS) computations. The main limitation of these techniques is their high computational time requirements; which makes their use for wind farm annual energy production (AEP) predictions expensive. The objective of the present paper is to minimize the number of model evaluations required to capture the wind power plant's AEP using stationary wind farm flow models. Polynomial chaos techniques are proposed based on arbitrary Weibull distributed wind speed and Von Misses distributed wind direction. The correlation between wind direction and wind speed are captured by defining Weibull-parameters as functions of wind direction. In order to evaluate the accuracy of these methods the expectation and variance of the wind farm power distributions are compared against the traditional binning method with trapezoidal and Simpson's integration rules.The wind farm flow model used in this study is the semi-empirical wake model developed by Larsen [1]. Three test cases are studied: a single turbine, a simple and a real offshore wind power plant. A reduced number of model evaluations for a general wind power plant is proposed based on the convergence of the present method for each case. (paper)

  3. Evaluation of olive flowering at low latitude sites in Argentina using a chilling requirement model

    Energy Technology Data Exchange (ETDEWEB)

    Aybar, V.E.; Melo-Abreu, J.P. de; Searles, P.S.; Matias, A.G.; Del Rio, C.; Caballero, C. M.; Rousseaux, M.C.

    2015-07-01

    Olive production has expanded significantly from the Mediterranean Basin into the New World over the last two decades. In some cases, cultivars of European origin have been introduced at a large commercial scale with little previous evaluation of potential productivity. The objective of this study was to evaluate whether a temperature-driven simulation model developed in the Mediterranean Basin to predict normal flowering occurrence and flowering date using cultivar-specific thermal requirements was suitable for the low latitude areas of Northwest Argentina. The model was validated at eight sites over several years and a wide elevation range (350–1200 m above mean sea level) for three cultivars (‘Arbequina’, ‘Frantoio’, ‘Leccino’) with potentially different chilling requirements. In ‘Arbequina’, normal flowering was observed at almost all sites and in all years, while normal flowering events in ‘Frantoio’ and ‘Leccino’ were uncommon. The model successfully predicted if flowering would be normal in 92% and 83% of the cases in ‘Arbequina’ and ‘Frantoio’, respectively, but was somewhat less successful in ‘Leccino’ (61%). When flowering occurred, the predicted flowering date was within ± 7 days of the observed date in 71% of the cases. Overall, the model results indicate that cultivar-specific simulation models may be used as an approximate tool to predict whether individual cultivars will be successful in new growing areas. In Northwest Argentina, the model could be used to identify cultivars to replace ‘Frantoio’ and ‘Leccino’ and to simulate global warming scenarios. (Author)

  4. An evaluation model for the definition of regulatory requirements on spent fuel pool cooling systems

    International Nuclear Information System (INIS)

    Izquierdo, J.M.

    1979-01-01

    A calculation model is presented for establishing regulatory requirements in the SFPCS System. The major design factors, regulatory and design limits and key parameters are discussed. A regulatory position for internal use is proposed. Finally, associated problems and experience are presented. (author)

  5. Required spatial resolution of hydrological models to evaluate urban flood resilience measures

    Science.gov (United States)

    Gires, A.; Giangola-Murzyn, A.; Tchiguirinskaia, I.; Schertzer, D.; Lovejoy, S.

    2012-04-01

    During a flood in urban area, several non-linear processes (rainfall, surface runoff, sewer flow, and sub-surface flow) interact. Fully distributed hydrological models are a useful tool to better understand these complex interactions between natural processes and man built environment. Developing an efficient model is a first step to improve the understanding of flood resilience in urban area. Given that the previously mentioned underlying physical phenomenon exhibit different relevant scales, determining the required spatial resolution of such model is tricky but necessary issue. For instance such model should be able to properly represent large scale effects of local scale flood resilience measures such as stop logs. The model should also be as simple as possible without being simplistic. In this paper we test two types of model. First we use an operational semi-distributed model over a 3400 ha peri-urban area located in Seine-Saint-Denis (North-East of Paris). In this model, the area is divided into sub-catchments of average size 17 ha that are considered as homogenous, and only the sewer discharge is modelled. The rainfall data, whose resolution is 1 km is space and 5 min in time, comes from the C-band radar of Trappes, located in the West of Paris, and operated by Météo-France. It was shown that the spatial resolution of both the model and the rainfall field did not enable to fully grasp the small scale rainfall variability. To achieve this, first an ensemble of realistic rainfall fields downscaled to a resolution of 100 m is generated with the help of multifractal space-time cascades whose characteristic exponents are estimated on the available radar data. Second the corresponding ensemble of sewer hydrographs is simulated by inputting each rainfall realization to the model. It appears that the probability distribution of the simulated peak flow exhibits a power-law behaviour. This indicates that there is a great uncertainty associated with small scale

  6. Improving the Enterprise Requirements and Acquisition Model’s Developmental Test and Evaluation Process Fidelity

    Science.gov (United States)

    2014-03-27

    program office. Many of the individuals were identified through snowball sampling . SME Discussion Summary The SME discussions brought to light...adequate sample sizes are collected. With computer modeling and simulation, time is less of a limiting factor as it can be manipulated. Data representing...The DAMS suffers from diverse problems of which only a very small sample were discussed here. However, as diverse as the problems encountered were, a

  7. Development and application of emission models for verification and evaluation of user requirements in the sector of environmental planning

    International Nuclear Information System (INIS)

    Fister, G.

    2001-04-01

    In chapter I, two basic emission models for calculation of emission inventories are presented: the bottom-up- and the top-down-approach. Their characteristics, typical international and national fields of application and the requirements for these approaches are discussed. A separate chapter describes a detailed comparison between two different emission balances. These characterize the same regional area but are based on different emission models (top-down- and bottom-up-approach). The structures of these approaches are analyzed, emission sectors are adjusted for a comparison of detailed emission data. Differences are pointed out and reasons for discrepancies are discussed. Due to the results of this investigation, limits for the fields of application of the two approaches are set and substantiated. An application of results of the above mentioned comparison are shown in the following part. Following the Kyoto Protocol commitment and Lower Austria Climate Protection Program current and future emission situation of Lower Austria is discussed. Other types of emission inventories are included for discussion and a top-down-based approach for a local splitting of Austrian reduction potentials of greenhouse gases is developed. Another step in the Lower Austria Climate Protection Program are investigations of all funding in Lower Austria related to their ozone and climate relevance. Survey and evaluation of funding are described in detail. Further analyses are made with housing grants which include quantitative aspects, too. Taking all aspects into consideration the actual situation regarding ozone and climate related emissions is shown. Changes in requirements of emission inventories in the last decade are discussed, experiences of applying emission approaches are mentioned. Concluding this work, an outlook in calculating emission inventories is given. (author)

  8. Value-Focused Thinking Model to Evaluate SHM System Alternatives From Military end User Requirements Point of View

    Directory of Open Access Journals (Sweden)

    Klimaszewski Sławomir

    2016-12-01

    Full Text Available The article describes Value-Focused Thinking (VFT model developed in order to evaluate various alternatives for implementation of Structural Health Monitoring (SHM system on a military aircraft. Four SHM system alternatives are considered based on: visual inspection (current approach, piezoelectric (PZT sensors, Fiber Bragg Grating (FBG sensors and Comparative Vacuum Monitoring (CVM sensors. A numerical example is shown to illustrate the model capability. Sensitivity analyses are performed on values such as Cost, Performance, Aircraft Availability and Technology Readiness Level in order to examine influence of these values on overall value of structural state of awareness provided by particular SHM system alternative.

  9. Development and evaluation of a connective tissue phantom model for subsurface visualization of cancers requiring wide local excision

    Science.gov (United States)

    Samkoe, Kimberley S.; Bates, Brent D.; Tselepidakis, Niki N.; DSouza, Alisha V.; Gunn, Jason R.; Ramkumar, Dipak B.; Paulsen, Keith D.; Pogue, Brian W.; Henderson, Eric R.

    2017-12-01

    Wide local excision (WLE) of tumors with negative margins remains a challenge because surgeons cannot directly visualize the mass. Fluorescence-guided surgery (FGS) may improve surgical accuracy; however, conventional methods with direct surface tumor visualization are not immediately applicable, and properties of tissues surrounding the cancer must be considered. We developed a phantom model for sarcoma resection with the near-infrared fluorophore IRDye 800CW and used it to iteratively define the properties of connective tissues that typically surround sarcoma tumors. We then tested the ability of a blinded surgeon to resect fluorescent tumor-simulating inclusions with ˜1-cm margins using predetermined target fluorescence intensities and a Solaris open-air fluorescence imaging system. In connective tissue-simulating phantoms, fluorescence intensity decreased with increasing blood concentration and increased with increasing intralipid concentrations. Fluorescent inclusions could be resolved at ≥1-cm depth in all inclusion concentrations and sizes tested. When inclusion depth was held constant, fluorescence intensity decreased with decreasing volume. Using targeted fluorescence intensities, a blinded surgeon was able to successfully excise inclusions with ˜1-cm margins from fat- and muscle-simulating phantoms with inclusion-to-background contrast ratios as low as 2∶1. Indirect, subsurface FGS is a promising tool for surgical resection of cancers requiring WLE.

  10. Requirements Modeling with Agent Programming

    Science.gov (United States)

    Dasgupta, Aniruddha; Krishna, Aneesh; Ghose, Aditya K.

    Agent-oriented conceptual modeling notations are highly effective in representing requirements from an intentional stance and answering questions such as what goals exist, how key actors depend on each other, and what alternatives must be considered. In this chapter, we review an approach to executing i* models by translating these into set of interacting agents implemented in the CASO language and suggest how we can perform reasoning with requirements modeled (both functional and non-functional) using i* models. In this chapter we particularly incorporate deliberation into the agent design. This allows us to benefit from the complementary representational capabilities of the two frameworks.

  11. Evaluation models and evaluation use

    Science.gov (United States)

    Contandriopoulos, Damien; Brousselle, Astrid

    2012-01-01

    The use of evaluation results is at the core of evaluation theory and practice. Major debates in the field have emphasized the importance of both the evaluator’s role and the evaluation process itself in fostering evaluation use. A recent systematic review of interventions aimed at influencing policy-making or organizational behavior through knowledge exchange offers a new perspective on evaluation use. We propose here a framework for better understanding the embedded relations between evaluation context, choice of an evaluation model and use of results. The article argues that the evaluation context presents conditions that affect both the appropriateness of the evaluation model implemented and the use of results. PMID:23526460

  12. Historical model evaluation data requirements

    International Nuclear Information System (INIS)

    Simpson, B.C.; McCain, D.J.

    1995-01-01

    Several studies about tank waste contents have been published using historical records of tank transactions and various analytical measurements. While these records offer a wealth of information, the results are questionable until error estimates associated with the results can be established. However, they do provide a direction for investigation. Two principal observations from the studies are: (1) Large quantities of individual waste types from the various separations processes were widely distributed throughout the tank farms, and (2) The compositions of many of these waste types are quite distinct from one another. A key assumption associated with these observations is that the effects of time and location on the tank wastes are either nominal or not discernable. Since each waste type has a distinct composition, it would benefit all programs to better quantify that composition, and establish an uncertainty for each element of that composition. Various process, disposal, or other decisions could then be made based on current information reducing the need for extended sampling and analysis

  13. A fuzzy model for exploiting customer requirements

    Directory of Open Access Journals (Sweden)

    Zahra Javadirad

    2016-09-01

    Full Text Available Nowadays, Quality function deployment (QFD is one of the total quality management tools, where customers’ views and requirements are perceived and using various techniques improves the production requirements and operations. The QFD department, after identification and analysis of the competitors, takes customers’ feedbacks to meet the customers’ demands for the products compared with the competitors. In this study, a comprehensive model for assessing the importance of the customer requirements in the products or services for an organization is proposed. The proposed study uses linguistic variables, as a more comprehensive approach, to increase the precision of the expression evaluations. The importance of these requirements specifies the strengths and weaknesses of the organization in meeting the requirements relative to competitors. The results of these experiments show that the proposed method performs better than the other methods.

  14. Site evaluation for nuclear installations. Safety requirements

    International Nuclear Information System (INIS)

    2003-01-01

    This Safety Requirements publication supersedes the Code on the Safety of Nuclear Power Plants: Siting, which was issued in 1988 as Safety Series No. 50-C-S (Rev. 1). It takes account of developments relating to site evaluations for nuclear installations since the Code on Siting was last revised. These developments include the issuing of the Safety Fundamentals publication on The Safety of Nuclear Installations, and the revision of various safety standards and other publications relating to safety. Requirements for site evaluation are intended to ensure adequate protection of site personnel, the public and the environment from the effects of ionizing radiation arising from nuclear installations. It is recognized that there are steady advances in technology and scientific knowledge, in nuclear safety and in what is considered adequate protection. Safety requirements change with these advances and this publication reflects the present consensus among States. This Safety Requirements publication was prepared under the IAEA programme on safety standards for nuclear installations. It establishes requirements and provides criteria for ensuring safety in site evaluation for nuclear installations. The Safety Guides on site evaluation listed in the references provide recommendations on how to meet the requirements established in this Safety Requirements publication. The objective of this publication is to establish the requirements for the elements of a site evaluation for a nuclear installation so as to characterize fully the site specific conditions pertinent to the safety of a nuclear installation. The purpose is to establish requirements for criteria, to be applied as appropriate to site and site-installation interaction in operational states and accident conditions, including those that could lead to emergency measures for: (a) Defining the extent of information on a proposed site to be presented by the applicant; (b) Evaluating a proposed site to ensure that the site

  15. The IIR evaluation model

    DEFF Research Database (Denmark)

    Borlund, Pia

    2003-01-01

    An alternative approach to evaluation of interactive information retrieval (IIR) systems, referred to as the IIR evaluation model, is proposed. The model provides a framework for the collection and analysis of IR interaction data. The aim of the model is two-fold: 1) to facilitate the evaluation ...

  16. Visual soil evaluation - future research requirements

    Science.gov (United States)

    Emmet-Booth, Jeremy; Forristal, Dermot; Fenton, Owen; Ball, Bruce; Holden, Nick

    2017-04-01

    A review of Visual Soil Evaluation (VSE) techniques (Emmet-Booth et al., 2016) highlighted their established utility for soil quality assessment, though some limitations were identified; (1) The examination of aggregate size, visible intra-porosity and shape forms a key assessment criterion in almost all methods, thus limiting evaluation to structural form. The addition of criteria that holistically examine structure may be desirable. For example, structural stability can be indicated using dispersion tests or examining soil surface crusting, while the assessment of soil colour may indirectly indicate soil organic matter content, a contributor to stability. Organic matter assessment may also indicate structural resilience, along with rooting, earthworm numbers or shrinkage cracking. (2) Soil texture may influence results or impeded method deployment. Modification of procedures to account for extreme texture variation is desirable. For example, evidence of compaction in sandy or single grain soils greatly differs to that in clayey soils. Some procedures incorporate separate classification systems or adjust deployment based on texture. (3) Research into impacts of soil moisture content on VSE evaluation criteria is required. Criteria such as rupture resistance and shape may be affected by moisture content. It is generally recommended that methods are deployed on moist soils and quantification of influences of moisture variation on results is necessary. (4) Robust sampling strategies for method deployment are required. Dealing with spatial variation differs between methods, but where methods can be deployed over large areas, clear instruction on sampling is required. Additionally, as emphasis has been placed on the agricultural production of soil, so the ability of VSE for exploring structural quality in terms of carbon storage, water purification and biodiversity support also requires research. References Emmet-Booth, J.P., Forristal. P.D., Fenton, O., Ball, B

  17. The EMEFS model evaluation

    International Nuclear Information System (INIS)

    Barchet, W.R.; Dennis, R.L.; Seilkop, S.K.; Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K.; Byun, D.; McHenry, J.N.; Karamchandani, P.; Venkatram, A.; Fung, C.; Misra, P.K.; Hansen, D.A.; Chang, J.S.

    1991-12-01

    The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs

  18. The EMEFS model evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Barchet, W.R. (Pacific Northwest Lab., Richland, WA (United States)); Dennis, R.L. (Environmental Protection Agency, Research Triangle Park, NC (United States)); Seilkop, S.K. (Analytical Sciences, Inc., Durham, NC (United States)); Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K. (Atmospheric Environment Service, Downsview, ON (Canada)); Byun, D.; McHenry, J.N.

    1991-12-01

    The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs.

  19. Introducing Program Evaluation Models

    Directory of Open Access Journals (Sweden)

    Raluca GÂRBOAN

    2008-02-01

    Full Text Available Programs and project evaluation models can be extremely useful in project planning and management. The aim is to set the right questions as soon as possible in order to see in time and deal with the unwanted program effects, as well as to encourage the positive elements of the project impact. In short, different evaluation models are used in order to minimize losses and maximize the benefits of the interventions upon small or large social groups. This article introduces some of the most recently used evaluation models.

  20. Data requirements for integrated near field models

    International Nuclear Information System (INIS)

    Wilems, R.E.; Pearson, F.J. Jr.; Faust, C.R.; Brecher, A.

    1981-01-01

    The coupled nature of the various processes in the near field require that integrated models be employed to assess long term performance of the waste package and repository. The nature of the integrated near field models being compiled under the SCEPTER program are discussed. The interfaces between these near field models and far field models are described. Finally, near field data requirements are outlined in sufficient detail to indicate overall programmatic guidance for data gathering activities

  1. Data modeling and evaluation

    International Nuclear Information System (INIS)

    Bauge, E.; Hilaire, S.

    2006-01-01

    This lecture is devoted to the nuclear data evaluation process, during which the current knowledge (experimental or theoretical) of nuclear reactions is condensed and synthesised into a computer file (the evaluated data file) that application codes can process and use for simulation calculations. After an overview of the content of evaluated nuclear data files, we describe the different methods used for evaluating nuclear data. We specifically focus on the model based approach which we use to evaluate data in the continuum region. A few examples, coming from the day to day practice of data evaluation will illustrate this lecture. Finally, we will discuss the most likely perspectives for improvement of the evaluation process in the next decade. (author)

  2. Integrated Assessment Model Evaluation

    Science.gov (United States)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.

    2012-12-01

    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside

  3. Training effectiveness evaluation model

    International Nuclear Information System (INIS)

    Penrose, J.B.

    1993-01-01

    NAESCO's Training Effectiveness Evaluation Model (TEEM) integrates existing evaluation procedures with new procedures. The new procedures are designed to measure training impact on organizational productivity. TEEM seeks to enhance organizational productivity through proactive training focused on operation results. These results can be identified and measured by establishing and tracking performance indicators. Relating training to organizational productivity is not easy. TEEM is a team process. It offers strategies to assess more effectively organizational costs and benefits of training. TEEM is one organization's attempt to refine, manage and extend its training evaluation program

  4. The EU model evaluation group

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1999-01-01

    The model evaluation group (MEG) was launched in 1992 growing out of the Major Technological Hazards Programme with EU/DG XII. The goal of MEG was to improve the culture in which models were developed, particularly by encouraging voluntary model evaluation procedures based on a formalised and consensus protocol. The evaluation intended to assess the fitness-for-purpose of the models being used as a measure of the quality. The approach adopted was focused on developing a generic model evaluation protocol and subsequent targeting this onto specific areas of application. Five such developments have been initiated, on heavy gas dispersion, liquid pool fires, gas explosions, human factors and momentum fires. The quality of models is an important element when complying with the 'Seveso Directive' requiring that the safety reports submitted to the authorities comprise an assessment of the extent and severity of the consequences of identified major accidents. Further, the quality of models become important in the land use planning process, where the proximity of industrial sites to vulnerable areas may be critical. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  5. Developmental Education Evaluation Model.

    Science.gov (United States)

    Perry-Miller, Mitzi; And Others

    A developmental education evaluation model designed to be used at a multi-unit urban community college is described. The purpose of the design was to determine the cost effectiveness/worth of programs in order to initiate self-improvement. A needs assessment was conducted by interviewing and taping the responses of students, faculty, staff, and…

  6. CMAQ Model Evaluation Framework

    Science.gov (United States)

    CMAQ is tested to establish the modeling system’s credibility in predicting pollutants such as ozone and particulate matter. Evaluation of CMAQ has been designed to assess the model’s performance for specific time periods and for specific uses.

  7. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  8. 48 CFR 652.219-73 - Mentor Requirements and Evaluation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Mentor Requirements and... Mentor Requirements and Evaluation. As prescribed in 619.202-70(o)(2), insert the following clause: Mentor Requirements and Evaluation (APR 2004) (a) Mentor and protégé firms shall submit an evaluation to...

  9. 48 CFR 1052.219-75 - Mentor Requirements and Evaluation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Mentor Requirements and... Mentor Requirements and Evaluation. As prescribed in DTAR 1019.202-70, insert the following clause: Mentor Requirements and Evaluation (JAN 2000) (a) Mentor and protégé firms shall submit an evaluation to...

  10. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  11. DECISION MAKING MODELING OF CONCRETE REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    Suhartono Irawan

    2001-01-01

    Full Text Available This paper presents the results of an experimental evaluation between predicted and practice concrete strength. The scope of the evaluation is the optimisation of the cement content for different concrete grades as a result of bringing the target mean value of tests cubes closer to the required characteristic strength value by reducing the standard deviation. Abstract in Bahasa Indonesia : concrete+mix+design%2C+acceptance+control%2C+optimisation%2C+cement+content.

  12. Pragmatic geometric model evaluation

    Science.gov (United States)

    Pamer, Robert

    2015-04-01

    Quantification of subsurface model reliability is mathematically and technically demanding as there are many different sources of uncertainty and some of the factors can be assessed merely in a subjective way. For many practical applications in industry or risk assessment (e. g. geothermal drilling) a quantitative estimation of possible geometric variations in depth unit is preferred over relative numbers because of cost calculations for different scenarios. The talk gives an overview of several factors that affect the geometry of structural subsurface models that are based upon typical geological survey organization (GSO) data like geological maps, borehole data and conceptually driven construction of subsurface elements (e. g. fault network). Within the context of the trans-European project "GeoMol" uncertainty analysis has to be very pragmatic also because of different data rights, data policies and modelling software between the project partners. In a case study a two-step evaluation methodology for geometric subsurface model uncertainty is being developed. In a first step several models of the same volume of interest have been calculated by omitting successively more and more input data types (seismic constraints, fault network, outcrop data). The positions of the various horizon surfaces are then compared. The procedure is equivalent to comparing data of various levels of detail and therefore structural complexity. This gives a measure of the structural significance of each data set in space and as a consequence areas of geometric complexity are identified. These areas are usually very data sensitive hence geometric variability in between individual data points in these areas is higher than in areas of low structural complexity. Instead of calculating a multitude of different models by varying some input data or parameters as it is done by Monte-Carlo-simulations, the aim of the second step of the evaluation procedure (which is part of the ongoing work) is to

  13. 12 CFR 614.4260 - Evaluation requirements.

    Science.gov (United States)

    2010-01-01

    ... evaluated on the five basic credit factors, without considering the subject real estate, would support the credit decision that was based on other sources of repayment or collateral; (4) A lien on real estate is..., even with the advancement of new loan funds; (6) A Farm Credit System institution purchases a loan or...

  14. Evaluating the Impact of Design-Driven Requirements Using SysML

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research will develop SysML requirements modeling patterns and scripts to automate the evaluation of the impact of design driven requirements....

  15. Designer's requirements for evaluation of sustainability

    DEFF Research Database (Denmark)

    Bey, Niki; Lenau, Torben Anker

    1998-01-01

    Today, sustainability of products is often evaluated on the basis of assessments of their environmental performance. Established means for this purpose are formal Life Cycle Assessment (LCA) methods. Designers have an essential influence on product design and are therefore one target group for life...... cycle-based evaluation methods. However, the application of LCA in the design process, when for example different materials and manufacturing processes have to be selected, is difficult. This is, among other things, because only a few designers have a deeper background in this area and even simplified...... LCAs involve calculations with a relatively high accuracy. Most LCA methods do therefore not qualify as hands-on tool for utilisation by typical designers.In this context, the authors raise the question, whether a largely simplified LCA-method which is exclusively based on energy considerations can...

  16. 48 CFR 552.219-76 - Mentor Requirements and Evaluation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Mentor Requirements and....219-76 Mentor Requirements and Evaluation. As prescribed in 519.7017(b), insert the following clause: Mentor Requirements and Evaluation (SEP 2009) (a) The purpose of the GSA Mentor-Protégé Program is for a...

  17. 48 CFR 752.219-71 - Mentor requirements and evaluation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Mentor requirements and....219-71 Mentor requirements and evaluation. As prescribed in AIDAR 719.273-11(b), insert the following clause: Mentor Requirements and Evaluation (July 13, 2007) (a) Mentor and Protégé firms shall submit an...

  18. 48 CFR 1852.219-79 - Mentor requirements and evaluation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Mentor requirements and... and Clauses 1852.219-79 Mentor requirements and evaluation. As prescribed in 1819.7215, insert the following clause: Mentor Requirements and Evaluation (MAY 2009) (a) The purpose of the NASA Mentor-Protégé...

  19. 38 CFR 21.8030 - Requirement for evaluation of child.

    Science.gov (United States)

    2010-07-01

    ... evaluation of child. 21.8030 Section 21.8030 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS... Certain Children of Vietnam Veterans-Spina Bifida and Covered Birth Defects Evaluation § 21.8030 Requirement for evaluation of child. (a) Children to be evaluated. The VR&E Division will evaluate each child...

  20. Modeling requirements for in situ vitrification

    International Nuclear Information System (INIS)

    MacKinnon, R.J.; Mecham, D.C.; Hagrman, D.L.; Johnson, R.W.; Murray, P.E.; Slater, C.E.; Marwil, E.S.; Weaver, R.A.; Argyle, M.D.

    1991-11-01

    This document outlines the requirements for the model being developed at the INEL which will provide analytical support for the ISV technology assessment program. The model includes representations of the electric potential field, thermal transport with melting, gas and particulate release, vapor migration, off-gas combustion and process chemistry. The modeling objectives are to (1) help determine the safety of the process by assessing the air and surrounding soil radionuclide and chemical pollution hazards, the nuclear criticality hazard, and the explosion and fire hazards, (2) help determine the suitability of the ISV process for stabilizing the buried wastes involved, and (3) help design laboratory and field tests and interpret results therefrom

  1. Requirements Modeling with the Aspect-oriented User Requirements Notation (AoURN): A Case Study

    Science.gov (United States)

    Mussbacher, Gunter; Amyot, Daniel; Araújo, João; Moreira, Ana

    The User Requirements Notation (URN) is a recent ITU-T standard that supports requirements engineering activities. The Aspect-oriented URN (AoURN) adds aspect-oriented concepts to URN, creating a unified framework that allows for scenario-based, goal-oriented, and aspect-oriented modeling. AoURN is applied to the car crash crisis management system (CCCMS), modeling its functional and non-functional requirements (NFRs). AoURN generally models all use cases, NFRs, and stakeholders as individual concerns and provides general guidelines for concern identification. AoURN handles interactions between concerns, capturing their dependencies and conflicts as well as the resolutions. We present a qualitative comparison of aspect-oriented techniques for scenario-based and goal-oriented requirements engineering. An evaluation carried out based on the metrics adapted from literature and a task-based evaluation suggest that AoURN models are more scalable than URN models and exhibit better modularity, reusability, and maintainability.

  2. Evaluating topic models with stability

    CSIR Research Space (South Africa)

    De Waal, A

    2008-11-01

    Full Text Available Topic models are unsupervised techniques that extract likely topics from text corpora, by creating probabilistic word-topic and topic-document associations. Evaluation of topic models is a challenge because (a) topic models are often employed...

  3. A nutrition mathematical model to account for dietary supply and requirements of energy and nutrients for domesticated small ruminants: the development and evaluation of the Small Ruminant Nutrition System

    Directory of Open Access Journals (Sweden)

    Luis Orlindo Tedeschi

    2008-07-01

    Full Text Available A mechanistic model that predicts nutrient requirements and biological values of feeds for sheep (Cornell Net Carbohydrate and Protein System; CNCPS-S was expanded to include goats and the name was changed to the Small Ruminant Nutrition System (SRNS. The SRNS uses animal and environmental factors to predict metabolizable energy (ME and protein, and Ca and P requirements. Requirements for goats in the SRNS are predicted based on the equations developed for CNCPS-S, modified to account for specific requirements of goats, including maintenance, lactation, and pregnancy requirements, and body reserves. Feed biological values are predicted based on carbohydrate and protein fractions and their ruminal fermentation rates, forage, concentrate and liquid passage rates, and microbial growth. The evaluation of the SRNS for sheep using published papers (19 treatment means indicated no mean bias (MB; 1.1 g/100 g and low root mean square prediction error (RMSPE; 3.6 g/100g when predicting dietary organic matter digestibility for diets not deficient in ruminal nitrogen. The SRNS accurately predicted gains and losses of shrunk body weight (SBW of adult sheep (15 treatment means; MB = 5.8 g/d and RMSPE = 30 g/d when diets were not deficient in ruminal nitrogen. The SRNS for sheep had MB varying from -34 to 1 g/d and RSME varying from 37 to 56 g/d when predicting average daily gain (ADG of growing lambs (42 treatment means. The evaluation of the SRNS for goats based on literature data showed accurate predictions for ADG of kids (31 treatment means; RMSEP = 32.5 g/d; r2= 0.85; concordance correlation coefficient, CCC, = 0.91, daily ME intake (21 treatment means; RMSEP = 0.24 Mcal/d g/d; r2 = 0.99; CCC = 0.99, and energy balance (21 treatment means; RMSEP = 0.20 Mcal/d g/d; r2 = 0.87; CCC = 0.90 of goats. In conclusion, the SRNS for sheep can accurately predict dietary organic matter digestibility, ADG of growing lambs and changes in SBW of mature sheep. The SRNS

  4. Gamified Requirements Engineering: Model and Experimentation

    NARCIS (Netherlands)

    Lombriser, Philipp; Dalpiaz, Fabiano; Lucassen, Garm; Brinkkemper, Sjaak

    2016-01-01

    [Context & Motivation] Engaging stakeholders in requirements engineering (RE) influences the quality of the requirements and ultimately of the system to-be. Unfortunately, stakeholder engagement is often insufficient, leading to too few, low-quality requirements. [Question/problem] We aim to

  5. Nuclear models relevant to evaluation

    International Nuclear Information System (INIS)

    Arthur, E.D.; Chadwick, M.B.; Hale, G.M.; Young, P.G.

    1991-01-01

    The widespread use of nuclear models continues in the creation of data evaluations. The reasons include extension of data evaluations to higher energies, creation of data libraries for isotopic components of natural materials, and production of evaluations for radiative target species. In these cases, experimental data are often sparse or nonexistent. As this trend continues, the nuclear models employed in evaluation work move towards more microscopically-based theoretical methods, prompted in part by the availability of increasingly powerful computational resources. Advances in nuclear models applicable to evaluation will be reviewed. These include advances in optical model theory, microscopic and phenomenological state and level density theory, unified models that consistently describe both equilibrium and nonequilibrium reaction mechanism, and improved methodologies for calculation of prompt radiation from fission. 84 refs., 8 figs

  6. Quick pace of property acquisitions requires two-stage evaluations

    International Nuclear Information System (INIS)

    Hollo, R.; Lockwood, S.

    1994-01-01

    The traditional method of evaluating oil and gas reserves may be too cumbersome for the quick pace of oil and gas property acquisition. An acquisition evaluator must decide quickly if a property meets basic purchase criteria. The current business climate requires a two-stage approach. First, the evaluator makes a quick assessment of the property and submits a bid. If the bid is accepted then the evaluator goes on with a detailed analysis, which represents the second stage. Acquisition of producing properties has become an important activity for many independent oil and gas producers, who must be able to evaluate reserves quickly enough to make effective business decisions yet accurately enough to avoid costly mistakes. Independent thus must be familiar with how transactions usually progress as well as with the basic methods of property evaluation. The paper discusses acquisition activity, the initial offer, the final offer, property evaluation, and fair market value

  7. Evaluation and decision of products conceptual design schemes based on customer requirements

    International Nuclear Information System (INIS)

    Huang, Hong Zhong; Li, Yan Feng; Liu, Yu; Wang, Zhonglai; Liu, Wenhai

    2011-01-01

    Within the competitive market environment, understanding customer requirements is crucial for all corporations to obtain market share and survive competition. Only the products exactly meeting customer requirements can win in the market place. Therefore, customer requirements play a very important role in the evaluation and decision process of conceptual design schemes of products. In this paper, an evaluation and decision method based on customer requirements is presented. It utilizes the importance of customer requirements, the satisfaction degree of each evaluation metric to the specification, and an evaluation metric which models customer requirements to evaluate the satisfaction degree of each design scheme to specific customer requirements via the proposed BP neural networks. In the evaluation and decision process, fuzzy sets are used to describe the importance of customer requirements, the relationship between customer requirements and evaluation metrics, the satisfaction degree of each scheme to customer requirements, and the crisp set is used to describe the satisfaction degree of each metric to specifications. The effectiveness of the proposed method is demonstrated by an example of front suspension fork design of mountain bikes

  8. Evaluation and decision of products conceptual design schemes based on customer requirements

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Hong Zhong; Li, Yan Feng; Liu, Yu; Wang, Zhonglai [University of Electronic Science and Technology of China, Sichuan (China); Liu, Wenhai [2China Science Patent Trademark Agents Ltd., Beijing (China)

    2011-09-15

    Within the competitive market environment, understanding customer requirements is crucial for all corporations to obtain market share and survive competition. Only the products exactly meeting customer requirements can win in the market place. Therefore, customer requirements play a very important role in the evaluation and decision process of conceptual design schemes of products. In this paper, an evaluation and decision method based on customer requirements is presented. It utilizes the importance of customer requirements, the satisfaction degree of each evaluation metric to the specification, and an evaluation metric which models customer requirements to evaluate the satisfaction degree of each design scheme to specific customer requirements via the proposed BP neural networks. In the evaluation and decision process, fuzzy sets are used to describe the importance of customer requirements, the relationship between customer requirements and evaluation metrics, the satisfaction degree of each scheme to customer requirements, and the crisp set is used to describe the satisfaction degree of each metric to specifications. The effectiveness of the proposed method is demonstrated by an example of front suspension fork design of mountain bikes.

  9. Vehicle systems and payload requirements evaluation. [computer programs for identifying launch vehicle system requirements

    Science.gov (United States)

    Rea, F. G.; Pittenger, J. L.; Conlon, R. J.; Allen, J. D.

    1975-01-01

    Techniques developed for identifying launch vehicle system requirements for NASA automated space missions are discussed. Emphasis is placed on development of computer programs and investigation of astrionics for OSS missions and Scout. The Earth Orbit Mission Program - 1 which performs linear error analysis of launch vehicle dispersions for both vehicle and navigation system factors is described along with the Interactive Graphic Orbit Selection program which allows the user to select orbits which satisfy mission requirements and to evaluate the necessary injection accuracy.

  10. Site descriptive modelling - strategy for integrated evaluation

    International Nuclear Information System (INIS)

    Andersson, Johan

    2003-02-01

    The current document establishes the strategy to be used for achieving sufficient integration between disciplines in producing Site Descriptive Models during the Site Investigation stage. The Site Descriptive Model should be a multidisciplinary interpretation of geology, rock mechanics, thermal properties, hydrogeology, hydrogeochemistry, transport properties and ecosystems using site investigation data from deep bore holes and from the surface as input. The modelling comprise the following iterative steps, evaluation of primary data, descriptive and quantitative modelling (in 3D), overall confidence evaluation. Data are first evaluated within each discipline and then the evaluations are checked between the disciplines. Three-dimensional modelling (i.e. estimating the distribution of parameter values in space and its uncertainty) is made in a sequence, where the geometrical framework is taken from the geological model and in turn used by the rock mechanics, thermal and hydrogeological modelling etc. The three-dimensional description should present the parameters with their spatial variability over a relevant and specified scale, with the uncertainty included in this description. Different alternative descriptions may be required. After the individual discipline modelling and uncertainty assessment a phase of overall confidence evaluation follows. Relevant parts of the different modelling teams assess the suggested uncertainties and evaluate the feedback. These discussions should assess overall confidence by, checking that all relevant data are used, checking that information in past model versions is considered, checking that the different kinds of uncertainty are addressed, checking if suggested alternatives make sense and if there is potential for additional alternatives, and by discussing, if appropriate, how additional measurements (i.e. more data) would affect confidence. The findings as well as the modelling results are to be documented in a Site Description

  11. From requirements to Java in a snap model-driven requirements engineering in practice

    CERN Document Server

    Smialek, Michal

    2015-01-01

    This book provides a coherent methodology for Model-Driven Requirements Engineering which stresses the systematic treatment of requirements within the realm of modelling and model transformations. The underlying basic assumption is that detailed requirements models are used as first-class artefacts playing a direct role in constructing software. To this end, the book presents the Requirements Specification Language (RSL) that allows precision and formality, which eventually permits automation of the process of turning requirements into a working system by applying model transformations and co

  12. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  13. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, W.; Jonkers, Henk; van Sinderen, Marten J.

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for

  14. Requirements for an evaluation infrastructure for reliable pervasive healthcare research

    DEFF Research Database (Denmark)

    Wagner, Stefan Rahr; Toftegaard, Thomas Skjødeberg; Bertelsen, Olav W.

    2012-01-01

    The need for a non-intrusive evaluation infrastructure platform to support research on reliable pervasive healthcare in the unsupervised setting is analyzed and challenges and possibilities are identified. A list of requirements is presented and a solution is suggested that would allow researchers...

  15. Requirements for the evaluation of computational speech segregation systems

    DEFF Research Database (Denmark)

    May, Tobias; Dau, Torsten

    2014-01-01

    Recent studies on computational speech segregation reported improved speech intelligibility in noise when estimating and applying an ideal binary mask with supervised learning algorithms. However, an important requirement for such systems in technical applications is their robustness to acoustic...... associated with perceptual attributes in speech segregation. The results could help establish a framework for a systematic evaluation of future segregation systems....

  16. 13 CFR 303.3 - Application requirements and evaluation criteria.

    Science.gov (United States)

    2010-01-01

    ... involvement of the Region's business leadership at each stage of the preparation of the CEDS, short-term... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Application requirements and evaluation criteria. 303.3 Section 303.3 Business Credit and Assistance ECONOMIC DEVELOPMENT ADMINISTRATION...

  17. Rock mechanics models evaluation report

    International Nuclear Information System (INIS)

    1987-08-01

    This report documents the evaluation of the thermal and thermomechanical models and codes for repository subsurface design and for design constraint analysis. The evaluation was based on a survey of the thermal and thermomechanical codes and models that are applicable to subsurface design, followed by a Kepner-Tregoe (KT) structured decision analysis of the codes and models. The primary recommendations of the analysis are that the DOT code be used for two-dimensional thermal analysis and that the STEALTH and HEATING 5/6 codes be used for three-dimensional and complicated two-dimensional thermal analysis. STEALTH and SPECTROM 32 are recommended for thermomechanical analyses. The other evaluated codes should be considered for use in certain applications. A separate review of salt creep models indicate that the commonly used exponential time law model is appropriate for use in repository design studies. 38 refs., 1 fig., 7 tabs

  18. Conceptual modelling of human resource evaluation process

    Directory of Open Access Journals (Sweden)

    Negoiţă Doina Olivia

    2017-01-01

    Full Text Available Taking into account the highly diverse tasks which employees have to fulfil due to complex requirements of nowadays consumers, the human resource within an enterprise has become a strategic element for developing and exploiting products which meet the market expectations. Therefore, organizations encounter difficulties when approaching the human resource evaluation process. Hence, the aim of the current paper is to design a conceptual model of the aforementioned process, which allows the enterprises to develop a specific methodology. In order to design the conceptual model, Business Process Modelling instruments were employed - Adonis Community Edition Business Process Management Toolkit using the ADONIS BPMS Notation. The conceptual model was developed based on an in-depth secondary research regarding the human resource evaluation process. The proposed conceptual model represents a generic workflow (sequential and/ or simultaneously activities, which can be extended considering the enterprise’s needs regarding their requirements when conducting a human resource evaluation process. Enterprises can benefit from using software instruments for business process modelling as they enable process analysis and evaluation (predefined / specific queries and also model optimization (simulations.

  19. Evaluation of animal models of neurobehavioral disorders

    Directory of Open Access Journals (Sweden)

    Nordquist Rebecca E

    2009-02-01

    Full Text Available Abstract Animal models play a central role in all areas of biomedical research. The process of animal model building, development and evaluation has rarely been addressed systematically, despite the long history of using animal models in the investigation of neuropsychiatric disorders and behavioral dysfunctions. An iterative, multi-stage trajectory for developing animal models and assessing their quality is proposed. The process starts with defining the purpose(s of the model, preferentially based on hypotheses about brain-behavior relationships. Then, the model is developed and tested. The evaluation of the model takes scientific and ethical criteria into consideration. Model development requires a multidisciplinary approach. Preclinical and clinical experts should establish a set of scientific criteria, which a model must meet. The scientific evaluation consists of assessing the replicability/reliability, predictive, construct and external validity/generalizability, and relevance of the model. We emphasize the role of (systematic and extended replications in the course of the validation process. One may apply a multiple-tiered 'replication battery' to estimate the reliability/replicability, validity, and generalizability of result. Compromised welfare is inherent in many deficiency models in animals. Unfortunately, 'animal welfare' is a vaguely defined concept, making it difficult to establish exact evaluation criteria. Weighing the animal's welfare and considerations as to whether action is indicated to reduce the discomfort must accompany the scientific evaluation at any stage of the model building and evaluation process. Animal model building should be discontinued if the model does not meet the preset scientific criteria, or when animal welfare is severely compromised. The application of the evaluation procedure is exemplified using the rat with neonatal hippocampal lesion as a proposed model of schizophrenia. In a manner congruent to

  20. Requirements model generation to support requirements elicitation: The Secure Tropos experience

    NARCIS (Netherlands)

    Kiyavitskaya, N.; Zannone, N.

    2008-01-01

    In recent years several efforts have been devoted by researchers in the Requirements Engineering community to the development of methodologies for supporting designers during requirements elicitation, modeling, and analysis. However, these methodologies often lack tool support to facilitate their

  1. Overview of contaminant arrival distributions as general evaluation requirements

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    The environmental consequences of subsurface contamination problems can be completely and effectively evaluated by fulfilling the following five requirements: Determine each present or future outflow boundary of contaminated groundwater; provide the location/arrival-time distributions; provide the location/outflow-quantity distributions; provide these distributions for each individual chemical or biological constituent of environmental importance; and use the arrival distributions to determine the quantity and concentration of each contaminant that will interface with the environment as time passes. The arrival distributions on which these requirements are based provide a reference point for communication among scientists and public decision makers by enabling complicated scientific analyses to be presented as simple summary relationships

  2. Mobility Models for Systems Evaluation

    Science.gov (United States)

    Musolesi, Mirco; Mascolo, Cecilia

    Mobility models are used to simulate and evaluate the performance of mobile wireless systems and the algorithms and protocols at the basis of them. The definition of realistic mobility models is one of the most critical and, at the same time, difficult aspects of the simulation of applications and systems designed for mobile environments. There are essentially two possible types of mobility patterns that can be used to evaluate mobile network protocols and algorithms by means of simulations: traces and synthetic models [130]. Traces are obtained by means of measurements of deployed systems and usually consist of logs of connectivity or location information, whereas synthetic models are mathematical models, such as sets of equations, which try to capture the movement of the devices.

  3. Evaluation of a decontamination model

    International Nuclear Information System (INIS)

    Rippin, D.W.T.; Hanulik, J.; Schenker, E.; Ullrich, G.

    1981-02-01

    In the scale-up of a laboratory decontamination process difficulties arise due to the limited understanding of the mechanisms controlling the process. This paper contains some initial proposals which may contribute to the quantitative understanding of the chemical and physical factors which influence decontamination operations. General features required in a mathematical model to describe a fluid-solid reaction are discussed, and initial work is presented with a simple model which has had some success in describing the observed laboratory behaviour. (Auth.)

  4. Evaluation and qualification of novel control techniques with safety requirements

    International Nuclear Information System (INIS)

    Gossner, S.; Wach, D.

    1985-01-01

    The paper discusses the questions related to the assessment and qualification of new I and C-systems. The tasks of nuclear power plant I and Cs as well as the efficiency of the new techniques are reflected. Problems with application of new I and Cs and the state of application in Germany and abroad are addressed. Starting from the essential differencies between conventional and new I and C-systems it is evaluated, if and in which way existing safety requirements can be met and to what extent new requirements need to be formulated. An overall concept has to be developed comprising the definition of graded requirement profiles for design and qualification. Associated qualification procedures and tools have to be adapted, developed and tuned upon each other. (orig./HP) [de

  5. Experimental and theoretical requirements for fuel modelling

    International Nuclear Information System (INIS)

    Gatesoupe, J.P.

    1979-01-01

    From a scientific point of view it may be considered that any event in the life of a fuel pin under irradiation should be perfectly well understood and foreseen from that deterministic point of view, the whole behaviour of the pin maybe analysed and dismantled with a specific function for every component part and each component part related to one basic phenomenon which can be independently studied on pure physical grounds. When extracted from the code structure the subroutine is studied for itself by specialists who try to keep as close as possible to the physics involved in the phenomenon; that often leads to an impressive luxury in details and a subsequent need for many unavailable input data. It might seem more secure to follow that approach since it tries to be firmly based on theoretical grounds. One should think so if the phenomenological situation in the pin were less complex than it is. The codes would not be adequate for off-normal operating conditions since for the accidental transient conditions the key-phenomena would not be the same as for steady-state or slow transient conditions. The orientation given to fuel modelling is based on our two main technological constraints which are: no fuel melting; no cladding failure; no excessive cladding deformation. In this context, the only relevant models are those which have a significant influence on the maximum temperatures in the fuel or on the cladding damage hence the selection between key models and irrelevant models which will next be done. A rather pragmatic view is kept on codification with a special focus on a few determinant aspects of fuel behaviour and no attention to models which are nothing but decorative. Fuel modeling is merely considered as a link between experimental knowledge; it serves as a guide for further improvements in fuel design and as so happens to be quite useful. On this basis the main lacks in of fuel behaviour is described. These are mainly concerning: thermal transfer through

  6. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    ... associated with military training exercises. The model combines the business practice of Material Requirements Planning and the commercial spreadsheet software capabilities of Lotus 1-2-3 to calculate the requirements for food, consumable...

  7. Extending enterprise architecture modelling with business goals and requirements

    NARCIS (Netherlands)

    Engelsman, W.; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten J.

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling

  8. Reponsive and Open Learning Environments (ROLE: Requirements, Evaluation and Reflection

    Directory of Open Access Journals (Sweden)

    Effie Lai-Chong Law

    2013-02-01

    Full Text Available Coordinating requirements engineering (RE and evaluation studies across heterogeneous technology-enhanced learning (TEL environments is deemed challenging, because each of them is situated in a specific organizational, technical and socio-cultural context. We have dealt with such challenges in the project of ROLE (http://www.role-project.eu/ in which five test-beds are involved in deploying and evaluating Personal Learning Environments (PLEs. They include Higher Education Institutions (HEIs and global enterprises in and beyond Europe, representing a range of values and assumptions. While the diversity provides fertile grounds for validating our research ideas, it poses many challenges for conducting comparison studies. In the paper, we first provide an overview of the ROLE project, focusing on its missions and aims. Next we present a Web2.0-inspired RE approach called Social Requirements Engineering (SRE. Then we depict our initial attempts to evaluate the ROLE framework and report some preliminary findings. One major outcome is that the technology adoption process must work on the basis of existing LMS, extending them with the ROLE functionality rather than embracing LMS functionality in ROLE.

  9. Extending enterprise architecture modelling with business goals and requirements

    Science.gov (United States)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  10. Ethics Requirement Score: new tool for evaluating ethics in publications.

    Science.gov (United States)

    Santos, Lígia Gabrielle dos; Costa e Fonseca, Ana Carolina da; Bica, Claudia Giuliano

    2014-01-01

    To analyze ethical standards considered by health-related scientific journals, and to prepare the Ethics Requirement Score, a bibliometric index to be applied to scientific healthcare journals in order to evaluate criteria for ethics in scientific publication. Journals related to healthcare selected by the Journal of Citation Reports™ 2010 database were considered as experimental units. Parameters related to publication ethics were analyzed for each journal. These parameters were acquired by analyzing the author's guidelines or instructions in each journal website. The parameters considered were approval by an Internal Review Board, Declaration of Helsinki or Resolution 196/96, recommendations on plagiarism, need for application of Informed Consent Forms with the volunteers, declaration of confidentiality of patients, record in the database for clinical trials (if applicable), conflict of interest disclosure, and funding sources statement. Each item was analyzed considering their presence or absence. The foreign journals had a significantly higher Impact Factor than the Brazilian journals, however, no significant results were observed in relation to the Ethics Requirement Score. There was no correlation between the Ethics Requirement Score and the Impact Factor. Although the Impact Factor of foreigner journals was considerably higher than that of the Brazilian publications, the results showed that the Impact Factor has no correlation with the proposed score. This allows us to state that the ethical requirements for publication in biomedical journals are not related to the comprehensiveness or scope of the journal.

  11. Ethics Requirement Score: new tool for evaluating ethics in publications

    Science.gov (United States)

    dos Santos, Lígia Gabrielle; Fonseca, Ana Carolina da Costa e; Bica, Claudia Giuliano

    2014-01-01

    Objective To analyze ethical standards considered by health-related scientific journals, and to prepare the Ethics Requirement Score, a bibliometric index to be applied to scientific healthcare journals in order to evaluate criteria for ethics in scientific publication. Methods Journals related to healthcare selected by the Journal of Citation Reports™ 2010 database were considered as experimental units. Parameters related to publication ethics were analyzed for each journal. These parameters were acquired by analyzing the author’s guidelines or instructions in each journal website. The parameters considered were approval by an Internal Review Board, Declaration of Helsinki or Resolution 196/96, recommendations on plagiarism, need for application of Informed Consent Forms with the volunteers, declaration of confidentiality of patients, record in the database for clinical trials (if applicable), conflict of interest disclosure, and funding sources statement. Each item was analyzed considering their presence or absence. Result The foreign journals had a significantly higher Impact Factor than the Brazilian journals, however, no significant results were observed in relation to the Ethics Requirement Score. There was no correlation between the Ethics Requirement Score and the Impact Factor. Conclusion Although the Impact Factor of foreigner journals was considerably higher than that of the Brazilian publications, the results showed that the Impact Factor has no correlation with the proposed score. This allows us to state that the ethical requirements for publication in biomedical journals are not related to the comprehensiveness or scope of the journal. PMID:25628189

  12. Systems and context modeling approach to requirements analysis

    Science.gov (United States)

    Ahuja, Amrit; Muralikrishna, G.; Patwari, Puneet; Subhrojyoti, C.; Swaminathan, N.; Vin, Harrick

    2014-08-01

    Ensuring completeness and correctness of the requirements for a complex system such as the SKA is challenging. Current system engineering practice includes developing a stakeholder needs definition, a concept of operations, and defining system requirements in terms of use cases and requirements statements. We present a method that enhances this current practice into a collection of system models with mutual consistency relationships. These include stakeholder goals, needs definition and system-of-interest models, together with a context model that participates in the consistency relationships among these models. We illustrate this approach by using it to analyze the SKA system requirements.

  13. Evaluation of the effect of torsemide on warfarin dosage requirements.

    Science.gov (United States)

    Lai, Sophia; Momper, Jeremiah D; Yam, Felix K

    2017-08-01

    Background According to drug interaction databases, torsemide may potentiate the effects of warfarin. Evidence for this drug-drug interaction, however, is conflicting and the clinical significance is unknown. Objective The aim of this study is to evaluate the impact of torsemide initiation on warfarin dosage requirements. Setting This study was conducted at the Veterans Affairs Healthcare System in San Diego, California. Method A retrospective cohort study was conducted using Veterans Affairs data from patients who were converted from bumetanide to torsemide between March 2014 and July 2014. Patients were also prescribed and taking warfarin during the observation period. Warfarin dosage requirements were evaluated to determine if any changes occurred within the first 3 months of starting torsemide. Main outcome measure The primary outcome was the average weekly warfarin dose before and after torsemide initiation. Results Eighteen patients met study inclusion criteria. The weekly warfarin dose before and after initiation of torsemide was not significantly different (34 ± 15 and 34 ± 13 mg, p > 0.05). Of those eighteen patients, only two experienced elevations in INR that required a decrease in warfarin dosage after torsemide initiation. Between those two patients, dosage reductions ranged from 5.3 to 18%. Conclusion These results indicated that most patients did not require any warfarin dosage adjustments after torsemide was initiated. The potential for interaction, however, still exists. While empiric warfarin dosage adjustments are not recommended when initiating torsemide, increased monitoring is warranted to minimize the risk of adverse effects.

  14. Customer requirement modeling and mapping of numerical control machine

    Directory of Open Access Journals (Sweden)

    Zhongqi Sheng

    2015-10-01

    Full Text Available In order to better obtain information about customer requirement and develop products meeting customer requirement, it is necessary to systematically analyze and handle the customer requirement. This article uses the product service system of numerical control machine as research objective and studies the customer requirement modeling and mapping oriented toward configuration design. It introduces the conception of requirement unit, expounds the customer requirement decomposition rules, and establishes customer requirement model; it builds the house of quality using quality function deployment and confirms the weight of technical feature of product and service; it explores the relevance rules between data using rough set theory, establishes rule database, and solves the target value of technical feature of product. Using economical turning center series numerical control machine as an example, it verifies the rationality of proposed customer requirement model.

  15. Using cognitive modeling for requirements engineering in anesthesiology

    NARCIS (Netherlands)

    Pott, C; le Feber, J

    2005-01-01

    Cognitive modeling is a complexity reducing method to describe significant cognitive processes under a specified research focus. Here, a cognitive process model for decision making in anesthesiology is presented and applied in requirements engineering. Three decision making situations of

  16. Resident Evaluation of a Required Telepsychiatry Clinical Experience.

    Science.gov (United States)

    Teshima, John; Hodgins, Michael; Boydell, Katherine M; Pignatiello, Antonio

    2016-04-01

    The authors explored resident experiences of telepsychiatry clinical training. This paper describes an analysis of evaluation forms completed by psychiatry residents following a required training experience in telepsychiatry. Retrospective numeric and narrative data were collected from 2005 to 2012. Using a five-point Likert-type scale (1 = strongly disagree and 5 = strongly agree), residents ranked the session based on the following characteristics: the overall experience, interest in participating in telepsychiatry in the future, understanding service provision to underserved areas, telepsychiatry as mode of service delivery, and the unique aspects of telepsychiatry work. The authors also conducted a content analysis of narrative comments in response to open-ended questions about the positive and negative aspects of the training experience. In all, 88% of residents completed (n = 335) an anonymous evaluation following their participation in telepsychiatry consultation sessions. Numeric results were mostly positive and indicated that the experience was interesting and enjoyable, enhanced interest in participating in telepsychiatry in the future, and increased understanding of providing psychiatric services to underserved communities. Narrative data demonstrated that the most valuable aspects of training included the knowledge acquired in terms of establishing rapport and engaging with patients, using the technology, working collaboratively, identifying different approaches used, and awareness of the complexity of cases. Resident desire for more training of this nature was prevalent, specifically a wish for more detail, additional time for discussion and debriefing, and further explanation of the unique aspects of telepsychiatry as mode of delivery. More evaluation of telepsychiatry training, elective or required, is needed. The context of this training offered potential side benefits of learning about interprofessional and collaborative care for the

  17. Capabilities and requirements for modelling radionuclide transport in the geosphere

    International Nuclear Information System (INIS)

    Paige, R.W.; Piper, D.

    1989-02-01

    This report gives an overview of geosphere flow and transport models suitable for use by the Department of the Environment in the performance assessment of radioactive waste disposal sites. An outline methodology for geosphere modelling is proposed, consisting of a number of different types of model. A brief description of each of the component models is given, indicating the purpose of the model, the processes being modelled and the methodologies adopted. Areas requiring development are noted. (author)

  18. Multi-criteria evaluation of hydrological models

    Science.gov (United States)

    Rakovec, Oldrich; Clark, Martyn; Weerts, Albrecht; Hill, Mary; Teuling, Ryan; Uijlenhoet, Remko

    2013-04-01

    Over the last years, there is a tendency in the hydrological community to move from the simple conceptual models towards more complex, physically/process-based hydrological models. This is because conceptual models often fail to simulate the dynamics of the observations. However, there is little agreement on how much complexity needs to be considered within the complex process-based models. One way to proceed to is to improve understanding of what is important and unimportant in the models considered. The aim of this ongoing study is to evaluate structural model adequacy using alternative conceptual and process-based models of hydrological systems, with an emphasis on understanding how model complexity relates to observed hydrological processes. Some of the models require considerable execution time and the computationally frugal sensitivity analysis, model calibration and uncertainty quantification methods are well-suited to providing important insights for models with lengthy execution times. The current experiment evaluates two version of the Framework for Understanding Structural Errors (FUSE), which both enable running model inter-comparison experiments. One supports computationally efficient conceptual models, and the second supports more-process-based models that tend to have longer execution times. The conceptual FUSE combines components of 4 existing conceptual hydrological models. The process-based framework consists of different forms of Richard's equations, numerical solutions, groundwater parameterizations and hydraulic conductivity distribution. The hydrological analysis of the model processes has evolved from focusing only on simulated runoff (final model output), to also including other criteria such as soil moisture and groundwater levels. Parameter importance and associated structural importance are evaluated using different types of sensitivity analyses techniques, making use of both robust global methods (e.g. Sobol') as well as several

  19. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  20. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  1. Assessment of compliance with regulatory requirements for a best estimate methodology for evaluation of ECCS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Un Chul; Jang, Jin Wook; Lim, Ho Gon; Jeong, Ik [Seoul National Univ., Seoul (Korea, Republic of); Sim, Suk Ku [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2000-03-15

    Best estimate methodology for evaluation of ECCS proposed by KEPCO(KREM) os using thermal-hydraulic best-estimate code and the topical report for the methodology is described that it meets the regulatory requirement of USNRC regulatory guide. In this research the assessment of compliance with regulatory guide. In this research the assessment of compliance with regulatory requirements for the methodology is performed. The state of licensing procedure of other countries and best-estimate evaluation methodologies of Europe is also investigated, The applicability of models and propriety of procedure of uncertainty analysis of KREM are appraised and compliance with USNRC regulatory guide is assessed.

  2. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  3. GENERAL REQUIREMENTS FOR SIMULATION MODELS IN WASTE MANAGEMENT

    International Nuclear Information System (INIS)

    Miller, Ian; Kossik, Rick; Voss, Charlie

    2003-01-01

    Most waste management activities are decided upon and carried out in a public or semi-public arena, typically involving the waste management organization, one or more regulators, and often other stakeholders and members of the public. In these environments, simulation modeling can be a powerful tool in reaching a consensus on the best path forward, but only if the models that are developed are understood and accepted by all of the parties involved. These requirements for understanding and acceptance of the models constrain the appropriate software and model development procedures that are employed. This paper discusses requirements for both simulation software and for the models that are developed using the software. Requirements for the software include transparency, accessibility, flexibility, extensibility, quality assurance, ability to do discrete and/or continuous simulation, and efficiency. Requirements for the models that are developed include traceability, transparency, credibility/validity, and quality control. The paper discusses these requirements with specific reference to the requirements for performance assessment models that are used for predicting the long-term safety of waste disposal facilities, such as the proposed Yucca Mountain repository

  4. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    This thesis focuses on developing a spreadsheet decision support model that can be used by combat engineer platoon and company commanders in determining the material requirements and estimated costs...

  5. Commonsense Psychology and the Functional Requirements of Cognitive Models

    National Research Council Canada - National Science Library

    Gordon, Andrew S

    2005-01-01

    In this paper we argue that previous models of cognitive abilities (e.g. memory, analogy) have been constructed to satisfy functional requirements of implicit commonsense psychological theories held by researchers and nonresearchers alike...

  6. Evaluation Model for Sentient Cities

    Directory of Open Access Journals (Sweden)

    Mª Florencia Fergnani Brion

    2016-11-01

    Full Text Available In this article we made a research about the Sentient Cities and produced an assessment model to analyse if a city is or could be potentially considered one. It can be used to evaluate the current situation of a city before introducing urban policies based on citizen participation in hybrid environments (physical and digital. To that effect, we've developed evaluation grids with the main elements that form a Sentient City and their measurement values. The Sentient City is a variation of the Smart City, also based on technology progress and innovation, but where the citizens are the principal agent. In this model, governments aim to have a participatory and sustainable system for achieving the Knowledge Society and Collective Intelligence development, as well as the city’s efficiency. Also, they increase communication channels between the Administration and citizens. In this new context, citizens are empowered because they have the opportunity to create a Local Identity and transform their surroundings through open and horizontal initiatives.

  7. A Compositional Knowledge Level Process Model of Requirements Engineering

    NARCIS (Netherlands)

    Herlea, D.E.; Jonker, C.M.; Treur, J.; Wijngaards, W.C.A.

    2002-01-01

    In current literature few detailed process models for Requirements Engineering are presented: usually high-level activities are distinguished, without a more precise specification of each activity. In this paper the process of Requirements Engineering has been analyzed using knowledge-level

  8. The Spiral-Interactive Program Evaluation Model.

    Science.gov (United States)

    Khaleel, Ibrahim Adamu

    1988-01-01

    Describes the spiral interactive program evaluation model, which is designed to evaluate vocational-technical education programs in secondary schools in Nigeria. Program evaluation is defined; utility oriented and process oriented models for evaluation are described; and internal and external evaluative factors and variables that define each…

  9. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  10. Simulations, evaluations and models. Vol. 1

    International Nuclear Information System (INIS)

    Brehmer, B.; Leplat, J.

    1992-01-01

    Papers presented at the Fourth MOHAWC (Models of Human Activities in Work Context) workshop. The general theme was simulations, evaluations and models. The emphasis was on time in relation to the modelling of human activities in modern, high tech. work. Such work often requires people to control dynamic systems, and the behaviour and misbehaviour of these systems in time is a principle focus of work in, for example, a modern process plant. The papers report on microworlds and on their innovative uses, both in the form of experiments and in the form of a new form of use, that of testing a program which performs diagnostic reasoning. They present new aspects on the problem of time in process control, showing the importance of considering the time scales of dynamic tasks, both in individual decision making and in distributed decision making, and in providing new formalisms, both for the representation of time and for reasoning involving time in diagnosis. (AB)

  11. Evaluation of onset of nucleate boiling models

    Energy Technology Data Exchange (ETDEWEB)

    Huang, LiDong [Heat Transfer Research, Inc., College Station, TX (United States)], e-mail: lh@htri.net

    2009-07-01

    This article discusses available models and correlations for predicting the required heat flux or wall superheat for the Onset of Nucleate Boiling (ONB) on plain surfaces. It reviews ONB data in the open literature and discusses the continuing efforts of Heat Transfer Research, Inc. in this area. Our ONB database contains ten individual sources for ten test fluids and a wide range of operating conditions for different geometries, e.g., tube side and shell side flow boiling and falling film evaporation. The article also evaluates literature models and correlations based on the data: no single model in the open literature predicts all data well. The prediction uncertainty is especially higher in vacuum conditions. Surface roughness is another critical criterion in determining which model should be used. However, most models do not directly account for surface roughness, and most investigators do not provide surface roughness information in their published findings. Additional experimental research is needed to improve confidence in predicting the required wall superheats for nucleation boiling for engineering design purposes. (author)

  12. Evaluation of onset of nucleate boiling models

    International Nuclear Information System (INIS)

    Huang, LiDong

    2009-01-01

    This article discusses available models and correlations for predicting the required heat flux or wall superheat for the Onset of Nucleate Boiling (ONB) on plain surfaces. It reviews ONB data in the open literature and discusses the continuing efforts of Heat Transfer Research, Inc. in this area. Our ONB database contains ten individual sources for ten test fluids and a wide range of operating conditions for different geometries, e.g., tube side and shell side flow boiling and falling film evaporation. The article also evaluates literature models and correlations based on the data: no single model in the open literature predicts all data well. The prediction uncertainty is especially higher in vacuum conditions. Surface roughness is another critical criterion in determining which model should be used. However, most models do not directly account for surface roughness, and most investigators do not provide surface roughness information in their published findings. Additional experimental research is needed to improve confidence in predicting the required wall superheats for nucleation boiling for engineering design purposes. (author)

  13. Irrigation Requirement Estimation Using Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Bounoua, Lahouari; Imhoff, Marc L.; Franks, Shannon

    2010-01-01

    We explore an inverse biophysical modeling process forced by satellite and climatological data to quantify irrigation requirements in semi-arid agricultural areas. We constrain the carbon and water cycles modeled under both equilibrium, balance between vegetation and climate, and non-equilibrium, water added through irrigation. We postulate that the degree to which irrigated dry lands vary from equilibrium climate conditions is related to the amount of irrigation. The amount of water required over and above precipitation is considered as an irrigation requirement. For July, results show that spray irrigation resulted in an additional amount of water of 1.3 mm per occurrence with a frequency of 24.6 hours. In contrast, the drip irrigation required only 0.6 mm every 45.6 hours or 46% of that simulated by the spray irrigation. The modeled estimates account for 87% of the total reported irrigation water use, when soil salinity is not important and 66% in saline lands.

  14. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    . There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  15. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  16. Ohio River navigation investment model: Requirements and model design

    Energy Technology Data Exchange (ETDEWEB)

    Bronzini, M.S.; Curlee, T.R.; Leiby, P.N.; Southworth, F.; Summers, M.S.

    1998-01-01

    Oak Ridge National Laboratory is assisting the US Army Corps of Engineers in improving its economic analysis procedures for evaluation of inland waterway investment projects along the Ohio River System. This paper describes the context and design of an integrated approach to calculating the system-wide benefits from alternative combinations of lock and channel improvements, providing an ability to project the cost savings from proposed waterway improvements in capacity and reliability for up to fifty years into the future. The design contains an in-depth treatment of the levels of risk and uncertainty associated with different multi-year lock and channel improvement plans, including the uncertainty that results from a high degree of interaction between the many different waterway system components.

  17. A Functional Model of Quality Assurance for Psychiatric Hospitals and Corresponding Staffing Requirements.

    Science.gov (United States)

    Kamis-Gould, Edna; And Others

    1991-01-01

    A model for quality assurance (QA) in psychiatric hospitals is described. Its functions (general QA, utilization review, clinical records, evaluation, management information systems, risk management, and infection control), subfunctions, and corresponding staffing requirements are reviewed. This model was designed to foster standardization in QA…

  18. Evaluation of Models of the Reading Process.

    Science.gov (United States)

    Balajthy, Ernest

    A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…

  19. Models of Human Information Requirements: "When Reasonable Aiding Systems Disagree"

    Science.gov (United States)

    Corker, Kevin; Pisanich, Gregory; Shafto, Michael (Technical Monitor)

    1994-01-01

    Aircraft flight management and Air Traffic Control (ATC) automation are under development to maximize the economy of flight and to increase the capacity of the terminal area airspace while maintaining levels of flight safety equal to or better than current system performance. These goals are being realized by the introduction of flight management automation aiding and operations support systems on the flight deck and by new developments of ATC aiding systems that seek to optimize scheduling of aircraft while potentially reducing required separation and accounting for weather and wake vortex turbulence. Aiding systems on both the flight deck and the ground operate through algorithmic functions on models of the aircraft and of the airspace. These models may differ from each other as a result of variations in their models of the immediate environment. The resultant flight operations or ATC commands may differ in their response requirements (e.g. different preferred descent speeds or descent initiation points). The human operators in the system must then interact with the automation to reconcile differences and resolve conflicts. We have developed a model of human performance including cognitive functions (decision-making, rule-based reasoning, procedural interruption recovery and forgetting) that supports analysis of the information requirements for resolution of flight aiding and ATC conflicts. The model represents multiple individuals in the flight crew and in ATC. The model is supported in simulation on a Silicon Graphics' workstation using Allegro Lisp. Design guidelines for aviation automation aiding systems have been developed using the model's specification of information and team procedural requirements. Empirical data on flight deck operations from full-mission flight simulation are provided to support the model's predictions. The paper describes the model, its development and implementation, the simulation test of the model predictions, and the empirical

  20. Evaluating Predictive Models of Software Quality

    Science.gov (United States)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  1. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Ciaschini, V; Canaparo, M; Ronchieri, E; Salomoni, D

    2014-01-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  2. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  3. Sizing and scaling requirements of a large-scale physical model for code validation

    International Nuclear Information System (INIS)

    Khaleel, R.; Legore, T.

    1990-01-01

    Model validation is an important consideration in application of a code for performance assessment and therefore in assessing the long-term behavior of the engineered and natural barriers of a geologic repository. Scaling considerations relevant to porous media flow are reviewed. An analysis approach is presented for determining the sizing requirements of a large-scale, hydrology physical model. The physical model will be used to validate performance assessment codes that evaluate the long-term behavior of the repository isolation system. Numerical simulation results for sizing requirements are presented for a porous medium model in which the media properties are spatially uncorrelated

  4. Formal Requirements Modeling for Reactive Systems with Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon

    This dissertation presents the contributions of seven publications all concerned with the application of Coloured Petri Nets (CPN) to requirements modeling for reactive systems. The publications are introduced along with relevant background material and related work, and their contributions...... to take into concern that the behavior of human actors is less likely to be predictable than the behavior of e.g. mechanical components.   In the second approach, the CPN model is parameterized and utilizes a generic and reusable CPN module operating as an SD interpreter. In addition to distinguishing...... and events. A tool is presented that allows automated validation of the structure of CPN models with respect to the guidelines. Next, three publications on integrating Jackson's Problem Frames with CPN requirements models are presented: The first publication introduces a method for systematically structuring...

  5. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation

  6. 45 CFR 2519.800 - What are the evaluation requirements for Higher Education programs?

    Science.gov (United States)

    2010-10-01

    ... (Continued) CORPORATION FOR NATIONAL AND COMMUNITY SERVICE HIGHER EDUCATION INNOVATIVE PROGRAMS FOR COMMUNITY SERVICE Evaluation Requirements § 2519.800 What are the evaluation requirements for Higher Education... 45 Public Welfare 4 2010-10-01 2010-10-01 false What are the evaluation requirements for Higher...

  7. Critical Business Requirements Model and Metrics for Intranet ROI

    OpenAIRE

    Luqi; Jacoby, Grant A.

    2005-01-01

    Journal of Electronic Commerce Research, Vol. 6, No. 1, pp. 1-30. This research provides the first theoretical model, the Intranet Efficiency and Effectiveness Model (IEEM), to measure intranet overall value contributions based on a corporation’s critical business requirements by applying a balanced baseline of metrics and conversion ratios linked to key business processes of knowledge workers, IT managers and business decision makers -- in effect, closing the gap of understanding...

  8. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    Science.gov (United States)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  9. A Hybrid Parallel Execution Model for Logic Based Requirement Specifications (Invited Paper

    Directory of Open Access Journals (Sweden)

    Jeffrey J. P. Tsai

    1999-05-01

    Full Text Available It is well known that undiscovered errors in a requirements specification is extremely expensive to be fixed when discovered in the software maintenance phase. Errors in the requirement phase can be reduced through the validation and verification of the requirements specification. Many logic-based requirements specification languages have been developed to achieve these goals. However, the execution and reasoning of a logic-based requirements specification can be very slow. An effective way to improve their performance is to execute and reason the logic-based requirements specification in parallel. In this paper, we present a hybrid model to facilitate the parallel execution of a logic-based requirements specification language. A logic-based specification is first applied by a data dependency analysis technique which can find all the mode combinations that exist within a specification clause. This mode information is used to support a novel hybrid parallel execution model, which combines both top-down and bottom-up evaluation strategies. This new execution model can find the failure in the deepest node of the search tree at the early stage of the evaluation, thus this new execution model can reduce the total number of nodes searched in the tree, the total processes needed to be generated, and the total communication channels needed in the search process. A simulator has been implemented to analyze the execution behavior of the new model. Experiments show significant improvement based on several criteria.

  10. Generic skills requirements (KSA model) towards future mechanical ...

    African Journals Online (AJOL)

    ... Statistics and Discriminant Analysis (DA) as required to achieve the objective of the study. This study will guide all future engineers, especially in the field of Mechanical Engineering in Malaysia to penetrate the job market according to the current market needs. Keywords: generic skills; KSA model; mechanical engineers; ...

  11. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  12. Models of protein and amino acid requirements for cattle

    Directory of Open Access Journals (Sweden)

    Luis Orlindo Tedeschi

    2015-03-01

    Full Text Available Protein supply and requirements by ruminants have been studied for more than a century. These studies led to the accumulation of lots of scientific information about digestion and metabolism of protein by ruminants as well as the characterization of the dietary protein in order to maximize animal performance. During the 1980s and 1990s, when computers became more accessible and powerful, scientists began to conceptualize and develop mathematical nutrition models, and to program them into computers to assist with ration balancing and formulation for domesticated ruminants, specifically dairy and beef cattle. The most commonly known nutrition models developed during this period were the National Research Council (NRC in the United States, Agricultural Research Council (ARC in the United Kingdom, Institut National de la Recherche Agronomique (INRA in France, and the Commonwealth Scientific and Industrial Research Organization (CSIRO in Australia. Others were derivative works from these models with different degrees of modifications in the supply or requirement calculations, and the modeling nature (e.g., static or dynamic, mechanistic, or deterministic. Circa 1990s, most models adopted the metabolizable protein (MP system over the crude protein (CP and digestible CP systems to estimate supply of MP and the factorial system to calculate MP required by the animal. The MP system included two portions of protein (i.e., the rumen-undegraded dietary CP - RUP - and the contributions of microbial CP - MCP as the main sources of MP for the animal. Some models would explicitly account for the impact of dry matter intake (DMI on the MP required for maintenance (MPm; e.g., Cornell Net Carbohydrate and Protein System - CNCPS, the Dutch system - DVE/OEB, while others would simply account for scurf, urinary, metabolic fecal, and endogenous contributions independently of DMI. All models included milk yield and its components in estimating MP required for lactation

  13. Requirements of Integrated Design Teams While Evaluating Advanced Energy Retrofit Design Options in Immersive Virtual Environments

    Directory of Open Access Journals (Sweden)

    Xue Yang

    2015-12-01

    Full Text Available One of the significant ways to save energy use in buildings is to implement advanced energy retrofits in existing buildings. Improving energy performance of buildings through advanced energy retrofitting requires a clear understanding of the cost and energy implications of design alternatives from various engineering disciplines when different retrofit options are considered. The communication of retrofit design alternatives and their energy implications is essential in the decision-making process, as it affects the final retrofit selections and hence the energy efficiency of the retrofitted buildings. The objective of the research presented here was to identify a generic list of information requirements that are needed to be shared and collectively analyzed by integrated design teams during advanced energy retrofit design review meetings held in immersive settings. While identifying such requirements, the authors used an immersive environment based iterative requirements elicitation approach. The technology was used as a means to better identify the information requirements of integrated design teams to be analyzed as a group. This paper provides findings on information requirements of integrated design teams when evaluating retrofit options in immersive virtual environments. The information requirements were identified through interactions with sixteen experts in design and energy modeling domain, and validated with another group of participants consisting of six design experts who were experienced in integrated design processes. Industry practitioners can use the findings in deciding on what information to share with integrated design team members during design review meetings that utilize immersive virtual environments.

  14. Required experimental accuracy to select between supersymmetrical models

    Science.gov (United States)

    Grellscheid, David

    2004-03-01

    We will present a method to decide a priori whether various supersymmetrical scenarios can be distinguished based on sparticle mass data alone. For each model, a scan over all free SUSY breaking parameters reveals the extent of that model's physically allowed region of sparticle-mass-space. Based on the geometrical configuration of these regions in mass-space, it is possible to obtain an estimate of the required accuracy of future sparticle mass measurements to distinguish between the models. We will illustrate this algorithm with an example. This talk is based on work done in collaboration with B C Allanach (LAPTH, Annecy) and F Quevedo (DAMTP, Cambridge).

  15. Cognition and procedure representational requirements for predictive human performance models

    Science.gov (United States)

    Corker, K.

    1992-01-01

    Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods

  16. A Novel Model for Security Evaluation for Compliance

    DEFF Research Database (Denmark)

    Hald, Sara Ligaard; Pedersen, Jens Myrup; Prasad, Neeli R.

    2011-01-01

    for Compliance (SEC) model offers a lightweight alternative for use by decision makers to get a quick overview of the security attributes of different technologies for easy comparison and requirement compliance evaluation. The scientific contribution is this new approach to security modelling as well...

  17. Design Of Measurements For Evaluating Readiness Of Technoware Components To Meet The Required Standard Of Products

    Science.gov (United States)

    Fauzi, Ilham; Muharram Hasby, Fariz; Irianto, Dradjad

    2018-03-01

    Although government is able to make mandatory standards that must be obeyed by the industry, the respective industries themselves often have difficulties to fulfil the requirements described in those standards. This is especially true in many small and medium sized enterprises that lack the required capital to invest in standard-compliant equipment and machineries. This study aims to develop a set of measurement tools for evaluating the level of readiness of production technology with respect to the requirements of a product standard based on the quality function deployment (QFD) method. By combining the QFD methodology, UNESCAP Technometric model [9] and Analytic Hierarchy Process (AHP), this model is used to measure a firm’s capability to fulfill government standard in the toy making industry. Expert opinions from both the governmental officers responsible for setting and implementing standards and the industry practitioners responsible for managing manufacturing processes are collected and processed to find out the technological capabilities that should be improved by the firm to fulfill the existing standard. This study showed that the proposed model can be used successfully to measure the gap between the requirements of the standard and the readiness of technoware technological component in a particular firm.

  18. Information Models, Data Requirements, and Agile Data Curation

    Science.gov (United States)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  19. Evaluation of motion management strategies based on required margins

    International Nuclear Information System (INIS)

    Sawkey, D; Svatos, M; Zankowski, C

    2012-01-01

    Strategies for delivering radiation to a moving lesion each require a margin to compensate for uncertainties in treatment. These motion margins have been determined here by separating the total uncertainty into components. Probability density functions for the individual sources of uncertainty were calculated for ten motion traces obtained from the literature. Motion margins required to compensate for the center of mass motion of the clinical treatment volume were found by convolving the individual sources of uncertainty. For measurements of position at a frequency of 33 Hz, system latency was the dominant source of positional uncertainty. Averaged over the ten motion traces, the motion margin for tracking with a latency of 200 ms was 4.6 mm. Gating with a duty cycle of 33% required a mean motion margin of 3.2–3.4 mm, and tracking with a latency of 100 ms required a motion margin of 3.1 mm. Feasible reductions in the effects of the sources of uncertainty, for example by using a simple prediction algorithm to anticipate the lesion position at the end of the latency period, resulted in a mean motion margin of 1.7 mm for tracking with a latency of 100 ms, 2.4 mm for tracking with a latency of 200 ms, and 2.1–2.2 mm for the gating strategies with duty cycles of 33%. A crossover tracking latency of 150 ms was found, below which tracking strategies could take advantage of narrower motion margins than gating strategies. The methods described here provide a means to guide selection of a motion management strategy for a given patient. (paper)

  20. Functional requirements for an Exercise Evaluation and Simulation Facility

    International Nuclear Information System (INIS)

    1983-04-01

    The Exercise Evaluation and Simulation Facility (EESF) is a computer-based resource that will improve FEMA's capabilities for evaluating radiological emergency plans and preparedness around commercial nuclear sites. The EESF is being designed from the perspective of the organizations involved (i.e., FEMA Regional and National, and state and local response teams) and takes into account the evolution of radiological and other emergency preparedness activities. Like radiological emergency planning, EESF will evolve to suit FEMA (National and Regional) needs and interests and will be increasingly useful as a resource for radiological emergency planning and evaluation. Table ES-1 briefly describes seven functions for which EESF is currently being designed. They are listed in the approximate order in which they will be designed, developed, and implemented. The only exception is the data base function, which will be developed parallel with the other six functions and enhanced to support these functions, as well as be a source of information on sites, plans, exercises, and evaluations

  1. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  2. Reactor core modeling practice: Operational requirements, model characteristics, and model validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1997-01-01

    The physical models implemented in power plant simulators have greatly increased in performance and complexity in recent years. This process has been enabled by the ever increasing computing power available at affordable prices. This paper describes this process from several angles: First the operational requirements which are more critical from the point of view of model performance, both for normal and off-normal operating conditions; A second section discusses core model characteristics in the light of the solutions implemented by Thomson Training and Simulation (TT and S) in several full-scope simulators recently built and delivered for Dutch, German, and French nuclear power plants; finally we consider the model validation procedures, which are of course an integral part of model development, and which are becoming more and more severe as performance expectations increase. As a conclusion, it may be asserted that in the core modeling field, as in other areas, the general improvement in the quality of simulation codes has resulted in a fairly rapid convergence towards mainstream engineering-grade calculations. This is remarkable performance in view of the stringent real-time requirements which the simulation codes must satisfy as well as the extremely wide range of operating conditions that they are called upon to cover with good accuracy. (author)

  3. An evaluation framework for participatory modelling

    Science.gov (United States)

    Krueger, T.; Inman, A.; Chilvers, J.

    2012-04-01

    Strong arguments for participatory modelling in hydrology can be made on substantive, instrumental and normative grounds. These arguments have led to increasingly diverse groups of stakeholders (here anyone affecting or affected by an issue) getting involved in hydrological research and the management of water resources. In fact, participation has become a requirement of many research grants, programs, plans and policies. However, evidence of beneficial outcomes of participation as suggested by the arguments is difficult to generate and therefore rare. This is because outcomes are diverse, distributed, often tacit, and take time to emerge. In this paper we develop an evaluation framework for participatory modelling focussed on learning outcomes. Learning encompasses many of the potential benefits of participation, such as better models through diversity of knowledge and scrutiny, stakeholder empowerment, greater trust in models and ownership of subsequent decisions, individual moral development, reflexivity, relationships, social capital, institutional change, resilience and sustainability. Based on the theories of experiential, transformative and social learning, complemented by practitioner experience our framework examines if, when and how learning has occurred. Special emphasis is placed on the role of models as learning catalysts. We map the distribution of learning between stakeholders, scientists (as a subgroup of stakeholders) and models. And we analyse what type of learning has occurred: instrumental learning (broadly cognitive enhancement) and/or communicative learning (change in interpreting meanings, intentions and values associated with actions and activities; group dynamics). We demonstrate how our framework can be translated into a questionnaire-based survey conducted with stakeholders and scientists at key stages of the participatory process, and show preliminary insights from applying the framework within a rural pollution management situation in

  4. IDENTIFYING OPERATIONAL REQUIREMENTS TO SELECT SUITABLE DECISION MODELS FOR A PUBLIC SECTOR EPROCUREMENT DECISION SUPPORT SYSTEM

    Directory of Open Access Journals (Sweden)

    Mohamed Adil

    2014-10-01

    Full Text Available Public sector procurement should be a transparent and fair process. Strict legal requirements are enforced on public sector procurement to make it a standardised process. To make fair decisions on selecting suppliers, a practical method which adheres to legal requirements is important. The research that is the base for this paper aimed at identifying a suitable Multi-Criteria Decision Analysis (MCDA method for the specific legal and functional needs of the Maldivian Public Sector. To identify such operational requirements, a set of focus group interviews were conducted in the Maldives with public officials responsible for procurement decision making. Based on the operational requirements identified through focus groups, criteria-based evaluation is done on published MCDA methods to identify the suitable methods for e-procurement decision making. This paper describes the identification of the operational requirements and the results of the evaluation to select suitable decision models for the Maldivian context.

  5. Ergonomic requirements to control room design - evaluation method

    International Nuclear Information System (INIS)

    Hinz, W.

    1985-01-01

    The method of evaluation introduced is the result of work carried out by the sub-committee 'Control Room Design' of the Engineering Standards Committee in DIN Standards, Ergonomy. This committee compiles standards for the design of control rooms (instrumentation and control) for the monitoring and operation of process engineering cycles. With the agreement of the committee - whom we wish to take the opportunity of thanking at this point for their constructive collaboration - a planned partial standard will be introduced thematically in the following, in order that knowledge gained from the discussion can be included in further work on the subject. The matter in question is a procedure for the qualitative evaluation of the duties to be performed under the control of operators in order that an assessment can be made of existing control concepts or such concepts as are to be found in the draft phase. (orig./GL) [de

  6. Educational game models: conceptualization and evaluation ...

    African Journals Online (AJOL)

    Educational game models: conceptualization and evaluation. ... The Game Object Model (GOM), that marries educational theory and game design, forms the basis for the development of the Persona Outlining ... AJOL African Journals Online.

  7. The Benefit of Ambiguity in Understanding Goals in Requirements Modelling

    DEFF Research Database (Denmark)

    Paay, Jeni; Pedell, Sonja; Sterling, Leon

    2011-01-01

    This paper examines the benefit of ambiguity in describing goals in requirements modelling for the design of socio-technical systems using concepts from Agent-Oriented Software Engineering (AOSE) and ethnographic and cultural probe methods from Human Computer Interaction (HCI). The authors’ aim...... ambiguity in the process of elicitation and analysis through the use of empirically informed quality goals attached to functional goals. The authors demonstrate the benefit of articulating a quality goal without turning it into a functional goal. Their study shows that quality goals kept at a high level...... of abstraction, ambiguous and open for conversations through the modelling process add richness to goal models, and communicate quality attributes of the interaction being modelled to the design phase, where this ambiguity is regarded as a resource for design....

  8. Medical evaluation of the pregnant patient requiring nonobstetric surgery

    International Nuclear Information System (INIS)

    Barron, W.M.

    1985-01-01

    This article provides a summary of currently available information from a broad range of disciplines aimed at guiding the physician caring for the pregnant patient who requires nonobstetric surgery. An understanding of the anatomic and physiologic alterations that occur during pregnancy will allow such procedures to be accomplished with morbidity and mortality approaching those of nonpregnant surgical patients. The presence of the fetus does impose some restraint; however, this should rarely impair appropriate diagnosis and treatment of maternal disease. This obtains from the broad range of diagnostic and therapeutic alternatives available, and from the fact that what is beneficial for maternal health is generally best for the fetus. 64 references

  9. Specification of advanced safety modeling requirements (Rev. 0)

    International Nuclear Information System (INIS)

    Fanning, T. H.; Tautges, T. J.

    2008-01-01

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models will

  10. Modelling requirements for future assessments based on FEP analysis

    International Nuclear Information System (INIS)

    Locke, J.; Bailey, L.

    1998-01-01

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed safety assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs that are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. The structuring of FEPs on a Master Directed Diagram (MDD) provides a systematic framework for identifying those FEPs that form part of the natural evolution of the system and those, which may define alternative potential evolutions of the repository system. In order to construct a description of the base scenario, FEPs have been grouped into a series of conceptual models. Conceptual models are groups of FEPs, identified from the MDD, representing a specific component or process within the disposal system. It has been found appropriate to define conceptual models in terms of the three main components of the disposal system: the repository engineered system, the surrounding geosphere and the biosphere. For each of these components, conceptual models provide a description of the relevant subsystem in terms of its initial characteristics, subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. The aim of this document is to present the methodology that has been developed for deriving modelling requirements and to illustrate the application of the methodology by

  11. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  12. Performance Evaluation and Requirements Assessment for Gravity Gradient Referenced Navigation

    Directory of Open Access Journals (Sweden)

    Jisun Lee

    2015-07-01

    Full Text Available In this study, simulation tests for gravity gradient referenced navigation (GGRN are conducted to verify the effects of various factors such as database (DB and sensor errors, flight altitude, DB resolution, initial errors, and measurement update rates on the navigation performance. Based on the simulation results, requirements for GGRN are established for position determination with certain target accuracies. It is found that DB and sensor errors and flight altitude have strong effects on the navigation performance. In particular, a DB and sensor with accuracies of 0.1 E and 0.01 E, respectively, are required to determine the position more accurately than or at a level similar to the navigation performance of terrain referenced navigation (TRN. In most cases, the horizontal position error of GGRN is less than 100 m. However, the navigation performance of GGRN is similar to or worse than that of a pure inertial navigation system when the DB and sensor errors are 3 E or 5 E each and the flight altitude is 3000 m. Considering that the accuracy of currently available gradiometers is about 3 E or 5 E, GGRN does not show much advantage over TRN at present. However, GGRN is expected to exhibit much better performance in the near future when accurate DBs and gravity gradiometer are available.

  13. The EMEFS model evaluation. An interim report

    Energy Technology Data Exchange (ETDEWEB)

    Barchet, W.R. [Pacific Northwest Lab., Richland, WA (United States); Dennis, R.L. [Environmental Protection Agency, Research Triangle Park, NC (United States); Seilkop, S.K. [Analytical Sciences, Inc., Durham, NC (United States); Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K. [Atmospheric Environment Service, Downsview, ON (Canada); Byun, D.; McHenry, J.N. [Computer Sciences Corp., Research Triangle Park, NC (United States); Karamchandani, P.; Venkatram, A. [ENSR Consulting and Engineering, Camarillo, CA (United States); Fung, C.; Misra, P.K. [Ontario Ministry of the Environment, Toronto, ON (Canada); Hansen, D.A. [Electric Power Research Inst., Palo Alto, CA (United States); Chang, J.S. [State Univ. of New York, Albany, NY (United States). Atmospheric Sciences Research Center

    1991-12-01

    The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs.

  14. Implicit misattribution of evaluative responses: contingency-unaware evaluative conditioning requires simultaneous stimulus presentations.

    Science.gov (United States)

    Hütter, Mandy; Sweldens, Steven

    2013-08-01

    Recent research has shown that evaluative conditioning (EC) procedures can change attitudes without participants' awareness of the contingencies between conditioned and unconditioned stimuli (Hütter, Sweldens, Stahl, Unkelbach, & Klauer, 2012). We present a theoretical explanation and boundary condition for the emergence of unaware EC effects based on the implicit misattribution of evaluative responses from unconditioned to conditioned stimuli. We hypothesize that such misattribution is only possible when conditioned and unconditioned stimuli are perceived simultaneously. Therefore we manipulate the simultaneity of the stimulus presentations and apply a process dissociation procedure to distinguish contingency-aware from contingency-unaware EC effects. A multinomial model indicates that with sequential presentations, EC effects do not occur without contingency awareness. However, unaware EC effects do occur with simultaneous presentations. The findings support dual-process theories of learning. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  15. 49 CFR 40.285 - When is a SAP evaluation required?

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false When is a SAP evaluation required? 40.285 Section... § 40.285 When is a SAP evaluation required? (a) As an employee, when you have violated DOT drug and... unless you complete the SAP evaluation, referral, and education/treatment process set forth in this...

  16. Requirements for High Level Models Supporting Design Space Exploration in Model-based Systems Engineering

    OpenAIRE

    Haveman, Steven P.; Bonnema, G. Maarten

    2013-01-01

    Most formal models are used in detailed design and focus on a single domain. Few effective approaches exist that can effectively tie these lower level models to a high level system model during design space exploration. This complicates the validation of high level system requirements during detailed design. In this paper, we define requirements for a high level model that is firstly driven by key systems engineering challenges present in industry and secondly connects to several formal and d...

  17. A required course in the development, implementation, and evaluation of clinical pharmacy services.

    Science.gov (United States)

    Skomo, Monica L; Kamal, Khalid M; Berdine, Hildegarde J

    2008-10-15

    To develop, implement, and assess a required pharmacy practice course to prepare pharmacy students to develop, implement, and evaluate clinical pharmacy services using a business plan model. Course content centered around the process of business planning and pharmacoeconomic evaluations. Selected business planning topics included literature evaluation, mission statement development, market evaluation, policy and procedure development, and marketing strategy. Selected pharmacoeconomic topics included cost-minimization analysis, cost-benefit analysis, cost-effectiveness analysis, cost-utility analysis, and health-related quality of life (HRQoL). Assessment methods included objective examinations, student participation, performance on a group project, and peer evaluation. One hundred fifty-three students were enrolled in the course. The mean scores on the objective examinations (100 points per examination) ranged from 82 to 85 points, with 25%-35% of students in the class scoring over 90, and 40%-50% of students scoring from 80 to 89. The mean scores on the group project (200 points) and classroom participation (50 points) were 183.5 and 46.1, respectively. The mean score on the peer evaluation was 30.8, with scores ranging from 27.5 to 31.7. The course provided pharmacy students with the framework necessary to develop and implement evidence-based disease management programs and to assure efficient, cost-effective utilization of pertinent resources in the provision of patient care.

  18. Evaluation of the magnetic field requirements for nanomagnetic gene transfection

    Science.gov (United States)

    Fouriki, A.; Farrow, N.; Clements, M.A.; Dobson, J.

    2010-01-01

    The objective of this work was to examine the effects of magnet distance (and by proxy, field strength) on nanomagnetic transfection efficiency. Methods non-viral magnetic nanoparticle-based transfection was evaluated using both static and oscillating magnet arrays. Results Fluorescence intensity (firefly luciferase) of transfected H292 cells showed no increase using a 96-well NdFeB magnet array when the magnets were 5 mm from the cell culture plate or nearer. At 6 mm and higher, fluorescence intensity decreased systematically. Conclusion In all cases, fluorescence intensity was higher when using an oscillating array compared to a static array. For distances closer than 5 mm, the oscillating system also outperformed Lipofectamine 2000™. PMID:22110859

  19. Evaluation of the magnetic field requirements for nanomagnetic gene transfection

    Directory of Open Access Journals (Sweden)

    A. Fouriki

    2010-07-01

    Full Text Available The objective of this work was to examine the effects of magnet distance (and by proxy, field strength on nanomagnetic transfection efficiency. Methods: non-viral magnetic nanoparticle-based transfection was evaluated using both static and oscillating magnet arrays. Results: Fluorescence intensity (firefly luciferase of transfected H292 cells showed no increase using a 96-well NdFeB magnet array when the magnets were 5 mm from the cell culture plate or nearer. At 6 mm and higher, fluorescence intensity decreased systematically. Conclusion: In all cases, fluorescence intensity was higher when using an oscillating array compared to a static array. For distances closer than 5 mm, the oscillating system also outperformed Lipofectamine 2000™.

  20. Modelling human resource requirements for the nuclear industry in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Roelofs, Ferry [Nuclear Research and Consultancy Group (NRG) (Netherlands); Flore, Massimo; Estorff, Ulrik von [Joint Research Center (JRC) (Netherlands)

    2017-11-15

    The European Human Resource Observatory for Nuclear (EHRO-N) provides the European Commission with essential data related to supply and demand for nuclear experts in the EU-28 and the enlargement and integration countries based on bottom-up information from the nuclear industry. The objective is to assess how the supply of experts for the nuclear industry responds to the needs for the same experts for present and future nuclear projects in the region. Complementary to the bottom-up approach taken by the EHRO-N team at JRC, a top-down modelling approach has been taken in a collaboration with NRG in the Netherlands. This top-down modelling approach focuses on the human resource requirements for operation, construction, decommissioning, and efforts for long term operation of nuclear power plants. This paper describes the top-down methodology, the model input, the main assumptions, and the results of the analyses.

  1. Modelling human resource requirements for the nuclear industry in Europe

    International Nuclear Information System (INIS)

    Roelofs, Ferry; Flore, Massimo; Estorff, Ulrik von

    2017-01-01

    The European Human Resource Observatory for Nuclear (EHRO-N) provides the European Commission with essential data related to supply and demand for nuclear experts in the EU-28 and the enlargement and integration countries based on bottom-up information from the nuclear industry. The objective is to assess how the supply of experts for the nuclear industry responds to the needs for the same experts for present and future nuclear projects in the region. Complementary to the bottom-up approach taken by the EHRO-N team at JRC, a top-down modelling approach has been taken in a collaboration with NRG in the Netherlands. This top-down modelling approach focuses on the human resource requirements for operation, construction, decommissioning, and efforts for long term operation of nuclear power plants. This paper describes the top-down methodology, the model input, the main assumptions, and the results of the analyses.

  2. Empirically evaluating decision-analytic models.

    Science.gov (United States)

    Goldhaber-Fiebert, Jeremy D; Stout, Natasha K; Goldie, Sue J

    2010-08-01

    Model-based cost-effectiveness analyses support decision-making. To augment model credibility, evaluation via comparison to independent, empirical studies is recommended. We developed a structured reporting format for model evaluation and conducted a structured literature review to characterize current model evaluation recommendations and practices. As an illustration, we applied the reporting format to evaluate a microsimulation of human papillomavirus and cervical cancer. The model's outputs and uncertainty ranges were compared with multiple outcomes from a study of long-term progression from high-grade precancer (cervical intraepithelial neoplasia [CIN]) to cancer. Outcomes included 5 to 30-year cumulative cancer risk among women with and without appropriate CIN treatment. Consistency was measured by model ranges overlapping study confidence intervals. The structured reporting format included: matching baseline characteristics and follow-up, reporting model and study uncertainty, and stating metrics of consistency for model and study results. Structured searches yielded 2963 articles with 67 meeting inclusion criteria and found variation in how current model evaluations are reported. Evaluation of the cervical cancer microsimulation, reported using the proposed format, showed a modeled cumulative risk of invasive cancer for inadequately treated women of 39.6% (30.9-49.7) at 30 years, compared with the study: 37.5% (28.4-48.3). For appropriately treated women, modeled risks were 1.0% (0.7-1.3) at 30 years, study: 1.5% (0.4-3.3). To support external and projective validity, cost-effectiveness models should be iteratively evaluated as new studies become available, with reporting standardized to facilitate assessment. Such evaluations are particularly relevant for models used to conduct comparative effectiveness analyses.

  3. World Integrated Nuclear Evaluation System: Model documentation

    International Nuclear Information System (INIS)

    1991-12-01

    The World Integrated Nuclear Evaluation System (WINES) is an aggregate demand-based partial equilibrium model used by the Energy Information Administration (EIA) to project long-term domestic and international nuclear energy requirements. WINES follows a top-down approach in which economic growth rates, delivered energy demand growth rates, and electricity demand are projected successively to ultimately forecast total nuclear generation and nuclear capacity. WINES could be potentially used to produce forecasts for any country or region in the world. Presently, WINES is being used to generate long-term forecasts for the United States, and for all countries with commercial nuclear programs in the world, excluding countries located in centrally planned economic areas. Projections for the United States are developed for the period from 2010 through 2030, and for other countries for the period starting in 2000 or 2005 (depending on the country) through 2010. EIA uses a pipeline approach to project nuclear capacity for the period between 1990 and the starting year for which the WINES model is used. This approach involves a detailed accounting of existing nuclear generating units and units under construction, their capacities, their actual or estimated time of completion, and the estimated date of retirements. Further detail on this approach can be found in Appendix B of Commercial Nuclear Power 1991: Prospects for the United States and the World

  4. Design requirements for SRB production control system. Volume 3: Package evaluation, modification and hardware

    Science.gov (United States)

    1981-01-01

    The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.

  5. Rock mechanics models evaluation report: Draft report

    International Nuclear Information System (INIS)

    1985-10-01

    This report documents the evaluation of the thermal and thermomechanical models and codes for repository subsurface design and for design constraint analysis. The evaluation was based on a survey of the thermal and thermomechanical codes and models that are applicable to subsurface design, followed by a Kepner-Tregoe (KT) structured decision analysis of the codes and models. The end result of the KT analysis is a balanced, documented recommendation of the codes and models which are best suited to conceptual subsurface design for the salt repository. The various laws for modeling the creep of rock salt are also reviewed in this report. 37 refs., 1 fig., 7 tabs

  6. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  7. Life sciences research in space: The requirement for animal models

    Science.gov (United States)

    Fuller, C. A.; Philips, R. W.; Ballard, R. W.

    1987-01-01

    Use of animals in NASA space programs is reviewed. Animals are needed because life science experimentation frequently requires long-term controlled exposure to environments, statistical validation, invasive instrumentation or biological tissue sampling, tissue destruction, exposure to dangerous or unknown agents, or sacrifice of the subject. The availability and use of human subjects inflight is complicated by the multiple needs and demands upon crew time. Because only living organisms can sense, integrate and respond to the environment around them, the sole use of tissue culture and computer models is insufficient for understanding the influence of the space environment on intact organisms. Equipment for spaceborne experiments with animals is described.

  8. Modeling of the global carbon cycle - isotopic data requirements

    International Nuclear Information System (INIS)

    Ciais, P.

    1994-01-01

    Isotopes are powerful tools to constrain carbon cycle models. For example, the combinations of the CO 2 and the 13 C budget allows to calculate the net-carbon fluxes between atmosphere, ocean, and biosphere. Observations of natural and bomb-produced radiocarbon allow to estimate gross carbon exchange fluxes between different reservoirs and to deduce time scales of carbon overturning in important reservoirs. 18 O in CO 2 is potentially a tool to make the deconvolution of C fluxes within the land biosphere (assimilation vs respirations). The scope of this article is to identify gaps in our present knowledge about isotopes in the light of their use as constraint for the global carbon cycle. In the following we will present a list of some future data requirements for carbon cycle models. (authors)

  9. Model Penentuan Nilai Target Functional Requirement Berbasis Utilitas

    Directory of Open Access Journals (Sweden)

    Cucuk Nur Rosyidi

    2012-01-01

    Full Text Available In a product design and development process, a designer faces a problem to decide functional requirement (FR target values. That decision is made under a risk since it is conducted in the early design phase using incomplete information. Utility function can be used to reflect the decision maker attitude towards the risk in making such decision. In this research, we develop a utility-based model to determine FR target values using quadratic utility function and information from Quality Function Deployment (QFD. A pencil design is used as a numerical example using quadratic utility function for each FR. The model can be applied for balancing customer and designer interest in determining FR target values.

  10. Requirements for high level models supporting design space exploration in model-based systems engineering

    NARCIS (Netherlands)

    Haveman, Steven; Bonnema, Gerrit Maarten

    2013-01-01

    Most formal models are used in detailed design and focus on a single domain. Few effective approaches exist that can effectively tie these lower level models to a high level system model during design space exploration. This complicates the validation of high level system requirements during

  11. A GENERALIZATION OF TRADITIONAL KANO MODEL FOR CUSTOMER REQUIREMENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Renáta Turisová

    2015-07-01

    Full Text Available Purpose: The theory of attractiveness determines the relationship between the technically achieved and customer perceived quality of product attributes. The most frequently used approach in the theory of attractiveness is the implementation of Kano‘s model. There exist a lot of generalizations of that model which take into consideration various aspects and approaches focused on understanding the customer preferences and identification of his priorities for a selling  product. The aim of this article is to outline another possible generalization of Kano‘s model.Methodology/Approach: The traditional Kano’s model captures the nonlinear relationship between reached attributes of quality and customer requirements. The individual attributes of quality are divided into three main categories: must-be, one-dimensional, attractive quality and into two side categories: indifferent and reverse quality. The well selling product has to contain the must-be attribute. It should contain as many one-dimensional attributes as possible. If there are also supplementary attractive attributes, it means that attractiveness of the entire product, from the viewpoint of the customer, nonlinearly sharply rises what has a direct positive impact on a decision of potential customer when purchasing the product. In this article, we show that inclusion of individual quality attributes of a product to the mentioned categories depends, among other things, also on costs on life cycle of the product, respectively on a price of the product on the market.Findings: In practice, we are often encountering the inclusion of products into different price categories: lower, middle and upper class. For a certain type of products the category is either directly declared by a producer (especially in automotive industry, or is determined by a customer by means of assessment of available market prices. To each of those groups of a products different customer expectations can be assigned

  12. Decision-tree approach to evaluating inactive uranium-processing sites for liner requirements

    International Nuclear Information System (INIS)

    Relyea, J.F.

    1983-03-01

    Recently, concern has been expressed about potential toxic effects of both radon emission and release of toxic elements in leachate from inactive uranium mill tailings piles. Remedial action may be required to meet disposal standards set by the states and the US Environmental Protection Agency (EPA). In some cases, a possible disposal option is the exhumation and reburial (either on site or at a new location) of tailings and reliance on engineered barriers to satisfy the objectives established for remedial actions. Liners under disposal pits are the major engineered barrier for preventing contaminant release to ground and surface water. The purpose of this report is to provide a logical sequence of action, in the form of a decision tree, which could be followed to show whether a selected tailings disposal design meets the objectives for subsurface contaminant release without a liner. This information can be used to determine the need and type of liner for sites exhibiting a potential groundwater problem. The decision tree is based on the capability of hydrologic and mass transport models to predict the movement of water and contaminants with time. The types of modeling capabilities and data needed for those models are described, and the steps required to predict water and contaminant movement are discussed. A demonstration of the decision tree procedure is given to aid the reader in evaluating the need for the adequacy of a liner

  13. Individual model evaluation and probabilistic weighting of models

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-01-01

    This note stresses the importance of trying to assess the accuracy of each model individually. Putting a Bayesian probability distribution on a population of models faces conceptual and practical complications, and apparently can come only after the work of evaluating the individual models. Moreover, the primary issue is open-quotes How good is this modelclose quotes? Therefore, the individual evaluations are first in both chronology and importance. They are not easy, but some ideas are given here on how to perform them

  14. Evaluation of green house gas emissions models.

    Science.gov (United States)

    2014-11-01

    The objective of the project is to evaluate the GHG emissions models used by transportation agencies and industry leaders. Factors in the vehicle : operating environment that may affect modal emissions, such as, external conditions, : vehicle fleet c...

  15. Evaluation of CASP8 model quality predictions

    KAUST Repository

    Cozzetto, Domenico; Kryshtafovych, Andriy; Tramontano, Anna

    2009-01-01

    established a prediction category to evaluate their performance in 2006. In 2008 the experiment was repeated and its results are reported here. Participants were invited to infer the correctness of the protein models submitted by the registered automatic

  16. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-01-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a ''living document'' that will be modified over the course of the execution of this work

  17. The case for applying an early-lifecycle technology evaluation methodology to comparative evaluation of requirements engineering research

    Science.gov (United States)

    Feather, Martin S.

    2003-01-01

    The premise of this paper is taht there is a useful analogy between evaluation of proposed problem solutions and evaluation of requirements engineering research itself. Both of these application areas face the challenges of evaluation early in the lifecycle, of the need to consider a wide variety of factors, and of the need to combine inputs from multiple stakeholders in making thse evaluation and subsequent decisions.

  18. Modeling the minimum enzymatic requirements for optimal cellulose conversion

    International Nuclear Information System (INIS)

    Den Haan, R; Van Zyl, W H; Van Zyl, J M; Harms, T M

    2013-01-01

    Hydrolysis of cellulose is achieved by the synergistic action of endoglucanases, exoglucanases and β-glucosidases. Most cellulolytic microorganisms produce a varied array of these enzymes and the relative roles of the components are not easily defined or quantified. In this study we have used partially purified cellulases produced heterologously in the yeast Saccharomyces cerevisiae to increase our understanding of the roles of some of these components. CBH1 (Cel7), CBH2 (Cel6) and EG2 (Cel5) were separately produced in recombinant yeast strains, allowing their isolation free of any contaminating cellulolytic activity. Binary and ternary mixtures of the enzymes at loadings ranging between 3 and 100 mg g −1 Avicel allowed us to illustrate the relative roles of the enzymes and their levels of synergy. A mathematical model was created to simulate the interactions of these enzymes on crystalline cellulose, under both isolated and synergistic conditions. Laboratory results from the various mixtures at a range of loadings of recombinant enzymes allowed refinement of the mathematical model. The model can further be used to predict the optimal synergistic mixes of the enzymes. This information can subsequently be applied to help to determine the minimum protein requirement for complete hydrolysis of cellulose. Such knowledge will be greatly informative for the design of better enzymatic cocktails or processing organisms for the conversion of cellulosic biomass to commodity products. (letter)

  19. Metrics for Evaluation of Student Models

    Science.gov (United States)

    Pelanek, Radek

    2015-01-01

    Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…

  20. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  1. An evaluation of BPMN modeling tools

    NARCIS (Netherlands)

    Yan, Z.; Reijers, H.A.; Dijkman, R.M.; Mendling, J.; Weidlich, M.

    2010-01-01

    Various BPMN modeling tools are available and it is close to impossible to understand their functional differences without simply trying them out. This paper presents an evaluation framework and presents the outcomes of its application to a set of five BPMN modeling tools. We report on various

  2. 42 CFR 418.205 - Special requirements for hospice pre-election evaluation and counseling services.

    Science.gov (United States)

    2010-10-01

    ... evaluation and counseling services. 418.205 Section 418.205 Public Health CENTERS FOR MEDICARE & MEDICAID... Services § 418.205 Special requirements for hospice pre-election evaluation and counseling services. (a... evaluation and counseling services as specified in § 418.304(d) may be made to a hospice on behalf of a...

  3. Evaluation model development for sprinkler irrigation uniformity ...

    African Journals Online (AJOL)

    A new evaluation method with accompanying software was developed to precisely calculate uniformity from catch-can test data, assuming sprinkler distribution data to be a continuous variable. Two interpolation steps are required to compute unknown water application depths at grid distribution points from radial ...

  4. Model-Based Requirements Management in Gear Systems Design Based On Graph-Based Design Languages

    Directory of Open Access Journals (Sweden)

    Kevin Holder

    2017-10-01

    Full Text Available For several decades, a wide-spread consensus concerning the enormous importance of an in-depth clarification of the specifications of a product has been observed. A weak clarification of specifications is repeatedly listed as a main cause for the failure of product development projects. Requirements, which can be defined as the purpose, goals, constraints, and criteria associated with a product development project, play a central role in the clarification of specifications. The collection of activities which ensure that requirements are identified, documented, maintained, communicated, and traced throughout the life cycle of a system, product, or service can be referred to as “requirements engineering”. These activities can be supported by a collection and combination of strategies, methods, and tools which are appropriate for the clarification of specifications. Numerous publications describe the strategy and the components of requirements management. Furthermore, recent research investigates its industrial application. Simultaneously, promising developments of graph-based design languages for a holistic digital representation of the product life cycle are presented. Current developments realize graph-based languages by the diagrams of the Unified Modelling Language (UML, and allow the automatic generation and evaluation of multiple product variants. The research presented in this paper seeks to present a method in order to combine the advantages of a conscious requirements management process and graph-based design languages. Consequently, the main objective of this paper is the investigation of a model-based integration of requirements in a product development process by means of graph-based design languages. The research method is based on an in-depth analysis of an exemplary industrial product development, a gear system for so-called “Electrical Multiple Units” (EMU. Important requirements were abstracted from a gear system

  5. Evaluation of constitutive models for crushed salt

    International Nuclear Information System (INIS)

    Callahan, G.D.; Loken, M.C.; Hurtado, L.D.; Hansen, F.D.

    1996-01-01

    Three constitutive models are recommended as candidates for describing the deformation of crushed salt. These models are generalized to three-dimensional states of stress to include the effects of mean and deviatoric stress and modified to include effects of temperature, grain size, and moisture content. A database including hydrostatic consolidation and shear consolidation tests conducted on Waste Isolation Pilot Plant (WIPP) and southeastern New Mexico salt is used to determine material parameters for the models. To evaluate the capability of the models, parameter values obtained from fitting the complete database are used to predict the individual tests. Finite element calculations of a WIPP shaft with emplaced crushed salt demonstrate the model predictions

  6. Modeling for Green Supply Chain Evaluation

    Directory of Open Access Journals (Sweden)

    Elham Falatoonitoosi

    2013-01-01

    Full Text Available Green supply chain management (GSCM has become a practical approach to develop environmental performance. Under strict regulations and stakeholder pressures, enterprises need to enhance and improve GSCM practices, which are influenced by both traditional and green factors. This study developed a causal evaluation model to guide selection of qualified suppliers by prioritizing various criteria and mapping causal relationships to find effective criteria to improve green supply chain. The aim of the case study was to model and examine the influential and important main GSCM practices, namely, green logistics, organizational performance, green organizational activities, environmental protection, and green supplier evaluation. In the case study, decision-making trial and evaluation laboratory technique is applied to test the developed model. The result of the case study shows only “green supplier evaluation” and “green organizational activities” criteria of the model are in the cause group and the other criteria are in the effect group.

  7. Large scale Bayesian nuclear data evaluation with consistent model defects

    International Nuclear Information System (INIS)

    Schnabel, G

    2015-01-01

    The aim of nuclear data evaluation is the reliable determination of cross sections and related quantities of the atomic nuclei. To this end, evaluation methods are applied which combine the information of experiments with the results of model calculations. The evaluated observables with their associated uncertainties and correlations are assembled into data sets, which are required for the development of novel nuclear facilities, such as fusion reactors for energy supply, and accelerator driven systems for nuclear waste incineration. The efficiency and safety of such future facilities is dependent on the quality of these data sets and thus also on the reliability of the applied evaluation methods. This work investigated the performance of the majority of available evaluation methods in two scenarios. The study indicated the importance of an essential component in these methods, which is the frequently ignored deficiency of nuclear models. Usually, nuclear models are based on approximations and thus their predictions may deviate from reliable experimental data. As demonstrated in this thesis, the neglect of this possibility in evaluation methods can lead to estimates of observables which are inconsistent with experimental data. Due to this finding, an extension of Bayesian evaluation methods is proposed to take into account the deficiency of the nuclear models. The deficiency is modeled as a random function in terms of a Gaussian process and combined with the model prediction. This novel formulation conserves sum rules and allows to explicitly estimate the magnitude of model deficiency. Both features are missing in available evaluation methods so far. Furthermore, two improvements of existing methods have been developed in the course of this thesis. The first improvement concerns methods relying on Monte Carlo sampling. A Metropolis-Hastings scheme with a specific proposal distribution is suggested, which proved to be more efficient in the studied scenarios than the

  8. A website evaluation model by integration of previous evaluation models using a quantitative approach

    Directory of Open Access Journals (Sweden)

    Ali Moeini

    2015-01-01

    Full Text Available Regarding the ecommerce growth, websites play an essential role in business success. Therefore, many authors have offered website evaluation models since 1995. Although, the multiplicity and diversity of evaluation models make it difficult to integrate them into a single comprehensive model. In this paper a quantitative method has been used to integrate previous models into a comprehensive model that is compatible with them. In this approach the researcher judgment has no role in integration of models and the new model takes its validity from 93 previous models and systematic quantitative approach.

  9. Evaluating Extensions to Coherent Mortality Forecasting Models

    Directory of Open Access Journals (Sweden)

    Syazreen Shair

    2017-03-01

    Full Text Available Coherent models were developed recently to forecast the mortality of two or more sub-populations simultaneously and to ensure long-term non-divergent mortality forecasts of sub-populations. This paper evaluates the forecast accuracy of two recently-published coherent mortality models, the Poisson common factor and the product-ratio functional models. These models are compared to each other and the corresponding independent models, as well as the original Lee–Carter model. All models are applied to age-gender-specific mortality data for Australia and Malaysia and age-gender-ethnicity-specific data for Malaysia. The out-of-sample forecast error of log death rates, male-to-female death rate ratios and life expectancy at birth from each model are compared and examined across groups. The results show that, in terms of overall accuracy, the forecasts of both coherent models are consistently more accurate than those of the independent models for Australia and for Malaysia, but the relative performance differs by forecast horizon. Although the product-ratio functional model outperforms the Poisson common factor model for Australia, the Poisson common factor is more accurate for Malaysia. For the ethnic groups application, ethnic-coherence gives better results than gender-coherence. The results provide evidence that coherent models are preferable to independent models for forecasting sub-populations’ mortality.

  10. A hierarchical modeling methodology for the definition and selection of requirements

    Science.gov (United States)

    Dufresne, Stephane

    This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the

  11. A 2nd generation static model of greenhouse energy requirements (horticern) : a comparison with dynamic models

    CERN Document Server

    Jolliet, O; Munday, G L

    1989-01-01

    Optimisation of a greenhouse and its components requires a suitable model permitting precise determination of its energy requirements. Existing static models are simple but lack precision; dynamic models though more precise, are unsuitable for use over long periods and difficult to handle in practice. A theoretical study and measurements from the CERN trial greenhouse have allowed the development of new static model named "HORTICERN", precise and easy to use for predicting energy consumption and which takes into account effects of solar energy, wind and radiative loss to the sky. This paper compares the HORTICERN model with the dynamic models of Bot, Takakura, Van Bavel and Gembloux, and demonstrates that its precision is comparable; differences on average being less than 5%, it is independent of type of greenhouse (e.g. single or double glazing, Hortiplus, etc.) and climate. The HORTICERN method has been developed for PC use and is proving to be a powerful tool for greenhouse optimisation by research work...

  12. Saphire models and software for ASP evaluations

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission's (NRC's) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented

  13. Defining Requirements and Applying Information Modeling for Protecting Enterprise Assets

    Science.gov (United States)

    Fortier, Stephen C.; Volk, Jennifer H.

    The advent of terrorist threats has heightened local, regional, and national governments' interest in emergency response and disaster preparedness. The threat of natural disasters also challenges emergency responders to act swiftly and in a coordinated fashion. When a disaster occurs, an ad hoc coalition of pre-planned groups usually forms to respond to the incident. History has shown that these “system of systems” do not interoperate very well. Communications between fire, police and rescue components either do not work or are inefficient. Government agencies, non-governmental organizations (NGOs), and private industry use a wide array of software platforms for managing data about emergency conditions, resources and response activities. Most of these are stand-alone systems with very limited capability for data sharing with other agencies or other levels of government. Information technology advances have facilitated the movement towards an integrated and coordinated approach to emergency management. Other communication mechanisms, such as video teleconferencing, digital television and radio broadcasting, are being utilized to combat the challenges of emergency information exchange. Recent disasters, such as Hurricane Katrina and the tsunami in Indonesia, have illuminated the weaknesses in emergency response. This paper will discuss the need for defining requirements for components of ad hoc coalitions which are formed to respond to disasters. A goal of our effort was to develop a proof of concept that applying information modeling to the business processes used to protect and mitigate potential loss of an enterprise was feasible. These activities would be modeled both pre- and post-incident.

  14. Modeling Energy and Development : An Evaluation of Models and Concepts

    NARCIS (Netherlands)

    Ruijven, Bas van; Urban, Frauke; Benders, René M.J.; Moll, Henri C.; Sluijs, Jeroen P. van der; Vries, Bert de; Vuuren, Detlef P. van

    2008-01-01

    Most global energy models are developed by institutes from developed countries focusing primarily oil issues that are important in industrialized countries. Evaluation of the results for Asia of the IPCC/SRES models shows that broad concepts of energy and development. the energy ladder and the

  15. Model visualization for evaluation of biocatalytic processes

    DEFF Research Database (Denmark)

    Law, HEM; Lewis, DJ; McRobbie, I

    2008-01-01

    Biocatalysis offers great potential as an additional, and in some cases as an alternative, synthetic tool for organic chemists, especially as a route to introduce chirality. However, the implementation of scalable biocatalytic processes nearly always requires the introduction of process and/or bi......,S-EDDS), a biodegradable chelant, and is characterised by the use of model visualization using `windows of operation"....

  16. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    Science.gov (United States)

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model

  17. Evaluation of data requirements for computerized constructability analysis of pavement rehabilitation projects.

    Science.gov (United States)

    2013-08-01

    This research aimed to evaluate the data requirements for computer assisted construction planning : and staging methods that can be implemented in pavement rehabilitation projects in the state of : Georgia. Results showed that two main issues for the...

  18. Functional Fit Evaluation to Determine Optimal Ease Requirements in Canadian Forces Chemical Protective Gloves

    National Research Council Canada - National Science Library

    Tremblay-Lutter, Julie

    1995-01-01

    A functional fit evaluation of the Canadian Forces (CF) chemical protective lightweight glove was undertaken in order to quantify the amount of ease required within the glove for optimal functional fit...

  19. The Modeling of Factors That Influence Coast Guard Manpower Requirements

    Science.gov (United States)

    2014-12-01

    applications, and common data warehouses needed to fully develop an effective and efficient manpower requirements engineering and management program. The... manpower requirements determination ensures a ready force, and safe and effective mission execution. Shortage or excess of manpower is the catalyst...FACTORS THAT INFLUENCE COAST GUARD MANPOWER REQUIREMENTS by Kara M. Lavin December 2014 Thesis Advisor: Ronald E. Giachetti Co-Advisor

  20. Models and data requirements for human reliability analysis

    International Nuclear Information System (INIS)

    1989-03-01

    It has been widely recognised for many years that the safety of the nuclear power generation depends heavily on the human factors related to plant operation. This has been confirmed by the accidents at Three Mile Island and Chernobyl. Both these cases revealed how human actions can defeat engineered safeguards and the need for special operator training to cover the possibility of unexpected plant conditions. The importance of the human factor also stands out in the analysis of abnormal events and insights from probabilistic safety assessments (PSA's), which reveal a large proportion of cases having their origin in faulty operator performance. A consultants' meeting, organized jointly by the International Atomic Energy Agency (IAEA) and the International Institute for Applied Systems Analysis (IIASA) was held at IIASA in Laxenburg, Austria, December 7-11, 1987, with the aim of reviewing existing models used in Probabilistic Safety Assessment (PSA) for Human Reliability Analysis (HRA) and of identifying the data required. The report collects both the contributions offered by the members of the Expert Task Force and the findings of the extensive discussions that took place during the meeting. Refs, figs and tabs

  1. CRITICAL ANALYSIS OF EVALUATION MODEL LOMCE

    Directory of Open Access Journals (Sweden)

    José Luis Bernal Agudo

    2015-06-01

    Full Text Available The evaluation model that the LOMCE projects sinks its roots into the neoliberal beliefs, reflecting a specific way of understanding the world. What matters is not the process but the results, being the evaluation the center of the education-learning processes. It presents an evil planning, since the theory that justifies the model doesn’t specify upon coherent proposals, where there is an excessive worry for excellence and diversity is left out. A comprehensive way of understanding education should be recovered.

  2. Adapting Evaluations of Alternative Payment Models to a Changing Environment.

    Science.gov (United States)

    Grannemann, Thomas W; Brown, Randall S

    2018-04-01

    To identify the most robust methods for evaluating alternative payment models (APMs) in the emerging health care delivery system environment. We assess the impact of widespread testing of alternative payment models on the ability to find credible comparison groups. We consider the applicability of factorial research designs for assessing the effects of these models. The widespread adoption of alternative payment models could effectively eliminate the possibility of comparing APM results with a "pure" control or comparison group unaffected by other interventions. In this new environment, factorial experiments have distinct advantages over the single-model experimental or quasi-experimental designs that have been the mainstay of recent tests of Medicare payment and delivery models. The best prospects for producing definitive evidence of the effects of payment incentives for APMs include fractional factorial experiments that systematically vary requirements and payment provisions within a payment model. © Health Research and Educational Trust.

  3. Study on team evaluation. Team process model for team evaluation

    International Nuclear Information System (INIS)

    Sasou Kunihide; Ebisu, Mitsuhiro; Hirose, Ayako

    2004-01-01

    Several studies have been done to evaluate or improve team performance in nuclear and aviation industries. Crew resource management is the typical example. In addition, team evaluation recently gathers interests in other teams of lawyers, medical staff, accountants, psychiatrics, executive, etc. However, the most evaluation methods focus on the results of team behavior that can be observed through training or actual business situations. What is expected team is not only resolving problems but also training younger members being destined to lead the next generation. Therefore, the authors set the final goal of this study establishing a series of methods to evaluate and improve teams inclusively such as decision making, motivation, staffing, etc. As the first step, this study develops team process model describing viewpoints for the evaluation. The team process is defined as some kinds of power that activate or inactivate competency of individuals that is the components of team's competency. To find the team process, the authors discussed the merits of team behavior with the experienced training instructors and shift supervisors of nuclear/thermal power plants. The discussion finds four team merits and many components to realize those team merits. Classifying those components into eight groups of team processes such as 'Orientation', 'Decision Making', 'Power and Responsibility', 'Workload Management', 'Professional Trust', 'Motivation', 'Training' and 'staffing', the authors propose Team Process Model with two to four sub processes in each team process. In the future, the authors will develop methods to evaluate some of the team processes for nuclear/thermal power plant operation teams. (author)

  4. Evaluating significance in linear mixed-effects models in R.

    Science.gov (United States)

    Luke, Steven G

    2017-08-01

    Mixed-effects models are being used ever more frequently in the analysis of experimental data. However, in the lme4 package in R the standards for evaluating significance of fixed effects in these models (i.e., obtaining p-values) are somewhat vague. There are good reasons for this, but as researchers who are using these models are required in many cases to report p-values, some method for evaluating the significance of the model output is needed. This paper reports the results of simulations showing that the two most common methods for evaluating significance, using likelihood ratio tests and applying the z distribution to the Wald t values from the model output (t-as-z), are somewhat anti-conservative, especially for smaller sample sizes. Other methods for evaluating significance, including parametric bootstrapping and the Kenward-Roger and Satterthwaite approximations for degrees of freedom, were also evaluated. The results of these simulations suggest that Type 1 error rates are closest to .05 when models are fitted using REML and p-values are derived using the Kenward-Roger or Satterthwaite approximations, as these approximations both produced acceptable Type 1 error rates even for smaller samples.

  5. Evaluation-Function-based Model-free Adaptive Fuzzy Control

    Directory of Open Access Journals (Sweden)

    Agus Naba

    2016-12-01

    Full Text Available Designs of adaptive fuzzy controllers (AFC are commonly based on the Lyapunov approach, which requires a known model of the controlled plant. They need to consider a Lyapunov function candidate as an evaluation function to be minimized. In this study these drawbacks were handled by designing a model-free adaptive fuzzy controller (MFAFC using an approximate evaluation function defined in terms of the current state, the next state, and the control action. MFAFC considers the approximate evaluation function as an evaluative control performance measure similar to the state-action value function in reinforcement learning. The simulation results of applying MFAFC to the inverted pendulum benchmark verified the proposed scheme’s efficacy.

  6. 45 CFR 2516.810 - What types of evaluations are grantees and subgrantees required to perform?

    Science.gov (United States)

    2010-10-01

    ... Welfare (Continued) CORPORATION FOR NATIONAL AND COMMUNITY SERVICE SCHOOL-BASED SERVICE-LEARNING PROGRAMS...? All grantees and subgrantees are required to perform internal evaluations which are ongoing efforts to assess performance and improve quality. Grantees and subgrantees may, but are not required to, arrange...

  7. Spatial resolution requirements for traffic-related air pollutant exposure evaluations

    Science.gov (United States)

    Batterman, Stuart; Chambliss, Sarah; Isakov, Vlad

    2014-09-01

    Vehicle emissions represent one of the most important air pollution sources in most urban areas, and elevated concentrations of pollutants found near major roads have been associated with many adverse health impacts. To understand these impacts, exposure estimates should reflect the spatial and temporal patterns observed for traffic-related air pollutants. This paper evaluates the spatial resolution and zonal systems required to estimate accurately intraurban and near-road exposures of traffic-related air pollutants. The analyses use the detailed information assembled for a large (800 km2) area centered on Detroit, Michigan, USA. Concentrations of nitrogen oxides (NOx) due to vehicle emissions were estimated using hourly traffic volumes and speeds on 9700 links representing all but minor roads in the city, the MOVES2010 emission model, the RLINE dispersion model, local meteorological data, a temporal resolution of 1 h, and spatial resolution as low as 10 m. Model estimates were joined with the corresponding shape files to estimate residential exposures for 700,000 individuals at property parcel, census block, census tract, and ZIP code levels. We evaluate joining methods, the spatial resolution needed to meet specific error criteria, and the extent of exposure misclassification. To portray traffic-related air pollutant exposure, raster or inverse distance-weighted interpolations are superior to nearest neighbor approaches, and interpolations between receptors and points of interest should not exceed about 40 m near major roads, and 100 m at larger distances. For census tracts and ZIP codes, average exposures are overestimated since few individuals live very near major roads, the range of concentrations is compressed, most exposures are misclassified, and high concentrations near roads are entirely omitted. While smaller zones improve performance considerably, even block-level data can misclassify many individuals. To estimate exposures and impacts of traffic

  8. 45 CFR 2516.820 - What types of internal evaluation activities are required of programs?

    Science.gov (United States)

    2010-10-01

    ... required to: (a) Continuously assess management effectiveness, the quality of services provided, and the satisfaction of both participants and service recipients. Internal evaluations should seek frequent feedback... 45 Public Welfare 4 2010-10-01 2010-10-01 false What types of internal evaluation activities are...

  9. Non-parametric probabilistic forecasts of wind power: required properties and evaluation

    DEFF Research Database (Denmark)

    Pinson, Pierre; Nielsen, Henrik Aalborg; Møller, Jan Kloppenborg

    2007-01-01

    of a single or a set of quantile forecasts. The required and desirable properties of such probabilistic forecasts are defined and a framework for their evaluation is proposed. This framework is applied for evaluating the quality of two statistical methods producing full predictive distributions from point...

  10. Evaluating Performances of Traffic Noise Models | Oyedepo ...

    African Journals Online (AJOL)

    Traffic noise in decibel dB(A) were measured at six locations using 407780A Integrating Sound Level Meter, while spot speed and traffic volume were collected with cine-camera. The predicted sound exposure level (SEL) was evaluated using Burgess, British and FWHA model. The average noise level obtained are 77.64 ...

  11. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  12. Evaluation of Usability Utilizing Markov Models

    Science.gov (United States)

    Penedo, Janaina Rodrigues; Diniz, Morganna; Ferreira, Simone Bacellar Leal; Silveira, Denis S.; Capra, Eliane

    2012-01-01

    Purpose: The purpose of this paper is to analyze the usability of a remote learning system in its initial development phase, using a quantitative usability evaluation method through Markov models. Design/methodology/approach: The paper opted for an exploratory study. The data of interest of the research correspond to the possible accesses of users…

  13. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  14. Credit Risk Evaluation : Modeling - Analysis - Management

    OpenAIRE

    Wehrspohn, Uwe

    2002-01-01

    An analysis and further development of the building blocks of modern credit risk management: -Definitions of default -Estimation of default probabilities -Exposures -Recovery Rates -Pricing -Concepts of portfolio dependence -Time horizons for risk calculations -Quantification of portfolio risk -Estimation of risk measures -Portfolio analysis and portfolio improvement -Evaluation and comparison of credit risk models -Analytic portfolio loss distributions The thesis contributes to the evaluatio...

  15. Orbiter data reduction complex data processing requirements for the OFT mission evaluation team (level C)

    Science.gov (United States)

    1979-01-01

    This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.

  16. A model to evaluate quality and effectiveness of disease management.

    Science.gov (United States)

    Lemmens, K M M; Nieboer, A P; van Schayck, C P; Asin, J D; Huijsman, R

    2008-12-01

    Disease management has emerged as a new strategy to enhance quality of care for patients suffering from chronic conditions, and to control healthcare costs. So far, however, the effects of this strategy remain unclear. Although current models define the concept of disease management, they do not provide a systematic development or an explanatory theory of how disease management affects the outcomes of care. The objective of this paper is to present a framework for valid evaluation of disease-management initiatives. The evaluation model is built on two pillars of disease management: patient-related and professional-directed interventions. The effectiveness of these interventions is thought to be affected by the organisational design of the healthcare system. Disease management requires a multifaceted approach; hence disease-management programme evaluations should focus on the effects of multiple interventions, namely patient-related, professional-directed and organisational interventions. The framework has been built upon the conceptualisation of these disease-management interventions. Analysis of the underlying mechanisms of these interventions revealed that learning and behavioural theories support the core assumptions of disease management. The evaluation model can be used to identify the components of disease-management programmes and the mechanisms behind them, making valid comparison feasible. In addition, this model links the programme interventions to indicators that can be used to evaluate the disease-management programme. Consistent use of this framework will enable comparisons among disease-management programmes and outcomes in evaluation research.

  17. Estimating the incremental net health benefit of requirements for cardiovascular risk evaluation for diabetes therapies.

    Science.gov (United States)

    Chawla, Anita J; Mytelka, Daniel S; McBride, Stephan D; Nellesen, Dave; Elkins, Benjamin R; Ball, Daniel E; Kalsekar, Anupama; Towse, Adrian; Garrison, Louis P

    2014-03-01

    To evaluate the advantages and disadvantages of pre-approval requirements for safety data to detect cardiovascular (CV) risk contained in the December 2008 U.S. Food and Drug Administration (FDA) guidance for developing type 2 diabetes drugs compared with the February 2008 FDA draft guidance from the perspective of diabetes population health. We applied the incremental net health benefit (INHB) framework to quantify the benefits and risks of investigational diabetes drugs using a common survival metric (life-years [LYs]). We constructed a decision analytic model for clinical program development consistent with the requirements of each guidance and simulated diabetes drugs, some of which had elevated CV risk. Assuming constant research budgets, we estimate the impact of increased trial size on drugs investigated. We aggregate treatment benefit and CV risks for each approved drug over a 35-year horizon under each guidance. The quantitative analysis suggests that the December 2008 guidance adversely impacts diabetes population health. INHB was -1.80 million LYs, attributable to delayed access to diabetes therapies (-0 .18 million LYs) and fewer drugs (-1.64 million LYs), but partially offset by reduced CV risk exposure (0.02 million LYs). Results were robust in sensitivity analyses. The health outcomes impact of all potential benefits and risks should be evaluated in a common survival measure, including health gain from avoided adverse events, lost health benefits from delayed or for gone efficacious products, and impact of alternative policy approaches. Quantitative analysis of the December 2008 FDA guidance for diabetes therapies indicates that negative impact on patient health will result. Copyright © 2014 The Authors. Pharmacoepidemiology and Drug Safety published by John Wiley & Sons, Ltd.

  18. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  19. Dependencies among Architectural Views Got from Software Requirements Based on a Formal Model

    Directory of Open Access Journals (Sweden)

    Osis Janis

    2014-12-01

    Full Text Available A system architect has software requirements and some unspecified knowledge about a problem domain (e.g., an enterprise as source information for assessment and evaluation of possible solutions and getting the target point, a preliminary software design. The solving factor is architect’s experience and expertise in the problem domain (“AS-IS”. A proposed approach is dedicated to assist a system architect in making an appropriate decision on the solution (“TO-BE”. It is based on a formal mathematical model, Topological Functioning Model (TFM. Compliant TFMs can be transformed into software architectural views. The paper demonstrates and discusses tracing dependency links from the requirements to and between the architectural views.

  20. Department of Defense Enterprise Requirements and Acquisition Model

    Science.gov (United States)

    2011-06-01

    30  Figure 11: ExtendSim Icons ...collected through a series of interviews with space requirements and acquisition personnel from AFSPC Requirements directorate (AFSPC/ A5 ), the Under...of the many ExtendSim® icons are described and illustrated in Figure 11. The “Event/Activity” icon is implemented with a time duration allowing a

  1. Modelling and evaluation of surgical performance using hidden Markov models.

    Science.gov (United States)

    Megali, Giuseppe; Sinigaglia, Stefano; Tonet, Oliver; Dario, Paolo

    2006-10-01

    Minimally invasive surgery has become very widespread in the last ten years. Since surgeons experience difficulties in learning and mastering minimally invasive techniques, the development of training methods is of great importance. While the introduction of virtual reality-based simulators has introduced a new paradigm in surgical training, skill evaluation methods are far from being objective. This paper proposes a method for defining a model of surgical expertise and an objective metric to evaluate performance in laparoscopic surgery. Our approach is based on the processing of kinematic data describing movements of surgical instruments. We use hidden Markov model theory to define an expert model that describes expert surgical gesture. The model is trained on kinematic data related to exercises performed on a surgical simulator by experienced surgeons. Subsequently, we use this expert model as a reference model in the definition of an objective metric to evaluate performance of surgeons with different abilities. Preliminary results show that, using different topologies for the expert model, the method can be efficiently used both for the discrimination between experienced and novice surgeons, and for the quantitative assessment of surgical ability.

  2. A Model for Telestrok Network Evaluation

    DEFF Research Database (Denmark)

    Storm, Anna; Günzel, Franziska; Theiss, Stephan

    2011-01-01

    analysis lacking, current telestroke reimbursement by third-party payers is limited to special contracts and not included in the regular billing system. Based on a systematic literature review and expert interviews with health care economists, third-party payers and neurologists, a Markov model...... was developed from the third-party payer perspective. In principle, it enables telestroke networks to conduct cost-effectiveness studies, because the majority of the required data can be extracted from health insurance companies’ databases and the telestroke network itself. The model presents a basis...

  3. Evaluation of the BPMN According to the Requirements of the Enterprise Architecture Methodology

    Directory of Open Access Journals (Sweden)

    Václav Řepa

    2012-04-01

    Full Text Available This article evaluates some characteristics of the Business Process Modelling Notation from the perspective of the business system modelling methodology. Firstly the enterprise architecture context of the business process management as well as the importance of standards are discussed. Then the Business System Modelling Methodology is introduced with special attention paid to the Business Process Meta-model as a basis for the evaluation of the BPMN features. Particular basic concepts from the Business Process Meta-model are mapped to the usable constructs of the BPMN and related issues are analysed. Finally the basic conclusions are made and the general context is discussed.

  4. A New Rapid Simplified Model for Urban Rainstorm Inundation with Low Data Requirements

    Directory of Open Access Journals (Sweden)

    Ji Shen

    2016-11-01

    Full Text Available This paper proposes a new rapid simplified inundation model (NRSIM for flood inundation caused by rainstorms in an urban setting that can simulate the urban rainstorm inundation extent and depth in a data-scarce area. Drainage basins delineated from a floodplain map according to the distribution of the inundation sources serve as the calculation cells of NRSIM. To reduce data requirements and computational costs of the model, the internal topography of each calculation cell is simplified to a circular cone, and a mass conservation equation based on a volume spreading algorithm is established to simulate the interior water filling process. Moreover, an improved D8 algorithm is outlined for the simulation of water spilling between different cells. The performance of NRSIM is evaluated by comparing the simulated results with those from a traditional rapid flood spreading model (TRFSM for various resolutions of digital elevation model (DEM data. The results are as follows: (1 given high-resolution DEM data input, the TRFSM model has better performance in terms of precision than NRSIM; (2 the results from TRFSM are seriously affected by the decrease in DEM data resolution, whereas those from NRSIM are not; and (3 NRSIM always requires less computational time than TRFSM. Apparently, compared with the complex hydrodynamic or traditional rapid flood spreading model, NRSIM has much better applicability and cost-efficiency in real-time urban inundation forecasting for data-sparse areas.

  5. Model description and evaluation of model performance: DOSDIM model

    International Nuclear Information System (INIS)

    Lewyckyj, N.; Zeevaert, T.

    1996-01-01

    DOSDIM was developed to assess the impact to man from routine and accidental atmospheric releases. It is a compartmental, deterministic, radiological model. For an accidental release, dynamic transfer are used in opposition to a routine release for which equilibrium transfer factors are used. Parameters values were chosen to be conservative. Transfer between compartments are described by first-order differential equations. 2 figs

  6. Virtual reality technology as a tool for human factors requirements evaluation in design of the nuclear reactors control desks

    International Nuclear Information System (INIS)

    Grecco, Claudio H.S.; Santos, Isaac J.A.L.; Mol, Antonio C.A.; Carvalho, Paulo V.R.; Silva, Antonio C.F.; Ferreira, Francisco J.O.; Dutra, Marco A.M.

    2007-01-01

    The Virtual Reality (VR) is an advanced computer interface technology that allows the user to internet or to explore a three-dimensional environment through the computer, as was part of the virtual world. This technology presents great applicability in the most diverse areas of the human knowledge. This paper presents a study on the use of the VR as tool for human factors requirements evaluation in design of the nuclear reactors control desks. Moreover, this paper presents a case study: a virtual model of the control desk, developed using virtual reality technology to be used in the human factors requirements evaluation. This case study was developed in the Virtual Reality Laboratory at IEN, and understands the stereo visualization of the Argonauta research nuclear reactor control desk for a static ergonomic evaluation using check-lists, in accordance to the standards and human factors nuclear international guides (IEC 1771, NUREG-0700). (author)

  7. Virtual reality technology as a tool for human factors requirements evaluation in design of the nuclear reactors control desks

    Energy Technology Data Exchange (ETDEWEB)

    Grecco, Claudio H.S.; Santos, Isaac J.A.L.; Mol, Antonio C.A.; Carvalho, Paulo V.R.; Silva, Antonio C.F.; Ferreira, Francisco J.O.; Dutra, Marco A.M. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mail: grecco@ien.gov.br; luquetti@ien.gov.br; mol@ien.gov.br; paulov@ien.gov.br; tonico@ien.gov.br; fferreira@ien.gov.br; dutra@ien.gov.br

    2007-07-01

    The Virtual Reality (VR) is an advanced computer interface technology that allows the user to internet or to explore a three-dimensional environment through the computer, as was part of the virtual world. This technology presents great applicability in the most diverse areas of the human knowledge. This paper presents a study on the use of the VR as tool for human factors requirements evaluation in design of the nuclear reactors control desks. Moreover, this paper presents a case study: a virtual model of the control desk, developed using virtual reality technology to be used in the human factors requirements evaluation. This case study was developed in the Virtual Reality Laboratory at IEN, and understands the stereo visualization of the Argonauta research nuclear reactor control desk for a static ergonomic evaluation using check-lists, in accordance to the standards and human factors nuclear international guides (IEC 1771, NUREG-0700). (author)

  8. Performance Evaluation and Modelling of Container Terminals

    Science.gov (United States)

    Venkatasubbaiah, K.; Rao, K. Narayana; Rao, M. Malleswara; Challa, Suresh

    2018-02-01

    The present paper evaluates and analyzes the performance of 28 container terminals of south East Asia through data envelopment analysis (DEA), principal component analysis (PCA) and hybrid method of DEA-PCA. DEA technique is utilized to identify efficient decision making unit (DMU)s and to rank DMUs in a peer appraisal mode. PCA is a multivariate statistical method to evaluate the performance of container terminals. In hybrid method, DEA is integrated with PCA to arrive the ranking of container terminals. Based on the composite ranking, performance modelling and optimization of container terminals is carried out through response surface methodology (RSM).

  9. User Requirements Model for University Timetable Management System

    OpenAIRE

    Ahmad Althunibat; Mohammad I. Muhairat

    2016-01-01

    Automated timetables are used to schedule courses, lectures and rooms in universities by considering some constraints. Inconvenient and ineffective timetables often waste time and money. Therefore, it is important to investigate the requirements and potential needs of users. Thus, eliciting user requirements of University Timetable Management System (TMS) and their implication becomes an important process for the implementation of TMS. On this note, this paper seeks to propose a m...

  10. Evaluation of upper-shelf toughness requirements for reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Gamble, R.M.; Zahoor, A. (NOVETECH Corp., Rockville, MD (USA)); Hiser, A. (Materials Engineering Associates, Inc., Lanham, MD (USA)); Ernst, H.A.; Pollitz, E.T. (Georgia Inst. of Tech., Atlanta, GA (USA))

    1990-04-01

    This work assesses and applies the criteria recommended by the ASME Subgroup on Evaluation Standards for the evaluation of reactor pressure vessel beltline materials having upper shelf Charpy energies less than 50 ft-lbs. The assessment included comparison of the upper shelf energies required by the criteria recommended for Service Level A and B conditions and criteria proposed for evaluation of postulated Service Level C and D events. The criteria recommended for Service Level A and B conditions was used to evaluate Linde 80 weld material. 9 refs., 4 figs.

  11. Evaluation of upper-shelf toughness requirements for reactor pressure vessels

    International Nuclear Information System (INIS)

    Gamble, R.M.; Zahoor, A.; Hiser, A.; Ernst, H.A.; Pollitz, E.T.

    1990-04-01

    This work assesses and applies the criteria recommended by the ASME Subgroup on Evaluation Standards for the evaluation of reactor pressure vessel beltline materials having upper shelf Charpy energies less than 50 ft-lbs. The assessment included comparison of the upper shelf energies required by the criteria recommended for Service Level A and B conditions and criteria proposed for evaluation of postulated Service Level C and D events. The criteria recommended for Service Level A and B conditions was used to evaluate Linde 80 weld material. 9 refs., 4 figs

  12. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  13. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  14. Evaluating spatial patterns in hydrological modelling

    DEFF Research Database (Denmark)

    Koch, Julian

    the contiguous United Sates (10^6 km2). To this end, the thesis at hand applies a set of spatial performance metrics on various hydrological variables, namely land-surface-temperature (LST), evapotranspiration (ET) and soil moisture. The inspiration for the applied metrics is found in related fields...... is not fully exploited by current modelling frameworks due to the lack of suitable spatial performance metrics. Furthermore, the traditional model evaluation using discharge is found unsuitable to lay confidence on the predicted catchment inherent spatial variability of hydrological processes in a fully...

  15. An effective quality model for evaluating mobile websites

    International Nuclear Information System (INIS)

    Hassan, W.U.; Nawaz, M.T.; Syed, T.H.; Naseem, A.

    2015-01-01

    The Evolution in Web development in recent years has caused emergence of new area of mobile computing, Mobile phone has been transformed into high speed processing device capable of doing the processes which were suppose to be run only on computer previously, Modem mobile phones now have capability to process data with greater speed then desktop systems and with the inclusion of 3G and 4G networks, mobile became the prime choice for users to send and receive data from any device. As a result, there is a major increase in mobile website need and development but due to uniqueness of mobile website usage as compared to desktop website, there is a need to focus on quality aspect of mobile website, So, to increase and preserve quality of mobile website, a quality model is required which has to be designed specifically to evaluate mobile website quality, To design a mobile website quality model, a survey based methodology is used to gather the information regarding website unique usage in mobile from different users. On the basis of this information, a mobile website quality model is presented which aims to evaluate the quality of mobile websites. In proposed model, some sub characteristics are designed to evaluate mobile websites in particular. The result is a proposed model aims to evaluate features of website which are important in context of its deployment and its usability in mobile platform. (author)

  16. Metrics and Evaluation Models for Accessible Television

    DEFF Research Database (Denmark)

    Li, Dongxiao; Looms, Peter Olaf

    2014-01-01

    The adoption of the UN Convention on the Rights of Persons with Disabilities (UN CRPD) in 2006 has provided a global framework for work on accessibility, including information and communication technologies and audiovisual content. One of the challenges facing the application of the UN CRPD...... number of platforms on which audiovisual content needs to be distributed, requiring very clear multiplatform architectures to facilitate interworking and assure interoperability. As a consequence, the regular evaluations of progress being made by signatories to the UN CRPD protocol are difficult...

  17. Bioenergy crop models: Descriptions, data requirements and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Kang, Shujiang [ORNL; Zhang, Xuesong [Pacific Northwest National Laboratory (PNNL); Miguez, Fernando [Iowa State University; Izaurralde, Dr. R. Cesar [Pacific Northwest National Laboratory (PNNL); Post, Wilfred M [ORNL; Dietze, Michael [University of Illinois, Urbana-Champaign; Lynd, L. [Dartmouth College; Wullschleger, Stan D [ORNL

    2012-01-01

    Field studies that address the production of lignocellulosic biomass as a source of renewable energy provide critical data for the development of bioenergy crop models. A literature survey revealed that 14 models have been used for simulating bioenergy crops including herbaceous and woody bioenergy crops, and for crassulacean acid metabolism (CAM) crops. These models simulate field-scale production of biomass for switchgrass (ALMANAC, EPIC, and Agro-BGC), miscanthus (MISCANFOR, MISCANMOD, and WIMOVAC), sugarcane (APSIM, AUSCANE, and CANEGRO), and poplar and willow (SECRETS and 3PG). Two models are adaptations of dynamic global vegetation models and simulate biomass yields of miscanthus and sugarcane at regional scales (Agro-IBIS and LPJmL). Although it lacks the complexity of other bioenergy crop models, the environmental productivity index (EPI) is the only model used to estimate biomass production of CAM (Agave and Opuntia) plants. Except for the EPI model, all models include representations of leaf area dynamics, phenology, radiation interception and utilization, biomass production, and partitioning of biomass to roots and shoots. A few models simulate soil water, nutrient, and carbon cycle dynamics, making them especially useful for assessing the environmental consequences (e.g., erosion and nutrient losses) associated with the large-scale deployment of bioenergy crops. The rapid increase in use of models for energy crop simulation is encouraging; however, detailed information on the influence of climate, soils, and crop management practices on biomass production is scarce. Thus considerable work remains regarding the parameterization and validation of process-based models for bioenergy crops; generation and distribution of high-quality field data for model development and validation; and implementation of an integrated framework for efficient, high-resolution simulations of biomass production for use in planning sustainable bioenergy systems.

  18. Transport properties site descriptive model. Guidelines for evaluation and modelling

    International Nuclear Information System (INIS)

    Berglund, Sten; Selroos, Jan-Olof

    2004-04-01

    This report describes a strategy for the development of Transport Properties Site Descriptive Models within the SKB Site Investigation programme. Similar reports have been produced for the other disciplines in the site descriptive modelling (Geology, Hydrogeology, Hydrogeochemistry, Rock mechanics, Thermal properties, and Surface ecosystems). These reports are intended to guide the site descriptive modelling, but also to provide the authorities with an overview of modelling work that will be performed. The site descriptive modelling of transport properties is presented in this report and in the associated 'Strategy for the use of laboratory methods in the site investigations programme for the transport properties of the rock', which describes laboratory measurements and data evaluations. Specifically, the objectives of the present report are to: Present a description that gives an overview of the strategy for developing Site Descriptive Models, and which sets the transport modelling into this general context. Provide a structure for developing Transport Properties Site Descriptive Models that facilitates efficient modelling and comparisons between different sites. Provide guidelines on specific modelling issues where methodological consistency is judged to be of special importance, or where there is no general consensus on the modelling approach. The objectives of the site descriptive modelling process and the resulting Transport Properties Site Descriptive Models are to: Provide transport parameters for Safety Assessment. Describe the geoscientific basis for the transport model, including the qualitative and quantitative data that are of importance for the assessment of uncertainties and confidence in the transport description, and for the understanding of the processes at the sites. Provide transport parameters for use within other discipline-specific programmes. Contribute to the integrated evaluation of the investigated sites. The site descriptive modelling of

  19. Evaluation of CNN as anthropomorphic model observer

    Science.gov (United States)

    Massanes, Francesc; Brankov, Jovan G.

    2017-03-01

    Model observers (MO) are widely used in medical imaging to act as surrogates of human observers in task-based image quality evaluation, frequently towards optimization of reconstruction algorithms. In this paper, we explore the use of convolutional neural networks (CNN) to be used as MO. We will compare CNN MO to alternative MO currently being proposed and used such as the relevance vector machine based MO and channelized Hotelling observer (CHO). As the success of the CNN, and other deep learning approaches, is rooted in large data sets availability, which is rarely the case in medical imaging systems task-performance evaluation, we will evaluate CNN performance on both large and small training data sets.

  20. Small Animal Models for Evaluating Filovirus Countermeasures.

    Science.gov (United States)

    Banadyga, Logan; Wong, Gary; Qiu, Xiangguo

    2018-05-11

    The development of novel therapeutics and vaccines to treat or prevent disease caused by filoviruses, such as Ebola and Marburg viruses, depends on the availability of animal models that faithfully recapitulate clinical hallmarks of disease as it is observed in humans. In particular, small animal models (such as mice and guinea pigs) are historically and frequently used for the primary evaluation of antiviral countermeasures, prior to testing in nonhuman primates, which represent the gold-standard filovirus animal model. In the past several years, however, the filovirus field has witnessed the continued refinement of the mouse and guinea pig models of disease, as well as the introduction of the hamster and ferret models. We now have small animal models for most human-pathogenic filoviruses, many of which are susceptible to wild type virus and demonstrate key features of disease, including robust virus replication, coagulopathy, and immune system dysfunction. Although none of these small animal model systems perfectly recapitulates Ebola virus disease or Marburg virus disease on its own, collectively they offer a nearly complete set of tools in which to carry out the preclinical development of novel antiviral drugs.

  1. Implicit moral evaluations: A multinomial modeling approach.

    Science.gov (United States)

    Cameron, C Daryl; Payne, B Keith; Sinnott-Armstrong, Walter; Scheffer, Julian A; Inzlicht, Michael

    2017-01-01

    Implicit moral evaluations-i.e., immediate, unintentional assessments of the wrongness of actions or persons-play a central role in supporting moral behavior in everyday life. Yet little research has employed methods that rigorously measure individual differences in implicit moral evaluations. In five experiments, we develop a new sequential priming measure-the Moral Categorization Task-and a multinomial model that decomposes judgment on this task into multiple component processes. These include implicit moral evaluations of moral transgression primes (Unintentional Judgment), accurate moral judgments about target actions (Intentional Judgment), and a directional tendency to judge actions as morally wrong (Response Bias). Speeded response deadlines reduced Intentional Judgment but not Unintentional Judgment (Experiment 1). Unintentional Judgment was stronger toward moral transgression primes than non-moral negative primes (Experiments 2-4). Intentional Judgment was associated with increased error-related negativity, a neurophysiological indicator of behavioral control (Experiment 4). Finally, people who voted for an anti-gay marriage amendment had stronger Unintentional Judgment toward gay marriage primes (Experiment 5). Across Experiments 1-4, implicit moral evaluations converged with moral personality: Unintentional Judgment about wrong primes, but not negative primes, was negatively associated with psychopathic tendencies and positively associated with moral identity and guilt proneness. Theoretical and practical applications of formal modeling for moral psychology are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Evaluation procedure of software requirements specification for digital I and C of KNGR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Park, Jong Kyun; Lee, Ki Young; Kim, Jang Yeol; Cheon, Se Woo

    2001-06-01

    The accuracy of the specification of requirements of a digital system is of prime importance to the acceptance and success of the system. The development, use, and regulation of computer systems in nuclear reactor Instrumentation and Control (I and C) systems to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Korean Next Generation Reactor (KNGR) Software Safety Verification and Validation (SSVV) Task, Korea Atomic Energy Research Institute, which investigates different aspects of computer software in reactor I and C systems, and describes the engineering procedures for developing such a software. The purpose of this guideline is to give the software safety evaluator the trail map between the code and standards layer and the design methodology and documents layer for the software important to safety in nuclear power plants. Recently, the requirements specification of safety-critical software systems and safety analysis of them are being recognized as one of the important issues in the software life cycle, and being developed new regulatory positions and standards by the regulatory and the standardization organizations such as IAEA, IEC, and IEEE. We presented the procedure for evaluating the software requirements specifications of the KNGR protection systems. We believe it can be useful for both licenser and licensee to conduct an evaluation of the safety in the requirements phase of developing the software. The guideline consists of the requirements engineering for software of KNGR protection systems in chapter 1, the evaluation checklist of software requirements specification in chapter2.3, and the safety evaluation procedure of KNGR software requirements specification in chapter 2.4

  3. Commonsense Psychology and the Functional Requirements of Cognitive Models

    National Research Council Canada - National Science Library

    Gordon, Andrew S

    2005-01-01

    .... Rather than working to avoid the influence of commonsense psychology in cognitive modeling research, we propose to capitalize on progress in developing formal theories of commonsense psychology...

  4. Requirements for data integration platforms in biomedical research networks: a reference model.

    Science.gov (United States)

    Ganzinger, Matthias; Knaup, Petra

    2015-01-01

    Biomedical research networks need to integrate research data among their members and with external partners. To support such data sharing activities, an adequate information technology infrastructure is necessary. To facilitate the establishment of such an infrastructure, we developed a reference model for the requirements. The reference model consists of five reference goals and 15 reference requirements. Using the Unified Modeling Language, the goals and requirements are set into relation to each other. In addition, all goals and requirements are described textually in tables. This reference model can be used by research networks as a basis for a resource efficient acquisition of their project specific requirements. Furthermore, a concrete instance of the reference model is described for a research network on liver cancer. The reference model is transferred into a requirements model of the specific network. Based on this concrete requirements model, a service-oriented information technology architecture is derived and also described in this paper.

  5. Cleanliness Policy Implementation: Evaluating Retribution Model to Rise Public Satisfaction

    Science.gov (United States)

    Dailiati, Surya; Hernimawati; Prihati; Chintia Utami, Bunga

    2018-05-01

    This research is based on the principal issues concerning the evaluation of cleanliness retribution policy which has not been optimally be able to improve the Local Revenue of Pekanbaru City and has not improved the cleanliness of Pekanbaru City. It was estimated to be caused by the performance of Garden and Sanitation Department are not in accordance with the requirement of society of Pekanbaru City. The research method used in this study is a mixed method with sequential exploratory strategy. The data collection used are observation, interview and documentation for qualitative research as well as questionnaires for quantitative research. The collected data were analyzed with interactive model of Miles and Huberman for qualitative research and multiple regression analysis for quantitative research. The research result indicated that the model of cleanliness policy implementation that can increase of PAD Pekanbaru City and be able to improve people’s satisfaction divided into two (2) which are the evaluation model and the society satisfaction model. The evaluation model influence by criteria/variable of effectiveness, efficiency, adequacy, equity, responsiveness, and appropriateness, while the society satisfaction model influence by variables of society satisfaction, intentions, goals, plans, programs, and appropriateness of cleanliness retribution collection policy.

  6. EVALUATION OF RAINFALL-RUNOFF MODELS FOR MEDITERRANEAN SUBCATCHMENTS

    Directory of Open Access Journals (Sweden)

    A. Cilek

    2016-06-01

    Full Text Available The development and the application of rainfall-runoff models have been a corner-stone of hydrological research for many decades. The amount of rainfall and its intensity and variability control the generation of runoff and the erosional processes operating at different scales. These interactions can be greatly variable in Mediterranean catchments with marked hydrological fluctuations. The aim of the study was to evaluate the performance of rainfall-runoff model, for rainfall-runoff simulation in a Mediterranean subcatchment. The Pan-European Soil Erosion Risk Assessment (PESERA, a simplified hydrological process-based approach, was used in this study to combine hydrological surface runoff factors. In total 128 input layers derived from data set includes; climate, topography, land use, crop type, planting date, and soil characteristics, are required to run the model. Initial ground cover was estimated from the Landsat ETM data provided by ESA. This hydrological model was evaluated in terms of their performance in Goksu River Watershed, Turkey. It is located at the Central Eastern Mediterranean Basin of Turkey. The area is approximately 2000 km2. The landscape is dominated by bare ground, agricultural and forests. The average annual rainfall is 636.4mm. This study has a significant importance to evaluate different model performances in a complex Mediterranean basin. The results provided comprehensive insight including advantages and limitations of modelling approaches in the Mediterranean environment.

  7. Modelling and Simulation for Requirements Engineering and Options Analysis

    Science.gov (United States)

    2010-05-01

    should be performed to work successfully in the domain; and process-based techniques model the processes that occur in the work domain. There is a crisp ...acad/sed/sedres/ dm /erg/cwa. DRDC Toronto CR 2010-049 39 23. Can the current technique for developing simulation models for assessments

  8. Single High Fidelity Geometric Data Sets for LCM - Model Requirements

    Science.gov (United States)

    2006-11-01

    designed specifically to withstand severe underwater explosion (UNDEX) loading caused by the detonation of weapons such as bombs, missiles, mines and... Explosions ( BLEVEs ): The energy from a BLEVE is from a sudden change of phase of stored material. Tanks of liquids immersed in pool fires BLEVE when the...2.10.3 Summary of Data Requirements ....................................................... 46 2.11 Underwater Explosion

  9. Requirements engineering for trust management: Model, methodology, and reasoning

    NARCIS (Netherlands)

    Giorgini, P.; Massacci, F.; Mylopoulos, J.; Zannone, N.

    2006-01-01

    A number of recent proposals aim to incorporate security engineering into mainstream software engineering. Yet, capturing trust and security requirements at an organizational level, as opposed to an IT system level, and mapping these into security and trust management policies is still an open

  10. An evaluation of Tsyganenko magnetic field model

    International Nuclear Information System (INIS)

    Fairfield, D.H.

    1991-01-01

    A long-standing goal of magnetospheric physics has been to produce a model of the Earth's magnetic field that can accurately predict the field vector at all locations within the magnetosphere for all dipole tilt angles and for various solar wind or magnetic activity conditions. A number of models make such predictions, but some only for limited spatial regions, some only for zero tilt angle, and some only for arbitrary conditions. No models depend explicitly on solar wind conditions. A data set of more than 22,000 vector averages of the magnetosphere magnetic field over 0.5 R E regions is used to evaluate Tsyganenko's 1982 and 1987 magnetospheric magnetic field models. The magnetic field predicted by the model in various regions is compared to observations to find systematic discrepancies which future models might address. While agreement is generally good, discrepancies are noted which include: (1) a lack of adequate field line stretching in the tail and ring current regions; (2) an inability to predict weak enough fields in the polar cusps; and (3) a deficiency of Kp as a predictor of the field configuration

  11. A methodology for spectral wave model evaluation

    Science.gov (United States)

    Siqueira, S. A.; Edwards, K. L.; Rogers, W. E.

    2017-12-01

    Model evaluation is accomplished by comparing bulk parameters (e.g., significant wave height, energy period, and mean square slope (MSS)) calculated from the model energy spectra with those calculated from buoy energy spectra. Quality control of the observed data and choice of the frequency range from which the bulk parameters are calculated are critical steps in ensuring the validity of the model-data comparison. The compared frequency range of each observation and the analogous model output must be identical, and the optimal frequency range depends in part on the reliability of the observed spectra. National Data Buoy Center 3-m discus buoy spectra are unreliable above 0.3 Hz due to a non-optimal buoy response function correction. As such, the upper end of the spectrum should not be included when comparing a model to these data. Bioufouling of Waverider buoys must be detected, as it can harm the hydrodynamic response of the buoy at high frequencies, thereby rendering the upper part of the spectrum unsuitable for comparison. An important consideration is that the intentional exclusion of high frequency energy from a validation due to data quality concerns (above) can have major implications for validation exercises, especially for parameters such as the third and fourth moments of the spectrum (related to Stokes drift and MSS, respectively); final conclusions can be strongly altered. We demonstrate this by comparing outcomes with and without the exclusion, in a case where a Waverider buoy is believed to be free of biofouling. Determination of the appropriate frequency range is not limited to the observed spectra. Model evaluation involves considering whether all relevant frequencies are included. Guidance to make this decision is based on analysis of observed spectra. Two model frequency lower limits were considered. Energy in the observed spectrum below the model lower limit was calculated for each. For locations where long swell is a component of the wave

  12. Evaluation of image collection requirements for 3D reconstruction using phototourism techniques on sparse overhead data

    Science.gov (United States)

    Ontiveros, Erin; Salvaggio, Carl; Nilosek, David; Raqueño, Nina; Faulring, Jason

    2012-06-01

    Phototourism is a burgeoning field that uses collections of ground-based photographs to construct a three-dimensional model of a tourist site, using computer vision techniques. These techniques capitalize on the extensive overlap generated by the various visitor-acquired images from which a three-dimensional point cloud can be generated. From there, a facetized version of the structure can be created. Remotely sensed data tends to focus on nadir or near nadir imagery while trying to minimize overlap in order to achieve the greatest ground coverage possible during a data collection. A workflow is being developed at Digital Imaging and Remote Sensing (DIRS) Group at the Rochester Institute of Technology (RIT) that utilizes these phototourism techniques, which typically use dense coverage of a small object or region, and applies them to remotely sensed imagery, which involves sparse data coverage of a large area. In addition to this, RIT has planned and executed a high-overlap image collection, using the RIT WASP system, to study the requirements needed for such three-dimensional reconstruction efforts. While the collection was extensive, the intention was to find the minimum number of images and frame overlap needed to generate quality point clouds. This paper will discuss the image data collection effort and what it means to generate and evaluate a quality point cloud for reconstruction purposes.

  13. Intuitionistic fuzzy (IF) evaluations of multidimensional model

    International Nuclear Information System (INIS)

    Valova, I.

    2012-01-01

    There are different logical methods for data structuring, but no one is perfect enough. Multidimensional model-MD of data is presentation of data in a form of cube (referred also as info-cube or hypercube) with data or in form of 'star' type scheme (referred as multidimensional scheme), by use of F-structures (Facts) and set of D-structures (Dimensions), based on the notion of hierarchy of D-structures. The data, being subject of analysis in a specific multidimensional model is located in a Cartesian space, being restricted by D-structures. In fact, the data is either dispersed or 'concentrated', therefore the data cells are not distributed evenly within the respective space. The moment of occurrence of any event is difficult to be predicted and the data is concentrated as per time periods, location of performed business event, etc. To process such dispersed or concentrated data, various technical strategies are needed. The basic methods for presentation of such data should be selected. The approaches of data processing and respective calculations are connected with different options for data representation. The use of intuitionistic fuzzy evaluations (IFE) provide us new possibilities for alternative presentation and processing of data, subject of analysis in any OLAP application. The use of IFE at the evaluation of multidimensional models will result in the following advantages: analysts will dispose with more complete information for processing and analysis of respective data; benefit for the managers is that the final decisions will be more effective ones; enabling design of more functional multidimensional schemes. The purpose of this work is to apply intuitionistic fuzzy evaluations of multidimensional model of data. (authors)

  14. Conducting requirements analyses for research using routinely collected health data: a model driven approach.

    Science.gov (United States)

    de Lusignan, Simon; Cashman, Josephine; Poh, Norman; Michalakidis, Georgios; Mason, Aaron; Desombre, Terry; Krause, Paul

    2012-01-01

    Medical research increasingly requires the linkage of data from different sources. Conducting a requirements analysis for a new application is an established part of software engineering, but rarely reported in the biomedical literature; and no generic approaches have been published as to how to link heterogeneous health data. Literature review, followed by a consensus process to define how requirements for research, using, multiple data sources might be modeled. We have developed a requirements analysis: i-ScheDULEs - The first components of the modeling process are indexing and create a rich picture of the research study. Secondly, we developed a series of reference models of progressive complexity: Data flow diagrams (DFD) to define data requirements; unified modeling language (UML) use case diagrams to capture study specific and governance requirements; and finally, business process models, using business process modeling notation (BPMN). These requirements and their associated models should become part of research study protocols.

  15. Design Concept Evaluation Using System Throughput Model

    International Nuclear Information System (INIS)

    Sequeira, G.; Nutt, W. M.

    2004-01-01

    The U.S. Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) is currently developing the technical bases to support the submittal of a license application for construction of a geologic repository at Yucca Mountain, Nevada to the U.S. Nuclear Regulatory Commission. The Office of Repository Development (ORD) is responsible for developing the design of the proposed repository surface facilities for the handling of spent nuclear fuel and high level nuclear waste. Preliminary design activities are underway to sufficiently develop the repository surface facilities design for inclusion in the license application. The design continues to evolve to meet mission needs and to satisfy both regulatory and program requirements. A system engineering approach is being used in the design process since the proposed repository facilities are dynamically linked by a series of sub-systems and complex operations. In addition, the proposed repository facility is a major system element of the overall waste management process being developed by the OCRWM. Such an approach includes iterative probabilistic dynamic simulation as an integral part of the design evolution process. A dynamic simulation tool helps to determine if: (1) the mission and design requirements are complete, robust, and well integrated; (2) the design solutions under development meet the design requirements and mission goals; (3) opportunities exist where the system can be improved and/or optimized; and (4) proposed changes to the mission, and design requirements have a positive or negative impact on overall system performance and if design changes may be necessary to satisfy these changes. This paper will discuss the type of simulation employed to model the waste handling operations. It will then discuss the process being used to develop the Yucca Mountain surface facilities model. The latest simulation model and the results of the simulation and how the data were used in the design

  16. Risk assessment and remedial policy evaluation using predictive modeling

    International Nuclear Information System (INIS)

    Linkov, L.; Schell, W.R.

    1996-01-01

    As a result of nuclear industry operation and accidents, large areas of natural ecosystems have been contaminated by radionuclides and toxic metals. Extensive societal pressure has been exerted to decrease the radiation dose to the population and to the environment. Thus, in making abatement and remediation policy decisions, not only economic costs but also human and environmental risk assessments are desired. This paper introduces a general framework for risk assessment and remedial policy evaluation using predictive modeling. Ecological risk assessment requires evaluation of the radionuclide distribution in ecosystems. The FORESTPATH model is used for predicting the radionuclide fate in forest compartments after deposition as well as for evaluating the efficiency of remedial policies. Time of intervention and radionuclide deposition profile was predicted as being crucial for the remediation efficiency. Risk assessment conducted for a critical group of forest users in Belarus shows that consumption of forest products (berries and mushrooms) leads to about 0.004% risk of a fatal cancer annually. Cost-benefit analysis for forest cleanup suggests that complete removal of organic layer is too expensive for application in Belarus and a better methodology is required. In conclusion, FORESTPATH modeling framework could have wide applications in environmental remediation of radionuclides and toxic metals as well as in dose reconstruction and, risk-assessment

  17. Four Reference Models for Transparency Requirements in Information Systems

    OpenAIRE

    Hosseini, Mahmoud; Shahri, Alimohammad; Phalp, Keith T.; Ali, Ra

    2017-01-01

    Transparency is a key emerging requirement in modern businesses and their information systems. Transparency refers to the information which flows amongst stakeholders for the purpose of informed decision-making and taking the right action. Transparency is generally associated with positive connotations such as trust and accountability. However, it has been shown that it could have adverse effects such as information overload and affecting decisions objectiveness. This calls for systematic app...

  18. Use of an operational model evaluation system for model intercomparison

    Energy Technology Data Exchange (ETDEWEB)

    Foster, K. T., LLNL

    1998-03-01

    The Atmospheric Release Advisory Capability (ARAC) is a centralized emergency response system used to assess the impact from atmospheric releases of hazardous materials. As part of an on- going development program, new three-dimensional diagnostic windfield and Lagrangian particle dispersion models will soon replace ARAC`s current operational windfield and dispersion codes. A prototype model performance evaluation system has been implemented to facilitate the study of the capabilities and performance of early development versions of these new models relative to ARAC`s current operational codes. This system provides tools for both objective statistical analysis using common performance measures and for more subjective visualization of the temporal and spatial relationships of model results relative to field measurements. Supporting this system is a database of processed field experiment data (source terms and meteorological and tracer measurements) from over 100 individual tracer releases.

  19. Evaluation of models of waste glass durability

    International Nuclear Information System (INIS)

    Ellison, A.

    1995-01-01

    The main variable under the control of the waste glass producer is the composition of the glass; thus a need exists to establish functional relationships between the composition of a waste glass and measures of processability, product consistency, and durability. Many years of research show that the structure and properties of a glass depend on its composition, so it seems reasonable to assume that there also is relationship between the composition of a waste glass and its resistance to attack by an aqueous solution. Several models have been developed to describe this dependence, and an evaluation their predictive capabilities is the subject of this paper. The objective is to determine whether any of these models describe the ''correct'' functional relationship between composition and corrosion rate. A more thorough treatment of the relationships between glass composition and durability has been presented elsewhere, and the reader is encouraged to consult it for a more detailed discussion. The models examined in this study are the free energy of hydration model, developed at the Savannah River Laboratory, the structural bond strength model, developed at the Vitreous State Laboratory at the Catholic University of America, and the Composition Variation Study, developed at Pacific Northwest Laboratory

  20. Evaluation of CASP8 model quality predictions

    KAUST Repository

    Cozzetto, Domenico

    2009-01-01

    The model quality assessment problem consists in the a priori estimation of the overall and per-residue accuracy of protein structure predictions. Over the past years, a number of methods have been developed to address this issue and CASP established a prediction category to evaluate their performance in 2006. In 2008 the experiment was repeated and its results are reported here. Participants were invited to infer the correctness of the protein models submitted by the registered automatic servers. Estimates could apply to both whole models and individual amino acids. Groups involved in the tertiary structure prediction categories were also asked to assign local error estimates to each predicted residue in their own models and their results are also discussed here. The correlation between the predicted and observed correctness measures was the basis of the assessment of the results. We observe that consensus-based methods still perform significantly better than those accepting single models, similarly to what was concluded in the previous edition of the experiment. © 2009 WILEY-LISS, INC.

  1. Data assimilation and model evaluation experiment datasets

    Science.gov (United States)

    Lai, Chung-Cheng A.; Qian, Wen; Glenn, Scott M.

    1994-01-01

    The Institute for Naval Oceanography, in cooperation with Naval Research Laboratories and universities, executed the Data Assimilation and Model Evaluation Experiment (DAMEE) for the Gulf Stream region during fiscal years 1991-1993. Enormous effort has gone into the preparation of several high-quality and consistent datasets for model initialization and verification. This paper describes the preparation process, the temporal and spatial scopes, the contents, the structure, etc., of these datasets. The goal of DAMEE and the need of data for the four phases of experiment are briefly stated. The preparation of DAMEE datasets consisted of a series of processes: (1) collection of observational data; (2) analysis and interpretation; (3) interpolation using the Optimum Thermal Interpolation System package; (4) quality control and re-analysis; and (5) data archiving and software documentation. The data products from these processes included a time series of 3D fields of temperature and salinity, 2D fields of surface dynamic height and mixed-layer depth, analysis of the Gulf Stream and rings system, and bathythermograph profiles. To date, these are the most detailed and high-quality data for mesoscale ocean modeling, data assimilation, and forecasting research. Feedback from ocean modeling groups who tested this data was incorporated into its refinement. Suggestions for DAMEE data usages include (1) ocean modeling and data assimilation studies, (2) diagnosis and theoretical studies, and (3) comparisons with locally detailed observations.

  2. HIGH RESOLUTION RESISTIVITY LEAK DETECTION DATA PROCESSING and EVALUATION MEHTODS and REQUIREMENTS

    International Nuclear Information System (INIS)

    SCHOFIELD JS

    2007-01-01

    This document has two purposes: (sm b ullet) Describe how data generated by High Resolution REsistivity (HRR) leak detection (LD) systems deployed during single-shell tank (SST) waste retrieval operations are processed and evaluated. (sm b ullet) Provide the basic review requirements for HRR data when Hrr is deployed as a leak detection method during SST waste retrievals

  3. Evaluation of the ARCHITECT urine NGAL assay: Assay performance, specimen handling requirements and biological variability

    NARCIS (Netherlands)

    Grenier, F.C.; Ali, S.; Syed, H.; Workman, R.; Martens, F.; Liao, M.; Wang, Y.; Wong, P.Y.

    2010-01-01

    Objectives: NGAL (Neutrophil Gelatinase-Associated Lipocalin) has emerged as a new biomarker for the identification of acute kidney injury. Reliable clinical evaluations require a simple, robust test method for NGAL, and knowledge of specimen handling and specimen stability characteristics. We

  4. EVALUATION OF RISKS AND WASTE CHARACTERIZATION REQUIREMENTS FOR THE TRANSURANIC WASTE EMPLACED IN WIPP DURING 1999

    International Nuclear Information System (INIS)

    Channell, J.K.; Walker, B.A.

    2000-01-01

    Specifically this report: 1. Compares requirements of the WAP that are pertinent from a technical viewpoint with the WIPP pre-Permit waste characterization program, 2. Presents the results of a risk analysis of the currently emplaced wastes. Expected and bounding risks from routine operations and possible accidents are evaluated; and 3. Provides conclusions and recommendations

  5. 34 CFR 300.305 - Additional requirements for evaluations and reevaluations.

    Science.gov (United States)

    2010-07-01

    ... evaluation (if appropriate) and as part of any reevaluation under this part, the IEP Team and other qualified... the measurable annual goals set out in the IEP of the child and to participate, as appropriate, in the...) of this section. (d) Requirements if additional data are not needed. (1) If the IEP Team and other...

  6. Evaluating Translational Research: A Process Marker Model

    Science.gov (United States)

    Trochim, William; Kane, Cathleen; Graham, Mark J.; Pincus, Harold A.

    2011-01-01

    Abstract Objective: We examine the concept of translational research from the perspective of evaluators charged with assessing translational efforts. One of the major tasks for evaluators involved in translational research is to help assess efforts that aim to reduce the time it takes to move research to practice and health impacts. Another is to assess efforts that are intended to increase the rate and volume of translation. Methods: We offer an alternative to the dominant contemporary tendency to define translational research in terms of a series of discrete “phases.”Results: We contend that this phased approach has been confusing and that it is insufficient as a basis for evaluation. Instead, we argue for the identification of key operational and measurable markers along a generalized process pathway from research to practice. Conclusions: This model provides a foundation for the evaluation of interventions designed to improve translational research and the integration of these findings into a field of translational studies. Clin Trans Sci 2011; Volume 4: 153–162 PMID:21707944

  7. Mathematical Formulation Requirements and Specifications for the Process Models

    International Nuclear Information System (INIS)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-01-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments naturally

  8. Licensing evaluation of CANDU-PHW nuclear power plants relative to U.S. regulatory requirements

    International Nuclear Information System (INIS)

    Erp, J.B. van

    1978-01-01

    Differences between the U.S. and Canadian approach to safety and licensing are discussed. U.S. regulatory requirements are evaluated as regards their applicability to CANDU-PHW reactors; vice-versa the CANDU-PHW reactor is evaluated with respect to current Regulatory Requirements and Guides. A number of design modifications are proposed to be incorporated into the CANDU-PHW reactor in order to facilitate its introduction into the U.S. These modifications are proposed solely for the purpose of maintaining consistency within the current U.S. regulatory system and not out of a need to improve the safety of current-design CANDU-PHW nuclear power plants. A number of issues are identified which still require resolution. Most of these issues are concerned with design areas not (yet) covered by the ASME code. (author)

  9. Modelling of radon control and air cleaning requirements in underground uranium mines

    International Nuclear Information System (INIS)

    El Fawal, M.; Gadalla, A.

    2014-01-01

    As a part of a comprehensive study concerned with control workplace short-lived radon daughter concentration in underground uranium mines to safe levels, a computer program has been developed and verified, to calculate ventilation parameters e.g. local pressures, flow rates and radon daughter concentration levels. The computer program is composed of two parts, one part for mine ventilation and the other part for radon daughter levels calculations. This program has been validated in an actual case study to calculate radon concentration levels, pressure and flow rates required to maintain acceptable levels of radon concentrations in each point of the mine. The required fan static pressure and the approximate energy consumption were also estimated. The results of the calculations have been evaluated and compared with similar investigation. It was found that the calculated values are in good agreement with the corresponding values obtained using ''REDES'' standard ventilation modelling software. The developed computer model can be used as an available tool to help in the evaluation of ventilation systems proposed by mining authority, to assist the uranium mining industry in maintaining the health and safety of the workers underground while efficiently achieving economic production targets. It could be used also for regulatory inspection and radiation protection assessments of workers in the underground mining. Also with using this model, one can effectively design, assess and manage underground mine ventilation systems. Values of radon decay products concentration in units of working level, pressures drop and flow rates required to reach the acceptable radon concentration relative to the recommended levels, at different extraction points in the mine and fan static pressure could be estimated which are not available using other software. (author)

  10. ZNJPrice/Earnings Ratio Model through Dividend Yield and Required Yield Above Expected Inflation

    Directory of Open Access Journals (Sweden)

    Emil Mihalina

    2010-07-01

    Full Text Available Price/earnings ratio is the most popular and most widespread evaluation model used to assess relative capital asset value on financial markets. In functional terms, company earnings in the very long term can be described with high significance. Empirically, it is visible from long-term statistics that the demanded (required yield on capital markets has certain regularity. Thus, investors first require a yield above the stable inflation rate and then a dividend yield and a capital increase caused by the growth of earnings that influence the price, with the assumption that the P/E ratio is stable. By combining the Gordon model for current dividend value, the model of market capitalization of earnings (price/earnings ratio and bearing in mind the influence of the general price levels on company earnings, it is possible to adjust the price/earnings ratio by deriving a function of the required yield on capital markets measured by a market index through dividend yield and inflation rate above the stable inflation rate increased by profit growth. The S&P 500 index for example, has in the last 100 years grown by exactly the inflation rate above the stable inflation rate increased by profit growth. The comparison of two series of price/earnings ratios, a modelled one and an average 7-year ratio, shows a notable correlation in the movement of two series of variables, with a three year deviation. Therefore, it could be hypothesized that three years of the expected inflation level, dividend yield and profit growth rate of the market index are discounted in the current market prices. The conclusion is that, at the present time, the relationship between the adjusted average price/earnings ratio and its effect on the market index on one hand and the modelled price/earnings ratio on the other can clearly show the expected dynamics and course in the following period.

  11. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    Science.gov (United States)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  12. Obs4MIPS: Satellite Observations for Model Evaluation

    Science.gov (United States)

    Ferraro, R.; Waliser, D. E.; Gleckler, P. J.

    2017-12-01

    This poster will review the current status of the obs4MIPs project, whose purpose is to provide a limited collection of well-established and documented datasets for comparison with Earth system models (https://www.earthsystemcog.org/projects/obs4mips/). These datasets have been reformatted to correspond with the CMIP5 model output requirements, and include technical documentation specifically targeted for their use in model output evaluation. The project holdings now exceed 120 datasets with observations that directly correspond to CMIP5 model output variables, with new additions in response to the CMIP6 experiments. With the growth in climate model output data volume, it is increasing more difficult to bring the model output and the observations together to do evaluations. The positioning of the obs4MIPs datasets within the Earth System Grid Federation (ESGF) allows for the use of currently available and planned online tools within the ESGF to perform analysis using model output and observational datasets without necessarily downloading everything to a local workstation. This past year, obs4MIPs has updated its submission guidelines to closely align with changes in the CMIP6 experiments, and is implementing additional indicators and ancillary data to allow users to more easily determine the efficacy of an obs4MIPs dataset for specific evaluation purposes. This poster will present the new guidelines and indicators, and update the list of current obs4MIPs holdings and their connection to the ESGF evaluation and analysis tools currently available, and being developed for the CMIP6 experiments.

  13. REPFLO model evaluation, physical and numerical consistency

    International Nuclear Information System (INIS)

    Wilson, R.N.; Holland, D.H.

    1978-11-01

    This report contains a description of some suggested changes and an evaluation of the REPFLO computer code, which models ground-water flow and nuclear-waste migration in and about a nuclear-waste repository. The discussion contained in the main body of the report is supplemented by a flow chart, presented in the Appendix of this report. The suggested changes are of four kinds: (1) technical changes to make the code compatible with a wider variety of digital computer systems; (2) changes to fill gaps in the computer code, due to missing proprietary subroutines; (3) changes to (a) correct programming errors, (b) correct logical flaws, and (c) remove unnecessary complexity; and (4) changes in the computer code logical structure to make REPFLO a more viable model from the physical point of view

  14. FEM-model of the Naesliden Mine: requirements and limitations at the outset of the project. [Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Krauland, N.

    1980-05-15

    The design of any instrument depends entirely on the application planned for the instrument. This applies also to the FEM-representation of the Naesliden Mine. With reference to the aims of the project the requirements on the model are outlined with regard to - simulation of the mining process - modelling with special reference to the aims of the project - comparison of FEM-results with in situ observations to determine the validity of the model. The proposed model is two-dimensional and incorporates joint elements to simulate the weak alteration zone between orebody and sidewall rock. The remainder of the model exhibits linear elastic behaviour. This model is evaluated with respect to the given requirements. The limitations of the chosen model are outlined.

  15. Mathematical Formulation Requirements and Specifications for the Process Models

    Energy Technology Data Exchange (ETDEWEB)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-11-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments

  16. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  17. Functional requirements of a mathematical model of the heart.

    Science.gov (United States)

    Palladino, Joseph L; Noordergraaf, Abraham

    2009-01-01

    Functional descriptions of the heart, especially the left ventricle, are often based on the measured variables pressure and ventricular outflow, embodied as a time-varying elastance. The fundamental difficulty of describing the mechanical properties of the heart with a time-varying elastance function that is set a priori is described. As an alternative, a new functional model of the heart is presented, which characterizes the ventricle's contractile state with parameters, rather than variables. Each chamber is treated as a pressure generator that is time and volume dependent. The heart's complex dynamics develop from a single equation based on the formation and relaxation of crossbridge bonds. This equation permits the calculation of ventricular elastance via E(v) = partial differentialp(v)/ partial differentialV(v). This heart model is defined independently from load properties, and ventricular elastance is dynamic and reflects changing numbers of crossbridge bonds. In this paper, the functionality of this new heart model is presented via computed work loops that demonstrate the Frank-Starling mechanism and the effects of preload, the effects of afterload, inotropic changes, and varied heart rate, as well as the interdependence of these effects. Results suggest the origin of the equivalent of Hill's force-velocity relation in the ventricle.

  18. Modelling of capital requirements in the energy sector: capital market access. Final memorandum

    Energy Technology Data Exchange (ETDEWEB)

    1978-04-01

    Formal modelling techniques for analyzing the capital requirements of energy industries have been performed at DOE. A survey has been undertaken of a number of models which forecast energy-sector capital requirements or which detail the interactions of the energy sector and the economy. Models are identified which can be useful as prototypes for some portion of DOE's modelling needs. The models are examined to determine any useful data bases which could serve as inputs to an original DOE model. A selected group of models are examined which can comply with the stated capabilities. The data sources being used by these models are covered and a catalog of the relevant data bases is provided. The models covered are: capital markets and capital availability models (Fossil 1, Bankers Trust Co., DRI Macro Model); models of physical capital requirements (Bechtel Supply Planning Model, ICF Oil and Gas Model and Coal Model, Stanford Research Institute National Energy Model); macroeconomic forecasting models with input-output analysis capabilities (Wharton Annual Long-Term Forecasting Model, Brookhaven/University of Illinois Model, Hudson-Jorgenson/Brookhaven Model); utility models (MIT Regional Electricity Model-Baughman Joskow, Teknekron Electric Utility Simulation Model); and others (DRI Energy Model, DRI/Zimmerman Coal Model, and Oak Ridge Residential Energy Use Model).

  19. "Open Access" Requires Clarification: Medical Journal Publication Models Evolve.

    Science.gov (United States)

    Lubowitz, James H; Brand, Jefferson C; Rossi, Michael J; Provencher, Matthew T

    2017-03-01

    While Arthroscopy journal is a traditional subscription model journal, our companion journal Arthroscopy Techniques is "open access." We used to believe open access simply meant online and free of charge. However, while open-access journals are free to readers, in 2017 authors must make a greater sacrifice in the form of an article-processing charge (APC). Again, while this does not apply to Arthroscopy, the APC will apply to Arthroscopy Techniques. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  20. Minimization of required model runs in the Random Mixing approach to inverse groundwater flow and transport modeling

    Science.gov (United States)

    Hoerning, Sebastian; Bardossy, Andras; du Plessis, Jaco

    2017-04-01

    Most geostatistical inverse groundwater flow and transport modelling approaches utilize a numerical solver to minimize the discrepancy between observed and simulated hydraulic heads and/or hydraulic concentration values. The optimization procedure often requires many model runs, which for complex models lead to long run times. Random Mixing is a promising new geostatistical technique for inverse modelling. The method is an extension of the gradual deformation approach. It works by finding a field which preserves the covariance structure and maintains observed hydraulic conductivities. This field is perturbed by mixing it with new fields that fulfill the homogeneous conditions. This mixing is expressed as an optimization problem which aims to minimize the difference between the observed and simulated hydraulic heads and/or concentration values. To preserve the spatial structure, the mixing weights must lie on the unit hyper-sphere. We present a modification to the Random Mixing algorithm which significantly reduces the number of model runs required. The approach involves taking n equally spaced points on the unit circle as weights for mixing conditional random fields. Each of these mixtures provides a solution to the forward model at the conditioning locations. For each of the locations the solutions are then interpolated around the circle to provide solutions for additional mixing weights at very low computational cost. The interpolated solutions are used to search for a mixture which maximally reduces the objective function. This is in contrast to other approaches which evaluate the objective function for the n mixtures and then interpolate the obtained values. Keeping the mixture on the unit circle makes it easy to generate equidistant sampling points in the space; however, this means that only two fields are mixed at a time. Once the optimal mixture for two fields has been found, they are combined to form the input to the next iteration of the algorithm. This

  1. Requirements for modeling airborne microbial contamination in space stations

    Science.gov (United States)

    Van Houdt, Rob; Kokkonen, Eero; Lehtimäki, Matti; Pasanen, Pertti; Leys, Natalie; Kulmala, Ilpo

    2018-03-01

    Exposure to bioaerosols is one of the facets that affect indoor air quality, especially for people living in densely populated or confined habitats, and is associated to a wide range of health effects. Good indoor air quality is thus vital and a prerequisite for fully confined environments such as space habitats. Bioaerosols and microbial contamination in these confined space stations can have significant health impacts, considering the unique prevailing conditions and constraints of such habitats. Therefore, biocontamination in space stations is strictly monitored and controlled to ensure crew and mission safety. However, efficient bioaerosol control measures rely on solid understanding and knowledge on how these bioaerosols are created and dispersed, and which factors affect the survivability of the associated microorganisms. Here we review the current knowledge gained from relevant studies in this wide and multidisciplinary area of bioaerosol dispersion modeling and biological indoor air quality control, specifically taking into account the specific space conditions.

  2. [Requirements imposed on model objects in microevolutionary investigations].

    Science.gov (United States)

    Mina, M V

    2015-01-01

    Extrapolation of results of investigations of a model object is justified only within the limits of a set of objects that have essential properties in common with the modal object. Which properties are essential depends on the aim of a study. Similarity of objects emerged in the process of their independent evolution does not prove similarity of ways and mechanisms of their evolution. If the objects differ in their essential properties then extrapolation of results of investigation of an object on another one is risky because it may lead to wrong decisions and, moreover, to the loss of interest to alternative hypotheses. Positions formulated above are considered with the reference to species flocks of fishes, large African Barbus in particular.

  3. Evaluation Methodologies for Information Management Systems; Building Digital Tobacco Industry Document Libraries at the University of California, San Francisco Library/Center for Knowledge Management; Experiments with the IFLA Functional Requirements for Bibliographic Records (FRBR); Coming to Term: Designing the Texas Email Repository Model.

    Science.gov (United States)

    Morse, Emile L.; Schmidt, Heidi; Butter, Karen; Rider, Cynthia; Hickey, Thomas B.; O'Neill, Edward T.; Toves, Jenny; Green, Marlan; Soy, Sue; Gunn, Stan; Galloway, Patricia

    2002-01-01

    Includes four articles that discuss evaluation methods for information management systems under the Defense Advanced Research Projects Agency; building digital libraries at the University of California San Francisco's Tobacco Control Archives; IFLA's Functional Requirements for Bibliographic Records; and designing the Texas email repository model…

  4. Ex-vessel core catcher design requirements and preliminary concepts evaluation

    International Nuclear Information System (INIS)

    Friedland, A.J.; Tilbrook, R.W.

    1974-01-01

    As part of the overall study of the consequences of a hypothetical failure to scram following loss of pumping power, design requirements and preliminary concepts evaluation of an ex-vessel core catcher (EVCC) were performed. EVCC is the term applied to a class of devices whose primary objective is to provide a stable subcritical and coolable configuration within containment following a postulated accident in which it is assumed that core debris has penetrated the Reactor Vessel and Guard Vessel. Under these assumed conditions a set of functional requirements were developed for an EVCC and several concepts were evaluated. The studies were specifically directed toward the FFTF design considering the restraints imposed by the physical design and construction of the FFTF plant

  5. Economics of the specification 6M safety re-evaluation and regulatory requirements

    International Nuclear Information System (INIS)

    Hopper, C.M.

    1985-01-01

    The objective of this work was to examine the potential economic impact of the DOT Specification 6M criticality safety re-evaluation and regulatory requirements. The examination was based upon comparative analyses of current authorized fissile material load limits for the 6M, current Federal regulations (and interpretations) limiting the contents of Type B fissile material packages, limiting aggregates of fissile material packages, and recent proposed fissile material mass limits derived from specialized criticality safety analyses of the 6M package. The work examines influences on cost in transportation, handling, and storage of fissile materials. Depending upon facility throughput requirements (and assumed incremental costs of fissile material packaging, storage, and transport), operating, facility storage capacity, and transportation costs can be reduced significantly. As an example of the pricing algorithm application based upon reasonable cost influences, the magnitude of the first year cost reductions could extend beyond four times the cost of the packaging nuclear criticality safety re-evaluation. 1 tab

  6. Quantitative and Functional Requirements for Bioluminescent Cancer Models.

    Science.gov (United States)

    Feys, Lynn; Descamps, Benedicte; Vanhove, Christian; Vermeulen, Stefan; Vandesompele, J O; Vanderheyden, Katrien; Messens, Kathy; Bracke, Marc; De Wever, Olivier

    2016-01-01

    Bioluminescent cancer models are widely used but detailed quantification of the luciferase signal and functional comparison with a non-transfected control cell line are generally lacking. In the present study, we provide quantitative and functional tests for luciferase-transfected cells. We quantified the luciferase expression in BLM and HCT8/E11 transfected cancer cells, and examined the effect of long-term luciferin exposure. The present study also investigated functional differences between parental and transfected cancer cells. Our results showed that quantification of different single-cell-derived populations are superior with droplet digital polymerase chain reaction. Quantification of luciferase protein level and luciferase bioluminescent activity is only useful when there is a significant difference in copy number. Continuous exposure of cell cultures to luciferin leads to inhibitory effects on mitochondrial activity, cell growth and bioluminescence. These inhibitory effects correlate with luciferase copy number. Cell culture and mouse xenograft assays showed no significant functional differences between luciferase-transfected and parental cells. Luciferase-transfected cells should be validated by quantitative and functional assays before starting large-scale experiments. Copyright © 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  7. Evaluation of Student's Environment by DEA Models

    Directory of Open Access Journals (Sweden)

    F. Moradi

    2016-11-01

    Full Text Available The important question here is, is there real evaluation in educational advance? In other words, if a student has been successful in mathematics or has been unsuccessful in mathematics, is it possible to find the reasons behind his advance or, is it possible to find the reasons behind his advance or weakness? If we want to respond to this significant question, it should be said that factors of educational advance must be divided into 5 main groups. 1-family, 2-teacher, 3- students 4-school and 5-manager of 3 schools It can then be said that a student's score does not just depend on a factor that people have imaged From this, it can be concluded that by using the DEA and SBM models, each student's efficiency must be researched and the factors of the student's strengths and weaknesses must be analyzed.

  8. RTMOD: Real-Time MODel evaluation

    International Nuclear Information System (INIS)

    Graziani, G; Galmarini, S.; Mikkelsen, T.

    2000-01-01

    The 1998 - 1999 RTMOD project is a system based on an automated statistical evaluation for the inter-comparison of real-time forecasts produced by long-range atmospheric dispersion models for national nuclear emergency predictions of cross-boundary consequences. The background of RTMOD was the 1994 ETEX project that involved about 50 models run in several Institutes around the world to simulate two real tracer releases involving a large part of the European territory. In the preliminary phase of ETEX, three dry runs (i.e. simulations in real-time of fictitious releases) were carried out. At that time, the World Wide Web was not available to all the exercise participants, and plume predictions were therefore submitted to JRC-Ispra by fax and regular mail for subsequent processing. The rapid development of the World Wide Web in the second half of the nineties, together with the experience gained during the ETEX exercises suggested the development of this project. RTMOD featured a web-based user-friendly interface for data submission and an interactive program module for displaying, intercomparison and analysis of the forecasts. RTMOD has focussed on model intercomparison of concentration predictions at the nodes of a regular grid with 0.5 degrees of resolution both in latitude and in longitude, the domain grid extending from 5W to 40E and 40N to 65N. Hypothetical releases were notified around the world to the 28 model forecasters via the web on a one-day warning in advance. They then accessed the RTMOD web page for detailed information on the actual release, and as soon as possible they then uploaded their predictions to the RTMOD server and could soon after start their inter-comparison analysis with other modelers. When additional forecast data arrived, already existing statistical results would be recalculated to include the influence by all available predictions. The new web-based RTMOD concept has proven useful as a practical decision-making tool for realtime

  9. Application of Multiple Evaluation Models in Brazil

    Directory of Open Access Journals (Sweden)

    Rafael Victal Saliba

    2008-07-01

    Full Text Available Based on two different samples, this article tests the performance of a number of Value Drivers commonly used for evaluating companies by finance practitioners, through simple regression models of cross-section type which estimate the parameters associated to each Value Driver, denominated Market Multiples. We are able to diagnose the behavior of several multiples in the period 1994-2004, with an outlook also on the particularities of the economic activities performed by the sample companies (and their impacts on the performance through a subsequent analysis with segregation of companies in the sample by sectors. Extrapolating simple multiples evaluation standards from analysts of the main financial institutions in Brazil, we find that adjusting the ratio formulation to allow for an intercept does not provide satisfactory results in terms of pricing errors reduction. Results found, in spite of evidencing certain relative and absolute superiority among the multiples, may not be generically representative, given samples limitation.

  10. Bioprocesses: Modelling needs for process evaluation and sustainability assessment

    DEFF Research Database (Denmark)

    Jiménez-Gonzaléz, Concepcion; Woodley, John

    2010-01-01

    development such that they can also be used to evaluate processes against sustainability metrics, as well as economics as an integral part of assessments. Finally, property models will also be required based on compounds not currently present in existing databases. It is clear that many new opportunities......The next generation of process engineers will face a new set of challenges, with the need to devise new bioprocesses, with high selectivity for pharmaceutical manufacture, and for lower value chemicals manufacture based on renewable feedstocks. In this paper the current and predicted future roles...... of process system engineering and life cycle inventory and assessment in the design, development and improvement of sustainable bioprocesses are explored. The existing process systems engineering software tools will prove essential to assist this work. However, the existing tools will also require further...

  11. Model-based Assessment for Balancing Privacy Requirements and Operational Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-17

    The smart grid changes the way energy is produced and distributed. In addition both, energy and information is exchanged bidirectionally among participating parties. Therefore heterogeneous systems have to cooperate effectively in order to achieve a common high-level use case, such as smart metering for billing or demand response for load curtailment. Furthermore, a substantial amount of personal data is often needed for achieving that goal. Capturing and processing personal data in the smart grid increases customer concerns about privacy and in addition, certain statutory and operational requirements regarding privacy aware data processing and storage have to be met. An increase of privacy constraints, however, often limits the operational capabilities of the system. In this paper, we present an approach that automates the process of finding an optimal balance between privacy requirements and operational requirements in a smart grid use case and application scenario. This is achieved by formally describing use cases in an abstract model and by finding an algorithm that determines the optimum balance by forward mapping privacy and operational impacts. For this optimal balancing algorithm both, a numeric approximation and – if feasible – an analytic assessment are presented and investigated. The system is evaluated by applying the tool to a real-world use case from the University of Southern California (USC) microgrid.

  12. Evaluating to Solve Educational Problems: An Alternative Model.

    Science.gov (United States)

    Friedman, Myles I.; Anderson, Lorin W.

    1979-01-01

    A 19-step general evaluation model is described through its four stages: identifying problems, prescribing program solutions, evaluating the operation of the program, and evaluating the effectiveness of the model. The role of the evaluator in decision making is also explored. (RAO)

  13. Critical evaluation of paradigms for modelling integrated supply chains

    NARCIS (Netherlands)

    Van Dam, K.H.; Adhitya, A.; Srinivasan, R.; Lukszo, Z.

    2009-01-01

    Contemporary problems in process systems engineering often require model-based decision support tool. Among the various modelling paradigms, equation-based models and agent-based models are widely used to develop dynamic models of systems. Which is the most appropriate modelling paradigm for a

  14. A Model for Evaluating Student Clinical Psychomotor Skills.

    Science.gov (United States)

    And Others; Fiel, Nicholas J.

    1979-01-01

    A long-range plan to evaluate medical students' physical examination skills was undertaken at the Ingham Family Medical Clinic at Michigan State University. The development of the psychomotor skills evaluation model to evaluate the skill of blood pressure measurement, tests of the model's reliability, and the use of the model are described. (JMD)

  15. Study on visibility evaluation model which is considered field factors; Field factor wo koryoshita shininsei hyoka model ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    Nakanishi, M; Hagiwara, T [Hokkaido University, Sapporo (Japan)

    1997-10-01

    The present study proposes a model to evaluate visual performance of road traffic facilities required for drivers. Two factors were employed to obtain the suitable contrast for drivers under driving situation. One factor is a suitable luminance range, which is derived from minimum required luminance and glare luminance. Another is a field. The model showed capability of providing visibility range in some cases. 8 refs., 4 figs., 2 tabs.

  16. On Early Conflict Identification by Requirements Modeling of Energy System Control Structures

    DEFF Research Database (Denmark)

    Heussen, Kai; Gehrke, Oliver; Niemann, Hans Henrik

    2015-01-01

    issues early. For requirements formulation of control structures, cyber and physical aspects need to be jointly represented to express interdependencies, check for consistency and discover potentially conflicting requirements. Early identification of potential conflicts may prevent larger problems...... at later design stages. However, languages employed for requirements modeling today do not offer the expressiveness necessary to represent control purposes in relation to domain level interactions and therefore miss several types of interdependencies. This paper introduces the idea of control structure...... modeling for early requirements checking using a suitable modeling language, and illustrates how this approach enables the identification of several classes of controller conflict....

  17. Requirements-level semantics and model checking of object-oriented statecharts

    NARCIS (Netherlands)

    Eshuis, H.; Jansen, D.N.; Wieringa, Roelf J.

    2002-01-01

    In this paper we define a requirements-level execution semantics for object-oriented statecharts and show how properties of a system specified by these statecharts can be model checked using tool support for model checkers. Our execution semantics is requirements-level because it uses the perfect

  18. Evaluation of uncertainty in geological framework models at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Bagtzoglou, A.C.; Stirewalt, G.L.; Henderson, D.B.; Seida, S.B.

    1995-01-01

    The first step towards determining compliance with the performance objectives for both the repository system and the geologic setting at Yucca Mountain requires the development of detailed geostratigraphic models. This paper proposes an approach for the evaluation of the degree of uncertainty inherent in geologic maps and associated three-dimensional geological models. Following this approach, an assessment of accuracy and completeness of the data and evaluation of conceptual uncertainties in the geological framework models can be performed

  19. A Statistical Evaluation of Atmosphere-Ocean General Circulation Models: Complexity vs. Simplicity

    OpenAIRE

    Robert K. Kaufmann; David I. Stern

    2004-01-01

    The principal tools used to model future climate change are General Circulation Models which are deterministic high resolution bottom-up models of the global atmosphere-ocean system that require large amounts of supercomputer time to generate results. But are these models a cost-effective way of predicting future climate change at the global level? In this paper we use modern econometric techniques to evaluate the statistical adequacy of three general circulation models (GCMs) by testing thre...

  20. Evaluating and reducing a model of radiocaesium soil-plant uptake

    Energy Technology Data Exchange (ETDEWEB)

    Tarsitano, D.; Young, S.D. [School of Biosciences, University of Nottingham, University Park, Nottingham, NG7 2RD (United Kingdom); Crout, N.M.J., E-mail: neil.crout@nottingham.ac.u [School of Biosciences, University of Nottingham, University Park, Nottingham, NG7 2RD (United Kingdom)

    2011-03-15

    An existing model of radiocaesium transfer to grasses was extended to include wheat and barley and parameterised using data from a wide range of soils and contact times. The model structure was revised and evaluated using a subset of the available data which was not used for model parameterisation. The resulting model was then used as a basis for systematic model reduction to test the utility of the model components. This analysis suggested that the use of 4 model variables (relating to radiocaesium adsorption on organic matter and the pH sensitivity of soil solution potassium concentration) and 1 model input (pH) are not required. The results of this analysis were used to develop a reduced model which was further evaluated in terms of comparisons to observations. The reduced model had an improved empirical performance and fewer adjustable parameters and soil characteristic inputs. - Research highlights: {yields} A model of plant radiocesium uptake is evaluated and re-parameterised. {yields} The representation of time dependent changes in plant uptake is improved. {yields} Model reduction is applied to evaluate the model structure. {yields} A reduced model is identified which outperforms the previously reported model. {yields} The reduced model requires fewer soil specific inputs.

  1. Building a transnational biosurveillance network using semantic web technologies: requirements, design, and preliminary evaluation.

    Science.gov (United States)

    Teodoro, Douglas; Pasche, Emilie; Gobeill, Julien; Emonet, Stéphane; Ruch, Patrick; Lovis, Christian

    2012-05-29

    Antimicrobial resistance has reached globally alarming levels and is becoming a major public health threat. Lack of efficacious antimicrobial resistance surveillance systems was identified as one of the causes of increasing resistance, due to the lag time between new resistances and alerts to care providers. Several initiatives to track drug resistance evolution have been developed. However, no effective real-time and source-independent antimicrobial resistance monitoring system is available publicly. To design and implement an architecture that can provide real-time and source-independent antimicrobial resistance monitoring to support transnational resistance surveillance. In particular, we investigated the use of a Semantic Web-based model to foster integration and interoperability of interinstitutional and cross-border microbiology laboratory databases. Following the agile software development methodology, we derived the main requirements needed for effective antimicrobial resistance monitoring, from which we proposed a decentralized monitoring architecture based on the Semantic Web stack. The architecture uses an ontology-driven approach to promote the integration of a network of sentinel hospitals or laboratories. Local databases are wrapped into semantic data repositories that automatically expose local computing-formalized laboratory information in the Web. A central source mediator, based on local reasoning, coordinates the access to the semantic end points. On the user side, a user-friendly Web interface provides access and graphical visualization to the integrated views. We designed and implemented the online Antimicrobial Resistance Trend Monitoring System (ARTEMIS) in a pilot network of seven European health care institutions sharing 70+ million triples of information about drug resistance and consumption. Evaluation of the computing performance of the mediator demonstrated that, on average, query response time was a few seconds (mean 4.3, SD 0.1 × 10

  2. Diverse Secreted Effectors Are Required for Salmonella Persistence in a Mouse Infection Model

    Energy Technology Data Exchange (ETDEWEB)

    Kidwai, Afshan S.; Mushamiri, Ivy T.; Niemann, George; Brown, Roslyn N.; Adkins, Joshua N.; Heffron, Fred

    2013-08-12

    Salmonella enterica serovar Typhimurium causes typhoid-like disease in mice and is a model of typhoid fever in humans. One of the hallmarks of typhoid is persistence, the ability of the bacteria to survive in the host weeks after infection. Virulence factors called effectors facilitate this process by direct transfer to the cytoplasm of infected cells thereby subverting cellular processes. Secretion of effectors to the cell cytoplasm takes place through multiple routes, including two separate type III secretion (T3SS) apparati as well as outer membrane vesicles. The two T3SS are encoded on separate pathogenicity islands, SPI-1 and -2, with SPI-1 more strongly associated with the intestinal phase of infection, and SPI-2 with the systemic phase. Both T3SS are required for persistence, but the effectors required have not been systematically evaluated. In this study, mutations in 48 described effectors were tested for persistence. We replaced each effector with a specific DNA barcode sequence by allelic exchange and co-infected with a wild-type reference to calculate the ratio of wild-type parent to mutant at different times after infection. The competitive index (CI) was determined by quantitative PCR in which primers that correspond to the barcode were used for amplification. Mutations in all but seven effectors reduced persistence demonstrating that most effectors were required. One exception was CigR, a recently discovered effector that is widely conserved throughout enteric bacteria. Deletion of cigR increased lethality, suggesting that it may be an anti-virulence factor. The fact that almost all Salmonella effectors are required for persistence argues against redundant functions. This is different from effector repertoires in other intracellular pathogens such as Legionella.

  3. Towards systematic evaluation of crop model outputs for global land-use models

    Science.gov (United States)

    Leclere, David; Azevedo, Ligia B.; Skalský, Rastislav; Balkovič, Juraj; Havlík, Petr

    2016-04-01

    Land provides vital socioeconomic resources to the society, however at the cost of large environmental degradations. Global integrated models combining high resolution global gridded crop models (GGCMs) and global economic models (GEMs) are increasingly being used to inform sustainable solution for agricultural land-use. However, little effort has yet been done to evaluate and compare the accuracy of GGCM outputs. In addition, GGCM datasets require a large amount of parameters whose values and their variability across space are weakly constrained: increasing the accuracy of such dataset has a very high computing cost. Innovative evaluation methods are required both to ground credibility to the global integrated models, and to allow efficient parameter specification of GGCMs. We propose an evaluation strategy for GGCM datasets in the perspective of use in GEMs, illustrated with preliminary results from a novel dataset (the Hypercube) generated by the EPIC GGCM and used in the GLOBIOM land use GEM to inform on present-day crop yield, water and nutrient input needs for 16 crops x 15 management intensities, at a spatial resolution of 5 arc-minutes. We adopt the following principle: evaluation should provide a transparent diagnosis of model adequacy for its intended use. We briefly describe how the Hypercube data is generated and how it articulates with GLOBIOM in order to transparently identify the performances to be evaluated, as well as the main assumptions and data processing involved. Expected performances include adequately representing the sub-national heterogeneity in crop yield and input needs: i) in space, ii) across crop species, and iii) across management intensities. We will present and discuss measures of these expected performances and weight the relative contribution of crop model, input data and data processing steps in performances. We will also compare obtained yield gaps and main yield-limiting factors against the M3 dataset. Next steps include

  4. Quantitative evaluation of the requirements for the promotion as associate professor at German medical faculties.

    Science.gov (United States)

    Sorg, Heiko; Knobloch, Karsten

    2012-01-01

    First quantitative evaluation of the requirements for the promotion as associate professor (AP) at German medical faculties. Analysis of the AP-regulations of German medical faculties according to a validated scoring system, which has been adapted to this study. The overall scoring for the AP-requirements at 35 German medical faculties was 13.5±0.6 of 20 possible scoring points (95% confidence interval 12.2-14.7). More than 88% of the AP-regulations demand sufficient performance in teaching and research with adequate scientific publication. Furthermore, 83% of the faculties expect an expert review of the candidate's performance. Conference presentations required as an assistant professor as well as the reduction of the minimum time as an assistant professor do only play minor roles. The requirements for assistant professors to get nominated as an associate professor at German medical faculties are high with an only small range. In detail, however, it can be seen that there still exists large heterogeneity, which hinders equal opportunities and career possibilities. These data might be used for the ongoing objective discussion.

  5. Evaluation of safety, an unavoidable requirement in the applications of ionizing radiations

    International Nuclear Information System (INIS)

    Jova Sed, Luis Andres

    2013-01-01

    The safety assessments should be conducted as a means to evaluate compliance with safety requirements (and thus the application of fundamental safety principles) for all facilities and activities in order to determine the measures to be taken to ensure safety. It is an essential tool in decision making. For long time we have linked the safety assessment to nuclear facilities and not to all practices involving the use of ionizing radiation in daily life. However, the main purpose of the safety assessment is to determine if it has reached an appropriate level of safety for an installation or activity and if it has fulfilled the objectives of safety and basic safety criteria set by the designer, operating organization and the regulatory body under the protection and safety requirements set out in the International Basic safety Standards for Protection against Ionizing Radiation and for the Safety of Radiation Sources. This paper presents some criteria and personal experiences with the new international recommendations on this subject and its practical application in the region and demonstrates the importance of this requirement. Reflects the need to train personnel of the operator and the regulatory body in the proportional application of this requirement in practice with ionizing radiation

  6. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    Science.gov (United States)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  7. Modelling and evaluating against the violent insider

    International Nuclear Information System (INIS)

    Fortney, D.S.; Al-Ayat, R.A.; Saleh, R.A.

    1991-01-01

    The violent insider threat poses a special challenge to facilities protecting special nuclear material from theft or diversion. These insiders could potentially behave as nonviolent insiders to deceitfully defeat certain safeguards elements and use violence to forcefully defeat hardware or personnel. While several vulnerability assessment tools are available to deal with the nonviolent insider, very limited effort has been directed to developing analysis tools for the violent threat. In this paper, the authors present an approach using the results of a vulnerability assessment for nonviolent insiders to evaluate certain violent insider scenarios. Since existing tools do not explicitly consider violent insiders, the approach is intended for experienced safeguards analysts and relies on the analyst to brainstorm possible violent actions, to assign detection probabilities, and to ensure consistency. The authors then discuss our efforts in developing an automated tool for assessing the vulnerability against those violent insiders who are willing to use force against barriers, but who are unwilling to kill or be killed. Specifically, the authors discuss our efforts in developing databases for violent insiders penetrating barriers, algorithms for considering the entry of contraband, and modelling issues in considering the use of violence

  8. Modelling and evaluating against the violent insider

    International Nuclear Information System (INIS)

    Fortney, D.S.; Al-Ayat, R.A.; Saleh, R.A.

    1991-07-01

    The violent insider threat poses a special challenge to facilities protecting special nuclear material from theft or diversion. These insiders could potentially behave as nonviolent insiders to deceitfully defeat certain safeguards elements and use violence to forcefully defeat hardware or personnel. While several vulnerability assessment tools are available to deal with the nonviolent insider, very limited effort has been directed to developing analysis tools for the violent threat. In this paper, we present an approach using the results of a vulnerability assessment for nonviolent insiders to evaluate certain violent insider scenarios. Since existing tools do not explicitly consider violent insiders, the approach is intended for experienced safeguards analysts and relies on the analyst to brainstorm possible violent actions, to assign detection probabilities, and to ensure consistency. We then discuss our efforts in developing an automated tool for assessing the vulnerability against those violent insiders who are willing to use force against barriers, but who are unwilling to kill or be killed. Specifically, we discuss our efforts in developing databases for violent insiders penetrating barriers, algorithms for considering the entry of contraband, and modelling issues in considering the use of violence

  9. Developing a conceptual model for selecting and evaluating online markets

    Directory of Open Access Journals (Sweden)

    Sadegh Feizollahi

    2013-04-01

    Full Text Available There are many evidences, which emphasis on the benefits of using new technologies of information and communication in international business and many believe that E-Commerce can help satisfy customer explicit and implicit requirements. Internet shopping is a concept developed after the introduction of electronic commerce. Information technology (IT and its applications, specifically in the realm of the internet and e-mail promoted the development of e-commerce in terms of advertising, motivating and information. However, with the development of new technologies, credit and financial exchange on the internet websites were constructed so to facilitate e-commerce. The proposed study sends a total of 200 questionnaires to the target group (teachers - students - professionals - managers of commercial web sites and it manages to collect 130 questionnaires for final evaluation. Cronbach's alpha test is used for measuring reliability and to evaluate the validity of measurement instruments (questionnaires, and to assure construct validity, confirmatory factor analysis is employed. In addition, in order to analyze the research questions based on the path analysis method and to determine markets selection models, a regular technique is implemented. In the present study, after examining different aspects of e-commerce, we provide a conceptual model for selecting and evaluating online marketing in Iran. These findings provide a consistent, targeted and holistic framework for the development of the Internet market in the country.

  10. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    Science.gov (United States)

    The model performance evaluation consists of metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors.

  11. Input data requirements for performance modelling and monitoring of photovoltaic plants

    DEFF Research Database (Denmark)

    Gavriluta, Anamaria Florina; Spataru, Sergiu; Sera, Dezso

    2018-01-01

    This work investigates the input data requirements in the context of performance modeling of thin-film photovoltaic (PV) systems. The analysis focuses on the PVWatts performance model, well suited for on-line performance monitoring of PV strings, due to its low number of parameters and high......, modelling the performance of the PV modules at high irradiances requires a dataset of only a few hundred samples in order to obtain a power estimation accuracy of ~1-2\\%....

  12. Hybrid supply chain model for material requirement planning under financial constraints: A case study

    Science.gov (United States)

    Curci, Vita; Dassisti, Michele; Josefa, Mula Bru; Manuel, Díaz Madroñero

    2014-10-01

    Supply chain model (SCM) are potentially capable to integrate different aspects in supporting decision making for enterprise management tasks. The aim of the paper is to propose an hybrid mathematical programming model for optimization of production requirements resources planning. The preliminary model was conceived bottom-up from a real industrial case analysed oriented to maximize cash flow. Despite the intense computational effort required to converge to a solution, optimisation done brought good result in solving the objective function.

  13. Flood control design requirements and flood evaluation methods of inland nuclear power plant

    International Nuclear Information System (INIS)

    Zhang Ailing; Wang Ping; Zhu Jingxing

    2011-01-01

    Effect of flooding is one of the key safety factors and environmental factors in inland nuclear power plant sitting. Up to now, the rule of law and standard systems are established for the selection of nuclear power plant location and flood control requirements in China. In this paper flood control standards of China and other countries are introduced. Several inland nuclear power plants are taken as examples to thoroughly discuss the related flood evaluation methods. The suggestions are also put forward in the paper. (authors)

  14. Airline service quality evaluation: A review on concepts and models

    OpenAIRE

    Navid Haghighat

    2017-01-01

    This paper reviews different major service quality concept and models which led to great developments in evaluating service quality with focusing on improvement process of the models through discussing criticisms of each model. Criticisms against these models are discussed to clarify development steps of newer models which led to the improvement of airline service quality models. The precise and accurate evaluation of service quality needs utilizing a reliable concept with comprehensive crite...

  15. A preliminary evaluation of the simulation requirements of accident situations in a PWR training simulator

    International Nuclear Information System (INIS)

    Bhattacharya, A.S.

    1978-07-01

    An attempt was made to establish here the need to incorporate ''Accident Situations'' in Training Simulators, and, to indicate that the modelling of such otherwise complex situations could admit a great deal of simplifications. In most cases, a ''semidynamic'' model to display time dependent flows or Tank levels would do while ESF actuated start/stop or open/close of equipment could be modelled merely ''Live''. Transport delays and Time Constants in most cases are of no significance as are the accuracy considerations of most variables. The systems which need complete Dynamic Simulation are the ones which are always running, such as Pressuriser, RCS, Reactor, and the Steam Generator with the added advantages of the modest accuracy requirements and the plant being shutdown. (author)

  16. Finite Element Models Development of Car Seats With Passive Head Restraints to Study Their Meeting Requirements for EURO NCAP

    Directory of Open Access Journals (Sweden)

    D. Yu. Solopov

    2014-01-01

    Full Text Available In performing calculations to evaluate passive safety of car seats by computer modelling methods it is desirable to use the final element models (FEM thereby providing the greatest accuracy of calculation results. Besides, it is expedient to use FEM, which can be calculated by computer for a small period of time to give preliminary results for short terms.The paper describes the features to evaluate a passive safety, which is ensured by the developed KEM of seats with passive head restraints according to requirements of the EURO NCAP.Besides, accuracy of calculated results that is provided by the developed KEM was evaluated. Accuracy evaluation was accomplished in relation to the results obtained the by specialists of the organization conducting similar researches (LSTC.This work was performed within the framework of a technique, which allows us to develop effectively the car seat designs both with passive, and active head restraints, meeting requirements for passive safety.By results of made calculations and experiments it was found that when evaluating by the EURO NCAP technique the "rough" KEM (the 1st and 2nd levels can be considered as rational ones (in terms of labour costs for its creation and problem solving as well as by result errors and it is expedient to use them for preliminary and multivariate calculations. Detailed models (the 3rd level provide the greatest accuracy (the greatest accuracy is reached with the evaluated impact of 16km/h speed under the loading conditions "moderate impact". A relative error of full head acceleration is of 12%.In evaluation by EURO NCAP using NIC criterion a conclusion can be drawn that the seat models of the 2nd level (467 936 KE and the 3rd level (1 255 358 KE meet the passive safety requirements according to EURO NCAP requirements under "light", "moderate", and "heavy" impacts.In evaluation by EURO NCAP for preliminary and multivariate calculations a model of the middle level (consisting of 467

  17. Evaluation of weather-based rice yield models in India

    Science.gov (United States)

    Sudharsan, D.; Adinarayana, J.; Reddy, D. Raji; Sreenivas, G.; Ninomiya, S.; Hirafuji, M.; Kiura, T.; Tanaka, K.; Desai, U. B.; Merchant, S. N.

    2013-01-01

    The objective of this study was to compare two different rice simulation models—standalone (Decision Support System for Agrotechnology Transfer [DSSAT]) and web based (SImulation Model for RIce-Weather relations [SIMRIW])—with agrometeorological data and agronomic parameters for estimation of rice crop production in southern semi-arid tropics of India. Studies were carried out on the BPT5204 rice variety to evaluate two crop simulation models. Long-term experiments were conducted in a research farm of Acharya N G Ranga Agricultural University (ANGRAU), Hyderabad, India. Initially, the results were obtained using 4 years (1994-1997) of data with weather parameters from a local weather station to evaluate DSSAT simulated results with observed values. Linear regression models used for the purpose showed a close relationship between DSSAT and observed yield. Subsequently, yield comparisons were also carried out with SIMRIW and DSSAT, and validated with actual observed values. Realizing the correlation coefficient values of SIMRIW simulation values in acceptable limits, further rice experiments in monsoon (Kharif) and post-monsoon (Rabi) agricultural seasons (2009, 2010 and 2011) were carried out with a location-specific distributed sensor network system. These proximal systems help to simulate dry weight, leaf area index and potential yield by the Java based SIMRIW on a daily/weekly/monthly/seasonal basis. These dynamic parameters are useful to the farming community for necessary decision making in a ubiquitous manner. However, SIMRIW requires fine tuning for better results/decision making.

  18. Communicating Sustainability: An Operational Model for Evaluating Corporate Websites

    Directory of Open Access Journals (Sweden)

    Alfonso Siano

    2016-09-01

    Full Text Available The interest in corporate sustainability has increased rapidly in recent years and has encouraged organizations to adopt appropriate digital communication strategies, in which the corporate website plays a key role. Despite this growing attention in both the academic and business communities, models for the analysis and evaluation of online sustainability communication have not been developed to date. This paper aims to develop an operational model to identify and assess the requirements of sustainability communication in corporate websites. It has been developed from a literature review on corporate sustainability and digital communication and the analysis of the websites of the organizations included in the “Global CSR RepTrak 2015” by the Reputation Institute. The model identifies the core dimensions of online sustainability communication (orientation, structure, ergonomics, content—OSEC, sub-dimensions, such as stakeholder engagement and governance tools, communication principles, and measurable items (e.g., presence of the materiality matrix, interactive graphs. A pilot study on the websites of the energy and utilities companies included in the Dow Jones Sustainability World Index 2015 confirms the applicability of the OSEC framework. Thus, the model can provide managers and digital communication consultants with an operational tool that is useful for developing an industry ranking and assessing the best practices. The model can also help practitioners to identify corrective actions in the critical areas of digital sustainability communication and avoid greenwashing.

  19. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D.; Halford, Keith J.; Binley, Andrew; Lane, John W.; Werkema, Dale D.

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  20. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species.

    Science.gov (United States)

    Adams, Matthew P; Collier, Catherine J; Uthicke, Sven; Ow, Yan X; Langlois, Lucas; O'Brien, Katherine R

    2017-01-04

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (T opt ) for maximum photosynthetic rate (P max ). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.

  1. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species

    Science.gov (United States)

    Adams, Matthew P.; Collier, Catherine J.; Uthicke, Sven; Ow, Yan X.; Langlois, Lucas; O'Brien, Katherine R.

    2017-01-01

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (Topt) for maximum photosynthetic rate (Pmax). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.

  2. Pathogen reduction requirements for direct potable reuse in Antarctica: evaluating human health risks in small communities.

    Science.gov (United States)

    Barker, S Fiona; Packer, Michael; Scales, Peter J; Gray, Stephen; Snape, Ian; Hamilton, Andrew J

    2013-09-01

    Small, remote communities often have limited access to energy and water. Direct potable reuse of treated wastewater has recently gained attention as a potential solution for water-stressed regions, but requires further evaluation specific to small communities. The required pathogen reduction needed for safe implementation of direct potable reuse of treated sewage is an important consideration but these are typically quantified for larger communities and cities. A quantitative microbial risk assessment (QMRA) was conducted, using norovirus, giardia and Campylobacter as reference pathogens, to determine the level of treatment required to meet the tolerable annual disease burden of 10(-6) DALYs per person per year, using Davis Station in Antarctica as an example of a small remote community. Two scenarios were compared: published municipal sewage pathogen loads and estimated pathogen loads during a gastroenteritis outbreak. For the municipal sewage scenario, estimated required log10 reductions were 6.9, 8.0 and 7.4 for norovirus, giardia and Campylobacter respectively, while for the outbreak scenario the values were 12.1, 10.4 and 12.3 (95th percentiles). Pathogen concentrations are higher under outbreak conditions as a function of the relatively greater degree of contact between community members in a small population, compared with interactions in a large city, resulting in a higher proportion of the population being at risk of infection and illness. While the estimates of outbreak conditions may overestimate sewage concentration to some degree, the results suggest that additional treatment barriers would be required to achieve regulatory compliance for safe drinking water in small communities. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Evaluation and hydrological modelization in the natural hazard prevention

    International Nuclear Information System (INIS)

    Pla Sentis, Ildefonso

    2011-01-01

    Soil degradation affects negatively his functions as a base to produce food, to regulate the hydrological cycle and the environmental quality. All over the world soil degradation is increasing partly due to lacks or deficiencies in the evaluations of the processes and causes of this degradation on each specific situation. The processes of soil physical degradation are manifested through several problems as compaction, runoff, hydric and Eolic erosion, landslides with collateral effects in situ and in the distance, often with disastrous consequences as foods, landslides, sedimentations, droughts, etc. These processes are frequently associated to unfavorable changes into the hydrologic processes responsible of the water balance and soil hydric regimes, mainly derived to soil use changes and different management practices and climatic changes. The evaluation of these processes using simple simulation models; under several scenarios of climatic change, soil properties and land use and management; would allow to predict the occurrence of this disastrous processes and consequently to select and apply the appropriate practices of soil conservation to eliminate or reduce their effects. This simulation models require, as base, detailed climatic information and hydrologic soil properties data. Despite of the existence of methodologies and commercial equipment (each time more sophisticated and precise) to measure the different physical and hydrological soil properties related with degradation processes, most of them are only applicable under really specific or laboratory conditions. Often indirect methodologies are used, based on relations or empiric indexes without an adequate validation, that often lead to expensive mistakes on the evaluation of soil degradation processes and their effects on natural disasters. It could be preferred simple field methodologies, direct and adaptable to different soil types and climates and to the sample size and the spatial variability of the

  4. Probabilistic evaluation of process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2016-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  5. A systems evaluation model for selecting spent nuclear fuel storage concepts

    International Nuclear Information System (INIS)

    Postula, F.D.; Finch, W.C.; Morissette, R.P.

    1982-01-01

    This paper describes a system evaluation approach used to identify and evaluate monitored, retrievable fuel storage concepts that fulfill ten key criteria for meeting the functional requirements and system objectives of the National Nuclear Waste Management Program. The selection criteria include health and safety, schedules, costs, socio-economic factors and environmental factors. The methodology used to establish the selection criteria, develop a weight of importance for each criterion and assess the relative merit of each storage system is discussed. The impact of cost relative to technical criteria is examined along with experience in obtaining relative merit data and its application in the model. Topics considered include spent fuel storage requirements, functional requirements, preliminary screening, and Monitored Retrievable Storage (MRS) system evaluation. It is concluded that the proposed system evaluation model is universally applicable when many concepts in various stages of design and cost development need to be evaluated

  6. Sustainable BECCS pathways evaluated by an integrated assessment model

    Science.gov (United States)

    Kato, E.

    2017-12-01

    Negative emissions technologies, particularly Bioenergy with Carbon Capture and Storage (BECCS), are key components of mitigation strategies in ambitious future socioeconomic scenarios analysed by integrated assessment models. Generally, scenarios aiming to keep mean global temperature rise below 2°C above pre-industrial would require net negative carbon emissions in the end of the 21st century. Also, in the context of Paris agreement which acknowledges "a balance between anthropogenic emissions by sources and removals by sinks of greenhouse gases in the second half of this century", RD&D for the negative emissions technologies in this decade has a crucial role for the possibility of early deployment of the technology. Because of the requirement of potentially extensive use of land and water for producing the bioenergy feedstock to get the anticipated level of gross negative emissions, researches on how to develop sustainable scenarios of BECCS is needed. Here, we present BECCS deployment scenarios that consider economically viable flow of bioenergy system including power generation and conversion process to liquid and gaseous fuels for transportation and heat with consideration of sustainable global biomass use. In the modelling process, detailed bioenergy representations, i.e. various feedstock and conversion technologies with and without CCS, are implemented in an integrated assessment (IA) model GRAPE (Global Relationship Assessment to Protect the Environment). Also, to overcome a general discrepancy about assumed future agricultural yield between 'top-down' IA models and 'bottom-up' estimates, which would crucially affect the land-use pattern, we applied yields change of food and energy crops consistent with process-based biophysical crop models in consideration of changing climate conditions. Using the framework, economically viable strategy for implementing sustainable bioenergy and BECCS flow are evaluated in the scenarios targeting to keep global average

  7. Nuclear safety culture evaluation model based on SSE-CMM

    International Nuclear Information System (INIS)

    Yang Xiaohua; Liu Zhenghai; Liu Zhiming; Wan Yaping; Peng Guojian

    2012-01-01

    Safety culture, which is of great significance to establish safety objectives, characterizes level of enterprise safety production and development. Traditional safety culture evaluation models emphasis on thinking and behavior of individual and organization, and pay attention to evaluation results while ignore process. Moreover, determining evaluation indicators lacks objective evidence. A novel multidimensional safety culture evaluation model, which has scientific and completeness, is addressed by building an preliminary mapping between safety culture and SSE-CMM's (Systems Security Engineering Capability Maturity Model) process area and generic practice. The model focuses on enterprise system security engineering process evaluation and provides new ideas and scientific evidences for the study of safety culture. (authors)

  8. Presenting an Evaluation Model for the Cancer Registry Software.

    Science.gov (United States)

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  9. Evaluation of the Current State of Integrated Water Quality Modelling

    Science.gov (United States)

    Arhonditsis, G. B.; Wellen, C. C.; Ecological Modelling Laboratory

    2010-12-01

    Environmental policy and management implementation require robust methods for assessing the contribution of various point and non-point pollution sources to water quality problems as well as methods for estimating the expected and achieved compliance with the water quality goals. Water quality models have been widely used for creating the scientific basis for management decisions by providing a predictive link between restoration actions and ecosystem response. Modelling water quality and nutrient transport is challenging due a number of constraints associated with the input data and existing knowledge gaps related to the mathematical description of landscape and in-stream biogeochemical processes. While enormous effort has been invested to make watershed models process-based and spatially-distributed, there has not been a comprehensive meta-analysis of model credibility in watershed modelling literature. In this study, we evaluate the current state of integrated water quality modeling across the range of temporal and spatial scales typically utilized. We address several common modeling questions by providing a quantitative assessment of model performance and by assessing how model performance depends on model development. The data compiled represent a heterogeneous group of modeling studies, especially with respect to complexity, spatial and temporal scales and model development objectives. Beginning from 1992, the year when Beven and Binley published their seminal paper on uncertainty analysis in hydrological modelling, and ending in 2009, we selected over 150 papers fitting a number of criteria. These criteria involved publications that: (i) employed distributed or semi-distributed modelling approaches; (ii) provided predictions on flow and nutrient concentration state variables; and (iii) reported fit to measured data. Model performance was quantified with the Nash-Sutcliffe Efficiency, the relative error, and the coefficient of determination. Further, our

  10. A mechanism for revising accreditation standards: a study of the process, resources required and evaluation outcomes.

    Science.gov (United States)

    Greenfield, David; Civil, Mike; Donnison, Andrew; Hogden, Anne; Hinchcliff, Reece; Westbrook, Johanna; Braithwaite, Jeffrey

    2014-11-21

    The study objective was to identify and describe the process, resources and expertise required for the revision of accreditation standards, and report outcomes arising from such activities. Secondary document analysis of materials from an accreditation standards development agency. The Royal Australian College of General Practitioners' (RACGP) documents, minutes and reports related to the revision of the accreditation standards were examined. The RACGP revision of the accreditation standards was conducted over a 12 month period and comprised six phases with multiple tasks, including: review methodology planning; review of the evidence base and each standard; new material development; constructing field trial methodology; drafting, trialling and refining new standards; and production of new standards. Over 100 individuals participated, with an additional 30 providing periodic input and feedback. Participants were drawn from healthcare professional associations, primary healthcare services, accreditation agencies, government agencies and public health organisations. Their expertise spanned: project management; standards development and writing; primary healthcare practice; quality and safety improvement methodologies; accreditation implementation and surveying; and research. The review and development process was shaped by five issues: project expectations; resource and time requirements; a collaborative approach; stakeholder engagement; and the product produced. The RACGP evaluation was that participants were positive about their experience, the standards produced and considered them relevant for the sector. The revision of accreditation standards requires considerable resources and expertise, drawn from a broad range of stakeholders. Collaborative, inclusive processes that engage key stakeholders helps promote greater industry acceptance of the standards.

  11. Modeling a support system for the evaluator

    International Nuclear Information System (INIS)

    Lozano Lima, B.; Ilizastegui Perez, F; Barnet Izquierdo, B.

    1998-01-01

    This work gives evaluators a tool they can employ to give more soundness to their review of operational limits and conditions. The system will establish the most adequate method to carry out the evaluation, as well as to evaluate the basis for technical operational specifications. It also includes the attainment of alternative questions to be supplied to the operating entity to support it in decision-making activities

  12. Challenges for the registration of vaccines in emerging countries: Differences in dossier requirements, application and evaluation processes.

    Science.gov (United States)

    Dellepiane, Nora; Pagliusi, Sonia

    2018-06-07

    The divergence of regulatory requirements and processes in developing and emerging countries contributes to hamper vaccines' registration, and therefore delay access to high-quality, safe and efficacious vaccines for their respective populations. This report focuses on providing insights on the heterogeneity of registration requirements in terms of numbering structure and overall content of dossiers for marketing authorisation applications for vaccines in different areas of the world. While it also illustrates the divergence of regulatory processes in general, as well as the need to avoid redundant reviews, it does not claim to provide a comprehensive view of all processes nor existing facilitating mechanisms, nor is it intended to touch upon the differences in assessments made by different regulatory authorities. This report describes the work analysed by regulatory experts from vaccine manufacturing companies during a meeting held in Geneva in May 2017, in identifying and quantifying differences in the requirements for vaccine registration in three aspects for comparison: the dossier numbering structure and contents, the application forms, and the evaluation procedures, in different countries and regions. The Module 1 of the Common Technical Document (CTD) of 10 countries were compared. Modules 2-5 of the CTDs of two regions and three countries were compared to the CTD of the US FDA. The application forms of eight countries were compared and the registration procedures of 134 importing countries were compared as well. The analysis indicates a high degree of divergence in numbering structure and content requirements. Possible interventions that would lead to significant improvements in registration efficiency include alignment in CTD numbering structure, a standardised model-application form, and better convergence of evaluation procedures. Copyright © 2018.

  13. Statistical models of shape optimisation and evaluation

    CERN Document Server

    Davies, Rhodri; Taylor, Chris

    2014-01-01

    Deformable shape models have wide application in computer vision and biomedical image analysis. This book addresses a key issue in shape modelling: establishment of a meaningful correspondence between a set of shapes. Full implementation details are provided.

  14. A model for evaluating beef cattle rations considering effects of ruminal fiber mass

    OpenAIRE

    Henrique,Douglas Sampaio; Lana,Rogério de Paula; Vieira,Ricardo Augusto Mendonça; Fontes,Carlos Augusto de Alencar; Botelho,Mosar Faria

    2011-01-01

    A mathematical model based on Cornell Net Carbohydrate and Protein System (CNCPS) was developed and adapted in order to evaluate beef cattle rations at tropical climate conditions. The presented system differs from CNCPS in the modeling of insoluble particles' digestion and passage kinetics, which enabled the estimation of fiber mass in rumen and its effects on animal performance. The equations used to estimate metabolizable protein and net energy requirements for gain, net energy requirement...

  15. Evaluation of EOR Processes Using Network Models

    DEFF Research Database (Denmark)

    Winter, Anatol; Larsen, Jens Kjell; Krogsbøll, Anette

    1998-01-01

    The report consists of the following parts: 1) Studies of wetting properties of model fluids and fluid mixtures aimed at an optimal selection of candidates for micromodel experiments. 2) Experimental studies of multiphase transport properties using physical models of porous networks (micromodels......) including estimation of their "petrophysical" properties (e.g. absolute permeability). 3) Mathematical modelling and computer studies of multiphase transport through pore space using mathematical network models. 4) Investigation of link between pore-scale and macroscopic recovery mechanisms....

  16. The Use of AMET and Automated Scripts for Model Evaluation

    Science.gov (United States)

    The Atmospheric Model Evaluation Tool (AMET) is a suite of software designed to facilitate the analysis and evaluation of meteorological and air quality models. AMET matches the model output for particular locations to the corresponding observed values from one or more networks ...

  17. An evaluation of sex-age-kill (SAK) model performance

    Science.gov (United States)

    Millspaugh, Joshua J.; Skalski, John R.; Townsend, Richard L.; Diefenbach, Duane R.; Boyce, Mark S.; Hansen, Lonnie P.; Kammermeyer, Kent

    2009-01-01

    The sex-age-kill (SAK) model is widely used to estimate abundance of harvested large mammals, including white-tailed deer (Odocoileus virginianus). Despite a long history of use, few formal evaluations of SAK performance exist. We investigated how violations of the stable age distribution and stationary population assumption, changes to male or female harvest, stochastic effects (i.e., random fluctuations in recruitment and survival), and sampling efforts influenced SAK estimation. When the simulated population had a stable age distribution and λ > 1, the SAK model underestimated abundance. Conversely, when λ < 1, the SAK overestimated abundance. When changes to male harvest were introduced, SAK estimates were opposite the true population trend. In contrast, SAK estimates were robust to changes in female harvest rates. Stochastic effects caused SAK estimates to fluctuate about their equilibrium abundance, but the effect dampened as the size of the surveyed population increased. When we considered both stochastic effects and sampling error at a deer management unit scale the resultant abundance estimates were within ±121.9% of the true population level 95% of the time. These combined results demonstrate extreme sensitivity to model violations and scale of analysis. Without changes to model formulation, the SAK model will be biased when λ ≠ 1. Furthermore, any factor that alters the male harvest rate, such as changes to regulations or changes in hunter attitudes, will bias population estimates. Sex-age-kill estimates may be precise at large spatial scales, such as the state level, but less so at the individual management unit level. Alternative models, such as statistical age-at-harvest models, which require similar data types, might allow for more robust, broad-scale demographic assessments.

  18. Evaluation of global climate models for Indian monsoon climatology

    International Nuclear Information System (INIS)

    Kodra, Evan; Ganguly, Auroop R; Ghosh, Subimal

    2012-01-01

    The viability of global climate models for forecasting the Indian monsoon is explored. Evaluation and intercomparison of model skills are employed to assess the reliability of individual models and to guide model selection strategies. Two dominant and unique patterns of Indian monsoon climatology are trends in maximum temperature and periodicity in total rainfall observed after 30 yr averaging over India. An examination of seven models and their ensembles reveals that no single model or model selection strategy outperforms the rest. The single-best model for the periodicity of Indian monsoon rainfall is the only model that captures a low-frequency natural climate oscillator thought to dictate the periodicity. The trend in maximum temperature, which most models are thought to handle relatively better, is best captured through a multimodel average compared to individual models. The results suggest a need to carefully evaluate individual models and model combinations, in addition to physical drivers where possible, for regional projections from global climate models. (letter)

  19. Personal recommender systems for learners in lifelong learning: requirements, techniques and model

    NARCIS (Netherlands)

    Drachsler, Hendrik; Hummel, Hans; Koper, Rob

    2007-01-01

    Drachsler, H., Hummel, H. G. K., & Koper, R. (2008). Personal recommender systems for learners in lifelong learning: requirements, techniques and model. International Journal of Learning Technology, 3(4), 404-423.

  20. Developing a Predictive Model for Unscheduled Maintenance Requirements on United States Air Force Installations

    National Research Council Canada - National Science Library

    Kovich, Matthew D; Norton, J. D

    2008-01-01

    .... This paper develops one such method by using linear regression and time series analysis to develop a predictive model to forecast future year man-hour and funding requirements for unscheduled maintenance...

  1. A modeling ontology for integrating vulnerabilities into security requirements conceptual foundations

    NARCIS (Netherlands)

    Elahi, G.; Yu, E.; Zannone, N.; Laender, A.H.F.; Castano, S.; Dayal, U.; Casati, F.; Palazzo Moreira de Oliveira, J.

    2009-01-01

    Vulnerabilities are weaknesses in the requirements, design, and implementation, which attackers exploit to compromise the system. This paper proposes a vulnerability-centric modeling ontology, which aims to integrate empirical knowledge of vulnerabilities into the system development process. In

  2. Issues in Value-at-Risk Modeling and Evaluation

    NARCIS (Netherlands)

    J. Daníelsson (Jón); C.G. de Vries (Casper); B.N. Jorgensen (Bjørn); P.F. Christoffersen (Peter); F.X. Diebold (Francis); T. Schuermann (Til); J.A. Lopez (Jose); B. Hirtle (Beverly)

    1998-01-01

    textabstractDiscusses the issues in value-at-risk modeling and evaluation. Value of value at risk; Horizon problems and extreme events in financial risk management; Methods of evaluating value-at-risk estimates.

  3. Whether regional lymph nodes evaluation should be equally required for both right and left colon cancer.

    Science.gov (United States)

    Guan, Xu; Chen, Wei; Liu, Zheng; Jiang, Zheng; Hu, Hanqing; Zhao, Zhixun; Wang, Song; Chen, Yinggang; Wang, Guiyu; Wang, Xishan

    2016-09-13

    Despite the adequacy of nodal evaluation was gradually improved for colon cancer, the disparity in nodal examination for right colon cancer (RCC) and left colon cancer (LCC) still begs the question of whether 12 nodes is an appropriate threshold for both RCC and LCC. From Surveillance, Epidemiology, and End-Results (SEER) database, we identified 53897 RCC patients and 11822 LCC patients. Compared with LCC patients, RCC patients examined more lymph nodes (18.7 vs 16.3), and more likely to examine ≥12 nodes (Pcancer specific survival (CSS) was calculated according to the optimal node number in RCC and LCC patients, Cox's regression model were used to further assess the prognostic value of this revised nodal evaluation. The results showed that 5-year CSSs were significantly improved for RCC patients with ≥15 lymph nodes, and also for LCC patients with ≥11 lymph nodes (Pcolon cancer as a whole.

  4. Model-Based Requirements Analysis for Reactive Systems with UML Sequence Diagrams and Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon; Lassen, Kristian Bisgaard

    2008-01-01

    In this paper, we describe a formal foundation for a specialized approach to automatically checking traces against real-time requirements. The traces are obtained from simulation of Coloured Petri Net (CPN) models of reactive systems. The real-time requirements are expressed in terms of a derivat...

  5. Fuel requirements for experimental devices in MTR reactors. A perturbation model for reactor core analysis

    International Nuclear Information System (INIS)

    Beeckmans de West-Meerbeeck, A.

    1991-01-01

    Irradiation in neutron absorbing devices, requiring high fast neutron fluxes in the core or high thermal fluxes in the reflector and flux traps, lead to higher density fuel and larger core dimensions. A perturbation model of the reactor core helps to estimate the fuel requirements. (orig.)

  6. Evaluation Of Supplemental Pre-Treatment Development Requirements To Meet TRL 6: Rotary Microfiltration

    International Nuclear Information System (INIS)

    Huber, H.J.

    2011-01-01

    In spring 2011, the Technology Maturation Plan (TMP) for the Supplemental Treatment Project (RPP-PLAN-49827, Rev. 0), Technology Maturation Plan for the Treatment Project (T4S01) was developed. This plan contains all identified actions required to reach technical maturity for a field-deployable waste feed pretreatment system. The supplemental pretreatment system has a filtration and a Cs-removal component. Subsequent to issuance of the TMP, rotary microfiltration (RMF) has been identified as the prime filtration technology for this application. The prime Cs-removal technology is small column ion exchange (ScIX) using spherical resorcinol formaldehyde (sRF) as the exchange resin. During fiscal year 2011 (FY2011) some of the tasks identified in the TMP have been completed. As of September 2011, the conceptual design package has been submitted to DOE as part of the critical decision (CD-1) process. This document describes the remaining tasks identified in the TMP to reach technical maturity and evaluates the validity of the proposed tests to fill the gaps as previously identified in the TMP. The potential vulnerabilities are presented and the completed list of criteria for the DOE guide DOE G 413.3-4 different technology readiness levels are added in an attachment. This evaluation has been conducted from a technology development perspective - all programmatic and manufacturing aspects were excluded from this exercise. Compliance with the DOE G 413.3-4 programmatic and manufacturing requirements will be addressed directly by the Treatment Project during the course of engineering design. The results of this evaluation show that completion of the proposed development tasks in the TMP are sufficient to reach TRL 6 from a technological point of view. The tasks involve actual waste tests using the current baseline configuration (2nd generation disks, 40 psi differential pressure, 30 C feed temperature) and three different simulants - the PEP, an AP-Farm and an S

  7. Evaluation of energy requirements for all-electric range of plug-in hybrid electric two-wheeler

    International Nuclear Information System (INIS)

    Amjad, Shaik; Rudramoorthy, R.; Neelakrishnan, S.; Sri Raja Varman, K.; Arjunan, T.V.

    2011-01-01

    Recently plug-in hybrid electric vehicles (PHEVs) are emerging as one of the promising alternative to improve the sustainability of transportation energy and air quality especially in urban areas. The all-electric range in PHEV design plays a significant role in sizing of battery pack and cost. This paper presents the evaluation of battery energy and power requirements for a plug-in hybrid electric two-wheeler for different all-electric ranges. An analytical vehicle model and MATLAB simulation analysis has been discussed. The MATLAB simulation results estimate the impact of driving cycle and all-electric range on energy capacity, additional mass and initial cost of lead-acid, nickel-metal hydride and lithium-ion batteries. This paper also focuses on influence of cycle life on annual cost of battery pack and recommended suitable battery pack for implementing in plug-in hybrid electric two-wheelers. -- Research highlights: → Evaluates the battery energy and power requirements for a plug-in hybrid electric two-wheeler. → Simulation results reveal that the IDC demand more energy and cost of battery compared to ECE R40. → If cycle life is considered, the annual cost of Ni-MH battery pack is lower than lead-acid and Li-ion.

  8. State of the Art : Integrated Management of Requirements in Model-Based Software Engineering

    OpenAIRE

    Thörn, Christer

    2006-01-01

    This report describes the background and future of research concerning integrated management of requirements in model-based software engineering. The focus is on describing the relevant topics and existing theoretical backgrounds that form the basis for the research. The report describes the fundamental difficulties of requirements engineering for software projects, and proposes that the results and methods of models in software engineering can help leverage those problems. Taking inspiration...

  9. MARKET EVALUATION MODEL: TOOL FORBUSINESS DECISIONS

    OpenAIRE

    Porlles Loarte, José; Yenque Dedios, Julio; Lavado Soto, Aurelio

    2014-01-01

    In the present work the concepts of potential market and global market are analyzed as the basis for strategic decisions of market with long term perspectives, when the implantation of a business in certain geographic area is evaluated. On this conceptual frame, the methodological tool is proposed to evaluate a commercial decision, for which it is taken as reference the case from the brewing industry in Peru, considering that this industry faces in the region entrepreneurial reorderings withi...

  10. A Regional Climate Model Evaluation System

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a packaged data management infrastructure for the comparison of generated climate model output to existing observational datasets that includes capabilities...

  11. Assessment of reflood heat transfer model of COBRA-TIF against ABB-CE evaluation model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S. I.; Lee, S. Y.; Park, C. E.; Choi, H. R.; Choi, C. J. [Korea Power Engineering Company Inc., Taejon (Korea, Republic of)

    2000-05-01

    According to 10 CFR 50 Appendix K, ECCS performance evaluation model should be based on the experimental data of FLECHT and have the conservatism compared with experimental data. To meet this requirement ABB-CE has the complicate code structure as follows: COMPERC-II calculates three reflood rates, and FLELAPC and HTCOF calculate the reflood heat transfer coefficients, and finally STRIKIN-II calculates the cladding temperature using the reflood heat transfer calculated in previous stage. In this paper, to investigate whether or not COBRA-TF satisfies the requirement of Appendix K, the reflood heat transfer coefficient of COBRA-TF was assessed against ABB-CE MOD-2C model. It was found out that COBRA-TF predicts properly the experimental data and has more conservatism than the results of ABB-CE MOD-2C model. Based on these results, it can be concluded that the reflood heat transfer coefficients calculated by COBRA-TF meet the requirement of Appendix K.

  12. QUALITY OF AN ACADEMIC STUDY PROGRAMME - EVALUATION MODEL

    Directory of Open Access Journals (Sweden)

    Mirna Macur

    2016-01-01

    Full Text Available Quality of an academic study programme is evaluated by many: employees (internal evaluation and by external evaluators: experts, agencies and organisations. Internal and external evaluation of an academic programme follow written structure that resembles on one of the quality models. We believe the quality models (mostly derived from EFQM excellence model don’t fit very well into non-profit activities, policies and programmes, because they are much more complex than environment, from which quality models derive from (for example assembly line. Quality of an academic study programme is very complex and understood differently by various stakeholders, so we present dimensional evaluation in the article. Dimensional evaluation, as opposed to component and holistic evaluation, is a form of analytical evaluation in which the quality of value of the evaluand is determined by looking at its performance on multiple dimensions of merit or evaluation criteria. First stakeholders of a study programme and their views, expectations and interests are presented, followed by evaluation criteria. They are both joined into the evaluation model revealing which evaluation criteria can and should be evaluated by which stakeholder. Main research questions are posed and research method for each dimension listed.

  13. Modeling of requirement specification for safety critical real time computer system using formal mathematical specifications

    International Nuclear Information System (INIS)

    Sankar, Bindu; Sasidhar Rao, B.; Ilango Sambasivam, S.; Swaminathan, P.

    2002-01-01

    Full text: Real time computer systems are increasingly used for safety critical supervision and control of nuclear reactors. Typical application areas are supervision of reactor core against coolant flow blockage, supervision of clad hot spot, supervision of undesirable power excursion, power control and control logic for fuel handling systems. The most frequent cause of fault in safety critical real time computer system is traced to fuzziness in requirement specification. To ensure the specified safety, it is necessary to model the requirement specification of safety critical real time computer systems using formal mathematical methods. Modeling eliminates the fuzziness in the requirement specification and also helps to prepare the verification and validation schemes. Test data can be easily designed from the model of the requirement specification. Z and B are the popular languages used for modeling the requirement specification. A typical safety critical real time computer system for supervising the reactor core of prototype fast breeder reactor (PFBR) against flow blockage is taken as case study. Modeling techniques and the actual model are explained in detail. The advantages of modeling for ensuring the safety are summarized

  14. Evaluation of safety requirements of erbium laser equipment used in dentistry

    International Nuclear Information System (INIS)

    Braga, Flavio Hamilton

    2002-01-01

    The erbium laser (Er:YAG) has been used in several therapeutic processes. Erbium lasers, however, operate with energies capable to produce lesions in biological tissues. Aiming the safe use, the commercialization of therapeutic laser equipment is controlled in Brazil, where the equipment should comply with quality and safety requirement prescribed in technical regulations. The objective of this work is to evaluate the quality and safety requirements of a commercial therapeutic erbium laser according to Brazilian regulations, and to discuss a risk control program intended to minimize the accidental exposition at dangerous laser radiation levels. It was verified that the analyzed laser can produce lesions in the skin and eyes, when exposed to laser radiation at distances smaller than 80 cm by 10 s or more. In these conditions, the use of protection glasses is recommended to the personnel that have access to the laser operation ambient. It was verified that the user's training and the presence of a target indicator are fundamental to avoid damages in the skin and buccal cavity. It was also verified that the knowledge and the correct use of the equipment safety devices, and the application of technical and administrative measures is efficient to minimize the risk of dangerous expositions to the laser radiation. (author)

  15. Evaluation of the NASA Arc Jet Capabilities to Support Mission Requirements

    Science.gov (United States)

    Calomino, Anthony; Bruce, Walt; Gage, Peter; Horn, Dennis; Mastaler, Mike; Rigali, Don; Robey, Judee; Voss, Linda; Wahlberg, Jerry; Williams, Calvin

    2010-01-01

    NASA accomplishes its strategic goals through human and robotic exploration missions. Many of these missions require launching and landing or returning spacecraft with human or return samples through Earth's and other planetary atmospheres. Spacecraft entering an atmosphere are subjected to extreme aerothermal loads. Protecting against these extreme loads is a critical element of spacecraft design. The safety and success of the planned mission is a prime concern for the Agency, and risk mitigation requires the knowledgeable use of thermal protection systems to successfully withstand the high-energy states imposed on the vehicle. Arc jets provide ground-based testing for development and flight validation of re-entry vehicle thermal protection materials and are a critical capability and core competency of NASA. The Agency's primary hypersonic thermal testing capability resides at the Ames Research Center and the Johnson Space Center and was developed and built in the 1960s and 1970s. This capability was critical to the success of Apollo, Shuttle, Pioneer, Galileo, Mars Pathfinder, and Orion. But the capability and the infrastructure are beyond their design lives. The complexes urgently need strategic attention and investment to meet the future needs of the Agency. The Office of Chief Engineer (OCE) chartered the Arc Jet Evaluation Working Group (AJEWG), a team of experienced individuals from across the Nation, to capture perspectives and requirements from the arc jet user community and from the community that operates and maintains this capability and capacity. This report offers the AJEWG's findings and conclusions that are intended to inform the discussion surrounding potential strategic technical and investment strategies. The AJEWG was directed to employ a 30-year Agency-level view so that near-term issues did not cloud the findings and conclusions and did not dominate or limit any of the strategic options.

  16. Modeling and Evaluation of Multimodal Perceptual Quality

    DEFF Research Database (Denmark)

    Petersen, Kim T; Hansen, Steffen Duus; Sørensen, John Aasted

    1997-01-01

    The increasing performance requirements of multimedia modalities, carrying speech, audio, video, image, and graphics emphasize the need for assessment methods of the total quality of a multimedia system and methods for simultaneous analysis of the system components. It is important to take...

  17. A qualitative readiness-requirements assessment model for enterprise big-data infrastructure investment

    Science.gov (United States)

    Olama, Mohammed M.; McNair, Allen W.; Sukumar, Sreenivas R.; Nutaro, James J.

    2014-05-01

    In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its own set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system's readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMADMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system's data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.

  18. Evaluation of Required Water Sources during Extended Loss of All AC Power for CANDU NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Jeon, Woo Jae; Lee, Kyung Jin; Kim, Min Ki; Kim, Keon Yeop; Park, Da Hee; Oh, Seo Bin [FNC Technology Co., Yongin (Korea, Republic of); Chang, Young Jin; Byun, Choong Seop [KHNP, Daejeon (Korea, Republic of)

    2016-10-15

    Fukushima accident was caused by lasting long hours of Station Black-Out (SBO) triggered from natural disaster. This accident had resulted in the reactor core damage. The purpose of this study is to evaluate the required water sources to maintain hot standby conditions until 72 hours during ELAP situation. The analysis was performed with CATHENA code. CATHENA code has been developed for the best-estimated transient simulation of CANDU plants. This study was carried out to evaluate the strategy to maintain hot standby conditions during ELAP situation in CANDU reactors. In this analysis, water was supplied to SG by MSSV open and by the gravity feed. It can cool the core without damage until the dousing tank depletion. Before dousing tank depletion, the emergency water supply pump was available by emergency power restoration. The pump continuously fed water to SG. So it is expected that the reactor core can be cooled down without damage for 72 hours if water source is enough to feed. This result is useful to make a strategy against SBO including ELAP situation.

  19. Estimates of nutritional requirements and use of Small Ruminant Nutrition System model for hair sheep in semiarid conditions

    Directory of Open Access Journals (Sweden)

    Alessandra Pinto de Oliveira

    2014-09-01

    Full Text Available The objective was to determine the efficiency of utilization of metabolizable energy for maintenance (km and weight gain (kf, the dietary requirements of total digestible nutrients (TDN and metabolizable protein (MP, as well as, evaluate the Small Ruminant Nutrition System (SRNS model to predict the dry matter intake (DMI and the average daily gain (ADG of Santa Ines lambs, fed diets containing different levels of metabolizable energy (ME. Thirty five lambs, non-castrated, with initial body weight (BW of 14.77 ± 1.26 kg at approximate two months old, were used. At the beginning of the experiment, five animals were slaughtered to serve as reference for the estimative of empty body weight (EBW and initial body composition of the 30 remaining animals, which were distributed in randomized block design with five treatments (1.13; 1.40; 1.73; 2.22 and 2.60 Mcal/kg DM, and six repetitions. The requirement of metabolizable energy for maintenance was 78.53 kcal/kg EBW0,75/day, with a utilization efficiency of 66%. The average value of efficiency of metabolizable energy utilization for weight gain was 48%. The dietary requirements of TDN and MP increased with the increase in BW and ADG of the animals. The SRNS model underestimated the DMI and ADG of the animals in 6.2% and 24.6%, respectively. Concludes that the values of km and kf are consistent with those observed in several studies with lambs created in the tropics. The dietary requirements of TDN and MP of Santa Ines lambs for different BW and ADG are, approximately, 42% and 24%, respectively, lower than those suggested by the american system of evaluation of food and nutrient requirements of small ruminants. The SRNS model was sensitive to predict the DMI in Santa Ines lambs, however, for variable ADG, more studies are needed, since the model underestimated the response of the animals of this study.

  20. Evaluating Energy Efficiency Policies with Energy-Economy Models

    Energy Technology Data Exchange (ETDEWEB)

    Mundaca, Luis; Neij, Lena; Worrell, Ernst; McNeil, Michael A.

    2010-08-01

    The growing complexities of energy systems, environmental problems and technology markets are driving and testing most energy-economy models to their limits. To further advance bottom-up models from a multidisciplinary energy efficiency policy evaluation perspective, we review and critically analyse bottom-up energy-economy models and corresponding evaluation studies on energy efficiency policies to induce technological change. We use the household sector as a case study. Our analysis focuses on decision frameworks for technology choice, type of evaluation being carried out, treatment of market and behavioural failures, evaluated policy instruments, and key determinants used to mimic policy instruments. Although the review confirms criticism related to energy-economy models (e.g. unrealistic representation of decision-making by consumers when choosing technologies), they provide valuable guidance for policy evaluation related to energy efficiency. Different areas to further advance models remain open, particularly related to modelling issues, techno-economic and environmental aspects, behavioural determinants, and policy considerations.

  1. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance and equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  2. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance ad equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  3. Evaluation of the Trade Space Between UAS Maneuver Performance and SAA System Performance Requirements

    Science.gov (United States)

    Jack, Devin P.; Hoffler, Keith D.; Johnson, Sally C.

    2014-01-01

    A need exists to safely integrate Unmanned Aircraft Systems (UAS) into the National Airspace System. Replacing manned aircraft's see-and-avoid capability in the absence of an onboard pilot is one of the key challenges associated with safe integration. Sense-and-avoid (SAA) systems will have to achieve yet-to-be-determined required separation distances for a wide range of encounters. They will also need to account for the maneuver performance of the UAS they are paired with. The work described in this paper is aimed at developing an understanding of the trade space between UAS maneuver performance and SAA system performance requirements. An assessment of current manned and unmanned aircraft performance was used to establish potential UAS performance test matrix bounds. Then, nearterm UAS integration work was used to narrow down the scope. A simulator was developed with sufficient fidelity to assess SAA system performance requirements for a wide range of encounters. The simulator generates closest-point-of-approach (CPA) data from the wide range of UAS performance models maneuvering against a single intruder with various encounter geometries. The simulator is described herein and has both a graphical user interface and batch interface to support detailed analysis of individual UAS encounters and macro analysis of a very large set of UAS and encounter models, respectively. Results from the simulator using approximate performance data from a well-known manned aircraft is presented to provide insight into the problem and as verification and validation of the simulator. Analysis of climb, descent, and level turn maneuvers to avoid a collision is presented. Noting the diversity of backgrounds in the UAS community, a description of the UAS aerodynamic and propulsive design and performance parameters is included. Initial attempts to model the results made it clear that developing maneuver performance groups is required. Discussion of the performance groups developed and how

  4. evaluation of models for assessing groundwater vulnerability

    African Journals Online (AJOL)

    DR. AMINU

    applied models for groundwater vulnerability assessment mapping. The appraoches .... The overall 'pollution potential' or DRASTIC index is established by applying the formula: DRASTIC Index: ... affected by the structure of the soil surface.

  5. Models for Evaluating and Improving Architecture Competence

    National Research Council Canada - National Science Library

    Bass, Len; Clements, Paul; Kazman, Rick; Klein, Mark

    2008-01-01

    ... producing high-quality architectures. This report lays out the basic concepts of software architecture competence and describes four models for explaining, measuring, and improving the architecture competence of an individual...

  6. The CREATIVE Decontamination Performance Evaluation Model

    National Research Council Canada - National Science Library

    Shelly, Erin E

    2008-01-01

    The project objective is to develop a semi-empirical, deterministic model to characterize and predict laboratory-scale decontaminant efficacy and hazards for a range of: chemical agents (current focus on HD...

  7. Evaluation of Model Wheat/Hemp Composites

    Directory of Open Access Journals (Sweden)

    Ivan Švec

    2014-02-01

    Full Text Available Model cereal blends were prepared from commercial wheat fine flour and 5 samples of hemp flour (HF, including fine (2 of conventional form, 1 of organic form and wholemeal type (2 of conventional form. Wheat flour was substituted in 4 levels (5, 10, 15, 20%. HF addition has increased protein content independently on tested hemp flour form or type. Partial model cereal blends could be distinguished according to protein quality (Zeleny test values, especially between fine and wholemeal HF type. Both flour types affected also amylolytic activity, for which a relationship between hemp addition and determined level of Falling Number was confirmed for all five model cereal blends. Solvent retention capacity profiles (SRC of partial models were influenced by both HF form and type, as well as by its addition level. Between both mentioned groups of quality features, significant correlation were proved - relationships among protein content/quality and lactic acid SRC were verifiable on p <0.01 (-0.58, 0.91, respectively. By performed ANOVA, a possibility to distinguish the HF form used in model cereal blend according to the lactic acid SRC and the water SRC was demonstrated. Comparing partial cereal models containing fine and wholemeal hemp type, HF addition level demonstrated its impact on the sodium carbonate SRC and the water acid SRC. Normal 0 21 false false false CS JA X-NONE

  8. Evaluation model development for sprinkler irrigation uniformity ...

    African Journals Online (AJOL)

    use

    Sprinkle and trickle irrigation. The. Blackburn Press, New Jersey, USA. Li JS, Rao MJ (1999). Evaluation method of sprinkler irrigation nonuniformity. Trans. CSAE. 15(4): 78-82. Lin Z, Merkley GP (2011). Relationships between common irrigation application uniformity indicators. Irrig Sci. Online First™, 27 January. 2011.

  9. Evaluation of attenuation and scatter correction requirements in small animal PET and SPECT imaging

    Science.gov (United States)

    Konik, Arda Bekir

    Positron emission tomography (PET) and single photon emission tomography (SPECT) are two nuclear emission-imaging modalities that rely on the detection of high-energy photons emitted from radiotracers administered to the subject. The majority of these photons are attenuated (absorbed or scattered) in the body, resulting in count losses or deviations from true detection, which in turn degrades the accuracy of images. In clinical emission tomography, sophisticated correction methods are often required employing additional x-ray CT or radionuclide transmission scans. Having proven their potential in both clinical and research areas, both PET and SPECT are being adapted for small animal imaging. However, despite the growing interest in small animal emission tomography, little scientific information exists about the accuracy of these correction methods on smaller size objects, and what level of correction is required. The purpose of this work is to determine the role of attenuation and scatter corrections as a function of object size through simulations. The simulations were performed using Interactive Data Language (IDL) and a Monte Carlo based package, Geant4 application for emission tomography (GATE). In IDL simulations, PET and SPECT data acquisition were modeled in the presence of attenuation. A mathematical emission and attenuation phantom approximating a thorax slice and slices from real PET/CT data were scaled to 5 different sizes (i.e., human, dog, rabbit, rat and mouse). The simulated emission data collected from these objects were reconstructed. The reconstructed images, with and without attenuation correction, were compared to the ideal (i.e., non-attenuated) reconstruction. Next, using GATE, scatter fraction values (the ratio of the scatter counts to the total counts) of PET and SPECT scanners were measured for various sizes of NEMA (cylindrical phantoms representing small animals and human), MOBY (realistic mouse/rat model) and XCAT (realistic human model

  10. Connecting Requirements to Architecture and Analysis via Model-Based Systems Engineering

    Science.gov (United States)

    Cole, Bjorn F.; Jenkins, J. Steven

    2015-01-01

    In traditional systems engineering practice, architecture, concept development, and requirements development are related but still separate activities. Concepts for operation, key technical approaches, and related proofs of concept are developed. These inform the formulation of an architecture at multiple levels, starting with the overall system composition and functionality and progressing into more detail. As this formulation is done, a parallel activity develops a set of English statements that constrain solutions. These requirements are often called "shall statements" since they are formulated to use "shall." The separation of requirements from design is exacerbated by well-meaning tools like the Dynamic Object-Oriented Requirements System (DOORS) that remained separated from engineering design tools. With the Europa Clipper project, efforts are being taken to change the requirements development approach from a separate activity to one intimately embedded in formulation effort. This paper presents a modeling approach and related tooling to generate English requirement statements from constraints embedded in architecture definition.

  11. The Nuremberg Code subverts human health and safety by requiring animal modeling

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-07-01

    Full Text Available Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. Summary We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented.

  12. [Decision modeling for economic evaluation of health technologies].

    Science.gov (United States)

    de Soárez, Patrícia Coelho; Soares, Marta Oliveira; Novaes, Hillegonda Maria Dutilh

    2014-10-01

    Most economic evaluations that participate in decision-making processes for incorporation and financing of technologies of health systems use decision models to assess the costs and benefits of the compared strategies. Despite the large number of economic evaluations conducted in Brazil, there is a pressing need to conduct an in-depth methodological study of the types of decision models and their applicability in our setting. The objective of this literature review is to contribute to the knowledge and use of decision models in the national context of economic evaluations of health technologies. This article presents general definitions about models and concerns with their use; it describes the main models: decision trees, Markov chains, micro-simulation, simulation of discrete and dynamic events; it discusses the elements involved in the choice of model; and exemplifies the models addressed in national economic evaluation studies of diagnostic and therapeutic preventive technologies and health programs.

  13. A qualitative evaluation approach for energy system modelling frameworks

    DEFF Research Database (Denmark)

    Wiese, Frauke; Hilpert, Simon; Kaldemeyer, Cord

    2018-01-01

    properties define how useful it is in regard to the existing challenges. For energy system models, evaluation methods exist, but we argue that many decisions upon properties are rather made on the model generator or framework level. Thus, this paper presents a qualitative approach to evaluate frameworks...

  14. Simulation of electric power conservation strategies: model of economic evaluation

    International Nuclear Information System (INIS)

    Pinhel, A.C.C.

    1992-01-01

    A methodology for the economic evaluation model for energy conservation programs to be executed by the National Program of Electric Power Conservation is presented. From data as: forecasting of conserved energy, tariffs, energy costs and budget, the model calculates the economic indexes for the programs, allowing the evaluation of economic impacts in the electric sector. (C.G.C.)

  15. The Use of AMET & Automated Scripts for Model Evaluation

    Science.gov (United States)

    Brief overview of EPA’s new CMAQ website to be launched publically in June, 2017. Details on the upcoming release of the Atmospheric Model Evaluation Tool (AMET) and the creation of automated scripts for post-processing and evaluating air quality model data.

  16. Modelling in Evaluating a Working Life Project in Higher Education

    Science.gov (United States)

    Sarja, Anneli; Janhonen, Sirpa; Havukainen, Pirjo; Vesterinen, Anne

    2012-01-01

    This article describes an evaluation method based on collaboration between the higher education, a care home and university, in a R&D project. The aim of the project was to elaborate modelling as a tool of developmental evaluation for innovation and competence in project cooperation. The approach was based on activity theory. Modelling enabled a…

  17. Evaluation of the Cable Types for Safety Requirements during Fire Conditions in Nuclear Facility

    International Nuclear Information System (INIS)

    Al-kattan, W.A.

    2015-01-01

    In Nuclear Power Plants (NPPs), the fire in cables causes many dangerous events in electrical or mechanical operations causing a nuclear reactor melt down. Main Control Room (MCR) in nuclear power plants have therefore, special concern in the fire protection systems. The Nuclear International Atomic Energy Agency (IAEA) has promoted the use of risk-informed and performance based methods for fire protection. These methods affirm the relevant needs to develop realistic methods to quantify the risk of fire to NPPs safety. The recent electrical cable testing has been carried out to provide empirical data on the failure modes and likelihood of fire-induced damage. In this thesis, will use fire modeling to develop fire probabilistic safety assessment to estimate the likelihood of fire induced cable damage given a specified fire profile. The objective of this work is to provide a comprehensive evaluation of the most recent fire-induced circuit failure due to different cables type that used inside the NPPs by simulation using fire modeling. One of this work scope is to suggest a suitable cable insulation material especially in case of the thermal failure thresholds, for developing the electrical cable thermal fragility distributions and evaluate parameters that influence fire-induced circuit failure modes. The main control room is implementing using the CFAST (fire simulation package). The simulation results represent the full development fire temperature and heat flux as well as the output gases. The results can be used as the basic parameters of the cables comparison and then evaluation.The importance of these results are not only for evaluating the cables but one can efficiently use them to improve the whole NPPs safety levels. The gases results of the fire simulation inside the room are oxygen, carbon monoxide, carbon dioxide, and hydrogen chloride. These gases are being used lot achieving the healthy protection of NPPs. Finally, one can measure the healthy environment

  18. Evaluation of Satellite and Model Precipitation Products Over Turkey

    Science.gov (United States)

    Yilmaz, M. T.; Amjad, M.

    2017-12-01

    Satellite-based remote sensing, gauge stations, and models are the three major platforms to acquire precipitation dataset. Among them satellites and models have the advantage of retrieving spatially and temporally continuous and consistent datasets, while the uncertainty estimates of these retrievals are often required for many hydrological studies to understand the source and the magnitude of the uncertainty in hydrological response parameters. In this study, satellite and model precipitation data products are validated over various temporal scales (daily, 3-daily, 7-daily, 10-daily and monthly) using in-situ measured precipitation observations from a network of 733 gauges from all over the Turkey. Tropical Rainfall Measurement Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) 3B42 version 7 and European Center of Medium-Range Weather Forecast (ECMWF) model estimates (daily, 3-daily, 7-daily and 10-daily accumulated forecast) are used in this study. Retrievals are evaluated for their mean and standard deviation and their accuracies are evaluated via bias, root mean square error, error standard deviation and correlation coefficient statistics. Intensity vs frequency analysis and some contingency table statistics like percent correct, probability of detection, false alarm ratio and critical success index are determined using daily time-series. Both ECMWF forecasts and TRMM observations, on average, overestimate the precipitation compared to gauge estimates; wet biases are 10.26 mm/month and 8.65 mm/month, respectively for ECMWF and TRMM. RMSE values of ECMWF forecasts and TRMM estimates are 39.69 mm/month and 41.55 mm/month, respectively. Monthly correlations between Gauges-ECMWF, Gauges-TRMM and ECMWF-TRMM are 0.76, 0.73 and 0.81, respectively. The model and the satellite error statistics are further compared against the gauges error statistics based on inverse distance weighting (IWD) analysis. Both the model and satellite data have less IWD errors (14

  19. Reuse-centric Requirements Analysis with Task Models, Scenarios, and Critical Parameters

    Directory of Open Access Journals (Sweden)

    Cyril Montabert

    2007-02-01

    Full Text Available This paper outlines a requirements-analysis process that unites task models, scenarios, and critical parameters to exploit and generate reusable knowledge at the requirements phase. Through the deployment of a critical-parameter-based approach to task modeling, the process yields the establishment of an integrative and formalized model issued from scenarios that can be used for requirements characterization. Furthermore, not only can this entity serve as interface to a knowledge repository relying on a critical-parameter-based taxonomy to support reuse but its characterization in terms of critical parameters also allows the model to constitute a broader reuse solution. We discuss our vision for a user-centric and reuse-centric approach to requirements analysis, present previous efforts implicated with this line of work, and state the revisions brought to extend the reuse potential and effectiveness of a previous iteration of a requirements tool implementing such process. Finally, the paper describes the sequence and nature of the activities involved with the conduct of our proposed requirements-analysis technique, concluding by previewing ongoing work in the field that will explore the feasibility for designers to use our approach.

  20. Evaluation of biological models using Spacelab

    Science.gov (United States)

    Tollinger, D.; Williams, B. A.

    1980-01-01

    Biological models of hypogravity effects are described, including the cardiovascular-fluid shift, musculoskeletal, embryological and space sickness models. These models predict such effects as loss of extracellular fluid and electrolytes, decrease in red blood cell mass, and the loss of muscle and bone mass in weight-bearing portions of the body. Experimentation in Spacelab by the use of implanted electromagnetic flow probes, by fertilizing frog eggs in hypogravity and fixing the eggs at various stages of early development and by assessing the role of the vestibulocular reflex arc in space sickness is suggested. It is concluded that the use of small animals eliminates the uncertainties caused by corrective or preventive measures employed with human subjects.

  1. Shock circle model for ejector performance evaluation

    International Nuclear Information System (INIS)

    Zhu, Yinhai; Cai, Wenjian; Wen, Changyun; Li, Yanzhong

    2007-01-01

    In this paper, a novel shock circle model for the prediction of ejector performance at the critical mode operation is proposed. By introducing the 'shock circle' at the entrance of the constant area chamber, a 2D exponential expression for velocity distribution is adopted to approximate the viscosity flow near the ejector inner wall. The advantage of the 'shock circle' analysis is that the calculation of ejector performance is independent of the flows in the constant area chamber and diffuser. Consequently, the calculation is even simpler than many 1D modeling methods and can predict the performance of critical mode operation ejectors much more accurately. The effectiveness of the method is validated by two experimental results reported earlier. The proposed modeling method using two coefficients is shown to produce entrainment ratio, efficiency and coefficient of performance (COP) accurately and much closer to experimental results than those of 1D analysis methods

  2. p-values for model evaluation

    International Nuclear Information System (INIS)

    Beaujean, F.; Caldwell, A.; Kollar, D.; Kroeninger, K.

    2011-01-01

    Deciding whether a model provides a good description of data is often based on a goodness-of-fit criterion summarized by a p-value. Although there is considerable confusion concerning the meaning of p-values, leading to their misuse, they are nevertheless of practical importance in common data analysis tasks. We motivate their application using a Bayesian argumentation. We then describe commonly and less commonly known discrepancy variables and how they are used to define p-values. The distribution of these are then extracted for examples modeled on typical data analysis tasks, and comments on their usefulness for determining goodness-of-fit are given.

  3. Center for Integrated Nanotechnologies (CINT) Chemical Release Modeling Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Stirrup, Timothy Scott [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-12-20

    This evaluation documents the methodology and results of chemical release modeling for operations at Building 518, Center for Integrated Nanotechnologies (CINT) Core Facility. This evaluation is intended to supplement an update to the CINT [Standalone] Hazards Analysis (SHA). This evaluation also updates the original [Design] Hazards Analysis (DHA) completed in 2003 during the design and construction of the facility; since the original DHA, additional toxic materials have been evaluated and modeled to confirm the continued low hazard classification of the CINT facility and operations. This evaluation addresses the potential catastrophic release of the current inventory of toxic chemicals at Building 518 based on a standard query in the Chemical Information System (CIS).

  4. Statistical modeling for visualization evaluation through data fusion.

    Science.gov (United States)

    Chen, Xiaoyu; Jin, Ran

    2017-11-01

    There is a high demand of data visualization providing insights to users in various applications. However, a consistent, online visualization evaluation method to quantify mental workload or user preference is lacking, which leads to an inefficient visualization and user interface design process. Recently, the advancement of interactive and sensing technologies makes the electroencephalogram (EEG) signals, eye movements as well as visualization logs available in user-centered evaluation. This paper proposes a data fusion model and the application procedure for quantitative and online visualization evaluation. 15 participants joined the study based on three different visualization designs. The results provide a regularized regression model which can accurately predict the user's evaluation of task complexity, and indicate the significance of all three types of sensing data sets for visualization evaluation. This model can be widely applied to data visualization evaluation, and other user-centered designs evaluation and data analysis in human factors and ergonomics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Evaluation Model of Tea Industry Information Service Quality

    OpenAIRE

    Shi , Xiaohui; Chen , Tian’en

    2015-01-01

    International audience; According to characteristics of tea industry information service, this paper have built service quality evaluation index system for tea industry information service quality, R-cluster analysis and multiple regression have been comprehensively used to contribute evaluation model with a high practice and credibility. Proved by the experiment, the evaluation model of information service quality has a good precision, which has guidance significance to a certain extent to e...

  6. Evaluating Models of Human Performance: Safety-Critical Systems Applications

    Science.gov (United States)

    Feary, Michael S.

    2012-01-01

    This presentation is part of panel discussion on Evaluating Models of Human Performance. The purpose of this panel is to discuss the increasing use of models in the world today and specifically focus on how to describe and evaluate models of human performance. My presentation will focus on discussions of generating distributions of performance, and the evaluation of different strategies for humans performing tasks with mixed initiative (Human-Automation) systems. I will also discuss issues with how to provide Human Performance modeling data to support decisions on acceptability and tradeoffs in the design of safety critical systems. I will conclude with challenges for the future.

  7. Evaluating a Model of Youth Physical Activity

    Science.gov (United States)

    Heitzler, Carrie D.; Lytle, Leslie A.; Erickson, Darin J.; Barr-Anderson, Daheia; Sirard, John R.; Story, Mary

    2010-01-01

    Objective: To explore the relationship between social influences, self-efficacy, enjoyment, and barriers and physical activity. Methods: Structural equation modeling examined relationships between parent and peer support, parent physical activity, individual perceptions, and objectively measured physical activity using accelerometers among a…

  8. An evaluation of uncertainties in radioecological models

    International Nuclear Information System (INIS)

    Hoffmann, F.O.; Little, C.A.; Miller, C.W.; Dunning, D.E. Jr.; Rupp, E.M.; Shor, R.W.; Schaeffer, D.L.; Baes, C.F. III

    1978-01-01

    The paper presents results of analyses for seven selected parameters commonly used in environmental radiological assessment models, assuming that the available data are representative of the true distribution of parameter values and that their respective distributions are lognormal. Estimates of the most probable, median, mean, and 99th percentile for each parameter are fiven and compared to U.S. NRC default values. The regulatory default values are generally greater than the median values for the selected parameters, but some are associated with percentiles significantly less than the 50th. The largest uncertainties appear to be associated with aquatic bioaccumulation factors for fresh water fish. Approximately one order of magnitude separates median values and values of the 99th percentile. The uncertainty is also estimated for the annual dose rate predicted by a multiplicative chain model for the transport of molecular iodine-131 via the air-pasture-cow-milk-child's thyroid pathway. The value for the 99th percentile is ten times larger than the median value of the predicted dose normalized for a given air concentration of 131 I 2 . About 72% of the uncertainty in this model is contributed by the dose conversion factor and the milk transfer coefficient. Considering the difficulties in obtaining a reliable quantification of the true uncertainties in model predictions, methods for taking these uncertainties into account when determining compliance with regulatory statutes are discussed. (orig./HP) [de

  9. Evaluation of Uncertainties in hydrogeological modeling and groundwater flow analyses. Model calibration

    International Nuclear Information System (INIS)

    Ijiri, Yuji; Ono, Makoto; Sugihara, Yutaka; Shimo, Michito; Yamamoto, Hajime; Fumimura, Kenichi

    2003-03-01

    This study involves evaluation of uncertainty in hydrogeological modeling and groundwater flow analysis. Three-dimensional groundwater flow in Shobasama site in Tono was analyzed using two continuum models and one discontinuous model. The domain of this study covered area of four kilometers in east-west direction and six kilometers in north-south direction. Moreover, for the purpose of evaluating how uncertainties included in modeling of hydrogeological structure and results of groundwater simulation decreased with progress of investigation research, updating and calibration of the models about several modeling techniques of hydrogeological structure and groundwater flow analysis techniques were carried out, based on the information and knowledge which were newly acquired. The acquired knowledge is as follows. As a result of setting parameters and structures in renewal of the models following to the circumstances by last year, there is no big difference to handling between modeling methods. The model calibration is performed by the method of matching numerical simulation with observation, about the pressure response caused by opening and closing of a packer in MIU-2 borehole. Each analysis technique attains reducing of residual sum of squares of observations and results of numerical simulation by adjusting hydrogeological parameters. However, each model adjusts different parameters as water conductivity, effective porosity, specific storage, and anisotropy. When calibrating models, sometimes it is impossible to explain the phenomena only by adjusting parameters. In such case, another investigation may be required to clarify details of hydrogeological structure more. As a result of comparing research from beginning to this year, the following conclusions are obtained about investigation. (1) The transient hydraulic data are effective means in reducing the uncertainty of hydrogeological structure. (2) Effective porosity for calculating pore water velocity of

  10. A COMPARISON OF SEMANTIC SIMILARITY MODELS IN EVALUATING CONCEPT SIMILARITY

    Directory of Open Access Journals (Sweden)

    Q. X. Xu

    2012-08-01

    Full Text Available The semantic similarities are important in concept definition, recognition, categorization, interpretation, and integration. Many semantic similarity models have been established to evaluate semantic similarities of objects or/and concepts. To find out the suitability and performance of different models in evaluating concept similarities, we make a comparison of four main types of models in this paper: the geometric model, the feature model, the network model, and the transformational model. Fundamental principles and main characteristics of these models are introduced and compared firstly. Land use and land cover concepts of NLCD92 are employed as examples in the case study. The results demonstrate that correlations between these models are very high for a possible reason that all these models are designed to simulate the similarity judgement of human mind.

  11. A MULTILAYER BIOCHEMICAL DRY DEPOSITION MODEL 2. MODEL EVALUATION

    Science.gov (United States)

    The multilayer biochemical dry deposition model (MLBC) described in the accompanying paper was tested against half-hourly eddy correlation data from six field sites under a wide range of climate conditions with various plant types. Modeled CO2, O3, SO2<...

  12. RTMOD: Real-Time MODel evaluation

    DEFF Research Database (Denmark)

    Graziani, G.; Galmarini, S.; Mikkelsen, Torben

    2000-01-01

    the RTMOD web page for detailed information on the actual release, and as soon as possible they then uploaded their predictions to the RTMOD server and could soon after start their inter-comparison analysis with other modellers. When additionalforecast data arrived, already existing statistical results....... At that time, the World Wide Web was not available to all the exercise participants, and plume predictions were therefore submitted to JRC-Ispra by fax andregular mail for subsequent processing. The rapid development of the World Wide Web in the second half of the nineties, together with the experience gained...... during the ETEX exercises suggested the development of this project. RTMOD featured a web-baseduser-friendly interface for data submission and an interactive program module for displaying, intercomparison and analysis of the forecasts. RTMOD has focussed on model intercomparison of concentration...

  13. An Overview of Westinghouse Realistic Large Break LOCA Evaluation Model

    Directory of Open Access Journals (Sweden)

    Cesare Frepoli

    2008-01-01

    Full Text Available Since the 1988 amendment of the 10 CFR 50.46 rule in 1988, Westinghouse has been developing and applying realistic or best-estimate methods to perform LOCA safety analyses. A realistic analysis requires the execution of various realistic LOCA transient simulations where the effect of both model and input uncertainties are ranged and propagated throughout the transients. The outcome is typically a range of results with associated probabilities. The thermal/hydraulic code is the engine of the methodology but a procedure is developed to assess the code and determine its biases and uncertainties. In addition, inputs to the simulation are also affected by uncertainty and these uncertainties are incorporated into the process. Several approaches have been proposed and applied in the industry in the framework of best-estimate methods. Most of the implementations, including Westinghouse, follow the Code Scaling, Applicability and Uncertainty (CSAU methodology. Westinghouse methodology is based on the use of the WCOBRA/TRAC thermal-hydraulic code. The paper starts with an overview of the regulations and its interpretation in the context of realistic analysis. The CSAU roadmap is reviewed in the context of its implementation in the Westinghouse evaluation model. An overview of the code (WCOBRA/TRAC and methodology is provided. Finally, the recent evolution to nonparametric statistics in the current edition of the W methodology is discussed. Sample results of a typical large break LOCA analysis for a PWR are provided.

  14. A Descriptive Evaluation of Software Sizing Models

    Science.gov (United States)

    1987-09-01

    2-22 2.3.2 SPQR Sizer/FP ............................... 2-25 2.3.3 QSM Size Planner: Function Points .......... 2-26 2.3.4 Feature...Characteristics ............................. 4-20 4.5.3 Results and Conclusions ..................... 4-20 4.6 Application of the SPQR SIZER/FP Approach...4-19 4-7 SPQR Function Point Estimate for the CATSS Sensitivity Model .................................................. 4-23 4-8 ASSET-R

  15. Automated expert modeling for automated student evaluation.

    Energy Technology Data Exchange (ETDEWEB)

    Abbott, Robert G.

    2006-01-01

    The 8th International Conference on Intelligent Tutoring Systems provides a leading international forum for the dissemination of original results in the design, implementation, and evaluation of intelligent tutoring systems and related areas. The conference draws researchers from a broad spectrum of disciplines ranging from artificial intelligence and cognitive science to pedagogy and educational psychology. The conference explores intelligent tutoring systems increasing real world impact on an increasingly global scale. Improved authoring tools and learning object standards enable fielding systems and curricula in real world settings on an unprecedented scale. Researchers deploy ITS's in ever larger studies and increasingly use data from real students, tasks, and settings to guide new research. With high volumes of student interaction data, data mining, and machine learning, tutoring systems can learn from experience and improve their teaching performance. The increasing number of realistic evaluation studies also broaden researchers knowledge about the educational contexts for which ITS's are best suited. At the same time, researchers explore how to expand and improve ITS/student communications, for example, how to achieve more flexible and responsive discourse with students, help students integrate Web resources into learning, use mobile technologies and games to enhance student motivation and learning, and address multicultural perspectives.

  16. Global climate targets and future consumption level: an evaluation of the required GHG intensity

    International Nuclear Information System (INIS)

    Girod, Bastien; Van Vuuren, Detlef Peter; Hertwich, Edgar G

    2013-01-01

    Discussion and analysis on international climate policy often focuses on the rather abstract level of total national and regional greenhouse gas (GHG) emissions. At some point, however, emission reductions need to be translated to consumption level. In this article, we evaluate the implications of the strictest IPCC representative concentration pathway for key consumption categories (food, travel, shelter, goods, services). We use IPAT style identities to account for possible growth in global consumption levels and indicate the required change in GHG emission intensity for each category (i.e. GHG emission per calorie, person kilometer, square meter, kilogram, US dollar). The proposed concept provides guidance for product developers, consumers and policymakers. To reach the 2 °C climate target (2.1 tCO 2 -eq. per capita in 2050), the GHG emission intensity of consumption has to be reduced by a factor of 5 in 2050. The climate targets on consumption level allow discussion of the feasibility of this climate target at product and consumption level. In most consumption categories products in line with this climate target are available. For animal food and air travel, reaching the GHG intensity targets with product modifications alone will be challenging and therefore structural changes in consumption patterns might be needed. The concept opens up possibilities for further research on potential solutions on the consumption and product level to global climate mitigation. (letter)

  17. Coalbed methane : evaluating pipeline and infrastructure requirements to get gas to market

    International Nuclear Information System (INIS)

    Murray, B.

    2005-01-01

    This Power Point presentation evaluated pipeline and infrastructure requirements for the economic production of coalbed methane (CBM) gas. Reports have suggested that capital costs for CBM production can be minimized by leveraging existing oil and gas infrastructure. By using existing plant facilities, CBM producers can then tie in to existing gathering systems and negotiate third party fees, which are less costly than building new pipelines. Many CBM wells can be spaced at an equal distance to third party gathering systems and regulated transmission meter stations and pipelines. Facility cost sharing, and contracts with pipeline companies for compression can also lower initial infrastructure costs. However, transmission pressures and direct connect options for local distribution should always be considered during negotiations. The use of carbon dioxide (CO 2 ) commingling services was also recommended. A map of the North American gas network was provided, as well as details of Alberta gas transmission and coal pipeline overlays. Maps of various coal zones in Alberta were provided, as well as a map of North American pipelines. refs., tabs., figs

  18. The ACCENT-protocol: a framework for benchmarking and model evaluation

    Directory of Open Access Journals (Sweden)

    V. Grewe

    2012-05-01

    Full Text Available We summarise results from a workshop on "Model Benchmarking and Quality Assurance" of the EU-Network of Excellence ACCENT, including results from other activities (e.g. COST Action 732 and publications. A formalised evaluation protocol is presented, i.e. a generic formalism describing the procedure of how to perform a model evaluation. This includes eight steps and examples from global model applications which are given for illustration. The first and important step is concerning the purpose of the model application, i.e. the addressed underlying scientific or political question. We give examples to demonstrate that there is no model evaluation per se, i.e. without a focused purpose. Model evaluation is testing, whether a model is fit for its purpose. The following steps are deduced from the purpose and include model requirements, input data, key processes and quantities, benchmark data, quality indicators, sensitivities, as well as benchmarking and grading. We define "benchmarking" as the process of comparing the model output against either observational data or high fidelity model data, i.e. benchmark data. Special focus is given to the uncertainties, e.g. in observational data, which have the potential to lead to wrong conclusions in the model evaluation if not considered carefully.

  19. The ACCENT-protocol: a framework for benchmarking and model evaluation

    Science.gov (United States)

    Grewe, V.; Moussiopoulos, N.; Builtjes, P.; Borrego, C.; Isaksen, I. S. A.; Volz-Thomas, A.

    2012-05-01

    We summarise results from a workshop on "Model Benchmarking and Quality Assurance" of the EU-Network of Excellence ACCENT, including results from other activities (e.g. COST Action 732) and publications. A formalised evaluation protocol is presented, i.e. a generic formalism describing the procedure of how to perform a model evaluation. This includes eight steps and examples from global model applications which are given for illustration. The first and important step is concerning the purpose of the model application, i.e. the addressed underlying scientific or political question. We give examples to demonstrate that there is no model evaluation per se, i.e. without a focused purpose. Model evaluation is testing, whether a model is fit for its purpose. The following steps are deduced from the purpose and include model requirements, input data, key processes and quantities, benchmark data, quality indicators, sensitivities, as well as benchmarking and grading. We define "benchmarking" as the process of comparing the model output against either observational data or high fidelity model data, i.e. benchmark data. Special focus is given to the uncertainties, e.g. in observational data, which have the potential to lead to wrong conclusions in the model evaluation if not considered carefully.

  20. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  1. Spectral evaluation of Earth geopotential models and an experiment ...

    Indian Academy of Sciences (India)

    the models and monitoring the improvements in gravity field recovery are required. This study assesses ... group from the Inter- national Gravity Field Service (IGFS) and the ..... the method, the process may therefore be iter- ated until the ...

  2. Local fit evaluation of structural equation models using graphical criteria.

    Science.gov (United States)

    Thoemmes, Felix; Rosseel, Yves; Textor, Johannes

    2018-03-01

    Evaluation of model fit is critically important for every structural equation model (SEM), and sophisticated methods have been developed for this task. Among them are the χ² goodness-of-fit test, decomposition of the χ², derived measures like the popular root mean square error of approximation (RMSEA) or comparative fit index (CFI), or inspection of residuals or modification indices. Many of these methods provide a global approach to model fit evaluation: A single index is computed that quantifies the fit of the entire SEM to the data. In contrast, graphical criteria like d-separation or trek-separation allow derivation of implications that can be used for local fit evaluation, an approach that is hardly ever applied. We provide an overview of local fit evaluation from the viewpoint of SEM practitioners. In the presence of model misfit, local fit evaluation can potentially help in pinpointing where the problem with the model lies. For models that do fit the data, local tests can identify the parts of the model that are corroborated by the data. Local tests can also be conducted before a model is fitted at all, and they can be used even for models that are globally underidentified. We discuss appropriate statistical local tests, and provide applied examples. We also present novel software in R that automates this type of local fit evaluation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. 14 CFR 382.133 - What are the requirements concerning the evaluation and use of passenger-supplied electronic...

    Science.gov (United States)

    2010-01-01

    ... evaluation and use of passenger-supplied electronic devices that assist passengers with respiration in the... What are the requirements concerning the evaluation and use of passenger-supplied electronic devices... to use in the passenger cabin during air transportation, a ventilator, respirator, continuous...

  4. A model for photothermal responses of flowering in rice. II. Model evaluation.

    NARCIS (Netherlands)

    Yin, X.; Kropff, M.J.; Nakagawa, H.; Horie, T.; Goudriaan, J.

    1997-01-01

    A detailed nonlinear model, the 3s-Beta model, for photothermal responses of flowering in rice (Oryza sativa L.) was evaluated for predicting rice flowering date in field conditions. This model was compared with other three models: a three-plane linear model and two nonlinear models, viz, the

  5. Evaluating the double Poisson generalized linear model.

    Science.gov (United States)

    Zou, Yaotian; Geedipally, Srinivas Reddy; Lord, Dominique

    2013-10-01

    The objectives of this study are to: (1) examine the applicability of the double Poisson (DP) generalized linear model (GLM) for analyzing motor vehicle crash data characterized by over- and under-dispersion and (2) compare the performance of the DP GLM with the Conway-Maxwell-Poisson (COM-Poisson) GLM in terms of goodness-of-fit and theoretical soundness. The DP distribution has seldom been investigated and applied since its first introduction two decades ago. The hurdle for applying the DP is related to its normalizing constant (or multiplicative constant) which is not available in closed form. This study proposed a new method to approximate the normalizing constant of the DP with high accuracy and reliability. The DP GLM and COM-Poisson GLM were developed using two observed over-dispersed datasets and one observed under-dispersed dataset. The modeling results indicate that the DP GLM with its normalizing constant approximated by the new method can handle crash data characterized by over- and under-dispersion. Its performance is comparable to the COM-Poisson GLM in terms of goodness-of-fit (GOF), although COM-Poisson GLM provides a slightly better fit. For the over-dispersed data, the DP GLM performs similar to the NB GLM. Considering the fact that the DP GLM can be easily estimated with inexpensive computation and that it is simpler to interpret coefficients, it offers a flexible and efficient alternative for researchers to model count data. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Definition of common support equipment and space station interface requirements for IOC model technology experiments

    Science.gov (United States)

    Russell, Richard A.; Waiss, Richard D.

    1988-01-01

    A study was conducted to identify the common support equipment and Space Station interface requirements for the IOC (initial operating capabilities) model technology experiments. In particular, each principal investigator for the proposed model technology experiment was contacted and visited for technical understanding and support for the generation of the detailed technical backup data required for completion of this study. Based on the data generated, a strong case can be made for a dedicated technology experiment command and control work station consisting of a command keyboard, cathode ray tube, data processing and storage, and an alert/annunciator panel located in the pressurized laboratory.

  7. Improved Traceability of a Small Satellite Mission Concept to Requirements Using Model Based System Engineering

    Science.gov (United States)

    Reil, Robin L.

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.

  8. Improved Traceability of Mission Concept to Requirements Using Model Based Systems Engineering

    Science.gov (United States)

    Reil, Robin

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the traditional document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This thesis presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magics MagicDraw modeling tool. The model incorporates mission concept and requirement information from the missions original DBSE design efforts. Active dependency relationships are modeled to analyze the completeness and consistency of the requirements to the mission concept. Overall experience and methodology are presented for both the MBSE and original DBSE design efforts of SporeSat.

  9. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    International Nuclear Information System (INIS)

    St John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program

  10. Evaluation of methods to estimate the essential amino acids requirements of fish from the muscle amino acid profile

    Directory of Open Access Journals (Sweden)

    Álvaro José de Almeida Bicudo

    2014-03-01

    Full Text Available Many methods to estimate amino acid requirement based on amino acid profile of fish have been proposed. This study evaluates the methodology proposed by Meyer & Fracalossi (2005 and by Tacon (1989 to estimate amino acids requirement of fish, which do exempt knowledge on previous nutritional requirement of reference amino acid. Data on amino acid requirement of pacu, Piaractus mesopotamicus, were used to validate de accuracy of those methods. Meyer & Fracalossi's and Tacon's methodology estimated the lysine requirement of pacu, respectively, at 13 and 23% above requirement determined using dose-response method. The values estimated by both methods lie within the range of requirements determined for other omnivorous fish species, the Meyer & Fracalossi (2005 method showing better accuracy.

  11. A random walk model to evaluate autism

    Science.gov (United States)

    Moura, T. R. S.; Fulco, U. L.; Albuquerque, E. L.

    2018-02-01

    A common test administered during neurological examination in children is the analysis of their social communication and interaction across multiple contexts, including repetitive patterns of behavior. Poor performance may be associated with neurological conditions characterized by impairments in executive function, such as the so-called pervasive developmental disorders (PDDs), a particular condition of the autism spectrum disorders (ASDs). Inspired in these diagnosis tools, mainly those related to repetitive movements and behaviors, we studied here how the diffusion regimes of two discrete-time random walkers, mimicking the lack of social interaction and restricted interests developed for children with PDDs, are affected. Our model, which is based on the so-called elephant random walk (ERW) approach, consider that one of the random walker can learn and imitate the microscopic behavior of the other with probability f (1 - f otherwise). The diffusion regimes, measured by the Hurst exponent (H), is then obtained, whose changes may indicate a different degree of autism.

  12. Evaluation of the autoregression time-series model for analysis of a noisy signal

    International Nuclear Information System (INIS)

    Allen, J.W.

    1977-01-01

    The autoregression (AR) time-series model of a continuous noisy signal was statistically evaluated to determine quantitatively the uncertainties of the model order, the model parameters, and the model's power spectral density (PSD). The result of such a statistical evaluation enables an experimenter to decide whether an AR model can adequately represent a continuous noisy signal and be consistent with the signal's frequency spectrum, and whether it can be used for on-line monitoring. Although evaluations of other types of signals have been reported in the literature, no direct reference has been found to AR model's uncertainties for continuous noisy signals; yet the evaluation is necessary to decide the usefulness of AR models of typical reactor signals (e.g., neutron detector output or thermocouple output) and the potential of AR models for on-line monitoring applications. AR and other time-series models for noisy data representation are being investigated by others since such models require fewer parameters than the traditional PSD model. For this study, the AR model was selected for its simplicity and conduciveness to uncertainty analysis, and controlled laboratory bench signals were used for continuous noisy data. (author)

  13. Promoting Excellence in Nursing Education (PENE): Pross evaluation model.

    Science.gov (United States)

    Pross, Elizabeth A

    2010-08-01

    The purpose of this article is to examine the Promoting Excellence in Nursing Education (PENE) Pross evaluation model. A conceptual evaluation model, such as the one described here, may be useful to nurse academicians in the ongoing evaluation of educational programs, especially those with goals of excellence. Frameworks for evaluating nursing programs are necessary because they offer a way to systematically assess the educational effectiveness of complex nursing programs. This article describes the conceptual framework and its tenets of excellence. Copyright 2009 Elsevier Ltd. All rights reserved.

  14. Safety evaluations required in the safety regulations for Monju and the validity confirmation of safety evaluation methods

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    The purposes of this study are to perform the safety evaluations of the fast breeder reactor 'Monju' and to confirm the validity of the safety evaluation methods. In JFY 2012, the following results were obtained. As for the development of safety evaluation methods needed in the safety examination achieved for the reactor establishment permission, development of the analysis codes, such as a core damage analysis code, were carried out according to the plan. As for the development of the safety evaluation method needed for the risk informed safety regulation, the quantification technique of the event tree using the Continuous Markov chain Monte Carlo method (CMMC method) were studied. (author)

  15. ECOPATH: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Bergstroem, U.; Nordlinder, S.

    1996-01-01

    The model is based upon compartment theory and it is run in combination with a statistical error propagation method (PRISM, Gardner et al. 1983). It is intended to be generic for application on other sites with simple changing of parameter values. It was constructed especially for this scenario. However, it is based upon an earlier designed model for calculating relations between released amount of radioactivity and doses to critical groups (used for Swedish regulations concerning annual reports of released radioactivity from routine operation of Swedish nuclear power plants (Bergstroem och Nordlinder, 1991)). The model handles exposure from deposition on terrestrial areas as well as deposition on lakes, starting with deposition values. 14 refs, 16 figs, 7 tabs

  16. Evaluation of atmospheric dispersion/consequence models supporting safety analysis

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Lazaro, M.A.; Woodard, K.

    1996-01-01

    Two DOE Working Groups have completed evaluation of accident phenomenology and consequence methodologies used to support DOE facility safety documentation. The independent evaluations each concluded that no one computer model adequately addresses all accident and atmospheric release conditions. MACCS2, MATHEW/ADPIC, TRAC RA/HA, and COSYMA are adequate for most radiological dispersion and consequence needs. ALOHA, DEGADIS, HGSYSTEM, TSCREEN, and SLAB are recommended for chemical dispersion and consequence applications. Additional work is suggested, principally in evaluation of new models, targeting certain models for continued development, training, and establishing a Web page for guidance to safety analysts

  17. Modeling requirements for full-scope reactor simulators of fission-product transport during severe accidents

    International Nuclear Information System (INIS)

    Ellison, P.G.; Monson, P.R.; Mitchell, H.A.

    1990-01-01

    This paper describes in the needs and requirements to properly and efficiently model fission product transport on full scope reactor simulators. Current LWR simulators can be easily adapted to model severe accident phenomena and the transport of radionuclides. Once adapted these simulators can be used as a training tool during operator training exercises for training on severe accident guidelines, for training on containment venting procedures, or as training tool during site wide emergency training exercises

  18. System Design Description and Requirements for Modeling the Off-Gas Systems for Fuel Recycling Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Daryl R. Haefner; Jack D. Law; Troy J. Tranter

    2010-08-01

    This document provides descriptions of the off-gases evolved during spent nuclear fuel processing and the systems used to capture the gases of concern. Two reprocessing techniques are discussed, namely aqueous separations and electrochemical (pyrochemical) processing. The unit operations associated with each process are described in enough detail so that computer models to mimic their behavior can be developed. The document also lists the general requirements for the desired computer models.

  19. A funding model for health visiting: baseline requirements--part 1.

    Science.gov (United States)

    Cowley, Sarah

    2007-11-01

    A funding model proposed in two papers will outline the health visiting resource, including team skill mix, required to deliver the recommended approach of 'progressive universalism,' taking account of health inequalities, best evidence and impact on outcomes that might be anticipated. The model has been discussed as far as possible across the professional networks of both the Community Practitioners' and Health Visitors' Association (CPHVA) and United Kingdom Public Health Association (UKPHA), and is a consensus statement agreed by all who have participated.

  20. Evaluation of hydrodynamic ocean models as a first step in larval dispersal modelling

    Science.gov (United States)

    Vasile, Roxana; Hartmann, Klaas; Hobday, Alistair J.; Oliver, Eric; Tracey, Sean

    2018-01-01

    Larval dispersal modelling, a powerful tool in studying population connectivity and species distribution, requires accurate estimates of the ocean state, on a high-resolution grid in both space (e.g. 0.5-1 km horizontal grid) and time (e.g. hourly outputs), particularly of current velocities and water temperature. These estimates are usually provided by hydrodynamic models based on which larval trajectories and survival are computed. In this study we assessed the accuracy of two hydrodynamic models around Australia - Bluelink ReANalysis (BRAN) and Hybrid Coordinate Ocean Model (HYCOM) - through comparison with empirical data from the Australian National Moorings Network (ANMN). We evaluated the models' predictions of seawater parameters most relevant to larval dispersal - temperature, u and v velocities and current speed and direction - on the continental shelf where spawning and nursery areas for major fishery species are located. The performance of each model in estimating ocean parameters was found to depend on the parameter investigated and to vary from one geographical region to another. Both BRAN and HYCOM models systematically overestimated the mean water temperature, particularly in the top 140 m of water column, with over 2 °C bias at some of the mooring stations. HYCOM model was more accurate than BRAN for water temperature predictions in the Great Australian Bight and along the east coast of Australia. Skill scores between each model and the in situ observations showed lower accuracy in the models' predictions of u and v ocean current velocities compared to water temperature predictions. For both models, the lowest accuracy in predicting ocean current velocities, speed and direction was observed at 200 m depth. Low accuracy of both model predictions was also observed in the top 10 m of the water column. BRAN had more accurate predictions of both u and v velocities in the upper 50 m of water column at all mooring station locations. While HYCOM

  1. Experimental evaluation of the power balance model of speed skating.

    Science.gov (United States)

    de Koning, Jos J; Foster, Carl; Lampen, Joanne; Hettinga, Floor; Bobbert, Maarten F

    2005-01-01

    Prediction of speed skating performance with a power balance model requires assumptions about the kinetics of energy production, skating efficiency, and skating technique. The purpose of this study was to evaluate these parameters during competitive imitations for the purpose of improving model predictions. Elite speed skaters (n = 8) performed races and submaximal efficiency tests. External power output (P(o)) was calculated from movement analysis and aerodynamic models and ice friction measurements. Aerobic kinetics was calculated from breath-by-breath oxygen uptake (Vo(2)). Aerobic power (P(aer)) was calculated from measured skating efficiency. Anaerobic power (P(an)) kinetics was determined by subtracting P(aer) from P(o). We found gross skating efficiency to be 15.8% (1.8%). In the 1,500-m event, the kinetics of P(an) was characterized by a first-order system as P(an) = 88 + 556e(-0.0494t) (in W, where t is time). The rate constant for the increase in P(aer) was -0.153 s(-1), the time delay was 8.7 s, and the peak P(aer) was 234 W; P(aer) was equal to 234[1 - e(-0.153(t-8.7))] (in W). Skating position changed with preextension knee angle increasing and trunk angle decreasing throughout the event. We concluded the pattern of P(aer) to be quite similar to that reported during other competitive imitations, with the exception that the increase in P(aer) was more rapid. The pattern of P(an) does not appear to fit an "all-out" pattern, with near zero values during the last portion of the event, as assumed in our previous model (De Koning JJ, de Groot G, and van Ingen Schenau GJ. J Biomech 25: 573-580, 1992). Skating position changed in ways different from those assumed in our previous model. In addition to allowing improved predictions, the results demonstrate the importance of observations in unique subjects to the process of model construction.

  2. FARMLAND: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Attwood, C.; Fayers, C.; Mayall, A.; Brown, J.; Simmonds, J.R.

    1996-01-01

    The FARMLAND model was originally developed for use in connection with continuous, routine releases of radionuclides, but because it has many time-dependent features it has been developed further for a single accidental release. The most recent version of FARMLAND is flexible and can be used to predict activity concentrations in food as a function of time after both accidental and routine releases of radionuclides. The effect of deposition at different times of the year can be taken into account. FARMLAND contains a suite of models which simulate radionuclide transfer through different parts of the foodchain. The models can be used in different combinations and offer the flexibility to assess a variety of radiological situations. The main foods considered are green vegetables, grain products, root vegetables, milk, meat and offal from cattle, and meat and offal from sheep. A large variety of elements can be considered although the degree of complexity with which some are modelled is greater than others; isotopes of caesium, strontium and iodine are treated in greatest detail. 22 refs, 12 figs, 10 tabs

  3. FARMLAND: Model description and evaluation of model performance

    Energy Technology Data Exchange (ETDEWEB)

    Attwood, C; Fayers, C; Mayall, A; Brown, J; Simmonds, J R [National Radiological Protection Board, Chilton (United Kingdom)

    1996-09-01

    The FARMLAND model was originally developed for use in connection with continuous, routine releases of radionuclides, but because it has many time-dependent features it has been developed further for a single accidental release. The most recent version of FARMLAND is flexible and can be used to predict activity concentrations in food as a function of time after both accidental and routine releases of radionuclides. The effect of deposition at different times of the year can be taken into account. FARMLAND contains a suite of models which simulate radionuclide transfer through different parts of the foodchain. The models can be used in different combinations and offer the flexibility to assess a variety of radiological situations. The main foods considered are green vegetables, grain products, root vegetables, milk, meat and offal from cattle, and meat and offal from sheep. A large variety of elements can be considered although the degree of complexity with which some are modelled is greater than others; isotopes of caesium, strontium and iodine are treated in greatest detail. 22 refs, 12 figs, 10 tabs.

  4. Modeling the allocation and economic evaluation of PV panels and wind turbines in urban areas

    NARCIS (Netherlands)

    Mohammadi, S.; Vries, de B.; Schaefer, W.F.; Timmermans, H.

    2014-01-01

    A model for allocating PV panels and wind turbines in urban areas is developed. Firstly, it examines the spatial and technical requirements for the installation of PV panels and wind turbines and then evaluates their economic feasibilities in order to generate the cost effective electricity neutral

  5. Preparation and Evaluation of Newly Developed Chitosan Salt Coating Dispersions for Colon Delivery without Requiring Overcoating.

    Science.gov (United States)

    Yamada, Kyohei; Iwao, Yasunori; Bani-Jaber, Ahmad; Noguchi, Shuji; Itai, Shigeru

    2015-01-01

    Although chitosan (CS) has been recognized as a good material for colon-specific drug delivery systems, an overcoating with an enteric coating polymer on the surface of CS is absolutely necessary because CS is soluble in acidic conditions before reaching the colon. In the present study, to improve its stability in the presence of acid, a newly developed CS-laurate (CS-LA) material was evaluated as a coating dispersion for the development of colon-specific drug delivery systems. Two types of CS with different molecular weights, CS250 and CS600, were used to prepare CS-LA films by the casting method. The CS250-LA films had smooth surfaces, whereas the surfaces of the CS600-LA films were rough, indicating that the CS250-LA dispersion could form a denser film than CS600-LA. Both of these CS-LA films maintained a constant shape over 22 h in a pH 1.2 HCl/NaCl buffer, where the corresponding CS films rapidly disintegrated. In addition, the CS250-LA film showed specific colon degradability in a pH 6.0 phosphate buffered solution containing 1.0% (w/v) β-glucosidase. As a result of tensile strength and elongation at the break, both CS-LA films were found to have flexible film properties. Finally, the release of acetaminophen from disks coated with CS250-LA dispersions was significantly suppressed in fluids at pH 1.2 and 6.8, whereas disks coated with CS solution rapidly released the drug in pH 1.2 fluids. Taken together, this study shows that LA modification could be a useful approach in preparing CS films with acid stability and colonic degradability properties without requiring overcoating.

  6. Evaluation of a clinical dehydration scale in children requiring intravenous rehydration.

    Science.gov (United States)

    Kinlin, Laura M; Freedman, Stephen B

    2012-05-01

    To evaluate the reliability and validity of a previously derived clinical dehydration scale (CDS) in a cohort of children with gastroenteritis and evidence of dehydration. Participants were 226 children older than 3 months who presented to a tertiary care emergency department and required intravenous rehydration. Reliability was assessed at treatment initiation, by comparing the scores assigned independently by a trained research nurse and a physician. Validity was assessed by using parameters reflective of disease severity: weight gain, baseline laboratory results, willingness of the physician to discharge the patient, hospitalization, and length of stay. Interobserver reliability was moderate, with a weighted κ of 0.52 (95% confidence interval [CI] 0.41, 0.63). There was no correlation between CDS score and percent weight gain, a proxy measure of fluid deficit (Spearman correlation coefficient = -0.03; 95% CI -0.18, 0.12). There were, however, modest and statistically significant correlations between CDS score and several other parameters, including serum bicarbonate (Pearson correlation coefficient = -0.35; 95% CI -0.46, -0.22) and length of stay (Pearson correlation coefficient = 0.24; 95% CI 0.11, 0.36). The scale's discriminative ability was assessed for the outcome of hospitalization, yielding an area under the receiver operating characteristic curve of 0.65 (95% CI 0.57, 0.73). In children administered intravenous rehydration, the CDS was characterized by moderate interobserver reliability and weak associations with objective measures of disease severity. These data do not support its use as a tool to dictate the need for intravenous rehydration or to predict clinical course.

  7. 3D Core Model for simulation of nuclear power plants: Simulation requirements, model features, and validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1999-01-01

    In 1994-1996, Thomson Training and Simulation (TT and S) earned out the D50 Project, which involved the design and construction of optimized replica simulators for one Dutch and three German Nuclear Power Plants. It was recognized early on that the faithful reproduction of the Siemens reactor control and protection systems would impose extremely stringent demands on the simulation models, particularly the Core physics and the RCS thermohydraulics. The quality of the models, and their thorough validation, were thus essential. The present paper describes the main features of the fully 3D Core model implemented by TT and S, and its extensive validation campaign, which was defined in extremely positive collaboration with the Customer and the Core Data suppliers. (author)

  8. Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements

    Science.gov (United States)

    Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.

    2018-01-01

    The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation

  9. Geochemical modelling of CO2-water-rock interactions for carbon storage : data requirements and outputs

    International Nuclear Information System (INIS)

    Kirste, D.

    2008-01-01

    A geochemical model was used to predict the short-term and long-term behaviour of carbon dioxide (CO 2 ), formation water, and reservoir mineralogy at a carbon sequestration site. Data requirements for the geochemical model included detailed mineral petrography; formation water chemistry; thermodynamic and kinetic data for mineral phases; and rock and reservoir physical characteristics. The model was used to determine the types of outputs expected for potential CO 2 storage sites and natural analogues. Reaction path modelling was conducted to determine the total reactivity or CO 2 storage capability of the rock by applying static equilibrium and kinetic simulations. Potential product phases were identified using the modelling technique, which also enabled the identification of the chemical evolution of the system. Results of the modelling study demonstrated that changes in porosity and permeability over time should be considered during the site selection process.

  10. Projected irrigation requirements for upland crops using soil moisture model under climate change in South Korea

    Science.gov (United States)

    An increase in abnormal climate change patterns and unsustainable irrigation in uplands cause drought and affect agricultural water security, crop productivity, and price fluctuations. In this study, we developed a soil moisture model to project irrigation requirements (IR) for upland crops under cl...

  11. Non-formal techniques for requirements elicitation, modeling, and early assessment for services

    NARCIS (Netherlands)

    van der Veer, Gerrit C.; Vyas, Dhaval; Dittmar, A.; Forbig, P.

    2011-01-01

    Designing systems for multiple stakeholders requires frequent collaboration with multiple stakeholders from the start. In many cases at least some stakeholders lack a professional habit of formal modeling. We report observations from two case studies of stakeholder involvement in early design where

  12. A Proposal to Elicit Usability Requirements within a Model-Driven Development Environment

    NARCIS (Netherlands)

    Isela Ormeno, Y; Panach, I; Condori-Fernandez, O.N.; Pastor, O.

    2014-01-01

    Nowadays there are sound Model-Driven Development (MDD) methods that deal with functional requirements, but in general, usability is not considered from the early stages of the development. Analysts that work with MDD implement usability features manually once the code has been generated. This

  13. Teaching Tip: Using Rapid Game Prototyping for Exploring Requirements Discovery and Modeling

    Science.gov (United States)

    Dalal, Nikunj

    2012-01-01

    We describe the use of rapid game prototyping as a pedagogic technique to experientially explore and learn requirements discovery, modeling, and specification in systems analysis and design courses. Students have a natural interest in gaming that transcends age, gender, and background. Rapid digital game creation is used to build computer games…

  14. INTEGRATED DATA CAPTURING REQUIREMENTS FOR 3D SEMANTIC MODELLING OF CULTURAL HERITAGE: THE INCEPTION PROTOCOL

    Directory of Open Access Journals (Sweden)

    R. Di Giulio

    2017-02-01

    In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.

  15. Evaluating energy saving system of data centers based on AHP and fuzzy comprehensive evaluation model

    Science.gov (United States)

    Jiang, Yingni

    2018-03-01

    Due to the high energy consumption of communication, energy saving of data centers must be enforced. But the lack of evaluation mechanisms has restrained the process on energy saving construction of data centers. In this paper, energy saving evaluation index system of data centers was constructed on the basis of clarifying the influence factors. Based on the evaluation index system, analytical hierarchy process was used to determine the weights of the evaluation indexes. Subsequently, a three-grade fuzzy comprehensive evaluation model was constructed to evaluate the energy saving system of data centers.

  16. Understanding the relationship between Kano model's customer satisfaction scores and self-stated requirements importance.

    Science.gov (United States)

    Mkpojiogu, Emmanuel O C; Hashim, Nor Laily

    2016-01-01

    Customer satisfaction is the result of product quality and viability. The place of the perceived satisfaction of users/customers for a software product cannot be neglected especially in today competitive market environment as it drives the loyalty of customers and promotes high profitability and return on investment. Therefore understanding the importance of requirements as it is associated with the satisfaction of users/customers when their requirements are met is worth the pain considering. It is necessary to know the relationship between customer satisfactions when their requirements are met (or their dissatisfaction when their requirements are unmet) and the importance of such requirement. So many works have been carried out on customer satisfaction in connection with the importance of requirements but the relationship between customer satisfaction scores (coefficients) of the Kano model and users/customers self-stated requirements importance have not been sufficiently explored. In this study, an attempt is made to unravel the underlying relationship existing between Kano model's customer satisfaction indexes and users/customers self reported requirements importance. The results of the study indicate some interesting associations between these considered variables. These bivariate associations reveal that customer satisfaction index (SI), and average satisfaction coefficient (ASC) and customer dissatisfaction index (DI) and average satisfaction coefficient (ASC) are highly correlated (r = 96 %) and thus ASC can be used in place of either SI or DI in representing customer satisfaction scores. Also, these Kano model's customer satisfaction variables (SI, DI, and ASC) are each associated with self-stated requirements importance (IMP). Further analysis indicates that the value customers or users place on requirements that are met or on features that are incorporated into a product influences the level of satisfaction such customers derive from the product. The

  17. Biology learning evaluation model in Senior High Schools

    Directory of Open Access Journals (Sweden)

    Sri Utari

    2017-06-01

    Full Text Available The study was to develop a Biology learning evaluation model in senior high schools that referred to the research and development model by Borg & Gall and the logic model. The evaluation model included the components of input, activities, output and outcomes. The developing procedures involved a preliminary study in the form of observation and theoretical review regarding the Biology learning evaluation in senior high schools. The product development was carried out by designing an evaluation model, designing an instrument, performing instrument experiment and performing implementation. The instrument experiment involved teachers and Students from Grade XII in senior high schools located in the City of Yogyakarta. For the data gathering technique and instrument, the researchers implemented observation sheet, questionnaire and test. The questionnaire was applied in order to attain information regarding teacher performance, learning performance, classroom atmosphere and scientific attitude; on the other hand, test was applied in order to attain information regarding Biology concept mastery. Then, for the analysis of instrument construct, the researchers performed confirmatory factor analysis by means of Lisrel 0.80 software and the results of this analysis showed that the evaluation instrument valid and reliable. The construct validity was between 0.43-0.79 while the reliability of measurement model was between 0.88-0.94. Last but not the least, the model feasibility test showed that the theoretical model had been supported by the empirical data.

  18. Using Models of Cognition in HRI Evaluation and Design

    National Research Council Canada - National Science Library

    Goodrich, Michael A

    2004-01-01

    ...) guide the construction of experiments. In this paper, we present an information processing model of cognition that we have used extensively in designing and evaluating interfaces and autonomy modes...

  19. A model for evaluating beef cattle rations considering effects of ruminal fiber mass

    Directory of Open Access Journals (Sweden)

    Douglas Sampaio Henrique

    2011-11-01

    Full Text Available A mathematical model based on Cornell Net Carbohydrate and Protein System (CNCPS was developed and adapted in order to evaluate beef cattle rations at tropical climate conditions. The presented system differs from CNCPS in the modeling of insoluble particles' digestion and passage kinetics, which enabled the estimation of fiber mass in rumen and its effects on animal performance. The equations used to estimate metabolizable protein and net energy requirements for gain, net energy requirement for maintenance and total efficiency of metabolizable energy utilization were obtained from scientific articles published in Brazil. The parameters of the regression equations in these papers were estimated using data from Bos indicus purebred and crossbred animals reared under tropical conditions. The model was evaluated by using a 368-piece of information database originally published on 11 Doctoral theses, 14 Master's dissertations and four scientific articles. Outputs of the model can be considered adequate.

  20. Evaluation of one dimensional analytical models for vegetation canopies

    Science.gov (United States)

    Goel, Narendra S.; Kuusk, Andres

    1992-01-01

    The SAIL model for one-dimensional homogeneous vegetation canopies has been modified to include the specular reflectance and hot spot effects. This modified model and the Nilson-Kuusk model are evaluated by comparing the reflectances given by them against those given by a radiosity-based computer model, Diana, for a set of canopies, characterized by different leaf area index (LAI) and leaf angle distribution (LAD). It is shown that for homogeneous canopies, the analytical models are generally quite accurate in the visible region, but not in the infrared region. For architecturally realistic heterogeneous canopies of the type found in nature, these models fall short. These shortcomings are quantified.

  1. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  2. Model of service-oriented catering supply chain performance evaluation

    OpenAIRE

    Gou, Juanqiong; Shen, Guguan; Chai, Rui

    2013-01-01

    Purpose: The aim of this paper is constructing a performance evaluation model for service-oriented catering supply chain. Design/methodology/approach: With the research on the current situation of catering industry, this paper summarized the characters of the catering supply chain, and then presents the service-oriented catering supply chain model based on the platform of logistics and information. At last, the fuzzy AHP method is used to evaluate the performance of service-oriented catering ...

  3. FBI fingerprint identification automation study: AIDS 3 evaluation report. Volume 9: Functional requirements

    Science.gov (United States)

    1980-01-01

    The current system and subsystem used by the Identification Division are described. System constraints that dictate the system environment are discussed and boundaries within which solutions must be found are described. The functional requirements were related to the performance requirements. These performance requirements were then related to their applicable subsystems. The flow of data, documents, or other pieces of information from one subsystem to another or from the external world into the identification system is described. Requirements and design standards for a computer based system are presented.

  4. A Universal Model for the Normative Evaluation of Internet Information.

    NARCIS (Netherlands)

    Spence, E.H.

    2009-01-01

    Beginning with the initial premise that as the Internet has a global character, the paper will argue that the normative evaluation of digital information on the Internet necessitates an evaluative model that is itself universal and global in character (I agree, therefore, with Gorniak- Kocikowska’s

  5. Staying in the Light: Evaluating Sustainability Models for Brokering Software

    Science.gov (United States)

    Powers, L. A.; Benedict, K. K.; Best, M.; Fyfe, S.; Jacobs, C. A.; Michener, W. K.; Pearlman, J.; Turner, A.; Nativi, S.

    2015-12-01

    The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and obstacles, and policy and legal considerations. The issue of sustainability is not unique to brokering software and these models may be relevant to many applications. Results of this comprehensive analysis highlight advantages and disadvantages of the various models in respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis while recognizing that all software is part of an evolutionary process and has a lifespan.

  6. DETRA: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Suolanen, V.

    1996-01-01

    The computer code DETRA is a generic tool for environmental transfer analyses of radioactive or stable substances. The code has been applied for various purposes, mainly problems related to the biospheric transfer of radionuclides both in safety analyses of disposal of nuclear wastes and in consideration of foodchain exposure pathways in the analyses of off-site consequences of reactor accidents. For each specific application an individually tailored conceptual model can be developed. The biospheric transfer analyses performed by the code are typically carried out for terrestrial, aquatic and food chain applications. 21 refs, 35 figs, 15 tabs

  7. A Convergent Participation Model for Evaluation of Learning Objects

    Directory of Open Access Journals (Sweden)

    John Nesbit

    2002-10-01

    Full Text Available The properties that distinguish learning objects from other forms of educational software - global accessibility, metadata standards, finer granularity and reusability - have implications for evaluation. This article proposes a convergent participation model for learning object evaluation in which representatives from stakeholder groups (e.g., students, instructors, subject matter experts, instructional designers, and media developers converge toward more similar descriptions and ratings through a two-stage process supported by online collaboration tools. The article reviews evaluation models that have been applied to educational software and media, considers models for gathering and meta-evaluating individual user reviews that have recently emerged on the Web, and describes the peer review model adopted for the MERLOT repository. The convergent participation model is assessed in relation to other models and with respect to its support for eight goals of learning object evaluation: (1 aid for searching and selecting, (2 guidance for use, (3 formative evaluation, (4 influence on design practices, (5 professional development and student learning, (6 community building, (7 social recognition, and (8 economic exchange.

  8. Maintenance personnel performance simulation (MAPPS) model: overview and evaluation efforts

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.; Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Ryan, T.G.

    1984-01-01

    The development of the MAPPS model has been completed and the model is currently undergoing evaluation. These efforts are addressing a number of identified issues concerning practicality, acceptability, usefulness, and validity. Preliminary analysis of the evaluation data that has been collected indicates that MAPPS will provide comprehensive and reliable data for PRA purposes and for a number of other applications. The MAPPS computer simulation model provides the user with a sophisticated tool for gaining insights into tasks performed by NPP maintenance personnel. Its wide variety of input parameters and output data makes it extremely flexible for application to a number of diverse applications. With the demonstration of favorable model evaluation results, the MAPPS model will represent a valuable source of NPP maintainer reliability data and provide PRA studies with a source of data on maintainers that has previously not existed

  9. Baseline requirements of the proposed action for the Transportation Management Division routing models

    International Nuclear Information System (INIS)

    Johnson, P.E.; Joy, D.S.

    1995-02-01

    The potential impacts associated with the transportation of hazardous materials are important to shippers, carriers, and the general public. This is particularly true for shipments of radioactive material. The shippers are primarily concerned with safety, security, efficiency, and equipment requirements. The carriers are concerned with the potential impact that radioactive shipments may have on their operations--particularly if such materials are involved in an accident. The general public has also expressed concerns regarding the safety of transporting radioactive and other hazardous materials through their communities. Because transportation routes are a central concern in hazardous material transport, the prediction of likely routes is the first step toward resolution of these issues. In response to these routing needs, several models have been developed over the past fifteen years at Oak Ridge National Laboratory (ORNL). The HIGHWAY routing model is used to predict routes for truck transportation, the INTERLINE routing model is used to predict both rail and barge routes, and the AIRPORT locator model is used to determine airports with specified criteria near a specific location. As part of the ongoing improvement of the US Department of Energy's (DOE) Environmental Management Transportation Management Division's (EM-261) computer systems and development efforts, a Baseline Requirements Assessment Session on the HIGHWAY, INTERLINE, and AIRPORT models was held at ORNL on April 27, 1994. The purpose of this meeting was to discuss the existing capabilities of the models and data bases and to review enhancements of the models and data bases to expand their usefulness. The results of the Baseline Requirements Assessment Section will be discussed in this report. The discussions pertaining to the different models are contained in separate sections

  10. Biomechanical modelling and evaluation of construction jobs for performance improvement.

    Science.gov (United States)

    Parida, Ratri; Ray, Pradip Kumar

    2012-01-01

    Occupational risk factors, such as awkward posture, repetition, lack of rest, insufficient illumination and heavy workload related to construction-related MMH activities may cause musculoskeletal disorders and poor performance of the workers, ergonomic design of construction worksystems was a critical need for improving their health and safety wherein a dynamic biomechanical models were required to be empirically developed and tested at a construction site of Tata Steel, the largest steel making company of India in private sector. In this study, a comprehensive framework is proposed for biomechanical evaluation of shovelling and grinding under diverse work environments. The benefit of such an analysis lies in its usefulness in setting guidelines for designing such jobs with minimization of risks of musculoskeletal disorders (MSDs) and enhancing correct methods of carrying out the jobs leading to reduced fatigue and physical stress. Data based on direct observations and videography were collected for the shovellers and grinders over a number of workcycles. Compressive forces and moments for a number of segments and joints are computed with respect to joint flexion and extension. The results indicate that moments and compressive forces at L5/S1 link are significant for shovellers while moments at elbow and wrist are significant for grinders.

  11. A navigational evaluation model for content management systems

    International Nuclear Information System (INIS)

    Gilani, S.; Majeed, A.

    2016-01-01

    Web applications are widely used world-wide, however it is important that the navigation of these websites is effective, to enhance usability. Navigation is not limited to links between pages, it is also how we complete a task. Navigational structure presented as hypertext is one of the most important component of the Web application besides content and presentation. The main objective of this paper is to explore the navigational structure of various open source Content Management Systems from the developer's perspective. For this purpose three CMS are chosen which are WordPress, Joomla, and Drupal. Objective of the research is to identify the important navigational aspects present in these CMSs. Moreover, a comparative study of these CMSs in terms of navigational support is required. For this purpose an industrial survey is conducted based on our proposed navigational evaluation model. The results shows that there exist correlation between the identified factors and these CMSs provide helpful and effective navigational support to their users. (author)

  12. A Linguistic Multigranular Sensory Evaluation Model for Olive Oil

    Directory of Open Access Journals (Sweden)

    Luis Martinez

    2008-06-01

    Full Text Available Evaluation is a process that analyzes elements in order to achieve different objectives such as quality inspection, marketing and other fields in industrial companies. This paper focuses on sensory evaluation where the evaluated items are assessed by a panel of experts according to the knowledge acquired via human senses. In these evaluation processes the information provided by the experts implies uncertainty, vagueness and imprecision. The use of the Fuzzy Linguistic Approach 32 has provided successful results modelling such a type of information. In sensory evaluation it may happen that the panel of experts have more or less degree knowledge of about the evaluated items or indicators. So, it seems suitable that each expert could express their preferences in different linguistic term sets based on their own knowledge. In this paper, we present a sensory evaluation model that manages multigranular linguistic evaluation framework based on a decision analysis scheme. This model will be applied to the sensory evaluation process of Olive Oil.

  13. Requirements on mechanistic NPP models used in CSS for diagnostics and predictions

    International Nuclear Information System (INIS)

    Juslin, K.

    1996-01-01

    Mechanistic models have for several years with good experience been used for operators' support in electric power dispatching centres. Some models of limited scope have already been in use at nuclear power plants. It is considered that also advanced mechanistic models in combination with present computer technology with preference could be used in Computerized Support Systems (CSS) for the assistance of Nuclear Power Plant (NPP) operators. Requirements with respect to accuracy, validity range, speed flexibility and level of detail on the models used for such purposes are discussed. Quality Assurance, Verification and Validation efforts are considered. A long term commitment in the field of mechanistic modelling and real time simulation is considered as the key to successful implementations. The Advanced PROcess Simulation (APROS) code system and simulation environment developed at the Technical Research Centre of Finland (VTT) is intended also for CSS applications in NPP control rooms. (author). 4 refs

  14. On meeting capital requirements with a chance-constrained optimization model.

    Science.gov (United States)

    Atta Mills, Ebenezer Fiifi Emire; Yu, Bo; Gu, Lanlan

    2016-01-01

    This paper deals with a capital to risk asset ratio chance-constrained optimization model in the presence of loans, treasury bill, fixed assets and non-interest earning assets. To model the dynamics of loans, we introduce a modified CreditMetrics approach. This leads to development of a deterministic convex counterpart of capital to risk asset ratio chance constraint. We pursue the scope of analyzing our model under the worst-case scenario i.e. loan default. The theoretical model is analyzed by applying numerical procedures, in order to administer valuable insights from a financial outlook. Our results suggest that, our capital to risk asset ratio chance-constrained optimization model guarantees banks of meeting capital requirements of Basel III with a likelihood of 95 % irrespective of changes in future market value of assets.

  15. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  16. Modeling Requirements for Simulating the Effects of Extreme Acts of Terrorism: A White Paper

    Energy Technology Data Exchange (ETDEWEB)

    Allen, M.; Hiebert-Dodd, K.; Marozas, D.; Paananen, O.; Pryor, R.J.; Reinert, R.K.

    1998-10-01

    This white paper presents the initial requirements for developing a new computer model for simulating the effects of extreme acts of terrorism in the United States. General characteristics of the model are proposed and the level of effort to prepare a complete written description of the model, prior to coding, is detailed. The model would simulate the decision processes and interactions of complex U. S. systems engaged in responding to and recovering from four types of terrorist incidents. The incident scenarios span the space of extreme acts of terrorism that have the potential to affect not only the impacted area, but also the entire nation. The model would be useful to decision-makers in assessing and analyzing the vulnerability of the nation's complex infrastructures, in prioritizing resources to reduce risk, and in planning strategies for immediate response and for subsequent recovery from terrorist incidents.

  17. LINDOZ model for Finland environment: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Galeriu, D.; Apostoaie, A.I.; Mocanu, N.; Paunescu, N.

    1996-01-01

    LINDOZ model was developed as a realistic assessment tool for radioactive contamination of the environment. It was designed to produce estimates for the concentration of the pollutant in different compartments of the terrestrial ecosystem (soil, vegetation, animal tissue, and animal products), and to evaluate human exposure to the contaminant (concentration in whole human body, and dose to humans) from inhalation, ingestion and external irradiation. The user can apply LINDOZ for both routine and accidental type of releases. 2 figs, 2 tabs

  18. Three new models for evaluation of standard involute spur gear mesh stiffness

    Science.gov (United States)

    Liang, Xihui; Zhang, Hongsheng; Zuo, Ming J.; Qin, Yong

    2018-02-01

    Time-varying mesh stiffness is one of the main internal excitation sources of gear dynamics. Accurate evaluation of gear mesh stiffness is crucial for gear dynamic analysis. This study is devoted to developing new models for spur gear mesh stiffness evaluation. Three models are proposed. The proposed model 1 can give very accurate mesh stiffness result but the gear bore surface must be assumed to be rigid. Enlighted by the proposed model 1, our research discovers that the angular deflection pattern of the gear bore surface of a pair of meshing gears under a constant torque basically follows a cosine curve. Based on this finding, two other models are proposed. The proposed model 2 evaluates gear mesh stiffness by using angular deflections at different circumferential angles of an end surface circle of the gear bore. The proposed model 3 requires using only the angular deflection at an arbitrary circumferential angle of an end surface circle of the gear bore but this model can only be used for a gear with the same tooth profile among all teeth. The proposed models are accurate in gear mesh stiffness evaluation and easy to use. Finite element analysis is used to validate the accuracy of the proposed models.

  19. Knowledge representation requirements for model sharing between model-based reasoning and simulation in process flow domains

    Science.gov (United States)

    Throop, David R.

    1992-01-01

    The paper examines the requirements for the reuse of computational models employed in model-based reasoning (MBR) to support automated inference about mechanisms. Areas in which the theory of MBR is not yet completely adequate for using the information that simulations can yield are identified, and recent work in these areas is reviewed. It is argued that using MBR along with simulations forces the use of specific fault models. Fault models are used so that a particular fault can be instantiated into the model and run. This in turn implies that the component specification language needs to be capable of encoding any fault that might need to be sensed or diagnosed. It also means that the simulation code must anticipate all these faults at the component level.

  20. Using a Mixed Model to Explore Evaluation Criteria for Bank Supervision: A Banking Supervision Law Perspective

    Science.gov (United States)

    Tsai, Sang-Bing; Chen, Kuan-Yu; Zhao, Hongrui; Wei, Yu-Min; Wang, Cheng-Kuang; Zheng, Yuxiang; Chang, Li-Chung; Wang, Jiangtao

    2016-01-01

    Financial supervision means that monetary authorities have the power to supervise and manage financial institutions according to laws. Monetary authorities have this power because of the requirements of improving financial services, protecting the rights of depositors, adapting to industrial development, ensuring financial fair trade, and maintaining stable financial order. To establish evaluation criteria for bank supervision in China, this study integrated fuzzy theory and the decision making trial and evaluation laboratory (DEMATEL) and proposes a fuzzy-DEMATEL model. First, fuzzy theory was applied to examine bank supervision criteria and analyze fuzzy semantics. Second, the fuzzy-DEMATEL model was used to calculate the degree to which financial supervision criteria mutually influenced one another and their causal relationship. Finally, an evaluation criteria model for evaluating bank and financial supervision was established. PMID:27992449

  1. Using a Mixed Model to Explore Evaluation Criteria for Bank Supervision: A Banking Supervision Law Perspective.

    Directory of Open Access Journals (Sweden)

    Sang-Bing Tsai

    Full Text Available Financial supervision means that monetary authorities have the power to supervise and manage financial institutions according to laws. Monetary authorities have this power because of the requirements of improving financial services, protecting the rights of depositors, adapting to industrial development, ensuring financial fair trade, and maintaining stable financial order. To establish evaluation criteria for bank supervision in China, this study integrated fuzzy theory and the decision making trial and evaluation laboratory (DEMATEL and proposes a fuzzy-DEMATEL model. First, fuzzy theory was applied to examine bank supervision criteria and analyze fuzzy semantics. Second, the fuzzy-DEMATEL model was used to calculate the degree to which financial supervision criteria mutually influenced one another and their causal relationship. Finally, an evaluation criteria model for evaluating bank and financial supervision was established.

  2. Modeling atmospheric dispersion for reactor accident consequence evaluation

    International Nuclear Information System (INIS)

    Alpert, D.J.; Gudiksen, P.H.; Woodard, K.

    1982-01-01

    Atmospheric dispersion models are a central part of computer codes for the evaluation of potential reactor accident consequences. A variety of ways of treating to varying degrees the many physical processes that can have an impact on the predicted consequences exists. The currently available models are reviewed and their capabilities and limitations, as applied to reactor accident consequence analyses, are discussed

  3. Multi-criteria comparative evaluation of spallation reaction models

    Science.gov (United States)

    Andrianov, Andrey; Andrianova, Olga; Konobeev, Alexandr; Korovin, Yury; Kuptsov, Ilya

    2017-09-01

    This paper presents an approach to a comparative evaluation of the predictive ability of spallation reaction models based on widely used, well-proven multiple-criteria decision analysis methods (MAVT/MAUT, AHP, TOPSIS, PROMETHEE) and the results of such a comparison for 17 spallation reaction models in the presence of the interaction of high-energy protons with natPb.

  4. Evaluating energy efficiency policies with energy-economy models

    NARCIS (Netherlands)

    Mundaca, L.; Neij, L.; Worrell, E.; McNeil, M.

    2010-01-01

    The growing complexities of energy systems, environmental problems, and technology markets are driving and testing most energy-economy models to their limits. To further advance bottom-up models from a multidisciplinary energy efficiency policy evaluation perspective, we review and critically

  5. The fence experiment - a first evaluation of shelter models

    DEFF Research Database (Denmark)

    Peña, Alfredo; Bechmann, Andreas; Conti, Davide

    2016-01-01

    We present a preliminary evaluation of shelter models of different degrees of complexity using full-scale lidar measurements of the shelter on a vertical plane behind and orthogonal to a fence. Model results accounting for the distribution of the relative wind direction within the observed direct...

  6. Evaluating Emulation-based Models of Distributed Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Gabert, Kasimir G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Emulytics Initiatives

    2017-08-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  7. Boussinesq Modeling of Wave Propagation and Runup over Fringing Coral Reefs, Model Evaluation Report

    National Research Council Canada - National Science Library

    Demirbilek, Zeki; Nwogu, Okey G

    2007-01-01

    ..., for waves propagating over fringing reefs. The model evaluation had two goals: (a) investigate differences between laboratory and field characteristics of wave transformation processes over reefs, and (b...

  8. Applicability evaluation on the conservative metal-water reaction(MWR) model implemented into the SPACE code

    International Nuclear Information System (INIS)

    Lee, Suk Ho; You, Sung Chang; Kim, Han Gon

    2011-01-01

    The SBLOCA (Small Break Loss-of-Coolant Accident) evaluation methodology for the APR1400 (Advanced Power Reactor 1400) is under development using the SPACE code. The goal of the development of this methodology is to set up a conservative evaluation methodology in accordance with Appendix K of 10CFR50 by the end of 2012. In order to develop the Appendix K version of the SPACE code, the code modification is considered through implementation of the code on the required evaluation models. For the conservative models required in the SPACE code, the metal-water reaction (MWR) model, the critical flow model, the Critical Heat Flux (CHF) model and the post-CHF model must be implemented in the code. At present, the integration of the model to generate the Appendix K version of SPACE is in its preliminary stage. Among them, the conservative MWR model and its code applicability are introduced in this paper

  9. Ergonomic evaluation model of operational room based on team performance

    Directory of Open Access Journals (Sweden)

    YANG Zhiyi

    2017-05-01

    Full Text Available A theoretical calculation model based on the ergonomic evaluation of team performance was proposed in order to carry out the ergonomic evaluation of the layout design schemes of the action station in a multitasking operational room. This model was constructed in order to calculate and compare the theoretical value of team performance in multiple layout schemes by considering such substantial influential factors as frequency of communication, distance, angle, importance, human cognitive characteristics and so on. An experiment was finally conducted to verify the proposed model under the criteria of completion time and accuracy rating. As illustrated by the experiment results,the proposed approach is conductive to the prediction and ergonomic evaluation of the layout design schemes of the action station during early design stages,and provides a new theoretical method for the ergonomic evaluation,selection and optimization design of layout design schemes.

  10. Human Thermal Model Evaluation Using the JSC Human Thermal Database

    Science.gov (United States)

    Bue, Grant; Makinen, Janice; Cognata, Thomas

    2012-01-01

    Human thermal modeling has considerable long term utility to human space flight. Such models provide a tool to predict crew survivability in support of vehicle design and to evaluate crew response in untested space environments. It is to the benefit of any such model not only to collect relevant experimental data to correlate it against, but also to maintain an experimental standard or benchmark for future development in a readily and rapidly searchable and software accessible format. The Human thermal database project is intended to do just so; to collect relevant data from literature and experimentation and to store the data in a database structure for immediate and future use as a benchmark to judge human thermal models against, in identifying model strengths and weakness, to support model development and improve correlation, and to statistically quantify a model s predictive quality. The human thermal database developed at the Johnson Space Center (JSC) is intended to evaluate a set of widely used human thermal models. This set includes the Wissler human thermal model, a model that has been widely used to predict the human thermoregulatory response to a variety of cold and hot environments. These models are statistically compared to the current database, which contains experiments of human subjects primarily in air from a literature survey ranging between 1953 and 2004 and from a suited experiment recently performed by the authors, for a quantitative study of relative strength and predictive quality of the models.

  11. A smart growth evaluation model based on data envelopment analysis

    Science.gov (United States)

    Zhang, Xiaokun; Guan, Yongyi

    2018-04-01

    With the rapid spread of urbanization, smart growth (SG) has attracted plenty of attention from all over the world. In this paper, by the establishment of index system for smart growth, data envelopment analysis (DEA) model was suggested to evaluate the SG level of the current growth situation in cities. In order to further improve the information of both radial direction and non-radial detection, we introduced the non-Archimedean infinitesimal to form C2GS2 control model. Finally, we evaluated the SG level in Canberra and identified a series of problems, which can verify the applicability of the model and provide us more improvement information.

  12. Literature Review on Modeling Cyber Networks and Evaluating Cyber Risks.

    Energy Technology Data Exchange (ETDEWEB)

    Kelic, Andjelka; Campbell, Philip L

    2018-04-01

    The National Infrastructure Simulations and Analysis Center (NISAC) conducted a literature review on modeling cyber networks and evaluating cyber risks. The literature review explores where modeling is used in the cyber regime and ways that consequence and risk are evaluated. The relevant literature clusters in three different spaces: network security, cyber-physical, and mission assurance. In all approaches, some form of modeling is utilized at varying levels of detail, while the ability to understand consequence varies, as do interpretations of risk. This document summarizes the different literature viewpoints and explores their applicability to securing enterprise networks.

  13. Anticonvulsive evaluation of THIP in the murine pentylenetetrazole kindling model

    DEFF Research Database (Denmark)

    Simonsen, Charlotte; Boddum, Kim; von Schoubye, Nadia L

    2017-01-01

    . Evaluation of THIP as a potential anticonvulsant has given contradictory results in different animal models and for this reason, we reevaluated the anticonvulsive properties of THIP in the murine pentylenetetrazole (PTZ) kindling model. As loss of δ-GABAA R in the dentate gyrus has been associated...... the observed upregulation of δ-GABAA Rs. Even in the demonstrated presence of functional δ-GABAA Rs, THIP (0.5-4 mg/kg) showed no anticonvulsive effect in the PTZ kindling model using a comprehensive in vivo evaluation of the anticonvulsive properties....

  14. Evaluation of Roof Bolting Requirements Based on In-Mine Roof Bolter Drilling

    Energy Technology Data Exchange (ETDEWEB)

    Syd S. Peng

    2005-10-01

    Roof bolting is the most popular method for underground openings in the mining industry, especially in the bedded deposits such as coal. In fact, all U.S. underground coal mine entries are roof-bolted as required by law. However, roof falls still occur frequently in the roof bolted entries. The two possible reasons are: the lack of knowledge of and technology to detect the roof geological conditions in advance of mining, and lack of roof bolting design criteria for modern roof bolting systems. This research is to develop a method for predicting the roof geology and stability condition in real time during roof bolting operation. Based on this information, roof bolting design criteria for modern roof bolting systems will be developed for implementation in real time. For the prediction of roof geology and stability condition in real time, a micro processor was used and a program developed to monitor and record the drilling parameters of roof bolter. These parameters include feed pressure, feed flow (penetration rate), rotation pressure, rotation rate, vacuum pressure, oil temperature of hydraulic circuit, and signals for controlling machine. From the results of a series of laboratory and underground tests so far, feed pressure is found to be a good indicator for identifying the voids/fractures and estimating the roof rock strength. The method for determining quantitatively the location and the size of void/fracture and estimating the roof rock strength from the drilling parameters of roof bolter was developed. Also, a set of computational rules has been developed for in-mine roof using measured roof drilling parameters and implemented in MRGIS (Mine Roof Geology Information System), a software package developed to allow mine engineers to make use of the large amount of roof drilling parameters for predicting roof geology properties automatically. For the development of roof bolting criteria, finite element models were developed for tensioned and fully grouted bolting

  15. Application of Learning Curves for Didactic Model Evaluation: Case Studies

    Directory of Open Access Journals (Sweden)

    Felix Mödritscher

    2013-01-01

    Full Text Available The success of (online courses depends, among other factors, on the underlying didactical models which have always been evaluated with qualitative and quantitative research methods. Several new evaluation techniques have been developed and established in the last years. One of them is ‘learning curves’, which aim at measuring error rates of users when they interact with adaptive educational systems, thereby enabling the underlying models to be evaluated and improved. In this paper, we report how we have applied this new method to two case studies to show that learning curves are useful to evaluate didactical models and their implementation in educational platforms. Results show that the error rates follow a power law distribution with each additional attempt if the didactical model of an instructional unit is valid. Furthermore, the initial error rate, the slope of the curve and the goodness of fit of the curve are valid indicators for the difficulty level of a course and the quality of its didactical model. As a conclusion, the idea of applying learning curves for evaluating didactical model on the basis of usage data is considered to be valuable for supporting teachers and learning content providers in improving their online courses.

  16. Integrating Usability Evaluation into Model-Driven Video Game Development

    OpenAIRE

    Fernandez , Adrian; Insfran , Emilio; Abrahão , Silvia; Carsí , José ,; Montero , Emanuel

    2012-01-01

    Part 3: Short Papers; International audience; The increasing complexity of video game development highlights the need of design and evaluation methods for enhancing quality and reducing time and cost. In this context, Model-Driven Development approaches seem to be very promising since a video game can be obtained by transforming platform-independent models into platform-specific models that can be in turn transformed into code. Although this approach is started to being used for video game de...

  17. Standardizing the performance evaluation of short-term wind prediction models

    DEFF Research Database (Denmark)

    Madsen, Henrik; Pinson, Pierre; Kariniotakis, G.

    2005-01-01

    Short-term wind power prediction is a primary requirement for efficient large-scale integration of wind generation in power systems and electricity markets. The choice of an appropriate prediction model among the numerous available models is not trivial, and has to be based on an objective...... evaluation of model performance. This paper proposes a standardized protocol for the evaluation of short-term wind-poser preciction systems. A number of reference prediction models are also described, and their use for performance comparison is analysed. The use of the protocol is demonstrated using results...... from both on-shore and off-shore wind forms. The work was developed in the frame of the Anemos project (EU R&D project) where the protocol has been used to evaluate more than 10 prediction systems....

  18. Animal models for evaluation of oral delivery of biopharmaceuticals

    DEFF Research Database (Denmark)

    Harloff-Helleberg, Stine; Nielsen, Line Hagner; Nielsen, Hanne Mørck

    2017-01-01

    of systems for oral delivery of biopharmaceuticals may result in new treatment modalities to increase the patient compliance and reduce product cost. In the preclinical development phase, use of experimental animal models is essential for evaluation of new formulation designs. In general, the limited oral...... bioavailability of biopharmaceuticals, of just a few percent, is expected, and therefore, the animal models and the experimental settings must be chosen with utmost care. More knowledge and focus on this topic is highly needed, despite experience from the numerous studies evaluating animal models for oral drug...... delivery of small molecule drugs. This review highlights and discusses pros and cons of the most currently used animal models and settings. Additionally, it also looks into the influence of anesthetics and sampling methods for evaluation of drug delivery systems for oral delivery of biopharmaceuticals...

  19. Evaluation of Shipbuilding CAD/CAM/CIM Systems - Phase II (Requirements for Future Systems)

    National Research Council Canada - National Science Library

    Horvath, John; Ross, Jonathan M

    1997-01-01

    .... The purpose of the analysis was two fold: 1. To describe the requirements of a competitive, future-oriented computer-aided design/computer-aided manufacturing/computer-integrated management (CAD/CAMYCIM...

  20. Evaluating performances of simplified physically based landslide susceptibility models.

    Science.gov (United States)

    Capparelli, Giovanna; Formetta, Giuseppe; Versace, Pasquale

    2015-04-01

    Rainfall induced shallow landslides cause significant damages involving loss of life and properties. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. This paper presents a package of GIS based models for landslide susceptibility analysis. It was integrated in the NewAge-JGrass hydrological model using the Object Modeling System (OMS) modeling framework. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices (GOF) by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system offers the possibility to investigate and fairly compare the quality and the robustness of models and models parameters, according a procedure that includes: i) model parameters estimation by optimizing each of the GOF index separately, ii) models evaluation in the ROC plane by using each of the optimal parameter set, and iii) GOF robustness evaluation by assessing their sensitivity to the input parameter variation. This procedure was repeated for all three models. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, Average Index (AI) optimization coupled with model M3 is the best modeling solution for our test case. This research was funded by PON Project No. 01_01503 "Integrated Systems for Hydrogeological Risk

  1. On global and regional spectral evaluation of global geopotential models

    International Nuclear Information System (INIS)

    Ustun, A; Abbak, R A

    2010-01-01

    Spectral evaluation of global geopotential models (GGMs) is necessary to recognize the behaviour of gravity signal and its error recorded in spherical harmonic coefficients and associated standard deviations. Results put forward in this wise explain the whole contribution of gravity data in different kinds that represent various sections of the gravity spectrum. This method is more informative than accuracy assessment methods, which use external data such as GPS-levelling. Comparative spectral evaluation for more than one model can be performed both in global and local sense using many spectral tools. The number of GGMs has grown with the increasing number of data collected by the dedicated satellite gravity missions, CHAMP, GRACE and GOCE. This fact makes it necessary to measure the differences between models and to monitor the improvements in the gravity field recovery. In this paper, some of the satellite-only and combined models are examined in different scales, globally and regionally, in order to observe the advances in the modelling of GGMs and their strengths at various expansion degrees for geodetic and geophysical applications. The validation of the published errors of model coefficients is a part of this evaluation. All spectral tools explicitly reveal the superiority of the GRACE-based models when compared against the models that comprise the conventional satellite tracking data. The disagreement between models is large in local/regional areas if data sets are different, as seen from the example of the Turkish territory

  2. 14 CFR Special Federal Aviation... - Fuel Tank System Fault Tolerance Evaluation Requirements

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Fuel Tank System Fault Tolerance Evaluation..., SFAR No. 88 Special Federal Aviation Regulation No. 88—Fuel Tank System Fault Tolerance Evaluation... certificates that may affect the airplane fuel tank system, for turbine-powered transport category airplanes...

  3. Evaluation of recent quantitative magnetospheric magnetic field models

    International Nuclear Information System (INIS)

    Walker, R.J.

    1976-01-01

    Recent quantitative magnetospheric field models contain many features not found in earlier models. Magnetopause models which include the effects of the dipole tilt were presented. More realistic models of the tail field include tail currents which close on the magnetopause, cross-tail currents of finite thickness, and cross-tail current models which model the position of the neutral sheet as a function of tilt. Finally, models have attempted to calculate the field of currents distributed in the inner magnetosphere. As the purpose of a magnetospheric model is to provide a mathematical description of the field that reasonably reproduces the observed magnetospheric field, several recent models were compared with the observed ΔB(B/sub observed/--B/sub main field/) contours. Models containing only contributions from magnetopause and tail current systems are able to reproduce the observed quiet time field only in an extremely qualitative way. The best quantitative agreement between models and observations occurs when currents distributed in the inner magnetosphere are added to the magnetopause and tail current systems. However, the distributed current models are valid only for zero tilt. Even the models which reproduce the average observed field reasonably well may not give physically reasonable field gradients. Three of the models evaluated contain regions in the near tail in which the field gradient reverses direction. One region in which all the models fall short is that around the polar cusp, though most can be used to calculate the position of the last closed field line reasonably well

  4. Systematic Review of Health Economic Impact Evaluations of Risk Prediction Models : Stop Developing, Start Evaluating

    NARCIS (Netherlands)

    van Giessen, Anoukh; Peters, Jaime; Wilcher, Britni; Hyde, Chris; Moons, Carl; de Wit, Ardine; Koffijberg, Erik

    2017-01-01

    Background: Although health economic evaluations (HEEs) are increasingly common for therapeutic interventions, they appear to be rare for the use of risk prediction models (PMs). Objectives: To evaluate the current state of HEEs of PMs by performing a comprehensive systematic review. Methods: Four

  5. Airline service quality evaluation: A review on concepts and models

    Directory of Open Access Journals (Sweden)

    Navid Haghighat

    2017-12-01

    Full Text Available This paper reviews different major service quality concept and models which led to great developments in evaluating service quality with focusing on improvement process of the models through discussing criticisms of each model. Criticisms against these models are discussed to clarify development steps of newer models which led to the improvement of airline service quality models. The precise and accurate evaluation of service quality needs utilizing a reliable concept with comprehensive criteria and effective measurement techniques as the fundamentals of a valuable framework. In this paper, service quality models improvement is described based on three major service quality concepts, the disconfirmation, performance and hierarchical concepts which are developed subsequently. Reviewing various criteria and different measurement techniques such a statistical analysis and multi-criteria decision making assist researchers to have a clear understanding of the development of the evaluation framework in the airline industry. This study aims at promoting reliable frameworks for evaluating airline service quality in different countries and societies due to economic, cultural and social aspects of each society.

  6. Patient-centered care requires a patient-oriented workflow model.

    Science.gov (United States)

    Ozkaynak, Mustafa; Brennan, Patricia Flatley; Hanauer, David A; Johnson, Sharon; Aarts, Jos; Zheng, Kai; Haque, Saira N

    2013-06-01

    Effective design of health information technology (HIT) for patient-centered care requires consideration of workflow from the patient's perspective, termed 'patient-oriented workflow.' This approach organizes the building blocks of work around the patients who are moving through the care system. Patient-oriented workflow complements the more familiar clinician-oriented workflow approaches, and offers several advantages, including the ability to capture simultaneous, cooperative work, which is essential in care delivery. Patient-oriented workflow models can also provide an understanding of healthcare work taking place in various formal and informal health settings in an integrated manner. We present two cases demonstrating the potential value of patient-oriented workflow models. Significant theoretical, methodological, and practical challenges must be met to ensure adoption of patient-oriented workflow models. Patient-oriented workflow models define meaningful system boundaries and can lead to HIT implementations that are more consistent with cooperative work and its emergent features.

  7. Fraud Risk Modelling: Requirements Elicitation in the Case of Telecom Services

    DEFF Research Database (Denmark)

    Yesuf, Ahmed; Wolos, Lars Peter; Rannenberg, Kai

    2017-01-01

    Telecom providers are losing tremendous amounts of money due to fraud risks posed to Telecom services and products. Currently, they are mainly focusing on fraud detection approaches to reduce the impact of fraud risks against their services. However, fraud prevention approaches should also...... be investigated in order to further reduce fraud risks and improve the revenue of Telecom providers. Fraud risk modelling is a fraud prevention approach aims at identifying the potential fraud risks, estimating the damage and setting up preventive mechanisms before the fraud risks lead to actual losses....... In this paper, we highlight the important requirements for a usable and context-aware fraud risk modelling approach for Telecom services. To do so, we have conducted two workshops with experts from a Telecom provider and experts from multi-disciplinary areas. In order to show and document the requirements, we...

  8. Classification and moral evaluation of uncertainties in engineering modeling.

    Science.gov (United States)

    Murphy, Colleen; Gardoni, Paolo; Harris, Charles E

    2011-09-01

    Engineers must deal with risks and uncertainties as a part of their professional work and, in particular, uncertainties are inherent to engineering models. Models play a central role in engineering. Models often represent an abstract and idealized version of the mathematical properties of a target. Using models, engineers can investigate and acquire understanding of how an object or phenomenon will perform under specified conditions. This paper defines the different stages of the modeling process in engineering, classifies the various sources of uncertainty that arise in each stage, and discusses the categories into which these uncertainties fall. The paper then considers the way uncertainty and modeling are approached in science and the criteria for evaluating scientific hypotheses, in order to highlight the very different criteria appropriate for the development of models and the treatment of the inherent uncertainties in engineering. Finally, the paper puts forward nine guidelines for the treatment of uncertainty in engineering modeling.

  9. Evaluation of Artificial Intelligence Based Models for Chemical Biodegradability Prediction

    Directory of Open Access Journals (Sweden)

    Aleksandar Sabljic

    2004-12-01

    Full Text Available This study presents a review of biodegradability modeling efforts including a detailed assessment of two models developed using an artificial intelligence based methodology. Validation results for these models using an independent, quality reviewed database, demonstrate that the models perform well when compared to another commonly used biodegradability model, against the same data. The ability of models induced by an artificial intelligence methodology to accommodate complex interactions in detailed systems, and the demonstrated reliability of the approach evaluated by this study, indicate that the methodology may have application in broadening the scope of biodegradability models. Given adequate data for biodegradability of chemicals under environmental conditions, this may allow for the development of future models that include such things as surface interface impacts on biodegradability for example.

  10. Modeling Multi-Reservoir Hydropower Systems in the Sierra Nevada with Environmental Requirements and Climate Warming

    Science.gov (United States)

    Rheinheimer, David Emmanuel

    Hydropower systems and other river regulation often harm instream ecosystems, partly by altering the natural flow and temperature regimes that ecosystems have historically depended on. These effects are compounded at regional scales. As hydropower and ecosystems are increasingly valued globally due to growing values for clean energy and native species as well as and new threats from climate warming, it is important to understand how climate warming might affect these systems, to identify tradeoffs between different water uses for different climate conditions, and to identify promising water management solutions. This research uses traditional simulation and optimization to explore these issues in California's upper west slope Sierra Nevada mountains. The Sierra Nevada provides most of the water for California's vast water supply system, supporting high-elevation hydropower generation, ecosystems, recreation, and some local municipal and agricultural water supply along the way. However, regional climate warming is expected to reduce snowmelt and shift runoff to earlier in the year, affecting all water uses. This dissertation begins by reviewing important literature related to the broader motivations of this study, including river regulation, freshwater conservation, and climate change. It then describes three substantial studies. First, a weekly time step water resources management model spanning the Feather River watershed in the north to the Kern River watershed in the south is developed. The model, which uses the Water Evaluation And Planning System (WEAP), includes reservoirs, run-of-river hydropower, variable head hydropower, water supply demand, and instream flow requirements. The model is applied with a runoff dataset that considers regional air temperature increases of 0, 2, 4 and 6 °C to represent historical, near-term, mid-term and far-term (end-of-century) warming. Most major hydropower turbine flows are simulated well. Reservoir storage is also

  11. REQUIREMENTS FOR SYSTEMS DEVELOPMENT LIFE CYCLE MODELS FOR LARGE-SCALE DEFENSE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Kadir Alpaslan DEMIR

    2015-10-01

    Full Text Available TLarge-scale defense system projects are strategic for maintaining and increasing the national defense capability. Therefore, governments spend billions of dollars in the acquisition and development of large-scale defense systems. The scale of defense systems is always increasing and the costs to build them are skyrocketing. Today, defense systems are software intensive and they are either a system of systems or a part of it. Historically, the project performances observed in the development of these systems have been signifi cantly poor when compared to other types of projects. It is obvious that the currently used systems development life cycle models are insuffi cient to address today’s challenges of building these systems. Using a systems development life cycle model that is specifi cally designed for largescale defense system developments and is effective in dealing with today’s and near-future challenges will help to improve project performances. The fi rst step in the development a large-scale defense systems development life cycle model is the identifi cation of requirements for such a model. This paper contributes to the body of literature in the fi eld by providing a set of requirements for system development life cycle models for large-scale defense systems. Furthermore, a research agenda is proposed.

  12. Evaluation of two ozone air quality modelling systems

    Directory of Open Access Journals (Sweden)

    S. Ortega

    2004-01-01

    Full Text Available The aim of this paper is to compare two different modelling systems and to evaluate their ability to simulate high values of ozone concentration in typical summer episodes which take place in the north of Spain near the metropolitan area of Barcelona. As the focus of the paper is the comparison of the two systems, we do not attempt to improve the agreement by adjusting the emission inventory or model parameters. The first model, or forecasting system, is made up of three modules. The first module is a mesoscale model (MASS. This provides the initial condition for the second module, which is a nonlocal boundary layer model based on the transilient turbulence scheme. The third module is a photochemical box model (OZIPR, which is applied in Eulerian and Lagrangian modes and receives suitable information from the two previous modules. The model forecast is evaluated against ground base stations during summer 2001. The second model is the MM5/UAM-V. This is a grid model designed to predict the hourly three-dimensional ozone concentration fields. The model is applied during an ozone episode that occurred between 21 and 23 June 2001. Our results reflect the good performance of the two modelling systems when they are used in a specific episode.

  13. Data requirements of GREAT-ER: Modelling and validation using LAS in four UK catchments

    International Nuclear Information System (INIS)

    Price, Oliver R.; Munday, Dawn K.; Whelan, Mick J.; Holt, Martin S.; Fox, Katharine K.; Morris, Gerard; Young, Andrew R.

    2009-01-01

    Higher-tier environmental risk assessments on 'down-the-drain' chemicals in river networks can be conducted using models such as GREAT-ER (Geography-referenced Regional Exposure Assessment Tool for European Rivers). It is important these models are evaluated and their sensitivities to input variables understood. This study had two primary objectives: evaluate GREAT-ER model performance, comparing simulated modelled predictions for LAS (linear alkylbenzene sulphonate) with measured concentrations, for four rivers in the UK, and investigate model sensitivity to input variables. We demonstrate that the GREAT-ER model is very sensitive to variability in river discharges. However it is insensitive to the form of distributions used to describe chemical usage and removal rate in sewage treatment plants (STPs). It is concluded that more effort should be directed towards improving empirical estimates of effluent load and reducing uncertainty associated with usage and removal rates in STPs. Simulations could be improved by incorporating the effect of river depth on dissipation rates. - Validation of GREAT-ER.

  14. Data requirements of GREAT-ER: Modelling and validation using LAS in four UK catchments

    Energy Technology Data Exchange (ETDEWEB)

    Price, Oliver R., E-mail: oliver.price@unilever.co [Safety and Environmental Assurance Centre, Unilever, Colworth Science Park, Sharnbrook, Bedfordshire MK44 1LQ (United Kingdom); Munday, Dawn K. [Safety and Environmental Assurance Centre, Unilever, Colworth Science Park, Sharnbrook, Bedfordshire MK44 1LQ (United Kingdom); Whelan, Mick J. [Department of Natural Resources, School of Applied Sciences, Cranfield University, College Road, Cranfield, Bedfordshire MK43 0AL (United Kingdom); Holt, Martin S. [ECETOC, Ave van Nieuwenhuyse 4, Box 6, B-1160 Brussels (Belgium); Fox, Katharine K. [85 Park Road West, Birkenhead, Merseyside CH43 8SQ (United Kingdom); Morris, Gerard [Environment Agency, Phoenix House, Global Avenue, Leeds LS11 8PG (United Kingdom); Young, Andrew R. [Wallingford HydroSolutions Ltd, Maclean building, Crowmarsh Gifford, Wallingford, Oxon OX10 8BB (United Kingdom)

    2009-10-15

    Higher-tier environmental risk assessments on 'down-the-drain' chemicals in river networks can be conducted using models such as GREAT-ER (Geography-referenced Regional Exposure Assessment Tool for European Rivers). It is important these models are evaluated and their sensitivities to input variables understood. This study had two primary objectives: evaluate GREAT-ER model performance, comparing simulated modelled predictions for LAS (linear alkylbenzene sulphonate) with measured concentrations, for four rivers in the UK, and investigate model sensitivity to input variables. We demonstrate that the GREAT-ER model is very sensitive to variability in river discharges. However it is insensitive to the form of distributions used to describe chemical usage and removal rate in sewage treatment plants (STPs). It is concluded that more effort should be directed towards improving empirical estimates of effluent load and reducing uncertainty associated with usage and removal rates in STPs. Simulations could be improved by incorporating the effect of river depth on dissipation rates. - Validation of GREAT-ER.

  15. Evaluation of potential crushed-salt constitutive models

    International Nuclear Information System (INIS)

    Callahan, G.D.; Loken, M.C.; Sambeek, L.L. Van; Chen, R.; Pfeifle, T.W.; Nieland, J.D.; Hansen, F.D.

    1995-12-01

    Constitutive models describing the deformation of crushed salt are presented in this report. Ten constitutive models with potential to describe the phenomenological and micromechanical processes for crushed salt were selected from a literature search. Three of these ten constitutive models, termed Sjaardema-Krieg, Zeuch, and Spiers models, were adopted as candidate constitutive models. The candidate constitutive models were generalized in a consistent manner to three-dimensional states of stress and modified to include the effects of temperature, grain size, and moisture content. A database including hydrostatic consolidation and shear consolidation tests conducted on Waste Isolation Pilot Plant and southeastern New Mexico salt was used to determine material parameters for the candidate constitutive models. Nonlinear least-squares model fitting to data from the hydrostatic consolidation tests, the shear consolidation tests, and a combination of the shear and hydrostatic tests produces three sets of material parameter values for the candidate models. The change in material parameter values from test group to test group indicates the empirical nature of the models. To evaluate the predictive capability of the candidate models, each parameter value set was used to predict each of the tests in the database. Based on the fitting statistics and the ability of the models to predict the test data, the Spiers model appeared to perform slightly better than the other two candidate models. The work reported here is a first-of-its kind evaluation of constitutive models for reconsolidation of crushed salt. Questions remain to be answered. Deficiencies in models and databases are identified and recommendations for future work are made. 85 refs

  16. Evidence used in model-based economic evaluations for evaluating pharmacogenetic and pharmacogenomic tests: a systematic review protocol.

    Science.gov (United States)

    Peters, Jaime L; Cooper, Chris; Buchanan, James

    2015-11-11

    Decision models can be used to conduct economic evaluations of new pharmacogenetic and pharmacogenomic tests to ensure they offer value for money to healthcare systems. These models require a great deal of evidence, yet research suggests the evidence used is diverse and of uncertain quality. By conducting a systematic review, we aim to investigate the test-related evidence used to inform decision models developed for the economic evaluation of genetic tests. We will search electronic databases including MEDLINE, EMBASE and NHS EEDs to identify model-based economic evaluations of pharmacogenetic and pharmacogenomic tests. The search will not be limited by language or date. Title and abstract screening will be conducted independently by 2 reviewers, with screening of full texts and data extraction conducted by 1 reviewer, and checked by another. Characteristics of the decision problem, the decision model and the test evidence used to inform the model will be extracted. Specifically, we will identify the reported evidence sources for the test-related evidence used, describe the study design and how the evidence was identified. A checklist developed specifically for decision analytic models will be used to critically appraise the models described in these studies. Variations in the test evidence used in the decision models will be explored across the included studies, and we will identify gaps in the evidence in terms of both quantity and quality. The findings of this work will be disseminated via a peer-reviewed journal publication and at national and international conferences. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  17. Evaluation of expert systems - An approach and case study. [of determining software functional requirements for command management of satellites

    Science.gov (United States)

    Liebowitz, J.

    1985-01-01

    Techniques that were applied in defining an expert system prototype for first-cut evaluations of the software functional requirements of NASA satellite command management activities are described. The prototype was developed using the Knowledge Engineering System. Criteria were selected for evaluating the satellite software before defining the expert system prototype. Application of the prototype system is illustrated in terms of the evaluation procedures used with the COBE satellite to be launched in 1988. The limited number of options which can be considered by the program mandates that biases in the system output must be well understood by the users.

  18. An Efficient Dynamic Trust Evaluation Model for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Zhengwang Ye

    2017-01-01

    Full Text Available Trust evaluation is an effective method to detect malicious nodes and ensure security in wireless sensor networks (WSNs. In this paper, an efficient dynamic trust evaluation model (DTEM for WSNs is proposed, which implements accurate, efficient, and dynamic trust evaluation by dynamically adjusting the weights of direct trust and indirect trust and the parameters of the update mechanism. To achieve accurate trust evaluation, the direct trust is calculated considering multitrust including communication trust, data trust, and energy trust with the punishment factor and regulating function. The indirect trust is evaluated conditionally by the trusted recommendations from a third party. Moreover, the integrated trust is measured by assigning dynamic weights for direct trust and indirect trust and combining them. Finally, we propose an update mechanism by a sliding window based on induced ordered weighted averaging operator to enhance flexibility. We can dynamically adapt the parameters and the interactive history windows number according to the actual needs of the network to realize dynamic update of direct trust value. Simulation results indicate that the proposed dynamic trust model is an efficient dynamic and attack-resistant trust evaluation model. Compared with existing approaches, the proposed dynamic trust model performs better in defending multiple malicious attacks.

  19. Experiences in Using Practitioner’s Checklists to Evaluate the Relevance of Experiments Reported in Requirements Engineering

    NARCIS (Netherlands)

    Daneva, Maia; Sikkel, Nicolaas; Condori-Fernandez, Nelly; Herrmann, Andrea

    Background: Requirements Engineering (RE) researchers recognize that for RE methods to be adopted in industry, practitioners should be able to evaluate the relevance of a study to their practice. Kitchenham et al proposed a set of perspective-based checklists, which demonstrated to be a useful

  20. Evaluation of Cost Models and Needs & Gaps Analysis

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad

    2014-01-01

    they breakdown costs. This is followed by an in depth analysis of stakeholders’ needs for financial information derived from the 4C project stakeholder consultation.The stakeholders’ needs analysis indicated that models should:• support accounting, but more importantly they should enable budgeting• be able......his report ’D3.1—Evaluation of Cost Models and Needs & Gaps Analysis’ provides an analysis of existing research related to the economics of digital curation and cost & benefit modelling. It reports upon the investigation of how well current models and tools meet stakeholders’ needs for calculating...... andcomparing financial information. Based on this evaluation, it aims to point out gaps that need to be bridged in order to increase the uptake of cost & benefit modelling and good practices that will enable costing and comparison of the costs of alternative scenarios—which in turn provides a starting point...

  1. Using measurements for evaluation of black carbon modeling

    Directory of Open Access Journals (Sweden)

    S. Gilardoni

    2011-01-01

    Full Text Available The ever increasing use of air quality and climate model assessments to underpin economic, public health, and environmental policy decisions makes effective model evaluation critical. This paper discusses the properties of black carbon and light attenuation and absorption observations that are the key to a reliable evaluation of black carbon model and compares parametric and nonparametric statistical tools for the quantification of the agreement between models and observations. Black carbon concentrations are simulated with TM5/M7 global model from July 2002 to June 2003 at four remote sites (Alert, Jungfraujoch, Mace Head, and Trinidad Head and two regional background sites (Bondville and Ispra. Equivalent black carbon (EBC concentrations are calculated using light attenuation measurements from January 2000 to December 2005. Seasonal trends in the measurements are determined by fitting sinusoidal functions and the representativeness of the period simulated by the model is verified based on the scatter of the experimental values relative to the fit curves. When the resolution of the model grid is larger than 1° × 1°, it is recommended to verify that the measurement site is representative of the grid cell. For this purpose, equivalent black carbon measurements at Alert, Bondville and Trinidad Head are compared to light absorption and elemental carbon measurements performed at different sites inside the same model grid cells. Comparison of these equivalent black carbon and elemental carbon measurements indicates that uncertainties in black carbon optical properties can compromise the comparison between model and observations. During model evaluation it is important to examine the extent to which a model is able to simulate the variability in the observations over different integration periods as this will help to identify the most appropriate timescales. The agreement between model and observation is accurately described by the overlap of

  2. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species

    OpenAIRE

    Matthew P. Adams; Catherine J. Collier; Sven Uthicke; Yan X. Ow; Lucas Langlois; Katherine R. O’Brien

    2017-01-01

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluat...

  3. Evaluation of multivariate calibration models transferred between spectroscopic instruments

    DEFF Research Database (Denmark)

    Eskildsen, Carl Emil Aae; Hansen, Per W.; Skov, Thomas

    2016-01-01

    In a setting where multiple spectroscopic instruments are used for the same measurements it may be convenient to develop the calibration model on a single instrument and then transfer this model to the other instruments. In the ideal scenario, all instruments provide the same predictions for the ......In a setting where multiple spectroscopic instruments are used for the same measurements it may be convenient to develop the calibration model on a single instrument and then transfer this model to the other instruments. In the ideal scenario, all instruments provide the same predictions...... for the same samples using the transferred model. However, sometimes the success of a model transfer is evaluated by comparing the transferred model predictions with the reference values. This is not optimal, as uncertainties in the reference method will impact the evaluation. This paper proposes a new method...... for calibration model transfer evaluation. The new method is based on comparing predictions from different instruments, rather than comparing predictions and reference values. A total of 75 flour samples were available for the study. All samples were measured on ten near infrared (NIR) instruments from two...

  4. High resolution weather data for urban hydrological modelling and impact assessment, ICT requirements and future challenges

    Science.gov (United States)

    ten Veldhuis, Marie-claire; van Riemsdijk, Birna

    2013-04-01

    Hydrological analysis of urban catchments requires high resolution rainfall and catchment information because of the small size of these catchments, high spatial variability of the urban fabric, fast runoff processes and related short response times. Rainfall information available from traditional radar and rain gauge networks does no not meet the relevant scales of urban hydrology. A new type of weather radars, based on X-band frequency and equipped with Doppler and dual polarimetry capabilities, promises to provide more accurate rainfall estimates at the spatial and temporal scales that are required for urban hydrological analysis. Recently, the RAINGAIN project was started to analyse the applicability of this new type of radars in the context of urban hydrological modelling. In this project, meteorologists and hydrologists work closely together in several stages of urban hydrological analysis: from the acquisition procedure of novel and high-end radar products to data acquisition and processing, rainfall data retrieval, hydrological event analysis and forecasting. The project comprises of four pilot locations with various characteristics of weather radar equipment, ground stations, urban hydrological systems, modelling approaches and requirements. Access to data processing and modelling software is handled in different ways in the pilots, depending on ownership and user context. Sharing of data and software among pilots and with the outside world is an ongoing topic of discussion. The availability of high resolution weather data augments requirements with respect to the resolution of hydrological models and input data. This has led to the development of fully distributed hydrological models, the implementation of which remains limited by the unavailability of hydrological input data. On the other hand, if models are to be used in flood forecasting, hydrological models need to be computationally efficient to enable fast responses to extreme event conditions. This

  5. Experiments with data assimilation in comprehensive air quality models: Impacts on model predictions and observation requirements (Invited)

    Science.gov (United States)

    Mathur, R.

    2009-12-01

    Emerging regional scale atmospheric simulation models must address the increasing complexity arising from new model applications that treat multi-pollutant interactions. Sophisticated air quality modeling systems are needed to develop effective abatement strategies that focus on simultaneously controlling multiple criteria pollutants as well as use in providing short term air quality forecasts. In recent years the applications of such models is continuously being extended to address atmospheric pollution phenomenon from local to hemispheric spatial scales over time scales ranging from episodic to annual. The need to represent interactions between physical and chemical atmospheric processes occurring at these disparate spatial and temporal scales requires the use of observation data beyond traditional in-situ networks so that the model simulations can be reasonably constrained. Preliminary applications of assimilation of remote sensing and aloft observations within a comprehensive regional scale atmospheric chemistry-transport modeling system will be presented: (1) A methodology is developed to assimilate MODIS aerosol optical depths in the model to represent the impacts long-range transport associated with the summer 2004 Alaskan fires on surface-level regional fine particulate matter (PM2.5) concentrations across the Eastern U.S. The episodic impact of this pollution transport event on PM2.5 concentrations over the eastern U.S. during mid-July 2004, is quantified through the complementary use of the model with remotely-sensed, aloft, and surface measurements; (2) Simple nudging experiments with limited aloft measurements are performed to identify uncertainties in model representations of physical processes and assess the potential use of such measurements in improving the predictive capability of atmospheric chemistry-transport models. The results from these early applications will be discussed in context of uncertainties in the model and in the remote sensing

  6. Modelo de requisitos para sistemas embebidos: Model of requirements for embedded systems

    Directory of Open Access Journals (Sweden)

    Liliana González Palacio

    2008-07-01

    Full Text Available En este artículo se presenta un modelo de requisitos como apoyo para la construcción de sistemas embebidos. En la actualidad, las metodologías de Ingeniería de Requisitos propuestas para este dominio no establecen continuidad en su proceso de desarrollo, ya que poseen una fuerte orientación a la etapa de diseño y un énfasis más débil en la etapa de análisis. Además, dichas metodologías ofrecen pautas para tratar los requisitos luego de que han sido obtenidos, pero no proponen herramientas; como por ejemplo, un modelo de requisitos, para la obtención de estos. Este trabajo hace parte de un proyecto de investigación que tiene como objetivo proponer una metodología de Ingeniería de Requisitos (IR para el análisis de Sistemas Embebidos (SE. El modelo de requisitos propuesto y su forma de utilización se ilustran mediante un caso de aplicación consistente en la obtención de requisitos para un sistema de sensado de movimiento, embebido en un sistema de alarma para hogar.In this paper a model of requirements for supporting the construction of embedded systems is presented. Currently, the methodologies of Engineering of Requirements, in this field, do not let continuity in their development process, since they have a strong orientation to design stage and a weaker emphasis on the analysis stage. Furthermore, such methodologies provide guidelines for treating requirements after being obtained. However, they do not propose tools such as a model of requirements for obtaining them. This paper is the result of a research project which objective is to propose engineering of requirements methodology for embedded systems analysis. The model of proposed requirements and its use are illustrated through an application case consisting on obtaining requirements for a movement sensing system, embedded in a home alarm system.

  7. Building beef cow nutritional programs with the 1996 NRC beef cattle requirements model.

    Science.gov (United States)

    Lardy, G P; Adams, D C; Klopfenstein, T J; Patterson, H H

    2004-01-01

    Designing a sound cow-calf nutritional program requires knowledge of nutrient requirements, diet quality, and intake. Effectively using the NRC (1996) beef cattle requirements model (1996NRC) also requires knowledge of dietary degradable intake protein (DIP) and microbial efficiency. Objectives of this paper are to 1) describe a framework in which 1996NRC-applicable data can be generated, 2) describe seasonal changes in nutrients on native range, 3) use the 1996NRC to predict nutrient balance for cattle grazing these forages, and 4) make recommendations for using the 1996NRC for forage-fed cattle. Extrusa samples were collected over 2 yr on native upland range and subirrigated meadow in the Nebraska Sandhills. Samples were analyzed for CP, in vitro OM digestibility (IVOMD), and DIP. Regression equations to predict nutrients were developed from these data. The 1996NRC was used to predict nutrient balances based on the dietary nutrient analyses. Recommendations for model users were also developed. On subirrigated meadow, CP and IVOMD increased rapidly during March and April. On native range, CP and IVOMD increased from April through June but decreased rapidly from August through September. Degradable intake protein (DM basis) followed trends similar to CP for both native range and subirrigated meadow. Predicted nutrient balances for spring- and summer-calving cows agreed with reported values in the literature, provided that IVOMD values were converted to DE before use in the model (1.07 x IVOMD - 8.13). When the IVOMD-to-DE conversion was not used, the model gave unrealistically high NE(m) balances. To effectively use the 1996NRC to estimate protein requirements, users should focus on three key estimates: DIP, microbial efficiency, and TDN intake. Consequently, efforts should be focused on adequately describing seasonal changes in forage nutrient content. In order to increase use of the 1996NRC, research is needed in the following areas: 1) cost-effective and

  8. Evaluation of an objective plan-evaluation model in the three dimensional treatment of nonsmall cell lung cancer

    International Nuclear Information System (INIS)

    Graham, Mary V.; Jain, Nilesh L.; Kahn, Michael G.; Drzymala, Robert E.; Purdy, James A.

    1996-01-01

    Purpose: Evaluation of three dimensional (3D) radiotherapy plans is difficult because it requires the review of vast amounts of data. Selecting the optimal plan from a set of competing plans involves making trade-offs among the doses delivered to the target volumes and normal tissues. The purpose of this study was to test an objective plan-evaluation model and evaluate its clinical usefulness in 3D treatment planning for nonsmall cell lung cancer. Methods and Materials: Twenty patients with inoperable nonsmall cell lung cancer treated with definitive radiotherapy were studied using full 3D techniques for treatment design and implementation. For each patient, the evaluator (the treating radiation oncologist) initially ranked three plans using room-view dose-surface isplays and dose-volume histograms, and identified the issues that needed to be improved. The three plans were then ranked by the objective plan-evaluation model. A figure of merit (FOM) was computed for each plan by combining the numerical score (utility in decision-theoretic terms) for each clinical issue. The utility was computed from a probability of occurrence of the issue and a physician-specific weight indicating its clinical relevance. The FOM was used to rank the competing plans for a patient, and the utility was used to identify issues that needed to be improved. These were compared with the initial evaluations of the physician and discrepancies were analyzed. The issues identified in the best treatment plan were then used to attempt further manual optimization of this plan. Results: For the 20 patients (60 plans) in the study, the final plan ranking produced by the plan-evaluation model had an initial 73% agreement with the ranking provided by the evaluator. After discrepant cases were reviewed by the physician, the model was usually judged more objective or 'correct'. In most cases the model was also able to correctly identify the issues that needed improvement in each plan. Subsequent

  9. Sound insulation of dwellings - Legal requirements in Europe and subjective evaluation of acoustical comfort

    DEFF Research Database (Denmark)

    Rasmussen, B.; Rindel, Jens Holger

    2003-01-01

    Acoustical comfort is a concept that can be characterised by absence of unwanted sound and by opportunities for acoustic activities without annoying other people. In order to achieve acoustical comfort in dwellings certain requirements have to be fulfilled concerning the airborne sound insulation...... requirement for sound insulation. The findings can also be used as a guide to specify acoustic requirements for dwellings in the future......., the impact sound insulation and the noise level from traffic and building services. For road traffic noise it is well established that an outdoor noise level LAeq, 24 h below 55 dB in a housing area means that approximately 15-20% of the occupants are annoyed by the noise. However, for sound insulation...

  10. Evaluating pharmacological models of high and low anxiety in sheep

    Directory of Open Access Journals (Sweden)

    Rebecca E. Doyle

    2015-12-01

    Full Text Available New tests of animal affect and welfare require validation in subjects experiencing putatively different states. Pharmacological manipulations of affective state are advantageous because they can be administered in a standardised fashion, and the duration of their action can be established and tailored to suit the length of a particular test. To this end, the current study aimed to evaluate a pharmacological model of high and low anxiety in an important agricultural and laboratory species, the sheep. Thirty-five 8-month-old female sheep received either an intramuscular injection of the putatively anxiogenic drug 1-(m-chlorophenylpiperazine (mCPP; 1 mg/kg; n = 12, an intravenous injection of the putatively anxiolytic drug diazepam (0.1 mg/kg; n = 12, or acted as a control (saline intramuscular injection n = 11. Thirty minutes after the treatments, sheep were individually exposed to a variety of tests assessing their general movement, performance in a ‘runway task’ (moving down a raceway for a food reward, response to startle, and behaviour in isolation. A test to assess feeding motivation was performed 2 days later following administration of the drugs to the same animals in the same manner. The mCPP sheep had poorer performance in the two runway tasks (6.8 and 7.7 × slower respectively than control group; p < 0.001, a greater startle response (1.4 vs. 0.6; p = 0.02, a higher level of movement during isolation (9.1 steps vs. 5.4; p < 0.001, and a lower feeding motivation (1.8 × slower; p < 0.001 than the control group, all of which act as indicators of anxiety. These results show that mCPP is an effective pharmacological model of high anxiety in sheep. Comparatively, the sheep treated with diazepam did not display any differences compared to the control sheep. Thus we suggest that mCPP is an effective treatment to validate future tests aimed at assessing anxiety in sheep, and that future studies should include other subtle indicators of

  11. Designing the model for evaluating business quality in Croatia

    Directory of Open Access Journals (Sweden)

    Ana Ježovita

    2015-01-01

    Full Text Available The main objective of the paper includes designing a model for evaluating the financial quality of business operations. In that context, for the paper purposes, the financial quality of business operations is defined as an ability to achieve adequate value of individual financial ratios for financial position and performance evaluation. The objective of the model is to obtain comprehensive conclusion about the financial quality of business operation using only value of the function. Data used for designing the model is limited to financial data available from the annual balance sheet and income statement. Those limitations offer the opportunity for all sizes of companies from the non-financial business economy sector to use the designed model for evaluation purposes. Statistical methods used for designing the model are multivariate discriminant analysis and logistic regression. Discriminant analysis resulted in the function which includes five individual financial ratios with the best discriminant power. Respecting the results obtained in the classification matrix with classification accuracy of 95.92% by the original sample, or accuracy of 96.06% for the independent sample, it can be concluded that it is possible to evaluate the financial quality of business operations of companies in Croatia by using the model composed of individual financial ratios. Conducted logistic regression confirms the results obtained using discriminant analysis.

  12. Evaluation and comparison of models and modelling tools simulating nitrogen processes in treatment wetlands

    DEFF Research Database (Denmark)

    Edelfeldt, Stina; Fritzson, Peter

    2008-01-01

    with Modelica 2.1 (Wiley-IEEE Press, USA, 2004).] and an associated tool. The differences and similarities between the MathModelica Model Editor and three other ecological modelling tools have also been evaluated. The results show that the models can well be modelled and simulated in the MathModelica Model...... Editor, and that nitrogen decrease in a constructed treatment wetland should be described and simulated using the Nitrification/Denitrification model as this model has the highest overall quality score and provides a more variable environment.......In this paper, two ecological models of nitrogen processes in treatment wetlands have been evaluated and compared. These models were implemented, simulated, and visualized using the Modelica modelling and simulation language [P. Fritzson, Principles of Object-Oriented Modelling and Simulation...

  13. Mathematical modeling and evaluation of radionuclide transport parameters from the ANL Laboratory Analog Program

    International Nuclear Information System (INIS)

    Chen, B.C.J.; Hull, J.R.; Seitz, M.G.; Sha, W.T.; Shah, V.L.; Soo, S.L.

    1984-07-01

    Computer model simulation is required to evaluate the performance of proposed or future high-level radioactive waste geological repositories. However, the accuracy of a model in predicting the real situation depends on how well the values of the transport properties are prescribed as input parameters. Knowledge of transport parameters is therefore essential. We have modeled ANL's Experiment Analog Program which was designed to simulate long-term radwaste migration process by groundwater flowing through a high-level radioactive waste repository. Using this model and experimental measurements, we have evaluated neptunium (actinide) deposition velocity and analyzed the complex phenomena of simultaneous deposition, erosion, and reentrainment of bentonite when groundwater is flowing through a narrow crack in a basalt rock. The present modeling demonstrates that we can obtain the values of transport parameters, as added information without any additional cost, from the available measurements of laboratory analog experiments. 8 figures, 3 tables

  14. A critical review of the data requirements for fluid flow models through fractured rock

    International Nuclear Information System (INIS)

    Priest, S.D.

    1986-01-01

    The report is a comprehensive critical review of the data requirements for ten models of fluid flow through fractured rock, developed in Europe and North America. The first part of the report contains a detailed review of rock discontinuities and how their important geometrical properties can be quantified. This is followed by a brief summary of the fundamental principles in the analysis of fluid flow through two-dimensional discontinuity networks and an explanation of a new approach to the incorporation of variability and uncertainty into geotechnical models. The report also contains a review of the geological and geotechnical properties of anhydrite and granite. Of the ten fluid flow models reviewed, only three offer a realistic fracture network model for which it is feasible to obtain the input data. Although some of the other models have some valuable or novel features, there is a tendency to concentrate on the simulation of contaminant transport processes, at the expense of providing a realistic fracture network model. Only two of the models reviewed, neither of them developed in Europe, have seriously addressed the problem of analysing fluid flow in three-dimensional networks. (author)

  15. A Fuzzy Comprehensive Evaluation Model for Sustainability Risk Evaluation of PPP Projects

    Directory of Open Access Journals (Sweden)

    Libiao Bai

    2017-10-01

    Full Text Available Evaluating the sustainability risk level of public–private partnership (PPP projects can reduce project risk incidents and achieve the sustainable development of the organization. However, the existing studies about PPP projects risk management mainly focus on exploring the impact of financial and revenue risks but ignore the sustainability risks, causing the concept of “sustainability” to be missing while evaluating the risk level of PPP projects. To evaluate the sustainability risk level and achieve the most important objective of providing a reference for the public and private sectors when making decisions on PPP project management, this paper constructs a factor system of sustainability risk of PPP projects based on an extensive literature review and develops a mathematical model based on the methods of fuzzy comprehensive evaluation model (FCEM and failure mode, effects and criticality analysis (FMECA for evaluating the sustainability risk level of PPP projects. In addition, this paper conducts computational experiment based on a questionnaire survey to verify the effectiveness and feasibility of this proposed model. The results suggest that this model is reasonable for evaluating the sustainability risk level of PPP projects. To our knowledge, this paper is the first study to evaluate the sustainability risk of PPP projects, which would not only enrich the theories of project risk management, but also serve as a reference for the public and private sectors for the sustainable planning and development. Keywords: sustainability risk eva

  16. Analysis and Evaluation of Statistical Models for Integrated Circuits Design

    Directory of Open Access Journals (Sweden)

    Sáenz-Noval J.J.

    2011-10-01

    Full Text Available Statistical models for integrated circuits (IC allow us to estimate the percentage of acceptable devices in the batch before fabrication. Actually, Pelgrom is the statistical model most accepted in the industry; however it was derived from a micrometer technology, which does not guarantee reliability in nanometric manufacturing processes. This work considers three of the most relevant statistical models in the industry and evaluates their limitations and advantages in analog design, so that the designer has a better criterion to make a choice. Moreover, it shows how several statistical models can be used for each one of the stages and design purposes.

  17. Development and evaluation of thermal model reduction algorithms for spacecraft

    Science.gov (United States)

    Deiml, Michael; Suderland, Martin; Reiss, Philipp; Czupalla, Markus

    2015-05-01

    This paper is concerned with the topic of the reduction of thermal models of spacecraft. The work presented here has been conducted in cooperation with the company OHB AG, formerly Kayser-Threde GmbH, and the Institute of Astronautics at Technische Universität München with the goal to shorten and automatize the time-consuming and manual process of thermal model reduction. The reduction of thermal models can be divided into the simplification of the geometry model for calculation of external heat flows and radiative couplings and into the reduction of the underlying mathematical model. For simplification a method has been developed which approximates the reduced geometry model with the help of an optimization algorithm. Different linear and nonlinear model reduction techniques have been evaluated for their applicability in reduction of the mathematical model. Thereby the compatibility with the thermal analysis tool ESATAN-TMS is of major concern, which restricts the useful application of these methods. Additional model reduction methods have been developed, which account to these constraints. The Matrix Reduction method allows the approximation of the differential equation to reference values exactly expect for numerical errors. The summation method enables a useful, applicable reduction of thermal models that can be used in industry. In this work a framework for model reduction of thermal models has been created, which can be used together with a newly developed graphical user interface for the reduction of thermal models in industry.

  18. Presenting an evaluation model of the trauma registry software.

    Science.gov (United States)

    Asadi, Farkhondeh; Paydar, Somayeh

    2018-04-01

    Trauma is a major cause of 10% death in the worldwide and is considered as a global concern. This problem has made healthcare policy makers and managers to adopt a basic strategy in this context. Trauma registry has an important and basic role in decreasing the mortality and the disabilities due to injuries resulted from trauma. Today, different software are designed for trauma registry. Evaluation of this software improves management, increases efficiency and effectiveness of these systems. Therefore, the aim of this study is to present an evaluation model for trauma registry software. The present study is an applied research. In this study, general and specific criteria of trauma registry software were identified by reviewing literature including books, articles, scientific documents, valid websites and related software in this domain. According to general and specific criteria and related software, a model for evaluating trauma registry software was proposed. Based on the proposed model, a checklist designed and its validity and reliability evaluated. Mentioned model by using of the Delphi technique presented to 12 experts and specialists. To analyze the results, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved by the experts and professionals, the final version of the evaluation model for the trauma registry software was presented. For evaluating of criteria of trauma registry software, two groups were presented: 1- General criteria, 2- Specific criteria. General criteria of trauma registry software were classified into four main categories including: 1- usability, 2- security, 3- maintainability, and 4-interoperability. Specific criteria were divided into four main categories including: 1- data submission and entry, 2- reporting, 3- quality control, 4- decision and research support. The presented model in this research has introduced important general and specific criteria of trauma registry software

  19. Mesoscale to microscale wind farm flow modeling and evaluation

    DEFF Research Database (Denmark)

    Sanz Rodrigo, Javier; Chávez Arroyo, Roberto Aurelio; Moriarty, Patrick

    2017-01-01

    The increasing size of wind turbines, with rotors already spanning more than 150m diameter and hub heights above 100m, requires proper modeling of the atmospheric boundary layer (ABL) from the surface to the free atmosphere. Furthermore, large wind farm arrays create their own boundary layer stru...

  20. Evaluation of Irrigation Methods for Highbush Blueberry. I. Growth and Water Requirements of Young Plants

    Science.gov (United States)

    A study was conducted in a new field of northern highbush blueberry (Vaccinium corymbosum L. 'Elliott') to determine the effects of different irrigation methods on growth and water requirements of uncropped plants during the first 2 years after planting. The plants were grown on mulched, raised beds...