WorldWideScience

Sample records for thermocouple evaluation model

  1. Thermocouple

    International Nuclear Information System (INIS)

    Charlesworth, F.D.W.

    1983-01-01

    A thermocouple is provided by a cable of coaxial form with inner and outer conductors of thermocouple forming materials and with the conductors electrically joined together at one end of the cable to form the thermocouple junction. The inner and outer conductors are preferably of chromel and stainless steel respectively. (author)

  2. Evaluation of RTD and thermocouple for PID temperature control in ...

    African Journals Online (AJOL)

    Evaluation of RTD and thermocouple for PID temperature control in distributed control system laboratory. D. A. A. Nazarudin, M. K. Nordin, A. Ahmad, M. Masrie, M. F. Saaid, N. M. Thamrin, M. S. A. M. Ali ...

  3. An Explicit Approach Toward Modeling Thermo-Coupled Deformation Behaviors of SMPs

    Directory of Open Access Journals (Sweden)

    Hao Li

    2017-03-01

    Full Text Available A new elastoplastic J 2 -flow models with thermal effects is proposed toward simulating thermo-coupled finite deformation behaviors of shape memory polymers. In this new model, an elastic potential evolving with development of plastic flow is incorporated to characterize the stress-softening effect at unloading and, moreover, thermo-induced plastic flow is introduced to represent the strain recovery effect at heating. It is shown that any given test data for both effects may be accurately simulated by means of direct and explicit procedures. Numerical examples for model predictions compare well with test data in literature.

  4. Structural evaluation of thermocouple probes for 241-AZ-101 waste tank

    International Nuclear Information System (INIS)

    Kanjilal, S.K.

    1994-01-01

    This document reports on the structural analysis of the thermocouple probe to be installed in 241-AZ-101 waste tank. The thermocouple probe is analyzed for normal pump mixing operation and potential earthquake induced loads required by the Hanford Site Design Criteria SDC-4.1

  5. Simulation of a Shielded Thermocouple

    African Journals Online (AJOL)

    performance of shielded thermocouple designs. A mathematical model of the thermocouple is obtained by derivation of the heat propagation equation in cylindrical coordinates and by considering the ... Here k is the thermal conductivity, p c is the specific heat capacity, ρ is the density,. ∞. T is the ambient temperature, and μ ...

  6. A Simple Test to Evaluate the Calibration Stability and Accuracy of Infrared Thermocouple Sensors

    OpenAIRE

    Pinnock, Derek R.; Bugbee, Bruce

    2002-01-01

    Accurately measuring surface temperature is not difficult when the surface, the sensor, and air temperatures are similar, but it is challenging when the surface temperature is significantly different than air and sensor temperatures. We tested three Infrared Thermocouple sensors (IRT’s) that had been used for two years in a greenhouse environment. The importance of the correction for sensor body temperature was also examined.

  7. Evaluation of Disinfection Techniques for, and Their Effects on, Rectal Thermocouple Catheters1

    Science.gov (United States)

    Maher, J. T.; Rogers, M. R.; Peterson, D. W.

    1961-01-01

    The antibacterial activities of an iodophor (Wescodyne G), a quaternary ammonium compound (Roccal), and an iodine tincture as agents for the cold disinfection of rectal catheters contaminated in vitro were determined. Following thorough cleaning with an alcoholic solution of soft soap, each of the three disinfectants tested showed satisfactory results (100% kill) in 5 min against the enteric test bacteria (Escherichia coli and Salmonella typhosa) as well as a test species of the genus Pseudomonas, among the bacteria most resistant to surface-active agents. An aqueous solution of Wescodyne G containing 75 ppm available iodine was used both as a wiping solution and for subsequent disinfection of rectal catheters contaminated in vivo. Total bacterial destruction was found to follow a 60-min soak preceded by the wiping procedure. Rectal catheters subjected to prolonged immersion in each of the test disinfectants were found to be essentially unaffected, retaining their initial calibrations within a permissible tolerance. Neither Roccal nor Wescodyne G solutions were found to measurably attack bare thermocouples. Alcoholic iodine 0.5% did, however, exert a deteriorating effect on bare thermocouples in a short time, as measured by change in resistance characteristics. The results of this study have led to the recommendation that Wescodyne G containing 75 ppm available iodine be used in standing operating procedures for the initial cleaning and subsequent disinfection of rectal thermocouple catheters. Images Fig. 1 PMID:13765378

  8. Beyond "fire temperatures": calibrating thermocouple probes and modeling their response to surface fires in hardwood fuels

    Science.gov (United States)

    Anthony S. Bova; Matthew B. Dickinson

    2008-01-01

    The maximum temperatures of thermocouples, temperature-sensitive paints, and calorimeters exposed to flames in wildland fires are often called "fire temperatures" but are determined as much by the properties and deployment of the measurement devices as by the fires themselves. Rather than report device temperatures that are not generally comparable among...

  9. Comparative evaluation of corrosion behaviour of type K thin film thermocouple and its bulk counterpart

    International Nuclear Information System (INIS)

    Mukherjee, S.K.; Barhai, P.K.; Srikanth, S.

    2011-01-01

    Highlights: → Anodic vacuum arc deposited chromel and alumel films are more 'noble' in 5% NaCl solution than their respective wires. → Chromel undergoes localised corrosion while alumel shows uniform corrosion. → Virgin samples of chromel-alumel TFTCs exhibit good thermoelectric response. → Their thermoelectric outputs remain largely unaffected when shelved under normal atmospheric conditions. → After 288 h of exposure in salt spray environment, their thermoelectric outputs show noticeable change due to size effects. - Abstract: This paper investigates the corrosion behaviour of type K thermoelements and their thin films, and compares the performance of chromel-alumel thin film thermocouple with its wire counterpart before and after exposure to 5% NaCl medium. Potentiodynamic polarisation tests reveal that chromel and alumel films are more 'noble' than their respective wires. Alumel corrodes faster when coupled with chromel in films than as wires. Secondary electron micrographs and electrochemical impedance spectroscopy measurements suggest that chromel shows localised corrosion while alumel undergoes uniform corrosion. Corrosion adversely affects the thermocouple output and introduces an uncertainty in the measurement.

  10. Travelling gradient thermocouple calibration

    International Nuclear Information System (INIS)

    Broomfield, G.H.

    1975-01-01

    A short discussion of the origins of the thermocouple EMF is used to re-introduce the idea that the Peltier and Thompson effects are indistinguishable from one another. Thermocouples may be viewed as devices which generate an EMF at junctions or as integrators of EMF's developed in thermal gradients. The thermal gradient view is considered the more appropriate, because of its better accord with theory and behaviour, the correct approach to calibration, and investigation of service effects is immediately obvious. Inhomogeneities arise in thermocouples during manufacture and in service. The results of travelling gradient measurements are used to show that such effects are revealed with a resolution which depends on the length of the gradient although they may be masked during simple immersion calibration. Proposed tests on thermocouples irradiated in a nuclear reactor are discussed

  11. Identification model of an accidental drop of a control rod in PWR reactors using thermocouple readings and radial basis function neural networks

    International Nuclear Information System (INIS)

    Souza, T.J.; Medeiros, J.A.C.C.; Gonçalves, A.C.

    2017-01-01

    Highlights: • An alternative model capable of identifying the control rod that has accidentally dropped. • The identification model is based in readings of the thermocouples. • Radial basis function neural network is applied to predict the temperatures in control rod positions. - Abstract: The accidental dropping of a control rod may cause the reactor to operate unsafely. In this type of event, there is a distortion in the distribution of power and temperature in the core may exceed operating limits reactor safe. This work aims to develop an alternative model capable of identifying, at any time of the cycle, the control rod that has accidentally dropped at the core of a PWR reactor, using the readings of the thermocouples in order to minimize possible losses. The model assumes that in a possible drop of a control rod, the largest temperature change occurs in the position where the control rod is inserted. Considering the fact that there are no temperature gauges in all control rod positions, the proposed model uses radial basis function (RBF) neural networks to make a reconstruction of temperatures in these positions from the measurements of the thermocouples at the time of the accidental drop. The study found that the predictions of the temperatures made by the RBF neural networks showed good results, which enables the identification of the control rod dropped accidentally in the core, by simple inference of the fuel assembly of lowest temperature among temperatures reconstructed.

  12. Temperature measurements by thermocouples

    International Nuclear Information System (INIS)

    Liermann, J.

    1975-01-01

    The measurement of a temperature (whatever the type of transducer used) raises three problems: the choice of transducer; where it should be placed; how it should be fixed and protected. These are the three main points examined, after a brief description of the most commonly used thermocouples [fr

  13. Simulation of a welding process in polyduct pipelines resolved with a finite elements computational model. Comparison with analytical solutions and tests with thermocouples

    International Nuclear Information System (INIS)

    Sanzi, H; Elvira, G; Kloster, M; Asta, E; Zalazar, M

    2006-01-01

    All welding processes induce deformations and thermal tensions, which must be evaluated correctly since they can influence a component's structural integrity. This work determines the distribution of temperatures that develop during a manual welding process with shielded electrodes (SMAW), on the circumference seam of a pipe for use in polyducts. A simplified model of Finite Elements (FEA) using three dimensional solids is proposed for the study. The analysis considers that while the welding process is underway, no heat is lost into the environment, that is, adiabatic conditions are considered, and the transformations produced in the material due to phase changes do not produce modifications in the properties of the supporting or base materials. The results of the simulation are compared with those obtained by recent analytical studies developed by different investigators, such as Nguyen, Ohta, Matsuoka, Suzuki and Taeda, where a continuously moving three dimensional double ellipsoidal source was used. The results are then compared with the experimental results by measuring with thermocouples. This study reveals the sensitivity and the validity of the proposed computer model, and in a second stage optimizes the engineering times for the resolution of a problem like the one presented in order to design the corresponding welding procedure (CW)

  14. Construção e avaliação de psicrômetro aspirado de termopar Construction and evaluation of an aspirated thermocouple psychrometer

    Directory of Open Access Journals (Sweden)

    Fábio Ricardo Marin

    2001-12-01

    Full Text Available Construiu-se um psicrômetro de termopar aspirado, de baixo custo e fácil utilização em sistemas automáticos de aquisição de dados, utilizando-se tubos de PVC. A aspiração foi feita por ventiladores utilizados em microcomputadores e as temperaturas foram determinadas com junções de termopar de cobre-constantan. Para umidecimento do bulbo, utilizou-se um cordão de algodão. Os resultados da comparação com higrômetro capacitivo Vaisala Inc. e com psicrômetro aspirado tipo Assman mostraram que tanto em ambientes naturais como em controlados, a precisão e a exatidão das medidas foi muito boa, de maneira que o psicrômetro aqui descrito pode ser empregado para determinação da pressão atual de vapor e da umidade relativa sem perda de qualidade dos dados, e também em estudos que levem em conta gradientes de temperatura e umidade específica.The construction of a low cost aspirated thermocouple psychrometer made of PVC tubes is described. The instrument can easily be connected to dataloggers. The aspiration is made by fans used in microcomputers and temperatures measured with cooper-constantan thermocouples. A cotton string was used to make the wet junction. Its perfomance was evaluated in comparison to an Assman aspirated psychrometer and a Vaisala Inc. capacitive higrometer, in natural and controlled environments. The results show a good agreement between measures, allowing air vapour, relative humidity, temperature and specific humidity gradients to be determined using the proposed psychrometer.

  15. Accounting for the inertia of the thermocouples' measurements by modelling of a NPP Kalinin-3 transient with the coupled system code ATHLET-BIPR-VVER

    International Nuclear Information System (INIS)

    Nikonov, S.; Velkov, K.

    2008-01-01

    The ATHLET-BIPR-VVER coupled system code is applied for performing of safety analysis for different WWER reactors. During the last years its validation matrix is continuously being enlarged. The measurements performed during the commissioning phase of NPP Kalinin Unit 3 for the transient 'Switching-off of one Main Circulation Pump at nominal power' are very well documented and have a variety of recorded integral and local thermo-hydraulic and neutron-physic parameters including the measurements' errors. This data is being used for further validation of the coupled code system ATHLET-BIPR-VVER. In the paper are discussed the problems and our solutions by the correct interpretation of the measured thermocouples' records at NPP Kalinin-3 and the comparison with the predicted results by the coupled thermal-hydraulic/neutron-kinetic code ATHLET-BIPR-VVER. Of primary importance by such comparisons is the correct accounting of the fluid mixing process that take place in the surrounding of the measuring sensors and also the consideration of the time delay (inertia term) of the measuring devices. On the bases of previous experience and many simulations of the defined transient a method is discussed and proposed to consider correctly the inertia term of the thermocouples' measurements. The new modelling is implemented in the coupled system code ATHLET-BIPR-VVER for further validation. (Author)

  16. Noncontacting Measurement With a Thermocouple

    Science.gov (United States)

    Weatherill, W. T.; Schoreder, C. J.; Freitag, H. J.

    1986-01-01

    Tentlike covering brings thermocouple to within few degrees of surface temperature. Technique originally developed for measuring surface temperature of quartz fabric under radiant heating requires no direct contact with heated surface. Technique particularly useful when measuring surface temperatures of materials damaged if thermocouple or other temperature sensor attached.

  17. High-temperature thermocouples and related methods

    Science.gov (United States)

    Rempe, Joy L [Idaho Falls, ID; Knudson, Darrell L [Firth, ID; Condie, Keith G [Idaho Falls, ID; Wilkins, S Curt [Idaho Falls, ID

    2011-01-18

    A high-temperature thermocouple and methods for fabricating a thermocouple capable of long-term operation in high-temperature, hostile environments without significant signal degradation or shortened thermocouple lifetime due to heat induced brittleness.

  18. AGR-1 Thermocouple Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jeff Einerson

    2012-05-01

    This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods to further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of

  19. AGR-1 Thermocouple Data Analysis

    International Nuclear Information System (INIS)

    Einerson, Jeff

    2012-01-01

    This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R and D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods to further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of

  20. Development of a micro-thermal flow sensor with thin-film thermocouples

    Science.gov (United States)

    Kim, Tae Hoon; Kim, Sung Jin

    2006-11-01

    A micro-thermal flow sensor is developed using thin-film thermocouples as temperature sensors. A micro-thermal flow sensor consists of a heater and thin-film thermocouples which are deposited on a quartz wafer using stainless steel masks. Thin-film thermocouples are made of standard K-type thermocouple materials. The mass flow rate is measured by detecting the temperature difference of the thin-film thermocouples located in the upstream and downstream sections relative to a heater. The performance of the micro-thermal flow sensor is experimentally evaluated. In addition, a numerical model is presented and verified by experimental results. The effects of mass flow rate, input power, and position of temperature sensors on the performance of the micro-thermal flow sensor are experimentally investigated. At low values, the mass flow rate varies linearly with the temperature difference. The linearity of the micro-thermal flow sensor is shown to be independent of the input power. Finally, the position of the temperature sensors is shown to affect both the sensitivity and the linearity of the micro-thermal flow sensor.

  1. Heat-Conducting Anchors for Thermocouples

    Science.gov (United States)

    Macdavid, Kenton S.

    1987-01-01

    Metal particles in adhesive aid heat transfer. Aluminum caps containing silver-filled epoxy used as high-thermal-conductance anchors for thermocouples, epoxy providing thermal path between mounting surfaces and thermocouple measuring junctions. Normally, epoxy-filled aluminum caps used when measuring steady-state temperatures. Silver-filled epoxy used when thermocouple not isolated electrically from surface measured.

  2. PWR-PSMS benchmarking results using thermocouple data from the summer-1 plant

    International Nuclear Information System (INIS)

    Peng, C.M.; Ipakchi, A.; Kim, J.H.

    1986-01-01

    In large pressurized water reactor (PWR) power plants, estimating the in-core power distribution from off-line predictions is based on data from global measurements with conservative assumptions. The off-line predictions are too independent of the actual process to reflect the true state of the reactor. The on-line core monitoring systems tend to balance between measurements and theoretical calculations, better utilizing information coming from measurements. The hybrid system, which incorporates measurements in predictions along with frequent model adaptations, will closely track the actual operating state of the plant. Since the detailed core flux mapping is performed with large time intervals for those PWRs without fixed in-core detectors, the on-line signals from thermocouples located at the top of selected fuel assemblies offer an alternative means of monitoring. The in-core thermocouples give a good indication of the average coolant temperature at the outlet of the instrumented assemblies and potentially can provide continuous information of the radial power distribution between flux maps. The PWR Power Shape Monitoring System (PWR-PSMS) has implemented this on-line monitoring feature based on thermocouple readings to evaluate the core performance and to improve core monitoring. The purpose of this paper is to present the benchmark results of PWR-PSMS using thermocouple data from the Summer-1 plant of a Westinghouse PWR

  3. Relative humidity measurements with thermocouple psychrometer and capacitance sensors

    International Nuclear Information System (INIS)

    Mao, Naihsien.

    1991-01-01

    The relative humidity is one of the important hydrological parameters affecting waste package performance. Water potential of a system is defined as the amount of work required to reversibly and isothermally move an infinitesimal quantity of water from a pool of pure water to that system at the same elevation. The thermocouple psychrometer, which acts as a wet-dry bulb instrument based on the Peltier effect, is used to measure water potential. The thermocouple psychrometer works only for relative humidity greater than 94 percent. Other sensors must be used for drier conditions. Hence, the author also uses a Vaisala Humicap, which measures the capacitance change due to relative humidity change. The operation range of the Humicap (Model HMP 135Y) is from 0 to 100 percent relative humidity and up to 160C (320F) in temperature. A psychrometer has three thermocouple junctions. Two copper-constantan junctions serve as reference temperature junctions and the constantan-chromel junction is the sensing junction. Current is passed through the thermocouple causing cooling of the sensing junction by the Peltier effect. When the temperature of the junction is below the dew point, water will condense upon the junction from the air. The Peltier current is discontinued and the thermocouple output is recorded as the temperature of the thermocouple returns to ambient. The temperature changes rapidly toward the ambient temperature until it reaches the wet bulb depression temperature. At this point, evaporation of the water from the junction produces a cooling effect upon the junction that offsets the heat absorbed from the ambient surroundings. This continues until the water is depleted and the thermocouple temperature returns to the ambient temperature (Briscoe, 1984). The datalogger starts to take data roughly at the wet bulb depression temperature

  4. LOFT small break test thermocouple installation

    International Nuclear Information System (INIS)

    Fors, R.M.

    1980-01-01

    The subject thermocouple design has been analyzed for maximum expected hydraulic loading and found to be adequate. The natural frequency of the thermocouple was found to be between the vortex shedding frequencies for the gas and liquid phase so that a tendency for resonance will exist. However, since the thermocouple support will have a restricted displacement, stresses found are below the endurance limit and, thus, are acceptable in respect to fatigue life as well as primary stress due to pressure loading

  5. Novel thermocouples for automotive applications

    Directory of Open Access Journals (Sweden)

    P. Gierth

    2018-02-01

    Full Text Available Measurement of temperatures in engine and exhaust systems in automotive applications is necessary for thermal protection of the parts and optimizing of the combustion process. State-of-the-art temperature sensors are very limited in their response characteristic and installation space requirement. Miniaturized sensor concepts with a customizable geometry are needed. The basic idea of this novel sensor concept is to use thick-film technology on component surfaces. Different standardized and especially nonstandard material combinations of thermocouples have been produced for the validation of this technology concept. Application-oriented measurements took place in the exhaust system of a test vehicle and were compared to standard laboratory conditions.

  6. INITIAL RESULTS FROM INVESTIGATIONS TO ENHANCE THE PERFORMANCE OF HIGH TEMPERATURE IRRADIATION-RESISTANT THERMOCOUPLES

    International Nuclear Information System (INIS)

    Crepeau, John; Rempe, Joy; Wilkins, S. Curtis; Knudson, Darrell L.; Condie, Keith G.; Daw, Joshua

    2007-01-01

    New fuel, cladding, and structural materials offer the potential for safer and more economic energy from existing reactor and advanced nuclear reactor designs. However, insufficient data are available to characterize these materials in high temperature, radiation conditions. To evaluate candidate material performance, robust instrumentation is needed that can survive these conditions. However, traditional thermocouples either drift due to degradation at high temperatures (above 1100 C) or due to transmutation of thermocouple components. Thermocouples are needed which can withstand both high temperature and high radiation environments. To address this instrumentation need, the Idaho National Laboratory (INL) recently developed the design and evaluated the performance of a high temperature radiation-resistant thermocouple that contains commercially-available alloys of molybdenum and niobium (Rempe, 2006). Candidate thermocouple component materials were first identified based on their ability to withstand high temperature and radiation. Then, components were selected based on data obtained from materials interaction tests, ductility investigations, and resolution evaluations. Results from long duration (over 4000 hours) tests at high temperatures (up to 1400 C) and thermal cycling tests demonstrate the stability and reliability of the INL-developed design. Tests in INL's Advanced Test Reactor (ATR) are underway to demonstrate the in-pile performance of these thermocouples. However, several options have been identified that could further enhance the lifetime and reliability of the INL-developed thermocouples, allowing their use in higher temperature applications (up to at least 1700 C). A joint University of Idaho (UI) and INL University Nuclear Energy Research Initiative (UNERI) is underway to investigate these options and ultimately, provide recommendations for an enhanced thermocouple design. This paper presents preliminary results from this UI/INL effort. Results

  7. Characteristics of metal sheathed thermocouples in thermowell

    International Nuclear Information System (INIS)

    Okuda, Takehiro; Nakase, Tsuyoshi; Tanabe, Yutaka; Yamada, Kunitaka; Yoshizaki, Akio; Roko, Kiyokazu

    1987-01-01

    Static and dynamic characteristics of thermowell type thermocouples which are planned to be used for the High-Temperature engineering Test Reactor (HTTR) have been investigated. A mock-up test section was installed in Kawasaki's Helium Test Loop (KH-200). Thermal characteristics tests were carried out under the 600 ∼ 1000 deg C temperature conditions. The test section was equipped with four types sheathed thermocouples; the well type, the non well type, and ones with and without the thermal radiation shielding plate. The measured temperature by the well type thermocouples with the shielding plate was only about 1.3 deg C higher than the one without the shielding plate at gas temperature 990 deg C. The measured time constant of the well type thermocouples was about 7 seconds in the condition of the heat transfer coefficient 1600 Kcal/m 2 h deg C on the well surface, and coincided with the calculated one by ''TRUMP'' code. (author)

  8. Self-adapted thermocouple-diagnostic complex

    International Nuclear Information System (INIS)

    Alekseev, S.V.; Grankovskij, K.Eh.; Olejnikov, P.P.; Prijmak, S.V.; Shikalov, V.F.

    2003-01-01

    A self-adapted thermocouple-diagnostic complex (STDC) for obtaining the reliable data on the coolant temperature in the reactors of NPP is described. The STDC in based on the thermal pulse monitoring of a thermocouple in the measuring channel of a reactor. Measurement method and STDC composition are substantiated. It is shown that introduction of the developed STDC ensures realization of precise and reliable temperature monitoring in the reactors of all types [ru

  9. The transient response for different types of erodable surface thermocouples using finite element analysis

    Directory of Open Access Journals (Sweden)

    Mohammed Hussein

    2007-01-01

    Full Text Available The transient response of erodable surface thermocouples has been numerically assessed by using a two dimensional finite element analysis. Four types of base metal erodable surface thermocouples have been examined in this study, included type-K (alumel-chromel, type-E (chromel-constantan, type-T (copper-constantan, and type-J (iron-constantan with 50 mm thick- ness for each. The practical importance of these types of thermocouples is to be used in internal combustion engine studies and aerodynamics experiments. The step heat flux was applied at the surface of the thermocouple model. The heat flux from the measurements of the surface temperature can be commonly identified by assuming that the heat transfer within these devices is one-dimensional. The surface temperature histories at different positions along the thermocouple are presented. The normalized surface temperature histories at the center of the thermocouple for different types at different response time are also depicted. The thermocouple response to different heat flux variations were considered by using a square heat flux with 2 ms width, a sinusoidal surface heat flux variation width 10 ms period and repeated heat flux variation with 2 ms width. The present results demonstrate that the two dimensional transient heat conduction effects have a significant influence on the surface temperature history measurements made with these devices. It was observed that the surface temperature history and the transient response for thermocouple type-E are higher than that for other types due to the thermal properties of this thermocouple. It was concluded that the thermal properties of the surrounding material do have an impact, but the properties of the thermocouple and the insulation materials also make an important contribution to the net response.

  10. Updated uncertainty budgets for NIST thermocouple calibrations

    Science.gov (United States)

    Meyer, C. W.; Garrity, K. M.

    2013-09-01

    We have recently updated the uncertainty budgets for calibrations in the NIST Thermocouple Calibration Laboratory. The purpose for the updates has been to 1) revise the estimated values of the relevant uncertainty elements to reflect the current calibration facilities and methods, 2) provide uncertainty budgets for every standard calibration service offered, and 3) make the uncertainty budgets more understandable to customers by expressing all uncertainties in units of temperature (°C) rather than emf. We have updated the uncertainty budgets for fixed-point calibrations of type S, R, and B thermocouples and comparison calibrations of type R and S thermocouples using a type S reference standard. In addition, we have constructed new uncertainty budgets for comparison calibrations of type B thermocouples using a type B reference standard as well as using both a type S and type B reference standard (for calibration over a larger range). We have updated the uncertainty budgets for comparison calibrations of base-metal thermocouples using a type S reference standard and alternately using a standard platinum resistance thermometer reference standard. Finally, we have constructed new uncertainty budgets for comparison tests of noble-metal and base-metal thermoelements using a type S reference standard. A description of these updates is presented in this paper.

  11. A Mathematical Technique for Estimating True Temperature Profiles of Data Obtained from Long Time Constant Thermocouples

    National Research Council Canada - National Science Library

    Young, Graeme

    1998-01-01

    A mathematical modeling technique is described for estimating true temperature profiles of data obtained from long time constant thermocouples, which were used in fuel fire tests designed to determine...

  12. Study on thermocouple attachment in reflood experiments

    International Nuclear Information System (INIS)

    Sugimoto, Jun

    1977-03-01

    The method of thermocouple attachment to a heater rods has been studied for surface temperature measurement in reflood experiments. The method used as far in JAERI's reflood experiments had some possibilities of not estimating exactly the quench times. Various attachment method have been tested and some proved to be effective in the respect. (auth.)

  13. A thermocouple thermometry system for ultrasound hyperthermia

    International Nuclear Information System (INIS)

    Ozarka, M.; Gharakhani, A.; Magin, R.; Cain, C.

    1984-01-01

    A thermometry system designed to be used in the treatment of cancer by ultrasound hyperthermia is described. The system monitors tumor temperatures using 16 type T (copper-constantan) thermocouples and is controlled by a 12 MHz Intel 8031 microcomputer. An analog circuit board contains the thermocouple amplifiers, an analog multiplexer, scaling circuitry, and an analog to digital converter. A digital board contains the Intel 8031, program memory, data memory, as well as circuitry for control and data communications. Communication with the hyperthermia system control computer is serially by RS-232 with selectable baud rate. Since the thermocouple amplifiers may have slight differences in gain and offset, a calibrated offset is added to a lookup table value to obtain the proper display temperature to within +- 0.1 0 C. The calibration routine, implemented in software, loads a nonvolatile random access memory chip with the proper offset values based on the outputs of each thermocouple channel at known temperatures which bracket a range of interest

  14. High Temperature Irradiation-Resistant Thermocouple Performance Improvements

    International Nuclear Information System (INIS)

    Daw, Joshua; Rempe, Joy; Knudson, Darrell; Crepeau, John; Wilkins, S. Curtis

    2009-01-01

    Traditional methods for measuring temperature in-pile degrade at temperatures above 1100 C. To address this instrumentation need, the Idaho National Laboratory (INL) developed and evaluated the performance of a high temperature irradiation-resistant thermocouple (HTIR-TC) that contains alloys of molybdenum and niobium. Data from high temperature (up to 1500 C) long duration (up to 4000 hours) tests and on-going irradiations at INL's Advanced Test Reactor demonstrate the superiority of these sensors to commercially-available thermocouples. However, several options have been identified that could further enhance their reliability, reduce their production costs, and allow their use in a wider range of operating conditions. This paper presents results from on-going Idaho National Laboratory (INL)/University of Idaho (UI) efforts to investigate options to improve HTIR-TC ductility, reliability, and resolution by investigating specially-formulated alloys of molybdenum and niobium and alternate diameter thermoelements (wires). In addition, on-going efforts to evaluate alternate fabrication approaches, such as drawn and loose assembly techniques will be discussed. Efforts to reduce HTIR-TC fabrication costs, such as the use of less expensive extension cable will also be presented. Finally, customized HTIR-TC designs developed for specific customer needs will be summarized to emphasize the varied conditions under which these sensors may be used.

  15. Zircaloy sheathed thermocouples for PWR fuel rod temperature measurements

    International Nuclear Information System (INIS)

    Anderson, J.V.; Wesley, R.D.; Wilkins, S.C.

    1979-01-01

    Small diameter zircaloy sheathed thermocouples have been developed by EG and G Idaho, Inc., at the Idaho National Engineering Laboratory. Surface mounted thermocouples were developed to measure the temperature of zircaloy clad fuel rods used in the Thermal Fuels Behavior Program (TFBP), and embedded thermocouples were developed for use by the Loss-of-Fluid Test (LOFT) Program for support tests using zircaloy clad electrically heated nuclear fuel rod simulators. The first objective of this developmental effort was to produce zircaloy sheathed thermocouples to replace titanium sheathed thermocouples and thereby eliminate the long-term corrosion of the titanium-to-zircaloy attachment weld. The second objective was to reduce the sheath diameter to obtain faster thermal response and minimize cladding temperature disturbance due to thermocouple attachment

  16. Error analysis of thermocouple measurements in the Radiant Heat Facility

    International Nuclear Information System (INIS)

    Nakos, J.T.; Strait, B.G.

    1980-12-01

    The measurement most frequently made in the Radiant Heat Facility is temperature, and the transducer which is used almost exclusively is the thermocouple. Other methods, such as resistance thermometers and thermistors, are used but very rarely. Since a majority of the information gathered at Radiant Heat is from thermocouples, a reasonable measure of the quality of the measurements made at the facility is the accuracy of the thermocouple temperature data

  17. Thermocouple Errors when Mounted on Cylindrical Surfaces in Abnormal Thermal Environments.

    Energy Technology Data Exchange (ETDEWEB)

    Nakos, James T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Suo-Anttila, Jill M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zepper, Ethan T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Koenig, Jerry J [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Valdez, Vincent A. [ECI Inc., Albuquerque, NM (United States)

    2017-05-01

    Mineral-insulated, metal-sheathed, Type-K thermocouples are used to measure the temperature of various items in high-temperature environments, often exceeding 1000degC (1273 K). The thermocouple wires (chromel and alumel) are protected from the harsh environments by an Inconel sheath and magnesium oxide (MgO) insulation. The sheath and insulation are required for reliable measurements. Due to the sheath and MgO insulation, the temperature registered by the thermocouple is not the temperature of the surface of interest. In some cases, the error incurred is large enough to be of concern because these data are used for model validation, and thus the uncertainties of the data need to be well documented. This report documents the error using 0.062" and 0.040" diameter Inconel sheathed, Type-K thermocouples mounted on cylindrical surfaces (inside of a shroud, outside and inside of a mock test unit). After an initial transient, the thermocouple bias errors typically range only about +-1-2% of the reading in K. After all of the uncertainty sources have been included, the total uncertainty to 95% confidence, for shroud or test unit TCs in abnormal thermal environments, is about +-2% of the reading in K, lower than the +-3% typically used for flat shrouds. Recommendations are provided in Section 6 to facilitate interpretation and use of the results. .

  18. Vacuum vessel and first wall thermocouple instrumentation on TFTR

    International Nuclear Information System (INIS)

    Ferrara, A.A.; Sredniawski, J.J.

    1983-01-01

    The temperature instrumentation of TFTR is designed to monitor critical vacuum vessel surface temperatures resulting from pulsed operation, discharge cleaning, bakeout and the gobal thermal effects of first wall components, primarily during the period between pulses. The vacuum vessel instrumentation consists of 231 Type ''E'' thermocouples while the first wall instrumentation consists of 193 Type ''E'' thermocouples. Temperature responses are processed and are used to limit the operation of the machine to avoid over-stressing critical vacuum vessel structural areas and first wall components. This paper describes the complete thermocouple system including thermocouple assemblies, voltage isolation and temperature processing equipment as presently installed and operating in the TFTR complex

  19. Fabrication and use of zircaloy/tantalum-sheathed cladding thermocouples and molybdenum/rhenium-sheathed fuel centerline thermocouples

    International Nuclear Information System (INIS)

    Wilkins, S.C.; Sepold, L.K.

    1985-01-01

    The thermocouples described in this report are zircaloy/tantalum-sheathed and molybdenum/rhenium alloy-sheathed instruments intended for fuel rod cladding and fuel centerline temperature measurements, respectively. Both types incorporate beryllium oxide insulation and tungsten/rhenium alloy thermoelements. These thermocouples, operated at temperatures of 2000 0 C and above, were developed for use in the internationally sponsored Severe Fuel Damage test series in the Power Burst Facility. The fabrication steps for both thermocouple types are described in detail. A laser-welding attachment technique for the cladding-type thermocouple is presented, and experience with alternate materials for cladding and fuel therocouples is discussed

  20. Low Drift Type N Thermocouples for Nuclear Applications

    International Nuclear Information System (INIS)

    Scervini, M.; Rae, C.

    2013-06-01

    Thermocouples are the most commonly used sensors for temperature measurement in nuclear reactors. They are crucial for the control of current nuclear reactors and for the development of GEN IV reactors. In nuclear applications thermocouples are strongly affected by intense neutron fluxes. As a result of the interaction with neutrons, the thermoelements of the thermocouples undergo transmutation, which produces a time dependent change in composition and, as a consequence, a time dependent drift of the thermocouple signal. Thermocouple drift can be very significant for in-pile temperature measurements and may render the temperature sensors unreliable after exposure to nuclear radiation for relatively short times compared to the life required for temperature sensors in nuclear applications. Previous experiences with type K thermocouples in nuclear reactors have shown that they are affected by neutron irradiation only to a limited extent. Similarly type N thermocouples are expected to be only slightly affected by neutron fluxes. Currently the use of Nickel based thermocouples is limited to temperatures lower than 1000 deg. C due to drift related to phenomena other than nuclear irradiation. In this work, undertaken as part of the European project METROFISSION, the drift of type N thermocouples has been investigated in the temperature range 600-1300 deg. C. The approach of this study is based on the attempt to separate the contributions of each thermo-element to drift. In order to identify the dominant thermo-element for drift, the contributions of both positive (NP) and negative (NN) thermo-elements to the total drift of 3.2 mm diameter MIMS thermocouples have been measured in each drift test using a pure Pt thermo-element as a reference. Conventional Inconel-600 sheathed type N thermocouples have been compared with type N thermocouples sheathed in a new alloy. At temperatures higher than 1000 deg. C conventional Inconel600 sheathed type N thermocouples can experience a

  1. Magnesia insulated thermocouples - Type K for nuclear applications

    International Nuclear Information System (INIS)

    1983-11-01

    This Specification for magnesia-insulated, steel or nickel alloy sheathed thermocouples, Type K, is intended only for nuclear applications where long-term reliability is of paramount importance and where replacement would be extremely difficult and costly. The stringent requirements of this Specification are intended to eliminate, as far as is practicable, defects which could cause failure of the thermocouple in service. (author)

  2. Thermocouple design for measuring temperatures of small insects

    Science.gov (United States)

    A.A. Hanson; R.C. Venette

    2013-01-01

    Contact thermocouples often are used to measure surface body temperature changes of insects during cold exposure. However, small temperature changes of minute insects can be difficult to detect, particularly during the measurement of supercooling points. We developed two thermocouple designs, which use 0.51 mm diameter or 0.127 mm diameter copper-constantan wires, to...

  3. Sputtered type s thermocouples on quartz glass substrates

    International Nuclear Information System (INIS)

    Sopko, B.; Vlk, J.; Chren, D.; Sopko, V.; Dammer, J.; Mengler, J.; Hynek, V.

    2011-01-01

    The work deals with the development of thin film thermocouples and their practical use. The principle of measuring planar thin film thermocouples is the same as for conventional thermocouples and is based on the thermoelectric effect, which named after its discoverer, Seebeck. Seebeck effect is direct conversion of temperature differences to electric voltage. In different applications it is necessary to use temperature sensors with high spatial resolution (with the placement of several measured points on the segment of length 1 mm) and short response time. For this application are currently used planar thermocouples with important advantage in production price and reproducible production. The innovative potential of thin-film thermocouples are to be found mainly in: 1 st use of technology in thin layers, unlike the already mature technologies applied in the production of conventional thermocouple probes are capable of further improvement with the usage of new substrate materials, modified methods for creating electrical contacts to the new thermocouple configuration and adhesive and protective layers, 2 nd in saving precious and rare metals, 3 rd decreasing the thickness of the layers and reducing the overall size of thermo probe. Measuring the temperature of molten steel, leading to a general loss of strength and the subsequent destruction of the probe. Here exhibited the highest resistance of quartz plates used in thin film substrates thermocouples. (authors)

  4. Simulation of a Shielded Thermocouple | Berntsson | Rwanda Journal

    African Journals Online (AJOL)

    A shielded thermocouple is a measurement device used for monitoring the temperature in chemically, or mechanically, hostile environments. The sensitive parts of the thermocouple are protected by a shielding layer. In this work we use numerical methods to study the accuracy and dynamic properties of a shielded ...

  5. Thermocouple Rakes for Measuring Boundary Layer Flows Extremely Close to Surface

    Science.gov (United States)

    Hwang, Danny P.; Fralick, Gustave C.; Martin, Lisa C.; Blaha, Charles A.

    2001-01-01

    Of vital interest to aerodynamic researchers is precise knowledge of the flow velocity profile next to the surface. This information is needed for turbulence model development and the calculation of viscous shear force. Though many instruments can determine the flow velocity profile near the surface, none of them can make measurements closer than approximately 0.01 in. from the surface. The thermocouple boundary-layer rake can measure much closer to the surface than conventional instruments can, such as a total pressure boundary layer rake, hot wire, or hot film. By embedding the sensors (thermocouples) in the region where the velocity is equivalent to the velocity ahead of a constant thickness strut, the boundary-layer flow profile can be obtained. The present device fabricated at the NASA Glenn Research Center microsystem clean room has a heater made of platinum and thermocouples made of platinum and gold. Equal numbers of thermocouples are placed both upstream and downstream of the heater, so that the voltage generated by each pair at the same distance from the surface is indicative of the difference in temperature between the upstream and downstream thermocouple locations. This voltage differential is a function of the flow velocity, and like the conventional total pressure rake, it can provide the velocity profile. In order to measure flow extremely close to the surface, the strut is made of fused quartz with extremely low heat conductivity. A large size thermocouple boundary layer rake is shown in the following photo. The latest medium size sensors already provide smooth velocity profiles well into the boundary layer, as close as 0.0025 in. from the surface. This is about 4 times closer to the surface than the previously used total pressure rakes. This device also has the advantage of providing the flow profile of separated flow and also it is possible to measure simultaneous turbulence levels within the boundary layer.

  6. A Study of the Behavior Characteristics for K-type Thermocouple

    International Nuclear Information System (INIS)

    Ye, Songhae; Kim, Yongsik; Lee, Sooill; Kim, Sungjin; Lyou, Jooon

    2014-01-01

    K-type thermocouple is widely used in nuclear power plants (NPP) and they provide reliable service. Generally, the thermocouple assembly is the finished product and usually only nondestructive tests are performed on the assembly, whereas destructive tests are confined to selected bulk cable specimens. This K-type thermocouple has been used representatively in the In-Core Instrument Assembly (ICI) in the nuclear power plants. The ICI consists of five rhodium emitter detectors that provide information on the thermal power for the core and one K-type thermocouple made with two cables (Chromel-Alumel) that provides the temperature of core exit (CET). Generally, the quantity of the ICI is absolutely different according to the number of fuel assemblies in the NPP. In the case of SKN 3 and 4, they were designed to the 61 ICI to provide information on the core cooling to the inadequate core cooling monitoring system (ICCMS). This measured temperature could be also used to check the entry condition of severe accidents. The technology of the TFDR is a generic skill to detect the fault position of the cable. In-core Instruments (ICIs) were used to detect the Core Exit Temperature (CET) in a reactor. This measured temperature was also used to check the entry condition of severe accidents. However, if a serious accident occurs, the upper portion of the core is damaged. This instrument has not been available. This paper illustrates the estimation possibility for the status of molten core through the high-temperature characteristics test of k-type thermocouple. It turns out that it is possible to measure the k-type thermocouple up to 1350 .deg. C degrees before melting during insertion into the melting furnace. Additionally, in order to measure a high temperature of 2000 .deg. C or more, the replacement possibility of k-type thermocouple was evaluated. However the tungsten-rhenium thermocouple is impossible to use in the detection of temperature at the in-core because of the

  7. Evaluation models and evaluation use

    Science.gov (United States)

    Contandriopoulos, Damien; Brousselle, Astrid

    2012-01-01

    The use of evaluation results is at the core of evaluation theory and practice. Major debates in the field have emphasized the importance of both the evaluator’s role and the evaluation process itself in fostering evaluation use. A recent systematic review of interventions aimed at influencing policy-making or organizational behavior through knowledge exchange offers a new perspective on evaluation use. We propose here a framework for better understanding the embedded relations between evaluation context, choice of an evaluation model and use of results. The article argues that the evaluation context presents conditions that affect both the appropriateness of the evaluation model implemented and the use of results. PMID:23526460

  8. Study of thermocouples for control of high temperatures; Etude de thermocouples pour le reperage des hautes temperatures

    Energy Technology Data Exchange (ETDEWEB)

    Villamayor, M. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires; Faculte des Sciences de l' Universite de Lyon - 69 (France)

    1967-07-01

    Previous works have shown that the tungsten-rhenium alloys thermocouples were a good instrument for control of high temperatures. From its, the author has studied the W/W 26 per cent and W 5 per cent Re/W 26 per cent Re french manufactured thermocouples and intended for control of temperatures in nuclear reactors until 2300 deg. C. In 'out-pile' study he determines the general characteristics of these thermocouples: average calibration curves, thermal shocks influence, response times, and alloys allowing the cold source compensation. The evolution of these thermocouples under thermal neutron flux has been determined by 'in-pile' study. The observations have led the author to propose a new type of thermocouples settled of molybdenum-columbium alloys. (author) [French] Des travaux anterieurs ont montre que les thermocouples des alliages tungstene-rhenium etaient susceptibles de reperer avec precision des hautes temperatures. A partir de la, l'auteur a etudie las thermocouples W/W 26 pour cent Re et W 5 pour cent Re/W 26 pour cent Re de fabrication francaise et destines au controle des temperatures dans les reacteurs nucleaires, jusqu'a 2300 deg. C Dans l'etude 'hors-pile' il a determine les caracteristiques generales de ces thermocouples: courbes d'etalonnage moyen, influence des chocs thermiques, temps de reponse, et alliages assurant la compensation de soudure froide. L'etude 'en-pile' a permis de rendre compte de l'evolution de ces thermocouples sous flux neutroniques. Les phenomenes observes ont conduit l'auteur a proposer un nouveau type de thermocouples constitues d'alliages molybdene-niobium. (auteur)

  9. Metallic and Ceramic Thin Film Thermocouples for Gas Turbine Engines

    Directory of Open Access Journals (Sweden)

    Otto J. Gregory

    2013-11-01

    Full Text Available Temperatures of hot section components in today’s gas turbine engines reach as high as 1,500 °C, making in situ monitoring of the severe temperature gradients within the engine rather difficult. Therefore, there is a need to develop instrumentation (i.e., thermocouples and strain gauges for these turbine engines that can survive these harsh environments. Refractory metal and ceramic thin film thermocouples are well suited for this task since they have excellent chemical and electrical stability at high temperatures in oxidizing atmospheres, they are compatible with thermal barrier coatings commonly employed in today’s engines, they have greater sensitivity than conventional wire thermocouples, and they are non-invasive to combustion aerodynamics in the engine. Thin film thermocouples based on platinum:palladium and indium oxynitride:indium tin oxynitride as well as their oxide counterparts have been developed for this purpose and have proven to be more stable than conventional type-S and type-K thin film thermocouples. The metallic and ceramic thin film thermocouples described within this paper exhibited remarkable stability and drift rates similar to bulk (wire thermocouples.

  10. A new thermocouple with high stability, high reliability and high radioresistance

    International Nuclear Information System (INIS)

    Jiang Xiangying

    1989-01-01

    A new developed NiCrSi/NiSiMg thermocouple alloys and its sheathed thermocouple have high stability, reliability and radio-resistance. Their properties are much better than conventional NiCr/NiAl (type K) thermocouple and some properties are also better than Nicrosil/Nisil (Type N) thermocouple. These new thermocouple alloys and its sheathed thermocouple can be used in scientific research field where high accuracy is need and in nuclear or non-nuclear industries requiring high reliability

  11. Thin film thermocouples for high temperature turbine application

    Science.gov (United States)

    Martin, Lisa C.

    1991-01-01

    The objective is to develop thin film thermocouples (TFTC) for Space Shuttle Main Engine (SSME) components such as the high pressure fuel turbopump (HPFTP) blades and to test TFTC survivability and durability in the SSME environment. The purpose for developing TFTC's for SSME components is to obtain blade temperatures for computational models developed for fluid mechanics and structures. The TFTC must be able to withstand the presence of high temperature, high pressure hydrogen as well as a severe thermal transient due to a cryogenic to combustion temperature change. The TFTC's will eventually be installed and tested on SSME propulsion system components in the SSME test bed engine. The TFTC's were successfully fabricated on flat coupons of MAR-M 246 (Hf+), which is the superalloy material used for HPFTP turbine blades. The TFTC's fabricated on flat coupons survived thermal shock cycling as well as testing in a heat flux measurement facility which provided a rapid thermal transient. The same fabrication procedure was used to deposit TFTC's on HPFTP first stage rotor blades. Other results from the experiments are presented, and future testing plans are discussed.

  12. Microwave design and analysis of a micromachined self-heating power sensor based on matching thermocouples

    Science.gov (United States)

    Zhang, Zhiqiang; Liao, Xiaoping

    2017-08-01

    Microwave performance is a basic index of the sensors used at microwave frequencies, but also affects the sensing output. For the purpose of low-loss microwave applications, it is important for different microwave sensors to develop microwave design. This paper presents the microwave design and analysis of a micromachined self-heating microwave power sensor in the GaAs MMIC process, where the microwave power is dissipated and converted into output thermovoltages by two matching thermocouples. A dc-blocking capacitor is connected to the thermocouples in series and used to avoid the output short-circuit. In order to characterize the microwave performance, an S-parameter model of this self-heating power sensor is established. Using the model, the effects of the capacitor and the thermocouples on the reflection loss are investigated under different microwave frequencies. To demonstrate the validity of the microwave model, the microwave performance of the self-heating sensor is simulated using an electromagnetic software. In the simulation, the relationship between the substrate membrane underneath the thermocouples and the reflection loss is analyzed. Measured reflection losses of the self-heating sensor are between  -15.5 to  -15.9 dB at 8-12 GHz. The measured results show good agreement with the microwave model and simulation, and the source of small deviations is discussed. The proposed microwave design and analysis contributes to achieving low reflection loss for the sensor, with the fact that more power is used to convert into the thermovoltages.

  13. Mineral insulated thermocouples - installation in steam generating plant

    International Nuclear Information System (INIS)

    Bridges, W.J.; Brown, J.F.

    1980-01-01

    The main areas of interest considered are Central Station Fossil Fuel fired boilers of around 500 MW capacity, AGR Boilers, and Industrial and Research Development projects. While the requirement for temperature measurement in each of these areas may vary the techniques adopted to overcome installation and protection problems created by thermal, chemical and mechanical hazards remain basically the same. The reasons for temperature measurement are described together with methods of attachment development and procedures for protection of the thermocouple along its route length until its exit from the hazardous environment. These relative accuracies of the different attachments are discussed along with factors influencing the life of the thermocouple. In many instances thermocouple installation is either a once only opportunity and/or an expensive exercise. It is therefore essential to develop and apply an effective quality control system during the installation phase. An effective system is described. Finally, a brief outline of possible future trends is given. (author)

  14. Automated data model evaluation

    International Nuclear Information System (INIS)

    Kazi, Zoltan; Kazi, Ljubica; Radulovic, Biljana

    2012-01-01

    Modeling process is essential phase within information systems development and implementation. This paper presents methods and techniques for analysis and evaluation of data model correctness. Recent methodologies and development results regarding automation of the process of model correctness analysis and relations with ontology tools has been presented. Key words: Database modeling, Data model correctness, Evaluation

  15. An Innovative Flow-Measuring Device: Thermocouple Boundary Layer Rake

    Science.gov (United States)

    Hwang, Danny P.; Fralick, Gustave C.; Martin, Lisa C.; Wrbanek, John D.; Blaha, Charles A.

    2001-01-01

    An innovative flow-measuring device, a thermocouple boundary layer rake, was developed. The sensor detects the flow by using a thin-film thermocouple (TC) array to measure the temperature difference across a heater strip. The heater and TC arrays are microfabricated on a constant-thickness quartz strut with low heat conductivity. The device can measure the velocity profile well into the boundary layer, about 65 gm from the surface, which is almost four times closer to the surface than has been possible with the previously used total pressure tube.

  16. Magnetic tunnel junction thermocouple for thermoelectric power harvesting

    Science.gov (United States)

    Böhnert, T.; Paz, E.; Ferreira, R.; Freitas, P. P.

    2018-05-01

    The thermoelectric power generated in magnetic tunnel junctions (MTJs) is determined as a function of the tunnel barrier thickness for a matched electric circuit. This study suggests that lower resistance area product and higher tunnel magnetoresistance will maximize the thermoelectric power output of the MTJ structures. Further, the thermoelectric behavior of a series of two MTJs, a MTJ thermocouple, is investigated as a function of its magnetic configurations. In an alternating magnetic configurations the thermovoltages cancel each other, while the magnetic contribution remains. A large array of MTJ thermocouples could amplify the magnetic thermovoltage signal significantly.

  17. Realization of Copper Melting Point for Thermocouple Calibrations

    Directory of Open Access Journals (Sweden)

    Y. A. ABDELAZIZ

    2011-08-01

    Full Text Available Although the temperature stability and uncertainty of the freezing plateau is better than that of the melting plateau in most of the thermometry fixed points, but realization of melting plateaus are easier than that of freezing plateaus for metal fixed points. It will be convenient if the melting points can be used instead of the freezing points in calibration of standard noble metal thermocouples because of easier realization and longer plateau duration of melting plateaus. In this work a comparison between the melting and freezing points of copper (Cu was carried out using standard noble metal thermocouples. Platinum - platinum 10 % rhodium (type S, platinum – 30 % rhodium / platinum 6 % rhodium (type B and platinum - palladium (Pt/Pd thermocouples are used in this study. Uncertainty budget analysis of the melting points and freezing points is presented. The experimental results show that it is possible to replace the freezing point with the melting point of copper cell in the calibration of standard noble metal thermocouples in secondary-level laboratories if the optimal methods of realization of melting points are used.

  18. The IIR evaluation model

    DEFF Research Database (Denmark)

    Borlund, Pia

    2003-01-01

    An alternative approach to evaluation of interactive information retrieval (IIR) systems, referred to as the IIR evaluation model, is proposed. The model provides a framework for the collection and analysis of IR interaction data. The aim of the model is two-fold: 1) to facilitate the evaluation...... of IIR systems as realistically as possible with reference to actual information searching and retrieval processes, though still in a relatively controlled evaluation environment; and 2) to calculate the IIR system performance taking into account the non-binary nature of the assigned relevance...... assessments. The IIR evaluation model is presented as an alternative to the system-driven Cranfield model (Cleverdon, Mills & Keen, 1966; Cleverdon & Keen, 1966) which still is the dominant approach to the evaluation of IR and IIR systems. Key elements of the IIR evaluation model are the use of realistic...

  19. The EMEFS model evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Barchet, W.R. (Pacific Northwest Lab., Richland, WA (United States)); Dennis, R.L. (Environmental Protection Agency, Research Triangle Park, NC (United States)); Seilkop, S.K. (Analytical Sciences, Inc., Durham, NC (United States)); Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K. (Atmospheric Environment Service, Downsview, ON (Canada)); Byun, D.; McHenry, J.N.

    1991-12-01

    The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs.

  20. The EMEFS model evaluation

    International Nuclear Information System (INIS)

    Barchet, W.R.; Dennis, R.L.; Seilkop, S.K.; Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K.; Byun, D.; McHenry, J.N.; Karamchandani, P.; Venkatram, A.; Fung, C.; Misra, P.K.; Hansen, D.A.; Chang, J.S.

    1991-12-01

    The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs

  1. Zircaloy-sheathed element rods fitted with thermo-couples

    International Nuclear Information System (INIS)

    Bernardy de Sigoyer, B.; Jacques, F.; Thome, P.

    1963-01-01

    In order to carry out thermal conductivity measurements on UO 2 in conditions similar to those under which fuel rods are used, it was necessary to measure the temperature at the interior of a fuel element sheathed in zircaloy. The temperatures are taken with Thermocoax type thermocouples, that is to say fitted with a very thin sheath of stainless steel or Inconel. It is known also that fusion welding of zircaloy onto stainless steel is impossible and that high temperature welded joints are very difficult because of their aggressiveness. The technique used consists in brazing the thermocouples to relatively large stainless steel parts and then joining these plugs by electron bombardment welding to diffused stainless steel-zircaloy couplings. The properties of these diffused couplings and of the brazed joints were studied; the various stages in the fabrication of the containers are also described. (authors) [fr

  2. Study of the Stability of Copper Fixed Point in Sealed Cells for Calibration of Noble Metal Thermocouples

    Directory of Open Access Journals (Sweden)

    Yasser A. Abdel-Aziz

    2009-12-01

    Full Text Available This paper reports the realization and stability of the freezing point of high purity copper (99.9999% Cu (1084.62 °C in a sealed cell by noble metal thermocouples of Pt-l0%Rh/Pt (type S and Pt-30%Rh/Pt-6%Rh (type B, using a three zone heating furnace. The graphite crucible in the sealed cell is made of ultra high purity carbon to contain 99.9999 % purity Cu metal. The individual difference at Cu freezing point, measured by each thermocouple at the National Institute for Standards (NIS with respect to the value as stated in the ITS-90, is lying within the overall uncertainty of measurements. The expanded uncertainty of measurements was evaluated and expressed as 95% confidence level. The Cu freezing point as measured with type-S thermocouple was found to be 1084.62 °C ± 0.666 °C and that with type B thermocouple was found to be 1084.62 °C ± 0.532 °C.

  3. Recent improvements on micro-thermocouple based SThM

    OpenAIRE

    Nguyen, T. P.; Thiery, L.; Teyssieux, D.; Briand, Danick; Vairac, P.

    2017-01-01

    The scanning thermal microscope (SThM) has become a versatile tool for local surface temperature mapping or measuring thermal properties of solid materials. In this article, we present recent improvements in a SThM system, based on a micro-wire thermocouple probe associated with a quartz tuning fork for contact strength detection. Some results obtained on an electrothermal micro-hotplate device, operated in active and passive modes, allow demonstrating its performance as a coupled force detec...

  4. Calibration Technique of the Irradiated Thermocouple using Artificial Neural Network

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Jin Tae; Joung, Chang Young; Ahn, Sung Ho; Yang, Tae Ho; Heo, Sung Ho; Jang, Seo Yoon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    To correct the signals, the degradation rate of sensors needs to be analyzed, and re-calibration of sensors should be followed periodically. In particular, because thermocouples instrumented in the nuclear fuel rod are degraded owing to the high neutron fluence generated from the nuclear fuel, the periodic re-calibration process is necessary. However, despite the re-calibration of the thermocouple, the measurement error will be increased until next re-calibration. In this study, based on the periodically calibrated temperature - voltage data, an interpolation technique using the artificial neural network will be introduced to minimize the calibration error of the C-type thermocouple under the irradiation test. The test result shows that the calculated voltages derived from the interpolation function have good agreement with the experimental sampling data, and they also accurately interpolate the voltages at arbitrary temperature and neutron fluence. That is, once the reference data is obtained by experiments, it is possible to accurately calibrate the voltage signal at a certain neutron fluence and temperature using an artificial neural network.

  5. Introducing Program Evaluation Models

    Directory of Open Access Journals (Sweden)

    Raluca GÂRBOAN

    2008-02-01

    Full Text Available Programs and project evaluation models can be extremely useful in project planning and management. The aim is to set the right questions as soon as possible in order to see in time and deal with the unwanted program effects, as well as to encourage the positive elements of the project impact. In short, different evaluation models are used in order to minimize losses and maximize the benefits of the interventions upon small or large social groups. This article introduces some of the most recently used evaluation models.

  6. Exploring Soot Particle Concentration and Emissivity by Transient Thermocouples Measurements in Laminar Partially Premixed Coflow Flames

    Directory of Open Access Journals (Sweden)

    Gianluigi De Falco

    2017-02-01

    Full Text Available Soot formation in combustion represents a complex phenomenon that strongly depends on several factors such as pressure, temperature, fuel chemical composition, and the extent of premixing. The effect of partial premixing on soot formation is of relevance also for real combustion devices and still needs to be fully understood. An improved version of the thermophoretic particle densitometry (TPD method has been used in this work with the aim to obtain both quantitative and qualitative information of soot particles generated in a set of laminar partially-premixed coflow flames characterized by different equivalence ratios. To this aim, the transient thermocouple temperature response has been analyzed to infer particle concentration and emissivity. A variety of thermal emissivity values have been measured for flame-formed carbonaceous particles, ranging from 0.4 to 0.5 for the early nucleated soot particles up to the value of 0.95, representing the typical value commonly attributed to mature soot particles, indicating that the correct determination of the thermal emissivity is necessary to accurately evaluate the particle volume fraction. This is particularly true at the early stage of the soot formation, when particle concentration measurement is indeed particularly challenging as in the central region of the diffusion flames. With increasing premixing, an initial increase of particles is detected both in the maximum radial soot volume fraction region and in the central region of the flame, while the further addition of primary air determines the particle volume fraction drop. Finally, a modeling analysis based on a sectional approach has been performed to corroborate the experimental findings.

  7. Recent improvements on micro-thermocouple based SThM

    Science.gov (United States)

    Nguyen, TP; Thiery, L.; Teyssieux, D.; Briand, D.; Vairac, P.

    2017-01-01

    The scanning thermal microscope (SThM) has become a versatile tool for local surface temperature mapping or measuring thermal properties of solid materials. In this article, we present recent improvements in a SThM system, based on a micro-wire thermocouple probe associated with a quartz tuning fork for contact strength detection. Some results obtained on an electrothermal micro-hotplate device, operated in active and passive modes, allow demonstrating its performance as a coupled force detection and thermal measurement system.

  8. Temperature Control System for Chromel-Alumel Thermocouple

    International Nuclear Information System (INIS)

    Piping Supriatna; Nurhanan; Riswan DJ; Heru K, B.; Edi Karyanta

    2003-01-01

    Nuclear Power Plan Operation Safety needs serious handling on temperature measurement and control. In this report has been done manufacturing Temperature Control System for Chromel-Alumel Thermocouple, accordance to material, equipment and human resource ability in the laboratory. Basic component for the Temperature Control System is LM-741 type of Operation Amplifier, which is functionalized as summer for voltage comparator. Function test for this Control System shown its ability for damping on temperature reference. The Temperature Control System will be implemented on PCB Processing Machine. (author)

  9. Stability Studies of a New Design Au/Pt Thermocouple Without a Strain Relieving Coil

    Science.gov (United States)

    Jahan, Ferdouse; Ballico, Mark

    2007-12-01

    The performance of a simple, new design Au/Pt thermocouple developed by NMIA is assessed. This thermocouple is proposed as a more accurate replacement, over the temperature range from 0 to 1,000°C, for the commonly used Type R and S industrial transfer standards, in a robust form familiar to industrial calibration laboratories. Due to the significantly different thermal expansions of the Au and Pt thermoelements, reported designs of the Au/Pt thermocouple incorporate a strain-relieving coil or bridge at the thermocouple junction. As the strain relieving coil is mechanically delicate, these thermocouples are usually mounted in a protective quartz tube assembly, like a standard platinum resistance thermometer (SPRT). Although providing uncertainties at the mK level, they are more delicate than the commonly used Type R and S thermocouples. A new and simple design of the Au/Pt thermocouple was developed in which the differential thermal expansion between Au and Pt is accommodated in the thermocouple leads, facilitated by a special head design. The resulting thermocouple has the appearance and robustness of the traditional Type R and S thermocouples, while retaining stability better than 10 mK up to 961°C. Three thermocouples of this design were calibrated at fixed points and by comparison to SPRTs in a stirred salt bath. In order to assess possible impurity migration, strain effects, and mechanical robustness, sequences of heat treatment up to a total of 500 h together with over 50 thermal cycles from 900°C to ambient were performed. The effect of these treatments on the calibration was assessed, demonstrating the sensors to be robust and stable to better than 10 mK. The effects on the measured inhomogeneity of the thermocouple were assessed using the NMIA thermocouple scanning bath.

  10. Training effectiveness evaluation model

    International Nuclear Information System (INIS)

    Penrose, J.B.

    1993-01-01

    NAESCO's Training Effectiveness Evaluation Model (TEEM) integrates existing evaluation procedures with new procedures. The new procedures are designed to measure training impact on organizational productivity. TEEM seeks to enhance organizational productivity through proactive training focused on operation results. These results can be identified and measured by establishing and tracking performance indicators. Relating training to organizational productivity is not easy. TEEM is a team process. It offers strategies to assess more effectively organizational costs and benefits of training. TEEM is one organization's attempt to refine, manage and extend its training evaluation program

  11. Intellectual Capital Evaluation Models

    OpenAIRE

    Agoston Simona; Puia Ramona Stefania; Orzea Ivona

    2010-01-01

    The evaluation and measurement of intellectual capital is an issue of increasing importance for companies because of the staleness of the traditional accounting systems which do not provide relevant information regarding the value of a company. Thus, specialists are working to identify a model for assessing intellectual capital that can be easily implemented and used. The large number of proposed models but also the major differences between them emphasizes the fact that the specialists are s...

  12. CMAQ Model Evaluation Framework

    Science.gov (United States)

    CMAQ is tested to establish the modeling system’s credibility in predicting pollutants such as ozone and particulate matter. Evaluation of CMAQ has been designed to assess the model’s performance for specific time periods and for specific uses.

  13. An experimental study of the effect of external thermocouples on rewetting during reflood

    International Nuclear Information System (INIS)

    Shires, G.L.; Butcher, A.A.; Carpenter, B.G.; McCune, D.S.; Pearson, K.G.

    1980-04-01

    The validation of computer codes used for PWR safety assessment often depends upon experiments carried out with either real fuel pins or electrically heated fuel pin simulators. In some cases, and this applies particularly to in-pile tests, temperatures are measured by means of sheathed thermocouples attached externally to the pins and this raises the question of the possible effect of such thermocouples on the two phase hydraulics and heat transfer which are being studied. This paper describes the experiments which subjected two realistic fuel pin simulators, one with and one without external thermocouples, to identical bottom flooding conditions. They demonstrate very clearly that external thermocouples act as preferential rewetting sites and thereby increase the rate of propagation of the quench front. In the view of the authors of this paper the facts described raise serious doubts about the validity of rewetting data obtained from experiments employing external thermocouples. (U.K.)

  14. Blind system identification of two-thermocouple sensor based on cross-relation method

    Science.gov (United States)

    Li, Yanfeng; Zhang, Zhijie; Hao, Xiaojian

    2018-03-01

    In dynamic temperature measurement, the dynamic characteristics of the sensor affect the accuracy of the measurement results. Thermocouples are widely used for temperature measurement in harsh conditions due to their low cost, robustness, and reliability, but because of the presence of the thermal inertia, there is a dynamic error in the dynamic temperature measurement. In order to eliminate the dynamic error, two-thermocouple sensor was used to measure dynamic gas temperature in constant velocity flow environments in this paper. Blind system identification of two-thermocouple sensor based on a cross-relation method was carried out. Particle swarm optimization algorithm was used to estimate time constants of two thermocouples and compared with the grid based search method. The method was validated on the experimental equipment built by using high temperature furnace, and the input dynamic temperature was reconstructed by using the output data of the thermocouple with small time constant.

  15. Lifetime improvement of sheathed thermocouples for use in high-temperature and thermal transient operations

    International Nuclear Information System (INIS)

    McCulloch, R.W.; Clift, J.H.

    1982-01-01

    Premature failure of small-diameter, magnesium-oxide-insulated sheathed thermocouples occurred when they were placed within nuclear fuel rod simulators (FRSs) to measure high temperatures and to follow severe thermal transients encountered during simulation of nuclear reactor accidents in Oak Ridge National Laboratory (ORNL) thermal-hydraulic test facilities. Investigation of thermally cycled thermocouples yielded three criteria for improvement of thermocouple lifetime: (1) reduction of oxygen impurities prior to and during their fabrication, (2) refinement of thermoelement grain size during their fabrication, and (3) elimination of prestrain prior to use above their recrystallization temperature. The first and third criteria were satisfied by improved techniques of thermocouple assembly and by a recovery anneal prior to thermocouple use

  16. Specific features of thermocouple calorimeter application for measurements of pulsed X-ray emission from plasma

    International Nuclear Information System (INIS)

    Gavrilov, V. V.; Fasakhov, I. K.

    2012-01-01

    It is shown that the accuracy of time-integrated measurements of pulsed X-ray emission from hot plasma with calibrated thermocouple calorimeters is mainly determined by two factors. The first and the most important factor is heating of the filter by the absorbed X-rays; as a result, the calorimeter measures the thermal radiation of the filter, which causes appreciable distortion of the temporal profile and amplitude of the recorded signal. The second factor is the dependence of the effective depth of X-ray absorption in the dielectric that covers the entrance window of the calorimeter on the energy of X-ray photons, i.e., on the recorded radiation spectrum. The results of model calculations of the calorimeter signal are compared with the experimental data.

  17. Development of Surface Eroding Thermocouples in DIII-D

    Science.gov (United States)

    Ren, Jun; Donovan, David; Watkins, Jon; Wang, Huiqian; Rudakov, Dmitry; Murphy, Christopher; Unterberg, Ezekial; Thomas, Dan; Boivin, Rejean

    2017-10-01

    The Surface Eroding Thermocouple (SETC) is a specialized diagnostic for characterizing the surface temperature evolution with a high temporal resolution ( 1ms) which is especially useful in areas unobservable by line-of-sight diagnostics (e.g. IR cameras). Recently, SETCs were tested in DiMES and successfully acquired temperature signals during strike point sweeps on the lower divertor shelf. We observed that the SETCs have a sub-10 ms time response and is sufficient to resolve ELM heat pulses. Preliminary analysis shows heat fluxes measured by SETCs and IR camera agree within 20%. Comparison of SETCs, calorimeters and Langmuir probe also show good agreement. We plan to implement an array of SETCs embedded in the tiles forming the new DIII-D small angle slot (SAS) divertor. Strategies to improve the SNR of these SETCs through testing in DiMES before the final installation will be discussed. This work was supported by the US Department of Energy under DE-SC0016318 (UTK), DE-AC05-00OR22725 (ORNL), DE-FG02-07ER54917 (UCSD), DE-FC02-04ER54698 (GA), DE-AC04-94AL85000 (SNL).

  18. Thermoelectric properties of currently available Au/Pt thermocouples related to the valid reference function

    Directory of Open Access Journals (Sweden)

    Edler F.

    2015-01-01

    Full Text Available Au/Pt thermocouples are considered to be an alternative to High Temperature Standard Platinum Resistance Thermometers (HTSPRTs for realizing temperatures according to the International Temperature Scale of 1990 (ITS-90 in the temperature range between aluminium (660.323 °C and silver (961.78 °C. The original aim of this work was to develop and to validate a new reference function for Au/Pt thermocouples which reflects the properties of presently commercially available Au and Pt wires. The thermoelectric properties of 16 Au/Pt thermocouples constructed at different National Metrological Institutes by using wires from different suppliers and 4 commercially available Au/Pt thermocouples were investigated. Most of them exhibit significant deviations from the current reference function of Au/Pt thermocouples caused by the poor performance of the Au-wires available. Thermoelectric homogeneity was investigated by measuring immersion profiles during freezes at the freezing point of silver and in liquid baths. The thermoelectric inhomogeneities were found to be one order of magnitude larger than those of Au/Pt thermocouples of the Standard Reference Material® (SRM® 1749. The improvement of the annealing procedure of the gold wires is a key process to achieve thermoelectric homogeneities in the order of only about (2–3 mK, sufficient to replace the impracticable HTSPRTs as interpolation instruments of the ITS-90. Comparison measurements of some of the Au/Pt thermocouples against a HTSPRT and an absolutely calibrated radiation thermometer were performed and exhibit agreements within the expanded measurement uncertainties. It has been found that the current reference function of Au/Pt thermocouples reflects adequately the thermoelectric properties of currently available Au/Pt thermocouples.

  19. Thermocouples calibration and analysis of the influence of the length of the sensor coating

    International Nuclear Information System (INIS)

    Noriega, M; Ramírez, R; López, R; Vaca, M; Morales, J; Terres, H; Lizardi, A; Chávez, S

    2015-01-01

    This paper presents the design and construction of a lab prototype, with a much lower cost compared to the ones commercially sold, enabling the manufacture of thermocouples which are then calibrated to verify their functionality and acceptance. We also analyze the influence of the external insulation over the wires, to determine whether it influences temperature measurement. The tested lengths ranged from 0.00 m up to 0.030 m. The thermocouple was compared against the behavior of a thermocouple of the same type that was purchased with a commercial supplier. The obtained measurement showed less than 1 °C difference in some points. This makes the built thermocouple reliable, since the standard allows a difference of up to 2.2 °C

  20. Tools for Model Evaluation

    DEFF Research Database (Denmark)

    Olesen, H. R.

    1998-01-01

    Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France.......Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France....

  1. Model Program Evaluations. Fact Sheet

    Science.gov (United States)

    Arkansas Safe Schools Initiative Division, 2002

    2002-01-01

    There are probably thousands of programs and courses intended to prevent or reduce violence in this nation's schools. Evaluating these many programs has become a problem or goal in itself. There are now many evaluation programs, with many levels of designations, such as model, promising, best practice, exemplary and noteworthy. "Model program" is…

  2. Nuclear models relevant to evaluation

    International Nuclear Information System (INIS)

    Arthur, E.D.; Chadwick, M.B.; Hale, G.M.; Young, P.G.

    1992-01-01

    The widespread use of nuclear models continues in the creation of data evaluations. The reasons include extension of data evaluations to higher energies, creation of data libraries for isotopic components of natural materials, and production of evaluations for radioactive target species. In these cases, experimental data are often sparse or nonexistent. As this trend continues, the nuclear models employed in evaluation work move towards more microscopically-based theoretical methods, prompted in part by the availability of increasingly powerful computational resources. Advances in nuclear models applicable to evaluation will be reviewed. These include advances in optical model theory, microscopic and phenomenological state and level density theory, unified models that consistently describe both equilibrium and nonequilibrium reaction mechanisms, and improved methodologies for calculation of prompt radiation from fission. (orig.)

  3. Nuclear models relevant to evaluation

    International Nuclear Information System (INIS)

    Arthur, E.D.; Chadwick, M.B.; Hale, G.M.; Young, P.G.

    1991-01-01

    The widespread use of nuclear models continues in the creation of data evaluations. The reasons include extension of data evaluations to higher energies, creation of data libraries for isotopic components of natural materials, and production of evaluations for radiative target species. In these cases, experimental data are often sparse or nonexistent. As this trend continues, the nuclear models employed in evaluation work move towards more microscopically-based theoretical methods, prompted in part by the availability of increasingly powerful computational resources. Advances in nuclear models applicable to evaluation will be reviewed. These include advances in optical model theory, microscopic and phenomenological state and level density theory, unified models that consistently describe both equilibrium and nonequilibrium reaction mechanism, and improved methodologies for calculation of prompt radiation from fission. 84 refs., 8 figs

  4. Degradation by radiation of the response of a thermocouple of a fuel element

    International Nuclear Information System (INIS)

    Rodriguez V, A.

    1994-01-01

    In the TRIGA Mark III Reactor of the National Institute of Nuclear Research, is necessary to use an instrumented fuel element for measurement the fuel temperature during pulses of power. This fuel element is exposed to daily temperature gradient of order to 390 Centigrade degrees in normal condition of reactor operation at 1 MW. The experience which this instrumented fuel elements is that useful life of the thermocouples is less then the fuel, because they show important changes in their chemistry composition and electrical specifications, until the point they don't give any response. So is necessary to know the factors that influenced in the shortening of the thermocouples life. The change in composition affects the thermocouple calibration depends on where the changes take place relative to the temperature gradient. The change will be dependent on the neutron flux and so the value of the neutron flux may be used as a measure or the composition change. If there is no neutron flux within the temperature gradient, there will be no composition change, and so the thermocouple calibration will no change. If the neutron flux varies within the region in which a temperature gradients exists, the composition of the thermocouple will vary and the calibration will change. But the maximum change in calibration will occur if the neutron flux is high and constant within the region of the temperature gradient. In this case, a composition change takes place which is uniform throughout the gradient and so the emf output can be expected to change. In this reactor, the thermocouples are in the second case. Then, the relative position of the thermal and neutron flux gradients are the most important factor that explain the composition change after or 2,500 times of exposing the thermocouples to the temperature gradients of order to 390 Centigrade degrees. (Author)

  5. Effect of the Thermocouple on Measuring the Temperature Discontinuity at a Liquid-Vapor Interface.

    Science.gov (United States)

    Kazemi, Mohammad Amin; Nobes, David S; Elliott, Janet A W

    2017-07-18

    The coupled heat and mass transfer that occurs in evaporation is of interest in a large number of fields such as evaporative cooling, distillation, drying, coating, printing, crystallization, welding, atmospheric processes, and pool fires. The temperature jump that occurs at an evaporating interface is of central importance to understanding this complex process. Over the past three decades, thermocouples have been widely used to measure the interfacial temperature jumps at a liquid-vapor interface during evaporation. However, the reliability of these measurements has not been investigated so far. In this study, a numerical simulation of a thermocouple when it measures the interfacial temperatures at a liquid-vapor interface is conducted to understand the possible effects of the thermocouple on the measured temperature and features in the temperature profile. The differential equations of heat transfer in the solid and fluids as well as the momentum transfer in the fluids are coupled together and solved numerically subject to appropriate boundary conditions between the solid and fluids. The results of the numerical simulation showed that while thermocouples can measure the interfacial temperatures in the liquid correctly, they fail to read the actual interfacial temperatures in the vapor. As the results of our numerical study suggest, the temperature jumps at a liquid-vapor interface measured experimentally by using a thermocouple are larger than what really exists at the interface. For a typical experimental study of evaporation of water at low pressure, it was found that the temperature jumps measured by a thermocouple are overestimated by almost 50%. However, the revised temperature jumps are still in agreement with the statistical rate theory of interfacial transport. As well as addressing the specific application of the liquid-vapor temperature jump, this paper provides significant insight into the role that heat transfer plays in the operation of thermocouples

  6. A novel approach for fault detection and classification of the thermocouple sensor in Nuclear Power Plant using Singular Value Decomposition and Symbolic Dynamic Filter

    International Nuclear Information System (INIS)

    Mandal, Shyamapada; Santhi, B.; Sridhar, S.; Vinolia, K.; Swaminathan, P.

    2017-01-01

    Highlights: • A novel approach to classify the fault pattern using data-driven methods. • Application of robust reconstruction method (SVD) to identify the faulty sensor. • Analysing fault pattern for plenty of sensors using SDF with less time complexity. • An efficient data-driven model is designed to the false and missed alarms. - Abstract: A mathematical model with two layers is developed using data-driven methods for thermocouple sensor fault detection and classification in Nuclear Power Plants (NPP). The Singular Value Decomposition (SVD) based method is applied to detect the faulty sensor from a data set of all sensors, at the first layer. In the second layer, the Symbolic Dynamic Filter (SDF) is employed to classify the fault pattern. If SVD detects any false fault, it is also re-evaluated by the SDF, i.e., the model has two layers of checking to balance the false alarms. The proposed fault detection and classification method is compared with the Principal Component Analysis. Two case studies are taken from Fast Breeder Test Reactor (FBTR) to prove the efficiency of the proposed method.

  7. Global gridded crop model evaluation

    NARCIS (Netherlands)

    Müller, Christoph; Elliott, Joshua; Chryssanthacopoulos, James; Arneth, Almut; Balkovic, Juraj; Ciais, Philippe; Deryng, Delphine; Folberth, Christian; Glotter, Michael; Hoek, Steven; Iizumi, Toshichika; Izaurralde, Roberto C.; Jones, Curtis; Khabarov, Nikolay; Lawrence, Peter; Liu, Wenfeng; Olin, Stefan; Pugh, Thomas A.M.; Ray, Deepak K.; Reddy, Ashwan; Rosenzweig, Cynthia; Ruane, Alex C.; Sakurai, Gen; Schmid, Erwin; Skalsky, Rastislav; Song, Carol X.; Wang, Xuhui; Wit, De Allard; Yang, Hong

    2017-01-01

    Crop models are increasingly used to simulate crop yields at the global scale, but so far there is no general framework on how to assess model performance. Here we evaluate the simulation results of 14 global gridded crop modeling groups that have contributed historic crop yield simulations for

  8. Investigation of pool boiling dynamics on a rectangular heater using nano-thermocouples: is it chaotic or stochastic?

    Energy Technology Data Exchange (ETDEWEB)

    Sathyamurthi, Vijaykumar; Banerjee, Debjyoti [Texas A and M University, College Station, TX (United States). Dept. of Mechanical Engineering], e-mail: dbanerjee@tamu.edu

    2009-07-01

    The non-linear dynamical model of pool boiling on a horizontal rectangular heater is assessed from experimental results in this study. Pool boiling experiments are conducted over a horizontal rectangular silicon substrate measuring 63 mm x 35 mm with PF-5060 as the test fluid. Novel nano-thermocouples, micro-machined in-situ on the silicon substrate are used to measure the surface temperature fluctuations for steady state pool boiling. The acquisition frequency for temperature data from the nano-thermocouples is 1 k Hz. The surface temperature fluctuations are analyzed using the TISEAN{sup c} package. A time-delay embedding is employed to generate higher dimensional phase-space vectors from the temperature time series record. The optimal delay is determined from the first minimum of the mutual information function. Techniques such as recurrence plots, and false nearest neighbors tests are employed to assess the presence of deterministic chaotic dynamics. Chaos quantifiers such as correlation dimensions are found for various pool boiling regimes using the raw data as well as noise-reduced data. Additionally, pseudo-phase spaces are used to reconstruct the 'attractors'. The results after non-linear noise reduction shows definitive presence of low-dimensional (d {<=} 7) chaos in fully developed nucleate boiling, at critical heat flux and in film boiling. (author)

  9. Investigation of pool boiling dynamics on a rectangular heater using nano-thermocouples: is it chaotic or stochastic?

    International Nuclear Information System (INIS)

    Sathyamurthi, Vijaykumar; Banerjee, Debjyoti

    2009-01-01

    The non-linear dynamical model of pool boiling on a horizontal rectangular heater is assessed from experimental results in this study. Pool boiling experiments are conducted over a horizontal rectangular silicon substrate measuring 63 mm x 35 mm with PF-5060 as the test fluid. Novel nano-thermocouples, micro-machined in-situ on the silicon substrate are used to measure the surface temperature fluctuations for steady state pool boiling. The acquisition frequency for temperature data from the nano-thermocouples is 1 k Hz. The surface temperature fluctuations are analyzed using the TISEAN c package. A time-delay embedding is employed to generate higher dimensional phase-space vectors from the temperature time series record. The optimal delay is determined from the first minimum of the mutual information function. Techniques such as recurrence plots, and false nearest neighbors tests are employed to assess the presence of deterministic chaotic dynamics. Chaos quantifiers such as correlation dimensions are found for various pool boiling regimes using the raw data as well as noise-reduced data. Additionally, pseudo-phase spaces are used to reconstruct the 'attractors'. The results after non-linear noise reduction shows definitive presence of low-dimensional (d ≤ 7) chaos in fully developed nucleate boiling, at critical heat flux and in film boiling. (author)

  10. Rock mechanics models evaluation report

    International Nuclear Information System (INIS)

    1987-08-01

    This report documents the evaluation of the thermal and thermomechanical models and codes for repository subsurface design and for design constraint analysis. The evaluation was based on a survey of the thermal and thermomechanical codes and models that are applicable to subsurface design, followed by a Kepner-Tregoe (KT) structured decision analysis of the codes and models. The primary recommendations of the analysis are that the DOT code be used for two-dimensional thermal analysis and that the STEALTH and HEATING 5/6 codes be used for three-dimensional and complicated two-dimensional thermal analysis. STEALTH and SPECTROM 32 are recommended for thermomechanical analyses. The other evaluated codes should be considered for use in certain applications. A separate review of salt creep models indicate that the commonly used exponential time law model is appropriate for use in repository design studies. 38 refs., 1 fig., 7 tabs

  11. The EU model evaluation group

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1999-01-01

    The model evaluation group (MEG) was launched in 1992 growing out of the Major Technological Hazards Programme with EU/DG XII. The goal of MEG was to improve the culture in which models were developed, particularly by encouraging voluntary model evaluation procedures based on a formalised and consensus protocol. The evaluation intended to assess the fitness-for-purpose of the models being used as a measure of the quality. The approach adopted was focused on developing a generic model evaluation protocol and subsequent targeting this onto specific areas of application. Five such developments have been initiated, on heavy gas dispersion, liquid pool fires, gas explosions, human factors and momentum fires. The quality of models is an important element when complying with the 'Seveso Directive' requiring that the safety reports submitted to the authorities comprise an assessment of the extent and severity of the consequences of identified major accidents. Further, the quality of models become important in the land use planning process, where the proximity of industrial sites to vulnerable areas may be critical. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  12. Measuring skin temperature before, during and after exercise: a comparison of thermocouples and infrared thermography

    International Nuclear Information System (INIS)

    Fernandes, Alex de Andrade; Amorim, Paulo Roberto dos Santos; De Moura, Anselmo Gomes; Moreira, Danilo Gomes; Costa, Carlos Magno Amaral; Marins, João Carlos Bouzas; Brito, Ciro José; Sillero-Quintana, Manuel

    2014-01-01

    Measuring skin temperature (T SK ) provides important information about the complex thermal control system and could be interesting when carrying out studies about thermoregulation. The most common method to record T SK  involves thermocouples at specific locations; however, the use of infrared thermal imaging (IRT) has increased. The two methods use different physical processes to measure T SK , and each has advantages and disadvantages. Therefore, the objective of this study was to compare the mean skin temperature (MT SK ) measurements using thermocouples and IRT in three different situations: pre-exercise, exercise and post-exercise. Analysis of the residual scores in Bland–Altman plots showed poor agreement between the MT SK  obtained using thermocouples and those using IRT. The averaged error was −0.75 °C during pre-exercise, 1.22 °C during exercise and −1.16 °C during post-exercise, and the reliability between the methods was low in the pre- (ICC = 0.75 [0.12 to 0.93]), during (ICC = 0.49 [−0.80 to 0.85]) and post-exercise (ICC = 0.35 [−1.22 to 0.81] conditions. Thus, there is poor correlation between the values of MT SK  measured by thermocouples and IRT pre-exercise, exercise and post-exercise, and low reliability between the two forms of measurement. (paper)

  13. Temperature and SAR measurements in deep-body hyperthermia with thermocouple thermometry

    NARCIS (Netherlands)

    de Leeuw, A. A.; Crezee, J.; Lagendijk, J. J.

    1993-01-01

    Multisensor (7-14) thermocouple thermometry is used at our department for temperature measurement with our 'Coaxial TEM' regional hyperthermia system. A special design of the thermometry system with high resolution (0.005 degrees C) and fast data-acquisition (all channels within 320 ms) together

  14. Implications of using thermocouple thermometry in 27 MHz capacitively coupled interstitial hyperthermia

    NARCIS (Netherlands)

    Crezee, J.; van der Koijk, J. F.; Kaatee, R. S.; Lagendijk, J. J.

    1997-01-01

    The 27 MHz Multi Electrode Current Source (MECS) interstitial hyperthermia system uses segmented electrodes, 10-20 mm long, to steer the 3D power deposition. This power control at a scale of 1-2 cm requires detailed and accurate temperature feedback data. To this end seven-point thermocouples are

  15. Experiences with W3Re/W25Re thermocouples in fuel pins of NS Otto Hahn's two cores

    International Nuclear Information System (INIS)

    Kolb, M.

    1975-01-01

    Applications and performance of thermocouples in the Otto Hahn reactor are presented. The measurement of effective thermocouple time constants and of fuel rod heat transfer time constants utilizing the reactor noise and the resulting small temperature fluctuations which has become practical by the advent of modern noise analysis systems, is dealt with

  16. Thermal Recovery from Cold-Working in Type K Bare-Wire Thermocouples

    Science.gov (United States)

    Greenen, A. D.; Webster, E. S.

    2017-12-01

    Cold-working of most thermocouples has a significant, direct impact on the Seebeck coefficient which can lead to regions of thermoelectric inhomogeneity and accelerated drift. Cold-working can occur during the wire swaging process, when winding the wire onto a bobbin, or during handling by the end user—either accidentally or deliberately. Swaging-induced cold-work in thermocouples, if uniformly applied, may result in a high level of homogeneity. However, on exposure to elevated temperatures, the subsequent recovery process from the cold-working can then result in significant drift, and this can in turn lead to erroneous temperature measurements, often in excess of the specified manufacturer tolerances. Several studies have investigated the effects of cold-work in Type K thermocouples usually by bending, or swaging. However, the amount of cold-work applied to the thermocouple is often difficult to quantify, as the mechanisms for applying the strains are typically nonlinear when applied in this fashion. A repeatable level of cold-working is applied to the different wires using a tensional loading apparatus to apply a known yield displacement to the thermoelements. The effects of thermal recovery from cold-working can then be accurately quantified as a function of temperature, using a linear gradient furnace and a high-resolution homogeneity scanner. Variation in these effects due to differing alloy compositions in Type K wire is also explored, which is obtained by sourcing wire from a selection of manufacturers. The information gathered in this way will inform users of Type K thermocouples about the potential consequences of varying levels of cold-working and its impact on the Seebeck coefficient at a range of temperatures between ˜ 70°C and 600° C. This study will also guide users on the temperatures required to rapidly alleviate the effects of cold-working using thermal annealing treatments.

  17. Long Hole Film Cooling Dataset for CFD Development . Part 1; Infrared Thermography and Thermocouple Surveys

    Science.gov (United States)

    Shyam, Vikram; Thurman, Douglas; Poinsatte, Phillip; Ameri, Ali; Eichele, Peter; Knight, James

    2013-01-01

    An experiment investigating flow and heat transfer of long (length to diameter ratio of 18) cylindrical film cooling holes has been completed. In this paper, the thermal field in the flow and on the surface of the film cooled flat plate is presented for nominal freestream turbulence intensities of 1.5 and 8 percent. The holes are inclined at 30deg above the downstream direction, injecting chilled air of density ratio 1.0 onto the surface of a flat plate. The diameter of the hole is 0.75 in. (0.01905 m) with center to center spacing (pitch) of 3 hole diameters. Coolant was injected into the mainstream flow at nominal blowing ratios of 0.5, 1.0, 1.5, and 2.0. The Reynolds number of the freestream was approximately 11,000 based on hole diameter. Thermocouple surveys were used to characterize the thermal field. Infrared thermography was used to determine the adiabatic film effectiveness on the plate. Hotwire anemometry was used to provide flowfield physics and turbulence measurements. The results are compared to existing data in the literature. The aim of this work is to produce a benchmark dataset for Computational Fluid Dynamics (CFD) development to eliminate the effects of hole length to diameter ratio and to improve resolution in the near-hole region. In this report, a Time-Filtered Navier Stokes (TFNS), also known as Partially Resolved Navier Stokes (PRNS), method that was implemented in the Glenn-HT code is used to model coolant-mainstream interaction. This method is a high fidelity unsteady method that aims to represent large scale flow features and mixing more accurately.

  18. A Taxonomy of Evaluation Models: Use of Evaluation Models in Program Evaluation.

    Science.gov (United States)

    Carter, Wayne E.

    In the nine years following the passage of the Elementary Secondary Education Act (ESEA), several models have been developed to attempt to remedy the deficiencies in existing educational evaluation and decision theory noted by Stufflebeam and co-workers. Compilations of evaluation models have been undertaken and listings exist of models available…

  19. Proposed algorithm for determining the delta intercept of a thermocouple psychrometer curve

    International Nuclear Information System (INIS)

    Kurzmack, M.A.

    1993-01-01

    The USGS Hydrologic Investigations Program is currently developing instrumentation to study the unsaturated zone at Yucca Mountain in Nevada. Surface-based boreholes up to 2,500 feet in depth will be drilled, and then instrumented in order to define the water potential field within the unsaturated zone. Thermocouple psychrometers will be used to monitor the in-situ water potential. An algorithm is proposed for simply and efficiently reducing a six wire thermocouple psychrometer voltage output curve to a single value, the delta intercept. The algorithm identifies a plateau region in the psychrometer curve and extrapolates a linear regression back to the initial start of relaxation. When properly conditioned for the measurements being made, the algorithm results in reasonable results even with incomplete or noisy psychrometer curves over a 1 to 60 bar range

  20. Method for collecting thermocouple data via secured shell over a wireless local area network in real time.

    Science.gov (United States)

    Arnold, F; DeMallie, I; Florence, L; Kashinski, D O

    2015-03-01

    This manuscript addresses the design, hardware details, construction, and programming of an apparatus allowing an experimenter to monitor and record high-temperature thermocouple measurements of dynamic systems in real time. The apparatus uses wireless network technology to bridge the gap between a dynamic (moving) sample frame and the static laboratory frame. Our design is a custom solution applied to samples that rotate through large angular displacements where hard-wired and typical slip-ring solutions are not practical because of noise considerations. The apparatus consists of a Raspberry PI mini-Linux computer, an Arduino micro-controller, an Ocean Controls thermocouple multiplexer shield, and k-type thermocouples.

  1. Method for collecting thermocouple data via secured shell over a wireless local area network in real time

    Science.gov (United States)

    Arnold, F.; DeMallie, I.; Florence, L.; Kashinski, D. O.

    2015-03-01

    This manuscript addresses the design, hardware details, construction, and programming of an apparatus allowing an experimenter to monitor and record high-temperature thermocouple measurements of dynamic systems in real time. The apparatus uses wireless network technology to bridge the gap between a dynamic (moving) sample frame and the static laboratory frame. Our design is a custom solution applied to samples that rotate through large angular displacements where hard-wired and typical slip-ring solutions are not practical because of noise considerations. The apparatus consists of a Raspberry PI mini-Linux computer, an Arduino micro-controller, an Ocean Controls thermocouple multiplexer shield, and k-type thermocouples.

  2. Educational Program Evaluation Using CIPP Model

    OpenAIRE

    Warju, Warju

    2016-01-01

    There are many models of evaluation that can be used to evaluate a program. However, the most commonly used is the context, input, process, output (CIPP) evaluation models. CIPP evaluation model developed by Stufflebeam and Shinkfield in 1985. The evaluation context is used to give a rational reason a selected program or curriculum to be implemented. A wide scale, context can be evaluated on: the program's objectives, policies that support the vision and mission of the institution, the releva...

  3. Preparation and thermal volatility characteristics of In2O3/ITO thin film thermocouple by RF magnetron sputtering

    Directory of Open Access Journals (Sweden)

    Yantao Liu

    2017-11-01

    Full Text Available In2O3/ITO thin film thermocouples for high temperature measurement (up to 1250 °C were prepared by radio frequency magnetron sputtering method with different annealing temperatures from 1100 °C to 1250 °C. The changes with microstructure characteristics and the thickness of the thin film thermocouples were investigated as a function of sintering temperature in the range of 1100 °C -1250 °C and annealing time from 2 hrs to 10 hrs at 1200 °C by using XRD and SEM techniques. The thermoelectric output was measured and its results indicated that this thermocouple had a steady and constant voltage output from room temperature to 1247 oC. The thermoelectric voltage and Seebeck coefficient of In2O3/ITO thermocouples measured at 1247 oC were 166.7 mV and 136.3 μV/oC, respectively.

  4. Improving high-temperature measurements in nuclear reactors with Mo/Nb thermocouples

    International Nuclear Information System (INIS)

    Villard, J. F.; Fourmentel, D.; Legrand, A.; Fourrez, S.

    2008-01-01

    Many irradiation experiments performed in research reactors are used to assess the effects of nuclear radiations on material or fuel sample properties, and are therefore a crucial stage in most qualification and innovation studies regarding nuclear technologies. However, monitoring these experiments requires accurate and reliable instrumentation. Among all measurement systems implemented in irradiation devices, temperature-and more particularly high-temperature (above 1000 degrees C)-is a major parameter for future experiments related, for example, to the Generation IV International Forum (GIF) Program or the International Thermonuclear Experimental Reactor (ITER) Project. In this context, the French Commissariat a l'Energie Atomique (CEA) develops and qualifies innovative in-pile instrumentation for its irradiation experiments in current and future research reactors. Logically, a significant part of these research and development programs concerns the improvement of in-pile high-temperature measurements. This article describes the development and qualification of innovative high-temperature thermocouples specifically designed for in-pile applications. This key study has been achieved with technical contributions from the Thermocoax Company. This new kind of thermocouple is based on molybdenum and niobium thermo-elements, which remain nearly unchanged by thermal neutron flux even under harsh nuclear environments, whereas typical high-temperature thermocouples such as Type C or Type S are altered by significant drifts caused by material transmutations under the same conditions. This improvement has a significant impact on the temperature measurement capabilities for future irradiation experiments. Details of the successive stages of this development are given, including the results of prototype qualification tests and the manufacturing process. (authors)

  5. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  6. Fabrication of flexible Ir and Ir-Rh wires and application for thermocouple

    Science.gov (United States)

    Murakami, Rikito; Kamada, Kei; Shoji, Yasuhiro; Yokota, Yuui; Yoshino, Masao; Kurosawa, Shunsuke; Ohashi, Yuji; Yamaji, Akihiro; Yoshikawa, Akira

    2018-04-01

    The fabrication and thermal electromotive force characteristics of Ir/Ir-Rh thermocouples capable of repeated bending deformation are described. Ir and Ir-Rh wires with a diameter of 0.5 mm were fabricated using the alloy-micro-pulling-down method. Scanning electron microscopy and electron backscattering diffraction of the radial cross section of the grown wires were performed to investigate the microstructure and orientation of the crystal grains. At the start of growth, the microstructure was polycrystalline with diameters of several hundred micrometers, while at the 8-m growth point it was found to be monocrystalline. The observed single crystals of pure Ir and Ir-Rh alloy were oriented in the 〈1 1 3〉 and 〈1 1 2〉 directions, respectively, whereas the polycrystalline Ir-Rh samples showed preferential growth in the 〈1 0 0〉 direction. The thermal electromotive force of the fabricated Ir/Ir-Rh thermocouple was measured by the comparison technique and the fixed-point technique, and the thermoelectric power was estimated to be 5.9 μV/°C in the range from 600°C to 1100°C.

  7. Lag compensation of optical fibers or thermocouples to achieve waveform fidelity in dynamic gas pyrometry

    Science.gov (United States)

    Warshawsky, I.

    1991-01-01

    Fidelity of waveform reproduction requires constant amplitude ratio and constant time lag of a temperature sensor's indication, at all frequencies of interest. However, heat-transfer type sensors usually cannot satisfy these requirements. Equations for the actual indication of a thermocouple and an optical-fiber pyrometer are given explicitly, in terms of sensor and flowing-gas properties. A practical, realistic design of each type of sensor behaves like a first-order system with amplitude-ratio attenuation inversely proportional to frequency when the frequency exceeds the corner frequency. Only at much higher frequencies does the amplitude-ratio attenuation for the optical fiber sensor become inversely proportional to the square root of the frequency. Design options for improving the frequency response are discussed. On-line electrical lag compensation, using a linear amplifier and a passive compensation network, can extend the corner frequency of the thermocouple 100-fold or more; a similar passive network can be used for the optical-fiber sensor. Design details for these networks are presented.

  8. Improvement in the technology of thermocouples for the detection of high temperatures with a view to using them in irradiation safety tests in reactor

    International Nuclear Information System (INIS)

    Schley, R.; Liermann, J.; Aujollet, J.M.; Wilkins, S.C.

    1979-01-01

    The safety tests carried out under the CABRI and PHEBUS programmes have made it possible to improve the technology of W/Re thermocouples and their reliability in particularly hard operating conditions. An element of response is provided to the problem of W/Re thermocouple drift under neutron flux by defining the new thermocouple Mo 5% Nb/Nb 10% Mo which, because of the low capture cross section of thermoelectric elements, gives one reason to hope for a less significant drift of these thermocouples under neutron flux than that found with W/Re thermocouples. Finally, determining the surface temperature of fuel element cladding with the Mo/Zircaloy thermocouple may prove worthwhile providing the temperatures do not exceed 1300 0 C and the electric insulator is aluminium oxide which up to 1300 0 C does not appear to react with thermoelectric wires [fr

  9. EPA Corporate GHG Goal Evaluation Model

    Science.gov (United States)

    The EPA Corporate GHG Goal Evaluation Model provides companies with a transparent and publicly available benchmarking resource to help evaluate and establish new or existing GHG goals that go beyond business as usual for their individual sectors.

  10. Apparatus for spot welding sheathed thermocouples to the inside of small-diameter tubes at precise locations

    International Nuclear Information System (INIS)

    Baucum, W.E.; Dial, R.E.

    1976-01-01

    Equipment and procedures used to spot weld tantalum- or stainless-steel-sheathed thermocouples to the inside diameter of Zircaloy tubing to meet the requirements of the Multirod Burst Test (MRBT) Program at ORNL are described. Spot welding and oxide cleaning tools were fabricated to remove the oxide coating on the Zircaloy tubing at local areas and spot weld four thermocouples separated circumferentially by 90 0 at any axial distribution desired. It was found necessary to apply a nickel coating to stainless-steel-sheathed thermocouples to obtain acceptable welds. The material and shape of the inner electrode and resistance between inner and outer electrodes were found to be critical parameters in obtaining acceptable welds

  11. The EMEFS model evaluation. An interim report

    Energy Technology Data Exchange (ETDEWEB)

    Barchet, W.R. [Pacific Northwest Lab., Richland, WA (United States); Dennis, R.L. [Environmental Protection Agency, Research Triangle Park, NC (United States); Seilkop, S.K. [Analytical Sciences, Inc., Durham, NC (United States); Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K. [Atmospheric Environment Service, Downsview, ON (Canada); Byun, D.; McHenry, J.N. [Computer Sciences Corp., Research Triangle Park, NC (United States); Karamchandani, P.; Venkatram, A. [ENSR Consulting and Engineering, Camarillo, CA (United States); Fung, C.; Misra, P.K. [Ontario Ministry of the Environment, Toronto, ON (Canada); Hansen, D.A. [Electric Power Research Inst., Palo Alto, CA (United States); Chang, J.S. [State Univ. of New York, Albany, NY (United States). Atmospheric Sciences Research Center

    1991-12-01

    The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs.

  12. Preparation and Thermoelectric Characteristics of ITO/PtRh:PtRh Thin Film Thermocouple

    Science.gov (United States)

    Zhao, Xiaohui; Wang, Hongmin; Zhao, Zixiang; Zhang, Wanli; Jiang, Hongchuan

    2017-12-01

    Thin film thermocouples (TFTCs) can provide more precise in situ temperature measurement for aerospace propulsion systems without disturbance of gas flow and surface temperature distribution of the hot components. ITO /PtRh:PtRh TFTC with multilayer structure was deposited on alumina ceramic substrate by magnetron sputtering. After annealing, the TFTC was statically calibrated for multiple cycles with temperature up to 1000 °C. The TFTC with excellent stability and repeatability was realized for the negligible variation of EMF in different calibration cycles. It is believed that owing to oxygen diffusion barriers by the oxidation of top PtRh layer and Schottky barriers formed at the grain boundaries of ITO, the variation of the carrier concentration of ITO film is minimized. Meanwhile, the life time of TFTC is more than 30 h in harsh environment. This makes ITO/PtRh:PtRh TFTC a promising candidate for precise surface temperature measurement of hot components of aeroengines.

  13. Simple method for measuring vibration amplitude of high power airborne ultrasonic transducer: using thermo-couple.

    Science.gov (United States)

    Saffar, Saber; Abdullah, Amir

    2014-03-01

    Vibration amplitude of transducer's elements is the influential parameters in the performance of high power airborne ultrasonic transducers to control the optimum vibration without material yielding. The vibration amplitude of elements of provided high power airborne transducer was determined by measuring temperature of the provided high power airborne transducer transducer's elements. The results showed that simple thermocouples can be used both to measure the vibration amplitude of transducer's element and an indicator to power transmission to the air. To verify our approach, the power transmission to the air has been investigated by other common method experimentally. The experimental results displayed good agreement with presented approach. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Tile Surface Thermocouple Measurement Challenges from the Orbiter Boundary Layer Transition Flight Experiment

    Science.gov (United States)

    Campbell, Charles H.; Berger, Karen; Anderson, Brian

    2012-01-01

    Hypersonic entry flight testing motivated by efforts seeking to characterize boundary layer transition on the Space Shuttle Orbiters have identified challenges in our ability to acquire high quality quantitative surface temperature measurements versus time. Five missions near the end of the Space Shuttle Program implemented a tile surface protuberance as a boundary layer trip together with tile surface thermocouples to capture temperature measurements during entry. Similar engineering implementations of these measurements on Discovery and Endeavor demonstrated unexpected measurement voltage response during the high heating portion of the entry trajectory. An assessment has been performed to characterize possible causes of the issues experienced during STS-119, STS-128, STS-131, STS-133 and STS-134 as well as similar issues encountered during other orbiter entries.

  15. Operating Temperatures of a Sodium-Cooled Exhaust Valve as Measured by a Thermocouple

    Science.gov (United States)

    Sanders, J. C.; Wilsted, H. D.; Mulcahy, B. A.

    1943-01-01

    A thermocouple was installed in the crown of a sodium-cooled exhaust valve. The valve was then tested in an air-cooled engine cylinder and valve temperatures under various engine operating conditions were determined. A temperature of 1337 F was observed at a fuel-air ratio of 0.064, a brake mean effective pressure of 179 pounds per square inch, and an engine speed of 2000 rpm. Fuel-air ratio was found to have a large influence on valve temperature, but cooling-air pressure and variation in spark advance had little effect. An increase in engine power by change of speed or mean effective pressure increased the valve temperature. It was found that the temperature of the rear spark-plug bushing was not a satisfactory indication of the temperature of the exhaust valve.

  16. Low-noise audio amplifiers and preamplifier for use with intrinsic thermocouples

    International Nuclear Information System (INIS)

    Langner, G.C.; Sachs, R.D.; Stewart, F.L.

    1979-03-01

    Two simple, low-noise audio amplifiers and one low-noise preamplifier for use with intrinsic thermocouples were designed, built, and tested. The amplifiers and the preamplifier have different front end designs. One amplifier uses ultralow-noise operational amplifiers; the other amplifier uses a hybrid component. The preamplifier uses ultralow-noise discrete components. The amplifiers' equivalent noise inputs, at maximum gain, are 4.09 nV and 50 nV; the preamplifier's input is 4.05 μV. Their bandwidths are 15 600 Hz, 550 Hz, and 174 kHz, respectively. the amplifiers' equivalent noise inputs were measured from approx. 0 to 100 Hz, whereas the preamplifier's equivalent noise input was measured from approx. 0 to 174 kHz

  17. Investigating Microbial Habitats in Hydrothermal Chimneys using Ti-Thermocouple Arrays: Microbial Diversity

    Science.gov (United States)

    Pagé, A.; Tivey, M. K.; Stakes, D. S.; Bradley, A. M.; Seewald, J. S.; Wheat, C. G.; Reysenbach, A.

    2004-12-01

    In order to examine the changes that occur in the microbial community composition as a deep-sea hydrothermal vent chimney develops, we deployed Ti-thermocouple arrays over high temperature vents at two active sites of the Guaymas Basin Southern Trough. Chimney material that precipitated around the arrays was recovered after 4 and 72 days. Chimney material that precipitated prior to deployment of the arrays was also recovered at one of the sites (Busted Shroom). Culture-independent analysis based on the small subunit rRNA sequence (cloning and DGGE) was used to determine the microbial diversity associated with subsamples of each chimney. The original Busted Shroom chimney (BSO) was dominated by members of the Crenarchaeota Marine Group I, a group of cosmopolitan marine Archaea, ɛ -Proteobacteria, and γ -Proteobacteria, two divisions of Bacteria that are common to deep-sea vents. The 4 days old Busted Shroom chimney (BSD1) was dominated by members of the Methanocaldococcaceae, hyperthermophilic methanogens, and the 72 days old chimney (BSD2) by members of the Methanosarcinaceae, mesophilic and thermophilic methanogens. At the second site, Toadstool, the 72 days old chimney material that had precipitated around the array (TS) revealed the dominance of sequences from uncultured marine Archaea, the DHVE group I and II, and from the ɛ -Proteobacteria. Additionally, sequences belonging to the Methanocaldococcaceae and Desulfurococcaceae were recovered next to thermocouples that were at temperatures of 109° C (at Busted Shroom) and 116° C (at Toadstool), respectively. These temperatures are higher than the upper limit for growth of cultured representatives from each family.

  18. Co-C and Pd-C Eutectic Fixed Points for Radiation Thermometry and Thermocouple Thermometry

    Science.gov (United States)

    Wang, L.

    2017-12-01

    Two Co-C and Pd-C eutectic fixed point cells for both radiation thermometry and thermocouple thermometry were constructed at NMC. This paper describes details of the cell design, materials used, and fabrication of the cells. The melting curves of the Co-C and Pd-C cells were measured with a reference radiation thermometer realized in both a single-zone furnace and a three-zone furnace in order to investigate furnace effect. The transition temperatures in terms of ITS-90 were determined to be 1324.18 {°}C and 1491.61 {°}C with the corresponding combined standard uncertainty of 0.44 {°}C and 0.31 {°}C for Co-C and Pd-C, respectively, taking into account of the differences of two different types of furnaces used. The determined ITS-90 temperatures are also compared with that of INRIM cells obtained using the same reference radiation thermometer and the same furnaces with the same settings during a previous bilateral comparison exercise (Battuello et al. in Int J Thermophys 35:535-546, 2014). The agreements are within k=1 uncertainty for Co-C cell and k = 2 uncertainty for Pd-C cell. Shapes of the plateaus of NMC cells and INRIM cells are compared too and furnace effects are analyzed as well. The melting curves of the Co-C and Pd-C cells realized in the single-zone furnace are also measured by a Pt/Pd thermocouple, and the preliminary results are presented as well.

  19. Evaluating topic models with stability

    CSIR Research Space (South Africa)

    De Waal, A

    2008-11-01

    Full Text Available on unlabelled data, so that a ground truth does not exist and (b) "soft" (probabilistic) document clusters are created by state-of-the-art topic models, which complicates comparisons even when ground truth labels are available. Perplexity has often been used...

  20. Use of a thermocouple-datalogger system to evaluate overstory mortality

    Science.gov (United States)

    Lucy Brudnak; Thomas A. Waldrop; Ross J. Phillips

    2010-01-01

    In the past, it was difficult to accurately measure dynamic fire behavior during prescribed burns. Peak temperature, flaming duration, and total heat output may be directly related to first-order fire effects such as fuel consumption and vegetative mortality; however, little is known about which of these variables is most closely associated with, and therefore the best...

  1. Site descriptive modelling - strategy for integrated evaluation

    International Nuclear Information System (INIS)

    Andersson, Johan

    2003-02-01

    The current document establishes the strategy to be used for achieving sufficient integration between disciplines in producing Site Descriptive Models during the Site Investigation stage. The Site Descriptive Model should be a multidisciplinary interpretation of geology, rock mechanics, thermal properties, hydrogeology, hydrogeochemistry, transport properties and ecosystems using site investigation data from deep bore holes and from the surface as input. The modelling comprise the following iterative steps, evaluation of primary data, descriptive and quantitative modelling (in 3D), overall confidence evaluation. Data are first evaluated within each discipline and then the evaluations are checked between the disciplines. Three-dimensional modelling (i.e. estimating the distribution of parameter values in space and its uncertainty) is made in a sequence, where the geometrical framework is taken from the geological model and in turn used by the rock mechanics, thermal and hydrogeological modelling etc. The three-dimensional description should present the parameters with their spatial variability over a relevant and specified scale, with the uncertainty included in this description. Different alternative descriptions may be required. After the individual discipline modelling and uncertainty assessment a phase of overall confidence evaluation follows. Relevant parts of the different modelling teams assess the suggested uncertainties and evaluate the feedback. These discussions should assess overall confidence by, checking that all relevant data are used, checking that information in past model versions is considered, checking that the different kinds of uncertainty are addressed, checking if suggested alternatives make sense and if there is potential for additional alternatives, and by discussing, if appropriate, how additional measurements (i.e. more data) would affect confidence. The findings as well as the modelling results are to be documented in a Site Description

  2. Study for on-line system to identify inadvertent control rod drops in PWR reactors using ex-core detector and thermocouple measures

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Thiago J.; Medeiros, Jose A.C.C.; Goncalves, Alessandro C., E-mail: tsouza@nuclear.ufrj.br, E-mail: canedo@lmp.ufrj.br, E-mail: alessandro@nuclear.ufrj.br [Coordenacao dos Programas de Pos-Graduacao em Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear

    2015-07-01

    Accidental control rod drops event in PWR reactors leads to an unsafe operating condition. It is important to quickly identify the rod to minimize undesirable effects in such a scenario. In this event, there is a distortion in the power distribution and temperature in the reactor core. The goal of this study is to develop an on-line model to identify the inadvertent control rod dropped in PWR reactor. The proposed model is based on physical correlations and pattern recognition of ex-core detector responses and thermocouples measures. The results of the study demonstrated the feasibility of an on-line system, contributing to safer operation conditions and preventing undesirable effects, as its shutdown. (author)

  3. Rock mechanics models evaluation report: Draft report

    International Nuclear Information System (INIS)

    1985-10-01

    This report documents the evaluation of the thermal and thermomechanical models and codes for repository subsurface design and for design constraint analysis. The evaluation was based on a survey of the thermal and thermomechanical codes and models that are applicable to subsurface design, followed by a Kepner-Tregoe (KT) structured decision analysis of the codes and models. The end result of the KT analysis is a balanced, documented recommendation of the codes and models which are best suited to conceptual subsurface design for the salt repository. The various laws for modeling the creep of rock salt are also reviewed in this report. 37 refs., 1 fig., 7 tabs

  4. Seebeck Changes Due to Residual Cold-Work and Reversible Effects in Type K Bare-Wire Thermocouples

    Science.gov (United States)

    Webster, E. S.

    2017-09-01

    Type K thermocouples are the most commonly used thermocouple for industrial measurements because of their low cost, wide temperature range, and durability. As with all base-metal thermocouples, Type K is made to match a mathematical temperature-to-emf relationship and not a prescribed alloy formulation. Because different manufacturers use varying alloy formulations and manufacturing techniques, different Type K thermocouples exhibit a range of drift and hysteresis characteristics, largely due to ordering effects in the positive (K+) thermoelement. In this study, these effects are assessed in detail for temperatures below 700°C in the Type K wires from nine manufacturers. A linear gradient furnace and a high-resolution homogeneity scanner combined with the judicious use of annealing processes allow measurements that separately identify the effects of cold-work, ordering, and oxidation to be made. The results show most K+ alloys develop significant errors, but the magnitudes of the contributions of each process vary substantially between the different K+ wires. In practical applications, the measurement uncertainties achievable with Type K therefore depend not only on the wire formulation but also on the temperature, period of exposure, and, most importantly, the thermal treatments prior to use.

  5. Individual model evaluation and probabilistic weighting of models

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-01-01

    This note stresses the importance of trying to assess the accuracy of each model individually. Putting a Bayesian probability distribution on a population of models faces conceptual and practical complications, and apparently can come only after the work of evaluating the individual models. Moreover, the primary issue is open-quotes How good is this modelclose quotes? Therefore, the individual evaluations are first in both chronology and importance. They are not easy, but some ideas are given here on how to perform them

  6. [Evaluation model for municipal health planning management].

    Science.gov (United States)

    Berretta, Isabel Quint; Lacerda, Josimari Telino de; Calvo, Maria Cristina Marino

    2011-11-01

    This article presents an evaluation model for municipal health planning management. The basis was a methodological study using the health planning theoretical framework to construct the evaluation matrix, in addition to an understanding of the organization and functioning designed by the Planning System of the Unified National Health System (PlanejaSUS) and definition of responsibilities for the municipal level under the Health Management Pact. The indicators and measures were validated using the consensus technique with specialists in planning and evaluation. The applicability was tested in 271 municipalities (counties) in the State of Santa Catarina, Brazil, based on population size. The proposed model features two evaluative dimensions which reflect the municipal health administrator's commitment to planning: the guarantee of resources and the internal and external relations needed for developing the activities. The data were analyzed using indicators, sub-dimensions, and dimensions. The study concludes that the model is feasible and appropriate for evaluating municipal performance in health planning management.

  7. Evaluation of green house gas emissions models.

    Science.gov (United States)

    2014-11-01

    The objective of the project is to evaluate the GHG emissions models used by transportation agencies and industry leaders. Factors in the vehicle : operating environment that may affect modal emissions, such as, external conditions, : vehicle fleet c...

  8. COMPUTER MODEL FOR ORGANIC FERTILIZER EVALUATION

    OpenAIRE

    Lončarić, Zdenko; Vukobratović, Marija; Ragaly, Peter; Filep, Tibor; Popović, Brigita; Karalić, Krunoslav; Vukobratović, Želimir

    2009-01-01

    Evaluation of manures, composts and growing media quality should include enough properties to enable an optimal use from productivity and environmental points of view. The aim of this paper is to describe basic structure of organic fertilizer (and growing media) evaluation model to present the model example by comparison of different manures as well as example of using plant growth experiment for calculating impact of pH and EC of growing media on lettuce plant growth. The basic structure of ...

  9. The Air Quality Model Evaluation International Initiative ...

    Science.gov (United States)

    This presentation provides an overview of the Air Quality Model Evaluation International Initiative (AQMEII). It contains a synopsis of the three phases of AQMEII, including objectives, logistics, and timelines. It also provides a number of examples of analyses conducted through AQMEII with a particular focus on past and future analyses of deposition. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  10. A MICROCOMPUTER MODEL FOR IRRIGATION SYSTEM EVALUATION

    OpenAIRE

    Williams, Jeffery R.; Buller, Orlan H.; Dvorak, Gary J.; Manges, Harry L.

    1988-01-01

    ICEASE (Irrigation Cost Estimator and System Evaluator) is a microcomputer model designed and developed to meet the need for conducting economic evaluation of adjustments to irrigation systems and management techniques to improve the use of irrigated water. ICEASE can calculate the annual operating costs for irrigation systems and has five options that can be used to economically evaluate improvements in the pumping plant or the way the irrigation system is used for crop production.

  11. Evaluation of constitutive models for crushed salt

    Energy Technology Data Exchange (ETDEWEB)

    Callahan, G.D.; Loken, M.C. [RE/SPEC, Inc., Rapid City, SD (United States); Hurtado, L.D.; Hansen, F.D.

    1996-05-01

    Three constitutive models are recommended as candidates for describing the deformation of crushed salt. These models are generalized to three-dimensional states of stress to include the effects of mean and deviatoric stress and modified to include effects of temperature, grain size, and moisture content. A database including hydrostatic consolidation and shear consolidation tests conducted on Waste Isolation Pilot Plant (WIPP) and southeastern New Mexico salt is used to determine material parameters for the models. To evaluate the capability of the models, parameter values obtained from fitting the complete database are used to predict the individual tests. Finite element calculations of a WIPP shaft with emplaced crushed salt demonstrate the model predictions.

  12. Modeling for Green Supply Chain Evaluation

    Directory of Open Access Journals (Sweden)

    Elham Falatoonitoosi

    2013-01-01

    Full Text Available Green supply chain management (GSCM has become a practical approach to develop environmental performance. Under strict regulations and stakeholder pressures, enterprises need to enhance and improve GSCM practices, which are influenced by both traditional and green factors. This study developed a causal evaluation model to guide selection of qualified suppliers by prioritizing various criteria and mapping causal relationships to find effective criteria to improve green supply chain. The aim of the case study was to model and examine the influential and important main GSCM practices, namely, green logistics, organizational performance, green organizational activities, environmental protection, and green supplier evaluation. In the case study, decision-making trial and evaluation laboratory technique is applied to test the developed model. The result of the case study shows only “green supplier evaluation” and “green organizational activities” criteria of the model are in the cause group and the other criteria are in the effect group.

  13. Evaluation of models in performance assessment

    International Nuclear Information System (INIS)

    Dormuth, K.W.

    1993-01-01

    The reliability of models used for performance assessment for high-level waste repositories is a key factor in making decisions regarding the management of high-level waste. Model reliability may be viewed as a measure of the confidence that regulators and others have in the use of these models to provide information for decision making. The degree of reliability required for the models will increase as implementation of disposal proceeds and decisions become increasingly important to safety. Evaluation of the models by using observations of real systems provides information that assists the assessment analysts and reviewers in establishing confidence in the conclusions reached in the assessment. A continuing process of model calibration, evaluation, and refinement should lead to increasing reliability of models as implementation proceeds. However, uncertainty in the model predictions cannot be eliminated, so decisions will always be made under some uncertainty. Examples from the Canadian program illustrate the process of model evaluation using observations of real systems and its relationship to performance assessment. 21 refs., 2 figs

  14. Comparison of two surface temperature measurement using thermocouples and infrared camera

    Directory of Open Access Journals (Sweden)

    Michalski Dariusz

    2017-01-01

    Full Text Available This paper compares two methods applied to measure surface temperatures at an experimental setup designed to analyse flow boiling heat transfer. The temperature measurements were performed in two parallel rectangular minichannels, both 1.7 mm deep, 16 mm wide and 180 mm long. The heating element for the fluid flowing in each minichannel was a thin foil made of Haynes-230. The two measurement methods employed to determine the surface temperature of the foil were: the contact method, which involved mounting thermocouples at several points in one minichannel, and the contactless method to study the other minichannel, where the results were provided with an infrared camera. Calculations were necessary to compare the temperature results. Two sets of measurement data obtained for different values of the heat flux were analysed using the basic statistical methods, the method error and the method accuracy. The experimental error and the method accuracy were taken into account. The comparative analysis showed that although the values and distributions of the surface temperatures obtained with the two methods were similar but both methods had certain limitations.

  15. Thermocouple and infrared sensor-based measurement of temperature distribution in metal cutting.

    Science.gov (United States)

    Kus, Abdil; Isik, Yahya; Cakir, M Cemal; Coşkun, Salih; Özdemir, Kadir

    2015-01-12

    In metal cutting, the magnitude of the temperature at the tool-chip interface is a function of the cutting parameters. This temperature directly affects production; therefore, increased research on the role of cutting temperatures can lead to improved machining operations. In this study, tool temperature was estimated by simultaneous temperature measurement employing both a K-type thermocouple and an infrared radiation (IR) pyrometer to measure the tool-chip interface temperature. Due to the complexity of the machining processes, the integration of different measuring techniques was necessary in order to obtain consistent temperature data. The thermal analysis results were compared via the ANSYS finite element method. Experiments were carried out in dry machining using workpiece material of AISI 4140 alloy steel that was heat treated by an induction process to a hardness of 50 HRC. A PVD TiAlN-TiN-coated WNVG 080404-IC907 carbide insert was used during the turning process. The results showed that with increasing cutting speed, feed rate and depth of cut, the tool temperature increased; the cutting speed was found to be the most effective parameter in assessing the temperature rise. The heat distribution of the cutting tool, tool-chip interface and workpiece provided effective and useful data for the optimization of selected cutting parameters during orthogonal machining.

  16. Thermocouple and Infrared Sensor-Based Measurement of Temperature Distribution in Metal Cutting

    Directory of Open Access Journals (Sweden)

    Abdil Kus

    2015-01-01

    Full Text Available In metal cutting, the magnitude of the temperature at the tool-chip interface is a function of the cutting parameters. This temperature directly affects production; therefore, increased research on the role of cutting temperatures can lead to improved machining operations. In this study, tool temperature was estimated by simultaneous temperature measurement employing both a K-type thermocouple and an infrared radiation (IR pyrometer to measure the tool-chip interface temperature. Due to the complexity of the machining processes, the integration of different measuring techniques was necessary in order to obtain consistent temperature data. The thermal analysis results were compared via the ANSYS finite element method. Experiments were carried out in dry machining using workpiece material of AISI 4140 alloy steel that was heat treated by an induction process to a hardness of 50 HRC. A PVD TiAlN-TiN-coated WNVG 080404-IC907 carbide insert was used during the turning process. The results showed that with increasing cutting speed, feed rate and depth of cut, the tool temperature increased; the cutting speed was found to be the most effective parameter in assessing the temperature rise. The heat distribution of the cutting tool, tool-chip interface and workpiece provided effective and useful data for the optimization of selected cutting parameters during orthogonal machining.

  17. Thermocouple and Infrared Sensor-Based Measurement of Temperature Distribution in Metal Cutting

    Science.gov (United States)

    Kus, Abdil; Isik, Yahya; Cakir, M. Cemal; Coşkun, Salih; Özdemir, Kadir

    2015-01-01

    In metal cutting, the magnitude of the temperature at the tool-chip interface is a function of the cutting parameters. This temperature directly affects production; therefore, increased research on the role of cutting temperatures can lead to improved machining operations. In this study, tool temperature was estimated by simultaneous temperature measurement employing both a K-type thermocouple and an infrared radiation (IR) pyrometer to measure the tool-chip interface temperature. Due to the complexity of the machining processes, the integration of different measuring techniques was necessary in order to obtain consistent temperature data. The thermal analysis results were compared via the ANSYS finite element method. Experiments were carried out in dry machining using workpiece material of AISI 4140 alloy steel that was heat treated by an induction process to a hardness of 50 HRC. A PVD TiAlN-TiN-coated WNVG 080404-IC907 carbide insert was used during the turning process. The results showed that with increasing cutting speed, feed rate and depth of cut, the tool temperature increased; the cutting speed was found to be the most effective parameter in assessing the temperature rise. The heat distribution of the cutting tool, tool-chip interface and workpiece provided effective and useful data for the optimization of selected cutting parameters during orthogonal machining. PMID:25587976

  18. Some studies on the behavior of W-RE thermocouple materials at high temperatures

    Science.gov (United States)

    Burns, G. W.; Hurst, W. S.

    1972-01-01

    Bare 0.25 mm diameter W-Re alloy thermoelements (W, W-3% Re, W-5% Re and W-25%) and BeO-insulated W-3% Re and W-25% Re thermoelements were examined for metallurgical, chemical and thermal emf changes after testing for periods up to 1000 hours at temperatures principally in the range 2000 to 2400 K. Environments for the tests consisted of high purity argon, hydrogen, helium or nitrogen gases. Commercially obtained bare-wire thermoelements typically exhibited a shift in their emf-temperature relationship upon initial exposure. The shift was completed by thermally aging the W-3% Re thermoelement for 1 hour and the W-25% Re thermoelement for 2 minutes at 2400 K in argon or hydrogen. Aged thermoelements experienced no appreciable drift with subsequent exposure at 2400 K in the gaseous environments. The chemically doped W3% Re thermoelement retained a small-grained structure for exposure in excess of 50 hours at 2400 K. BeO-insulated thermoelement assemblies showed varied behavior that depended upon the method of exposure. However, when the assemblies were heated in a furnace, no serious material incompatibility problems were found if the materials were given prior thermal treatments. Thermocouples, assembled from aged W-3% Re and W-25% Re thermoelements and degassed sintered BeO insulators, exhibited a drift of only 2 to 3 K during exposure in argon at 2070 K for 1029 hours.

  19. Multi-criteria evaluation of hydrological models

    Science.gov (United States)

    Rakovec, Oldrich; Clark, Martyn; Weerts, Albrecht; Hill, Mary; Teuling, Ryan; Uijlenhoet, Remko

    2013-04-01

    Over the last years, there is a tendency in the hydrological community to move from the simple conceptual models towards more complex, physically/process-based hydrological models. This is because conceptual models often fail to simulate the dynamics of the observations. However, there is little agreement on how much complexity needs to be considered within the complex process-based models. One way to proceed to is to improve understanding of what is important and unimportant in the models considered. The aim of this ongoing study is to evaluate structural model adequacy using alternative conceptual and process-based models of hydrological systems, with an emphasis on understanding how model complexity relates to observed hydrological processes. Some of the models require considerable execution time and the computationally frugal sensitivity analysis, model calibration and uncertainty quantification methods are well-suited to providing important insights for models with lengthy execution times. The current experiment evaluates two version of the Framework for Understanding Structural Errors (FUSE), which both enable running model inter-comparison experiments. One supports computationally efficient conceptual models, and the second supports more-process-based models that tend to have longer execution times. The conceptual FUSE combines components of 4 existing conceptual hydrological models. The process-based framework consists of different forms of Richard's equations, numerical solutions, groundwater parameterizations and hydraulic conductivity distribution. The hydrological analysis of the model processes has evolved from focusing only on simulated runoff (final model output), to also including other criteria such as soil moisture and groundwater levels. Parameter importance and associated structural importance are evaluated using different types of sensitivity analyses techniques, making use of both robust global methods (e.g. Sobol') as well as several

  20. COST EVALUATION: STRUCTURING OF A MODEL

    Directory of Open Access Journals (Sweden)

    Altair Borgert

    2010-07-01

    Full Text Available This study’s purpose was to build a cost evaluation model with views to providing managers and decision makers with information to support the resolution process. From a strategic positioning standpoint, the pondering of variables involved in a cost system is key to corporate success. To this extent, overall consideration was given to contemporary cost approaches – the Theory of Constraints, Balanced Scorecard and Strategic Cost Management – and cost evaluation was analysed. It is understood that this is a relevant factor and that it ought to be taken into account when taking corporate decisions. Furthermore, considering that the MCDA methodology is recommended for the construction of cost evaluation models, some of it’s aspects were emphasised. Finally, the construction of the model itself complements this study. At this stage, cost variables for the three approaches were compiled. Thus, a repository of several variables was created and its use and combination is subject to the interests and needs of those responsible for it’s structuring within corporations. In so proceeding, the number of variables to ponder follows the complexity of the issue and of the required solution. Once meetings held with the study groups, the model was built, revised and reconstructed until consensus was reached. Thereafter, the conclusion was that a cost evaluation model, when built according to the characteristics and needs of each organization, might become the groundwork ensuring accounting becomes increasingly useful at  companies. Key-words: Cost evaluation. Cost measurement. Strategy.

  1. Saphire models and software for ASP evaluations

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission's (NRC's) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented

  2. Modeling Energy and Development : An Evaluation of Models and Concepts

    NARCIS (Netherlands)

    Ruijven, Bas van; Urban, Frauke; Benders, René M.J.; Moll, Henri C.; Sluijs, Jeroen P. van der; Vries, Bert de; Vuuren, Detlef P. van

    2008-01-01

    Most global energy models are developed by institutes from developed countries focusing primarily oil issues that are important in industrialized countries. Evaluation of the results for Asia of the IPCC/SRES models shows that broad concepts of energy and development. the energy ladder and the

  3. Nuclear Power Plant Thermocouple Sensor-Fault Detection and Classification Using Deep Learning and Generalized Likelihood Ratio Test

    Science.gov (United States)

    Mandal, Shyamapada; Santhi, B.; Sridhar, S.; Vinolia, K.; Swaminathan, P.

    2017-06-01

    In this paper, an online fault detection and classification method is proposed for thermocouples used in nuclear power plants. In the proposed method, the fault data are detected by the classification method, which classifies the fault data from the normal data. Deep belief network (DBN), a technique for deep learning, is applied to classify the fault data. The DBN has a multilayer feature extraction scheme, which is highly sensitive to a small variation of data. Since the classification method is unable to detect the faulty sensor; therefore, a technique is proposed to identify the faulty sensor from the fault data. Finally, the composite statistical hypothesis test, namely generalized likelihood ratio test, is applied to compute the fault pattern of the faulty sensor signal based on the magnitude of the fault. The performance of the proposed method is validated by field data obtained from thermocouple sensors of the fast breeder test reactor.

  4. [Evaluation of the Dresden Tympanoplasty Model (DTM)].

    Science.gov (United States)

    Beleites, T; Neudert, M; Lasurashvili, N; Kemper, M; Offergeld, C; Hofmann, G; Zahnert, T

    2011-11-01

    The training of microsurgical motor skills is essentiell for surgical education if the interests of the patient are to be safeguarded. In otosurgery the complex anatomy of the temporal bone and variations necessitate a special training before performing surgery on a patient. We therefore developed and evaluated a simplified middle ear model for acquiring first microsurgical skills in tympanoplasty.The simplified tympanoplasty model consists of the outer ear canal and a tympanic cavity. A stapes model is placed in projection of the upper posterior tympanic membrane quadrant at the medial wall of the simulated tympanic cavity. To imitate the annular ligament flexibility the stapes is fixed on a soft plastic pad. 41 subjects evaluated the model´s anatomical analogy, the comparability to the real surgical situation and the general model properties the using a special questionnaire.The tympanoplasty model was very well evaluated by all participants. It is a reasonably priced model and a useful tool in microsurgical skills training. Thereby, it closes the gap between theoretical training and real operation conditions. © Georg Thieme Verlag KG Stuttgart · New York.

  5. CRITICAL ANALYSIS OF EVALUATION MODEL LOMCE

    Directory of Open Access Journals (Sweden)

    José Luis Bernal Agudo

    2015-06-01

    Full Text Available The evaluation model that the LOMCE projects sinks its roots into the neoliberal beliefs, reflecting a specific way of understanding the world. What matters is not the process but the results, being the evaluation the center of the education-learning processes. It presents an evil planning, since the theory that justifies the model doesn’t specify upon coherent proposals, where there is an excessive worry for excellence and diversity is left out. A comprehensive way of understanding education should be recovered.

  6. Description of an identification method of thermocouple time constant based on application of recursive numerical filtering to temperature fluctuation

    International Nuclear Information System (INIS)

    Bernardin, B.; Le Guillou, G.; Parcy, JP.

    1981-04-01

    Usual spectral methods, based on temperature fluctuation analysis, aiming at thermocouple time constant identification are using an equipment too much sophisticated for on-line application. It is shown that numerical filtering is optimal for this application, the equipment is simpler than for spectral methods and less samples of signals are needed for the same accuracy. The method is described and a parametric study was performed using a temperature noise simulator [fr

  7. Development of KAERI LBLOCA realistic evaluation model

    International Nuclear Information System (INIS)

    Lee, W.J.; Lee, Y.J.; Chung, B.D.; Lee, S.Y.

    1994-01-01

    A realistic evaluation model (REM) for LBLOCA licensing calculation is developed and proposed for application to pressurized light water reactors. The developmental aim of the KAERI-REM is to provide a systematic methodology that is simple in structure and to use and built upon sound logical reasoning, for improving the code capability to realistically describe the LBLOCA phenomena and for evaluating the associated uncertainties. The method strives to be faithful to the intention of being best-estimate, that is, the method aims to evaluate the best-estimate values and the associated uncertainties while complying to the requirements in the ECCS regulations. (author)

  8. Study on team evaluation. Team process model for team evaluation

    International Nuclear Information System (INIS)

    Sasou Kunihide; Ebisu, Mitsuhiro; Hirose, Ayako

    2004-01-01

    Several studies have been done to evaluate or improve team performance in nuclear and aviation industries. Crew resource management is the typical example. In addition, team evaluation recently gathers interests in other teams of lawyers, medical staff, accountants, psychiatrics, executive, etc. However, the most evaluation methods focus on the results of team behavior that can be observed through training or actual business situations. What is expected team is not only resolving problems but also training younger members being destined to lead the next generation. Therefore, the authors set the final goal of this study establishing a series of methods to evaluate and improve teams inclusively such as decision making, motivation, staffing, etc. As the first step, this study develops team process model describing viewpoints for the evaluation. The team process is defined as some kinds of power that activate or inactivate competency of individuals that is the components of team's competency. To find the team process, the authors discussed the merits of team behavior with the experienced training instructors and shift supervisors of nuclear/thermal power plants. The discussion finds four team merits and many components to realize those team merits. Classifying those components into eight groups of team processes such as 'Orientation', 'Decision Making', 'Power and Responsibility', 'Workload Management', 'Professional Trust', 'Motivation', 'Training' and 'staffing', the authors propose Team Process Model with two to four sub processes in each team process. In the future, the authors will develop methods to evaluate some of the team processes for nuclear/thermal power plant operation teams. (author)

  9. Econometric Evaluation of Asset Pricing Models

    OpenAIRE

    Lars Peter Hansen; John Heaton; Erzo Luttmer

    1993-01-01

    In this article we provide econometric tools for the evaluation of intertemporal asset pricing models using specification-error and volatility bounds. We formulate analog estimators of these bounds, give conditions for consistency, and derive the limiting distribution of these estimators. The analysis incorporates market frictions such as short-sale constraints and proportional transactions costs. Among several applications we show how to use the methods to assess specific asset pricing model...

  10. New fixed-point mini-cell to investigate thermocouple drift in a high-temperature environment under neutron irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Laurie, M.; Vlahovic, L.; Rondinella, V.V. [European Commission, Joint Research Centre, Institute for Transuranium Elements, P.O. Box 2340, D-76125 Karlsruhe, (Germany); Sadli, M.; Failleau, G. [Laboratoire Commun de Metrologie, LNE-Cnam, Saint-Denis, (France); Fuetterer, M.; Lapetite, J.M. [European Commission, Joint Research Centre, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten, (Netherlands); Fourrez, S. [Thermocoax, 8 rue du pre neuf, F-61100 St Georges des Groseillers, (France)

    2015-07-01

    Temperature measurements in the nuclear field require a high degree of reliability and accuracy. Despite their sheathed form, thermocouples subjected to nuclear radiations undergo changes due to radiation damage and transmutation that lead to significant EMF drift during long-term fuel irradiation experiment. For the purpose of a High Temperature Reactor fuel irradiation to take place in the High Flux Reactor Petten, a dedicated fixed-point cell was jointly developed by LNE-Cnam and JRC-IET. The developed cell to be housed in the irradiation rig was tailor made to quantify the thermocouple drift during the irradiation (about two year duration) and withstand high temperature (in the range 950 deg. C - 1100 deg. C) in the presence of contaminated helium in a graphite environment. Considering the different levels of temperature achieved in the irradiation facility and the large palette of thermocouple types aimed at surveying the HTR fuel pebble during the qualification test both copper (1084.62 deg. C) and gold (1064.18 deg. C) fixed-point materials were considered. The aim of this paper is to first describe the fixed-point mini-cell designed to be embedded in the reactor rig and to discuss the preliminary results achieved during some out of pile tests as much as some robustness tests representative of the reactor scram scenarios. (authors)

  11. Evaluation of a Mysis bioenergetics model

    Science.gov (United States)

    Chipps, S.R.; Bennett, D.H.

    2002-01-01

    Direct approaches for estimating the feeding rate of the opossum shrimp Mysis relicta can be hampered by variable gut residence time (evacuation rate models) and non-linear functional responses (clearance rate models). Bioenergetics modeling provides an alternative method, but the reliability of this approach needs to be evaluated using independent measures of growth and food consumption. In this study, we measured growth and food consumption for M. relicta and compared experimental results with those predicted from a Mysis bioenergetics model. For Mysis reared at 10??C, model predictions were not significantly different from observed values. Moreover, decomposition of mean square error indicated that 70% of the variation between model predictions and observed values was attributable to random error. On average, model predictions were within 12% of observed values. A sensitivity analysis revealed that Mysis respiration and prey energy density were the most sensitive parameters affecting model output. By accounting for uncertainty (95% CLs) in Mysis respiration, we observed a significant improvement in the accuracy of model output (within 5% of observed values), illustrating the importance of sensitive input parameters for model performance. These findings help corroborate the Mysis bioenergetics model and demonstrate the usefulness of this approach for estimating Mysis feeding rate.

  12. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  13. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  14. Multicriterial evaluation of spallation reaction models

    International Nuclear Information System (INIS)

    Andrianov, A.A.; Gritsyuk, S.V.; Korovin, Yu.A.; Kuptsov, I.S.

    2013-01-01

    Results of evaluation of predicting ability of spallation reaction models as applied to high-energy protons interaction based on methods of discrete decision analysis are presented. It is shown that results obtained using different methods are well consistent. Recommendations are given on the use of discrete decision analysis methods for providing constants to be employed in calculations of future nuclear power facility [ru

  15. Credit Risk Evaluation : Modeling - Analysis - Management

    OpenAIRE

    Wehrspohn, Uwe

    2002-01-01

    An analysis and further development of the building blocks of modern credit risk management: -Definitions of default -Estimation of default probabilities -Exposures -Recovery Rates -Pricing -Concepts of portfolio dependence -Time horizons for risk calculations -Quantification of portfolio risk -Estimation of risk measures -Portfolio analysis and portfolio improvement -Evaluation and comparison of credit risk models -Analytic portfolio loss distributions The thesis contributes to the evaluatio...

  16. Evaluating Performances of Traffic Noise Models | Oyedepo ...

    African Journals Online (AJOL)

    Traffic noise in decibel dB(A) were measured at six locations using 407780A Integrating Sound Level Meter, while spot speed and traffic volume were collected with cine-camera. The predicted sound exposure level (SEL) was evaluated using Burgess, British and FWHA model. The average noise level obtained are 77.64 ...

  17. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  18. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  19. Evaluating the TD model of classical conditioning.

    Science.gov (United States)

    Ludvig, Elliot A; Sutton, Richard S; Kehoe, E James

    2012-09-01

    The temporal-difference (TD) algorithm from reinforcement learning provides a simple method for incrementally learning predictions of upcoming events. Applied to classical conditioning, TD models suppose that animals learn a real-time prediction of the unconditioned stimulus (US) on the basis of all available conditioned stimuli (CSs). In the TD model, similar to other error-correction models, learning is driven by prediction errors--the difference between the change in US prediction and the actual US. With the TD model, however, learning occurs continuously from moment to moment and is not artificially constrained to occur in trials. Accordingly, a key feature of any TD model is the assumption about the representation of a CS on a moment-to-moment basis. Here, we evaluate the performance of the TD model with a heretofore unexplored range of classical conditioning tasks. To do so, we consider three stimulus representations that vary in their degree of temporal generalization and evaluate how the representation influences the performance of the TD model on these conditioning tasks.

  20. Modelling and evaluation of surgical performance using hidden Markov models.

    Science.gov (United States)

    Megali, Giuseppe; Sinigaglia, Stefano; Tonet, Oliver; Dario, Paolo

    2006-10-01

    Minimally invasive surgery has become very widespread in the last ten years. Since surgeons experience difficulties in learning and mastering minimally invasive techniques, the development of training methods is of great importance. While the introduction of virtual reality-based simulators has introduced a new paradigm in surgical training, skill evaluation methods are far from being objective. This paper proposes a method for defining a model of surgical expertise and an objective metric to evaluate performance in laparoscopic surgery. Our approach is based on the processing of kinematic data describing movements of surgical instruments. We use hidden Markov model theory to define an expert model that describes expert surgical gesture. The model is trained on kinematic data related to exercises performed on a surgical simulator by experienced surgeons. Subsequently, we use this expert model as a reference model in the definition of an objective metric to evaluate performance of surgeons with different abilities. Preliminary results show that, using different topologies for the expert model, the method can be efficiently used both for the discrimination between experienced and novice surgeons, and for the quantitative assessment of surgical ability.

  1. Evaluating software architecture using fuzzy formal models

    Directory of Open Access Journals (Sweden)

    Payman Behbahaninejad

    2012-04-01

    Full Text Available Unified Modeling Language (UML has been recognized as one of the most popular techniques to describe static and dynamic aspects of software systems. One of the primary issues in designing software packages is the existence of uncertainty associated with such models. Fuzzy-UML to describe software architecture has both static and dynamic perspective, simultaneously. The evaluation of software architecture design phase initiates always help us find some additional requirements, which helps reduce cost of design. In this paper, we use a fuzzy data model to describe the static aspects of software architecture and the fuzzy sequence diagram to illustrate the dynamic aspects of software architecture. We also transform these diagrams into Petri Nets and evaluate reliability of the architecture. The web-based hotel reservation system for further explanation has been studied.

  2. Atmospheric Model Evaluation Tool for meteorological and air quality simulations

    Science.gov (United States)

    The Atmospheric Model Evaluation Tool compares model predictions to observed data from various meteorological and air quality observation networks to help evaluate meteorological and air quality simulations.

  3. CMAQ Involvement in Air Quality Model Evaluation International Initiative

    Science.gov (United States)

    Description of Air Quality Model Evaluation International Initiative (AQMEII). Different chemical transport models are applied by different groups over North America and Europe and evaluated against observations.

  4. Model description and evaluation of model performance: DOSDIM model

    International Nuclear Information System (INIS)

    Lewyckyj, N.; Zeevaert, T.

    1996-01-01

    DOSDIM was developed to assess the impact to man from routine and accidental atmospheric releases. It is a compartmental, deterministic, radiological model. For an accidental release, dynamic transfer are used in opposition to a routine release for which equilibrium transfer factors are used. Parameters values were chosen to be conservative. Transfer between compartments are described by first-order differential equations. 2 figs

  5. Performance Evaluation and Modelling of Container Terminals

    Science.gov (United States)

    Venkatasubbaiah, K.; Rao, K. Narayana; Rao, M. Malleswara; Challa, Suresh

    2018-02-01

    The present paper evaluates and analyzes the performance of 28 container terminals of south East Asia through data envelopment analysis (DEA), principal component analysis (PCA) and hybrid method of DEA-PCA. DEA technique is utilized to identify efficient decision making unit (DMU)s and to rank DMUs in a peer appraisal mode. PCA is a multivariate statistical method to evaluate the performance of container terminals. In hybrid method, DEA is integrated with PCA to arrive the ranking of container terminals. Based on the composite ranking, performance modelling and optimization of container terminals is carried out through response surface methodology (RSM).

  6. Probabilistic evaluation of competing climate models

    Directory of Open Access Journals (Sweden)

    A. Braverman

    2017-10-01

    Full Text Available Climate models produce output over decades or longer at high spatial and temporal resolution. Starting values, boundary conditions, greenhouse gas emissions, and so forth make the climate model an uncertain representation of the climate system. A standard paradigm for assessing the quality of climate model simulations is to compare what these models produce for past and present time periods, to observations of the past and present. Many of these comparisons are based on simple summary statistics called metrics. In this article, we propose an alternative: evaluation of competing climate models through probabilities derived from tests of the hypothesis that climate-model-simulated and observed time sequences share common climate-scale signals. The probabilities are based on the behavior of summary statistics of climate model output and observational data over ensembles of pseudo-realizations. These are obtained by partitioning the original time sequences into signal and noise components, and using a parametric bootstrap to create pseudo-realizations of the noise sequences. The statistics we choose come from working in the space of decorrelated and dimension-reduced wavelet coefficients. Here, we compare monthly sequences of CMIP5 model output of average global near-surface temperature anomalies to similar sequences obtained from the well-known HadCRUT4 data set as an illustration.

  7. Probabilistic evaluation of competing climate models

    Science.gov (United States)

    Braverman, Amy; Chatterjee, Snigdhansu; Heyman, Megan; Cressie, Noel

    2017-10-01

    Climate models produce output over decades or longer at high spatial and temporal resolution. Starting values, boundary conditions, greenhouse gas emissions, and so forth make the climate model an uncertain representation of the climate system. A standard paradigm for assessing the quality of climate model simulations is to compare what these models produce for past and present time periods, to observations of the past and present. Many of these comparisons are based on simple summary statistics called metrics. In this article, we propose an alternative: evaluation of competing climate models through probabilities derived from tests of the hypothesis that climate-model-simulated and observed time sequences share common climate-scale signals. The probabilities are based on the behavior of summary statistics of climate model output and observational data over ensembles of pseudo-realizations. These are obtained by partitioning the original time sequences into signal and noise components, and using a parametric bootstrap to create pseudo-realizations of the noise sequences. The statistics we choose come from working in the space of decorrelated and dimension-reduced wavelet coefficients. Here, we compare monthly sequences of CMIP5 model output of average global near-surface temperature anomalies to similar sequences obtained from the well-known HadCRUT4 data set as an illustration.

  8. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  9. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  10. Transport properties site descriptive model. Guidelines for evaluation and modelling

    International Nuclear Information System (INIS)

    Berglund, Sten; Selroos, Jan-Olof

    2004-04-01

    This report describes a strategy for the development of Transport Properties Site Descriptive Models within the SKB Site Investigation programme. Similar reports have been produced for the other disciplines in the site descriptive modelling (Geology, Hydrogeology, Hydrogeochemistry, Rock mechanics, Thermal properties, and Surface ecosystems). These reports are intended to guide the site descriptive modelling, but also to provide the authorities with an overview of modelling work that will be performed. The site descriptive modelling of transport properties is presented in this report and in the associated 'Strategy for the use of laboratory methods in the site investigations programme for the transport properties of the rock', which describes laboratory measurements and data evaluations. Specifically, the objectives of the present report are to: Present a description that gives an overview of the strategy for developing Site Descriptive Models, and which sets the transport modelling into this general context. Provide a structure for developing Transport Properties Site Descriptive Models that facilitates efficient modelling and comparisons between different sites. Provide guidelines on specific modelling issues where methodological consistency is judged to be of special importance, or where there is no general consensus on the modelling approach. The objectives of the site descriptive modelling process and the resulting Transport Properties Site Descriptive Models are to: Provide transport parameters for Safety Assessment. Describe the geoscientific basis for the transport model, including the qualitative and quantitative data that are of importance for the assessment of uncertainties and confidence in the transport description, and for the understanding of the processes at the sites. Provide transport parameters for use within other discipline-specific programmes. Contribute to the integrated evaluation of the investigated sites. The site descriptive modelling of

  11. Evaluating spatial patterns in hydrological modelling

    DEFF Research Database (Denmark)

    Koch, Julian

    is not fully exploited by current modelling frameworks due to the lack of suitable spatial performance metrics. Furthermore, the traditional model evaluation using discharge is found unsuitable to lay confidence on the predicted catchment inherent spatial variability of hydrological processes in a fully...... the contiguous United Sates (10^6 km2). To this end, the thesis at hand applies a set of spatial performance metrics on various hydrological variables, namely land-surface-temperature (LST), evapotranspiration (ET) and soil moisture. The inspiration for the applied metrics is found in related fields...

  12. Implicit moral evaluations: A multinomial modeling approach.

    Science.gov (United States)

    Cameron, C Daryl; Payne, B Keith; Sinnott-Armstrong, Walter; Scheffer, Julian A; Inzlicht, Michael

    2017-01-01

    Implicit moral evaluations-i.e., immediate, unintentional assessments of the wrongness of actions or persons-play a central role in supporting moral behavior in everyday life. Yet little research has employed methods that rigorously measure individual differences in implicit moral evaluations. In five experiments, we develop a new sequential priming measure-the Moral Categorization Task-and a multinomial model that decomposes judgment on this task into multiple component processes. These include implicit moral evaluations of moral transgression primes (Unintentional Judgment), accurate moral judgments about target actions (Intentional Judgment), and a directional tendency to judge actions as morally wrong (Response Bias). Speeded response deadlines reduced Intentional Judgment but not Unintentional Judgment (Experiment 1). Unintentional Judgment was stronger toward moral transgression primes than non-moral negative primes (Experiments 2-4). Intentional Judgment was associated with increased error-related negativity, a neurophysiological indicator of behavioral control (Experiment 4). Finally, people who voted for an anti-gay marriage amendment had stronger Unintentional Judgment toward gay marriage primes (Experiment 5). Across Experiments 1-4, implicit moral evaluations converged with moral personality: Unintentional Judgment about wrong primes, but not negative primes, was negatively associated with psychopathic tendencies and positively associated with moral identity and guilt proneness. Theoretical and practical applications of formal modeling for moral psychology are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Ciaschini, V; Canaparo, M; Ronchieri, E; Salomoni, D

    2014-01-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  14. Simulations, evaluations and models. Vol. 1

    International Nuclear Information System (INIS)

    Brehmer, B.; Leplat, J.

    1992-01-01

    Papers presented at the Fourth MOHAWC (Models of Human Activities in Work Context) workshop. The general theme was simulations, evaluations and models. The emphasis was on time in relation to the modelling of human activities in modern, high tech. work. Such work often requires people to control dynamic systems, and the behaviour and misbehaviour of these systems in time is a principle focus of work in, for example, a modern process plant. The papers report on microworlds and on their innovative uses, both in the form of experiments and in the form of a new form of use, that of testing a program which performs diagnostic reasoning. They present new aspects on the problem of time in process control, showing the importance of considering the time scales of dynamic tasks, both in individual decision making and in distributed decision making, and in providing new formalisms, both for the representation of time and for reasoning involving time in diagnosis. (AB)

  15. A methodology for spectral wave model evaluation

    Science.gov (United States)

    Siqueira, S. A.; Edwards, K. L.; Rogers, W. E.

    2017-12-01

    Model evaluation is accomplished by comparing bulk parameters (e.g., significant wave height, energy period, and mean square slope (MSS)) calculated from the model energy spectra with those calculated from buoy energy spectra. Quality control of the observed data and choice of the frequency range from which the bulk parameters are calculated are critical steps in ensuring the validity of the model-data comparison. The compared frequency range of each observation and the analogous model output must be identical, and the optimal frequency range depends in part on the reliability of the observed spectra. National Data Buoy Center 3-m discus buoy spectra are unreliable above 0.3 Hz due to a non-optimal buoy response function correction. As such, the upper end of the spectrum should not be included when comparing a model to these data. Bioufouling of Waverider buoys must be detected, as it can harm the hydrodynamic response of the buoy at high frequencies, thereby rendering the upper part of the spectrum unsuitable for comparison. An important consideration is that the intentional exclusion of high frequency energy from a validation due to data quality concerns (above) can have major implications for validation exercises, especially for parameters such as the third and fourth moments of the spectrum (related to Stokes drift and MSS, respectively); final conclusions can be strongly altered. We demonstrate this by comparing outcomes with and without the exclusion, in a case where a Waverider buoy is believed to be free of biofouling. Determination of the appropriate frequency range is not limited to the observed spectra. Model evaluation involves considering whether all relevant frequencies are included. Guidance to make this decision is based on analysis of observed spectra. Two model frequency lower limits were considered. Energy in the observed spectrum below the model lower limit was calculated for each. For locations where long swell is a component of the wave

  16. Diagnosis code assignment: models and evaluation metrics.

    Science.gov (United States)

    Perotte, Adler; Pivovarov, Rimma; Natarajan, Karthik; Weiskopf, Nicole; Wood, Frank; Elhadad, Noémie

    2014-01-01

    The volume of healthcare data is growing rapidly with the adoption of health information technology. We focus on automated ICD9 code assignment from discharge summary content and methods for evaluating such assignments. We study ICD9 diagnosis codes and discharge summaries from the publicly available Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC II) repository. We experiment with two coding approaches: one that treats each ICD9 code independently of each other (flat classifier), and one that leverages the hierarchical nature of ICD9 codes into its modeling (hierarchy-based classifier). We propose novel evaluation metrics, which reflect the distances among gold-standard and predicted codes and their locations in the ICD9 tree. Experimental setup, code for modeling, and evaluation scripts are made available to the research community. The hierarchy-based classifier outperforms the flat classifier with F-measures of 39.5% and 27.6%, respectively, when trained on 20,533 documents and tested on 2282 documents. While recall is improved at the expense of precision, our novel evaluation metrics show a more refined assessment: for instance, the hierarchy-based classifier identifies the correct sub-tree of gold-standard codes more often than the flat classifier. Error analysis reveals that gold-standard codes are not perfect, and as such the recall and precision are likely underestimated. Hierarchy-based classification yields better ICD9 coding than flat classification for MIMIC patients. Automated ICD9 coding is an example of a task for which data and tools can be shared and for which the research community can work together to build on shared models and advance the state of the art.

  17. Evaluation of NOx Emissions and Modeling

    Science.gov (United States)

    Henderson, B. H.; Simon, H. A.; Timin, B.; Dolwick, P. D.; Owen, R. C.; Eyth, A.; Foley, K.; Toro, C.; Baker, K. R.

    2017-12-01

    Studies focusing on ambient measurements of NOy have concluded that NOx emissions are overestimated and some have attributed the error to the onroad mobile sector. We investigate this conclusion to identify the cause of observed bias. First, we compare DISCOVER-AQ Baltimore ambient measurements to fine-scale modeling with NOy tagged by sector. Sector-based relationships with bias are present, but these are sensitive to simulated vertical mixing. This is evident both in sensitivity to mixing parameterization and the seasonal patterns of bias. We also evaluate observation-based indicators, like CO:NOy ratios, that are commonly used to diagnose emissions inventories. Second, we examine the sensitivity of predicted NOx and NOy to temporal allocation of emissions. We investigate alternative temporal allocations for EGUs without CEMS, on-road mobile, and several non-road categories. These results show some location-specific sensitivity and will lead to some improved temporal allocations. Third, near-road studies have inherently fewer confounding variables, and have been examined for more direct evaluation of emissions and dispersion models. From 2008-2011, the EPA and FHWA conducted near-road studies in Las Vegas and Detroit. These measurements are used to more directly evaluate the emissions and dispersion using site-specific traffic data. In addition, the site-specific emissions are being compared to the emissions used in larger-scale photochemical modeling to identify key discrepancies. These efforts are part of a larger coordinated effort by EPA scientist to ensure the highest quality in emissions and model processes. We look forward to sharing the state of these analyses and expected updates.

  18. Intuitionistic fuzzy (IF) evaluations of multidimensional model

    International Nuclear Information System (INIS)

    Valova, I.

    2012-01-01

    There are different logical methods for data structuring, but no one is perfect enough. Multidimensional model-MD of data is presentation of data in a form of cube (referred also as info-cube or hypercube) with data or in form of 'star' type scheme (referred as multidimensional scheme), by use of F-structures (Facts) and set of D-structures (Dimensions), based on the notion of hierarchy of D-structures. The data, being subject of analysis in a specific multidimensional model is located in a Cartesian space, being restricted by D-structures. In fact, the data is either dispersed or 'concentrated', therefore the data cells are not distributed evenly within the respective space. The moment of occurrence of any event is difficult to be predicted and the data is concentrated as per time periods, location of performed business event, etc. To process such dispersed or concentrated data, various technical strategies are needed. The basic methods for presentation of such data should be selected. The approaches of data processing and respective calculations are connected with different options for data representation. The use of intuitionistic fuzzy evaluations (IFE) provide us new possibilities for alternative presentation and processing of data, subject of analysis in any OLAP application. The use of IFE at the evaluation of multidimensional models will result in the following advantages: analysts will dispose with more complete information for processing and analysis of respective data; benefit for the managers is that the final decisions will be more effective ones; enabling design of more functional multidimensional schemes. The purpose of this work is to apply intuitionistic fuzzy evaluations of multidimensional model of data. (authors)

  19. Scanning thermal microscopy based on a quartz tuning fork and a micro-thermocouple in active mode (2ω method)

    Energy Technology Data Exchange (ETDEWEB)

    Bontempi, Alexia; Nguyen, Tran Phong; Salut, Roland; Thiery, Laurent; Teyssieux, Damien; Vairac, Pascal [FEMTO-ST Institute UMR 6174, Université de Franche-Comté, CNRS, ENSMM, UTBM, 15B Avenue des Montboucons, F-25030 Besançon (France)

    2016-06-15

    A novel probe for scanning thermal microscope using a micro-thermocouple probe placed on a Quartz Tuning Fork (QTF) is presented. Instead of using an external deflection with a cantilever beam for contact detection, an original combination of piezoelectric resonator and thermal probe is employed. Due to a non-contact photothermal excitation principle, the high quality factor of the QTF allows the probe-to-surface contact detection. Topographic and thermal scanning images obtained on a specific sample points out the interest of our system as an alternative to cantilevered resistive probe systems which are the most spread.

  20. Scanning thermal microscopy based on a quartz tuning fork and a micro-thermocouple in active mode (2ω method)

    International Nuclear Information System (INIS)

    Bontempi, Alexia; Nguyen, Tran Phong; Salut, Roland; Thiery, Laurent; Teyssieux, Damien; Vairac, Pascal

    2016-01-01

    A novel probe for scanning thermal microscope using a micro-thermocouple probe placed on a Quartz Tuning Fork (QTF) is presented. Instead of using an external deflection with a cantilever beam for contact detection, an original combination of piezoelectric resonator and thermal probe is employed. Due to a non-contact photothermal excitation principle, the high quality factor of the QTF allows the probe-to-surface contact detection. Topographic and thermal scanning images obtained on a specific sample points out the interest of our system as an alternative to cantilevered resistive probe systems which are the most spread.

  1. Scanning thermal microscopy based on a quartz tuning fork and a micro-thermocouple in active mode (2ω method).

    Science.gov (United States)

    Bontempi, Alexia; Nguyen, Tran Phong; Salut, Roland; Thiery, Laurent; Teyssieux, Damien; Vairac, Pascal

    2016-06-01

    A novel probe for scanning thermal microscope using a micro-thermocouple probe placed on a Quartz Tuning Fork (QTF) is presented. Instead of using an external deflection with a cantilever beam for contact detection, an original combination of piezoelectric resonator and thermal probe is employed. Due to a non-contact photothermal excitation principle, the high quality factor of the QTF allows the probe-to-surface contact detection. Topographic and thermal scanning images obtained on a specific sample points out the interest of our system as an alternative to cantilevered resistive probe systems which are the most spread.

  2. Training Module on the Evaluation of Best Modeling Practices

    Science.gov (United States)

    Building upon the fundamental concepts outlined in previous modules, the objectives of this module are to explore the topic of model evaluation and identify the 'best modeling practices' and strategies for the Evaluation Stage of the model life-cycle.

  3. Evaluation of onset of nucleate boiling models

    Energy Technology Data Exchange (ETDEWEB)

    Huang, LiDong [Heat Transfer Research, Inc., College Station, TX (United States)], e-mail: lh@htri.net

    2009-07-01

    This article discusses available models and correlations for predicting the required heat flux or wall superheat for the Onset of Nucleate Boiling (ONB) on plain surfaces. It reviews ONB data in the open literature and discusses the continuing efforts of Heat Transfer Research, Inc. in this area. Our ONB database contains ten individual sources for ten test fluids and a wide range of operating conditions for different geometries, e.g., tube side and shell side flow boiling and falling film evaporation. The article also evaluates literature models and correlations based on the data: no single model in the open literature predicts all data well. The prediction uncertainty is especially higher in vacuum conditions. Surface roughness is another critical criterion in determining which model should be used. However, most models do not directly account for surface roughness, and most investigators do not provide surface roughness information in their published findings. Additional experimental research is needed to improve confidence in predicting the required wall superheats for nucleation boiling for engineering design purposes. (author)

  4. Evaluation of CASP8 model quality predictions

    KAUST Repository

    Cozzetto, Domenico

    2009-01-01

    The model quality assessment problem consists in the a priori estimation of the overall and per-residue accuracy of protein structure predictions. Over the past years, a number of methods have been developed to address this issue and CASP established a prediction category to evaluate their performance in 2006. In 2008 the experiment was repeated and its results are reported here. Participants were invited to infer the correctness of the protein models submitted by the registered automatic servers. Estimates could apply to both whole models and individual amino acids. Groups involved in the tertiary structure prediction categories were also asked to assign local error estimates to each predicted residue in their own models and their results are also discussed here. The correlation between the predicted and observed correctness measures was the basis of the assessment of the results. We observe that consensus-based methods still perform significantly better than those accepting single models, similarly to what was concluded in the previous edition of the experiment. © 2009 WILEY-LISS, INC.

  5. Moisture evaluation by dynamic thermography data modeling

    Energy Technology Data Exchange (ETDEWEB)

    Bison, P.G.; Grinzato, E.; Marinetti, S. [ITEF-CNR, Padova (Italy)

    1994-12-31

    This paper is the prosecution of previous works on the design of a Non Destructive method for in situ detection of moistened areas in buildings and the evaluation of the water content in porous materials by thermographic analysis. The use of heat transfer model to interpret data allows to improve the measurement accuracy taking into account the actual boundary conditions. The relative increase of computation time is balanced by the additional advantage to optimize the testing procedure of different objects simulating the heat transfer. Two models are tested both analytically and experimentally: (1) the semi-infinite body to evaluate the thermal inertia and water content; (2) the slab to measure the sample`s diffusivity, the dependence of conductivity with the water content and to correct the water content estimation. The fitting of the experimental data on the model is carried out according to the least square method that is linear in the first case and nonlinear in the second. The Levenberg-Marquardt procedure is followed in nonlinear fitting to search in the parameters space the optimum point that minimizes the Chi-square estimator. Experimental results on bricks used in building for restoration activities, are discussed. The water content measured in different hygrometric conditions is compared with known values. A correction on the absorptivity coefficient dependent on water content is introduced.

  6. European Cohesion Policy: A Proposed Evaluation Model

    Directory of Open Access Journals (Sweden)

    Alina Bouroşu (Costăchescu

    2012-06-01

    Full Text Available The current approach of European Cohesion Policy (ECP is intended to be a bridge between different fields of study, emphasizing the intersection between "the public policy cycle, theories of new institutionalism and the new public management”. ECP can be viewed as a focal point between putting into practice the principles of the new governance theory, theories of economic convergence and divergence and the governance of common goods. After a short introduction of defining the concepts used, the author discussed on the created image of ECP by applying three different theories, focusing on the structural funds implementation system (SFIS, directing the discussion on the evaluation part of this policy, by proposing a model of performance evaluation of the system, in order to outline key principles for creating effective management mechanisms of ECP.

  7. Incorporating a 360 Degree Evaluation Model IOT Transform the USMC Performance Evaluation System

    Science.gov (United States)

    2005-02-08

    Incorporating a 360 Evaluation Model IOT Transform the USMC Performance Evaluation System EWS 2005 Subject Area Manpower...Incorporating a 360 Evaluation Model IOT Transform the USMC Performance Evaluation System” Contemporary...COVERED 00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE Incorporating a 360 Evaluation Model IOT Transform the USMC Performance

  8. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  9. Designing and evaluating representations to model pedagogy

    Directory of Open Access Journals (Sweden)

    Elizabeth Masterman

    2013-08-01

    Full Text Available This article presents the case for a theory-informed approach to designing and evaluating representations for implementation in digital tools to support Learning Design, using the framework of epistemic efficacy as an example. This framework, which is rooted in the literature of cognitive psychology, is operationalised through dimensions of fit that attend to: (1 the underlying ontology of the domain, (2 the purpose of the task that the representation is intended to facilitate, (3 how best to support the cognitive processes of the users of the representations, (4 users’ differing needs and preferences, and (5 the tool and environment in which the representations are constructed and manipulated.Through showing how epistemic efficacy can be applied to the design and evaluation of representations, the article presents the Learning Designer, a constructionist microworld in which teachers can both assemble their learning designs and model their pedagogy in terms of students’ potential learning experience. Although the activity of modelling may add to the cognitive task of design, the article suggests that the insights thereby gained can additionally help a lecturer who wishes to reuse a particular learning design to make informed decisions about its value to their practice.

  10. Evaluation of Student's Environment by DEA Models

    Directory of Open Access Journals (Sweden)

    F. Moradi

    2016-11-01

    Full Text Available The important question here is, is there real evaluation in educational advance? In other words, if a student has been successful in mathematics or has been unsuccessful in mathematics, is it possible to find the reasons behind his advance or, is it possible to find the reasons behind his advance or weakness? If we want to respond to this significant question, it should be said that factors of educational advance must be divided into 5 main groups. 1-family, 2-teacher, 3- students 4-school and 5-manager of 3 schools It can then be said that a student's score does not just depend on a factor that people have imaged From this, it can be concluded that by using the DEA and SBM models, each student's efficiency must be researched and the factors of the student's strengths and weaknesses must be analyzed.

  11. RTMOD: Real-Time MODel evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Graziani, G; Galmarini, S. [Joint Research centre, Ispra (Italy); Mikkelsen, T. [Risoe National Lab., Wind Energy and Atmospheric Physics Dept. (Denmark)

    2000-01-01

    The 1998 - 1999 RTMOD project is a system based on an automated statistical evaluation for the inter-comparison of real-time forecasts produced by long-range atmospheric dispersion models for national nuclear emergency predictions of cross-boundary consequences. The background of RTMOD was the 1994 ETEX project that involved about 50 models run in several Institutes around the world to simulate two real tracer releases involving a large part of the European territory. In the preliminary phase of ETEX, three dry runs (i.e. simulations in real-time of fictitious releases) were carried out. At that time, the World Wide Web was not available to all the exercise participants, and plume predictions were therefore submitted to JRC-Ispra by fax and regular mail for subsequent processing. The rapid development of the World Wide Web in the second half of the nineties, together with the experience gained during the ETEX exercises suggested the development of this project. RTMOD featured a web-based user-friendly interface for data submission and an interactive program module for displaying, intercomparison and analysis of the forecasts. RTMOD has focussed on model intercomparison of concentration predictions at the nodes of a regular grid with 0.5 degrees of resolution both in latitude and in longitude, the domain grid extending from 5W to 40E and 40N to 65N. Hypothetical releases were notified around the world to the 28 model forecasters via the web on a one-day warning in advance. They then accessed the RTMOD web page for detailed information on the actual release, and as soon as possible they then uploaded their predictions to the RTMOD server and could soon after start their inter-comparison analysis with other modelers. When additional forecast data arrived, already existing statistical results would be recalculated to include the influence by all available predictions. The new web-based RTMOD concept has proven useful as a practical decision-making tool for realtime

  12. ZATPAC: a model consortium evaluates teen programs.

    Science.gov (United States)

    Owen, Kathryn; Murphy, Dana; Parsons, Chris

    2009-09-01

    How do we advance the environmental literacy of young people, support the next generation of environmental stewards and increase the diversity of the leadership of zoos and aquariums? We believe it is through ongoing evaluation of zoo and aquarium teen programming and have founded a consortium to pursue those goals. The Zoo and Aquarium Teen Program Assessment Consortium (ZATPAC) is an initiative by six of the nation's leading zoos and aquariums to strengthen institutional evaluation capacity, model a collaborative approach toward assessing the impact of youth programs, and bring additional rigor to evaluation efforts within the field of informal science education. Since its beginning in 2004, ZATPAC has researched, developed, pilot-tested and implemented a pre-post program survey instrument designed to assess teens' knowledge of environmental issues, skills and abilities to take conservation actions, self-efficacy in environmental actions, and engagement in environmentally responsible behaviors. Findings from this survey indicate that teens who join zoo/aquarium programs are already actively engaged in many conservation behaviors. After participating in the programs, teens showed a statistically significant increase in their reported knowledge of conservation and environmental issues and their abilities to research, explain, and find resources to take action on conservation issues of personal concern. Teens also showed statistically significant increases pre-program to post-program for various conservation behaviors, including "I talk with my family and/or friends about things they can do to help the animals or the environment," "I save water...," "I save energy...," "When I am shopping I look for recycled products," and "I help with projects that restore wildlife habitat."

  13. World Integrated Nuclear Evaluation System: Model documentation

    International Nuclear Information System (INIS)

    1991-12-01

    The World Integrated Nuclear Evaluation System (WINES) is an aggregate demand-based partial equilibrium model used by the Energy Information Administration (EIA) to project long-term domestic and international nuclear energy requirements. WINES follows a top-down approach in which economic growth rates, delivered energy demand growth rates, and electricity demand are projected successively to ultimately forecast total nuclear generation and nuclear capacity. WINES could be potentially used to produce forecasts for any country or region in the world. Presently, WINES is being used to generate long-term forecasts for the United States, and for all countries with commercial nuclear programs in the world, excluding countries located in centrally planned economic areas. Projections for the United States are developed for the period from 2010 through 2030, and for other countries for the period starting in 2000 or 2005 (depending on the country) through 2010. EIA uses a pipeline approach to project nuclear capacity for the period between 1990 and the starting year for which the WINES model is used. This approach involves a detailed accounting of existing nuclear generating units and units under construction, their capacities, their actual or estimated time of completion, and the estimated date of retirements. Further detail on this approach can be found in Appendix B of Commercial Nuclear Power 1991: Prospects for the United States and the World

  14. COMPUTER MODEL FOR ORGANIC FERTILIZER EVALUATION

    Directory of Open Access Journals (Sweden)

    Zdenko Lončarić

    2009-12-01

    Full Text Available Evaluation of manures, composts and growing media quality should include enough properties to enable an optimal use from productivity and environmental points of view. The aim of this paper is to describe basic structure of organic fertilizer (and growing media evaluation model to present the model example by comparison of different manures as well as example of using plant growth experiment for calculating impact of pH and EC of growing media on lettuce plant growth. The basic structure of the model includes selection of quality indicators, interpretations of indicators value, and integration of interpreted values into new indexes. The first step includes data input and selection of available data as a basic or additional indicators depending on possible use as fertilizer or growing media. The second part of the model uses inputs for calculation of derived quality indicators. The third step integrates values into three new indexes: fertilizer, growing media, and environmental index. All three indexes are calculated on the basis of three different groups of indicators: basic value indicators, additional value indicators and limiting factors. The possible range of indexes values is 0-10, where range 0-3 means low, 3-7 medium and 7-10 high quality. Comparing fresh and composted manures, higher fertilizer and environmental indexes were determined for composted manures, and the highest fertilizer index was determined for composted pig manure (9.6 whereas the lowest for fresh cattle manure (3.2. Composted manures had high environmental index (6.0-10 for conventional agriculture, but some had no value (environmental index = 0 for organic agriculture because of too high zinc, copper or cadmium concentrations. Growing media indexes were determined according to their impact on lettuce growth. Growing media with different pH and EC resulted in very significant impacts on height, dry matter mass and leaf area of lettuce seedlings. The highest lettuce

  15. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  16. A Model for the Evaluation of Educational Products.

    Science.gov (United States)

    Bertram, Charles L.

    A model for the evaluation of educational products based on experience with development of three such products is described. The purpose of the evaluation model is to indicate the flow of evaluation activity as products undergo development. Evaluation is given Stufflebeam's definition as the process of delineating, obtaining, and providing useful…

  17. A Model for Evaluating Student Clinical Psychomotor Skills.

    Science.gov (United States)

    And Others; Fiel, Nicholas J.

    1979-01-01

    A long-range plan to evaluate medical students' physical examination skills was undertaken at the Ingham Family Medical Clinic at Michigan State University. The development of the psychomotor skills evaluation model to evaluate the skill of blood pressure measurement, tests of the model's reliability, and the use of the model are described. (JMD)

  18. Evaluating Quality in Model-Driven Engineering

    OpenAIRE

    Mohagheghi, Parastoo; Aagedal, Jan

    2007-01-01

    In Model-Driven Engineering (MDE), models are the prime artifacts, and developing high-quality systems depends on developing high-quality models and performing transformations that preserve quality or even improve it. This paper presents quality goals in MDE and states that the quality of models is affected by the quality of modeling languages, tools, modeling processes, the knowledge and experience of modelers, and the quality assurance techniques applied. The paper further presents related ...

  19. Safe and consistent method of spot-welding platinum thermocouple wires and foils for high temperature measurements

    Science.gov (United States)

    Orr, G.; Roth, M.

    2012-08-01

    A low-voltage (mV) electronically triggered spot welding system for fabricating fine thermocouples and thin sheets used in high-temperature characterization of materials' properties is suggested. The system is based on the capacitance discharge method with a timed trigger for obtaining reliable and consistent welds. In contrast to existing techniques based on employing high voltage DC supplies for charging the capacitor or supplies with positive and negative rails, this method uses a simple, standard dual power supply available at most of the physical laboratories or can be acquired at a low cost. In addition, an efficient and simple method of fabricating non-sticking electrodes that do not contaminate the weld area is suggested and implemented.

  20. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    Science.gov (United States)

    The model performance evaluation consists of metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors.

  1. Impact of model defect and experimental uncertainties on evaluated output

    International Nuclear Information System (INIS)

    Neudecker, D.; Capote, R.; Leeb, H.

    2013-01-01

    One of the current major problems in nuclear data evaluation is the unreasonably small evaluated uncertainties often obtained. These small uncertainties are partly attributed to missing correlations of experimental uncertainties as well as to deficiencies of the model employed for the prior information. In this article, both uncertainty sources are included in an evaluation of 55 Mn cross-sections for incident neutrons. Their impact on the evaluated output is studied using a prior obtained by the Full Bayesian Evaluation Technique and a prior obtained by the nuclear model program EMPIRE. It is shown analytically and by means of an evaluation that unreasonably small evaluated uncertainties can be obtained not only if correlated systematic uncertainties of the experiment are neglected but also if prior uncertainties are smaller or about the same magnitude as the experimental ones. Furthermore, it is shown that including model defect uncertainties in the evaluation of 55 Mn leads to larger evaluated uncertainties for channels where the model is deficient. It is concluded that including correlated experimental uncertainties is equally important as model defect uncertainties, if the model calculations deviate significantly from the measurements. -- Highlights: • We study possible causes of unreasonably small evaluated nuclear data uncertainties. • Two different formulations of model defect uncertainties are presented and compared. • Smaller prior than experimental uncertainties cause too small evaluated ones. • Neglected correlations of experimental uncertainties cause too small evaluated ones. • Including model defect uncertainties in the prior improves the evaluated output

  2. Nuclear safety culture evaluation model based on SSE-CMM

    International Nuclear Information System (INIS)

    Yang Xiaohua; Liu Zhenghai; Liu Zhiming; Wan Yaping; Peng Guojian

    2012-01-01

    Safety culture, which is of great significance to establish safety objectives, characterizes level of enterprise safety production and development. Traditional safety culture evaluation models emphasis on thinking and behavior of individual and organization, and pay attention to evaluation results while ignore process. Moreover, determining evaluation indicators lacks objective evidence. A novel multidimensional safety culture evaluation model, which has scientific and completeness, is addressed by building an preliminary mapping between safety culture and SSE-CMM's (Systems Security Engineering Capability Maturity Model) process area and generic practice. The model focuses on enterprise system security engineering process evaluation and provides new ideas and scientific evidences for the study of safety culture. (authors)

  3. Presenting an Evaluation Model for the Cancer Registry Software.

    Science.gov (United States)

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  4. An Evaluation Model of Digital Educational Resources

    Directory of Open Access Journals (Sweden)

    Abderrahim El Mhouti

    2013-05-01

    Full Text Available Abstract—Today, the use of digital educational resources in teaching and learning is considerably expanding. Such expansion calls educators and computer scientists to reflect more on the design of such products. However, this reflection exposes a number of criteria and recommendations that can guide and direct any teaching tool design be it campus-based or online (e-learning. Our work is at the heart of this issue. We suggest, through this article, examining academic, pedagogical, didactic and technical criteria to conduct this study which aims to evaluate the quality of digital educational resources. Our approach consists in addressing the specific and relevant factors of each evaluation criterion. We will then explain the detailed structure of the evaluation instrument used : “evaluation grid”. Finally, we show the evaluation outcomes based on the conceived grid and then we establish an analytical evaluation of the state of the art of digital educational resources.

  5. Modeling a support system for the evaluator

    International Nuclear Information System (INIS)

    Lozano Lima, B.; Ilizastegui Perez, F; Barnet Izquierdo, B.

    1998-01-01

    This work gives evaluators a tool they can employ to give more soundness to their review of operational limits and conditions. The system will establish the most adequate method to carry out the evaluation, as well as to evaluate the basis for technical operational specifications. It also includes the attainment of alternative questions to be supplied to the operating entity to support it in decision-making activities

  6. Statistical models of shape optimisation and evaluation

    CERN Document Server

    Davies, Rhodri; Taylor, Chris

    2014-01-01

    Deformable shape models have wide application in computer vision and biomedical image analysis. This book addresses a key issue in shape modelling: establishment of a meaningful correspondence between a set of shapes. Full implementation details are provided.

  7. Evaluation of EOR Processes Using Network Models

    DEFF Research Database (Denmark)

    Winter, Anatol; Larsen, Jens Kjell; Krogsbøll, Anette

    1998-01-01

    The report consists of the following parts: 1) Studies of wetting properties of model fluids and fluid mixtures aimed at an optimal selection of candidates for micromodel experiments. 2) Experimental studies of multiphase transport properties using physical models of porous networks (micromodels......) including estimation of their "petrophysical" properties (e.g. absolute permeability). 3) Mathematical modelling and computer studies of multiphase transport through pore space using mathematical network models. 4) Investigation of link between pore-scale and macroscopic recovery mechanisms....

  8. The Relevance of the CIPP Evaluation Model for Educational Accountability.

    Science.gov (United States)

    Stufflebeam, Daniel L.

    The CIPP Evaluation Model was originally developed to provide timely information in a systematic way for decision making, which is a proactive application of evaluation. This article examines whether the CIPP model also serves the retroactive purpose of providing information for accountability. Specifically, can the CIPP Model adequately assist…

  9. The Use of AMET and Automated Scripts for Model Evaluation

    Science.gov (United States)

    The Atmospheric Model Evaluation Tool (AMET) is a suite of software designed to facilitate the analysis and evaluation of meteorological and air quality models. AMET matches the model output for particular locations to the corresponding observed values from one or more networks ...

  10. SIMPLEBOX: a generic multimedia fate evaluation model

    NARCIS (Netherlands)

    van de Meent D

    1993-01-01

    This document describes the technical details of the multimedia fate model SimpleBox, version 1.0 (930801). SimpleBox is a multimedia box model of what is commonly referred to as a "Mackay-type" model ; it assumes spatially homogeneous environmental compartments (air, water, suspended

  11. Educational game models: conceptualization and evaluation ...

    African Journals Online (AJOL)

    The relationship between educational theories, game design and game development are used to develop models for the creation of complex learning environments. The Game Object Model (GOM), that marries educational theory and game design, forms the basis for the development of the Persona Outlining Model (POM) ...

  12. Issues in Value-at-Risk Modeling and Evaluation

    NARCIS (Netherlands)

    J. Daníelsson (Jón); C.G. de Vries (Casper); B.N. Jorgensen (Bjørn); P.F. Christoffersen (Peter); F.X. Diebold (Francis); T. Schuermann (Til); J.A. Lopez (Jose); B. Hirtle (Beverly)

    1998-01-01

    textabstractDiscusses the issues in value-at-risk modeling and evaluation. Value of value at risk; Horizon problems and extreme events in financial risk management; Methods of evaluating value-at-risk estimates.

  13. A simple model for straggling evaluation

    CERN Document Server

    Wilson, J W; Tai, H; Tripathi, R K

    2002-01-01

    Simple straggling models had largely been abandoned in favor of Monte Carlo simulations of straggling which are accurate but time consuming, limiting their application in practice. The difficulty of simple analytic models is the failure to give accurate values past 85% of the particle range. A simple model is derived herein based on a second order approximation upon which rapid analysis tools are developed for improved understanding of material charged particle transmission properties.

  14. A Descriptive Evaluation of Software Sizing Models

    Science.gov (United States)

    1987-09-01

    compensate for a lack of understanding of a software job to be done. 1.3 REPORT OUTLINE The guiding principle for model selection for this paper was...MODEL SIZE ESTIMATES FOR THE CAiSS SENSITIVITY MODEL MODEL SLOC ESD 37,600+ SPQR 35,910 BYL 22,402 PRICE SZ 21,410 ASSET-R 11,943 SSM 11,700 ASSET-R...disk. ?. Date LS, De fault current date, Re quire ] - ,, ... perffr: an,- 1 ,’ e e st i ma t e. Quantitative inputs Note- Each of the nine quantitative

  15. A Regional Climate Model Evaluation System

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a packaged data management infrastructure for the comparison of generated climate model output to existing observational datasets that includes capabilities...

  16. A Regional Climate Model Evaluation System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a packaged data management infrastructure for the comparison of generated climate model output to existing observational datasets that includes capabilities...

  17. QUALITY OF AN ACADEMIC STUDY PROGRAMME - EVALUATION MODEL

    Directory of Open Access Journals (Sweden)

    Mirna Macur

    2016-01-01

    Full Text Available Quality of an academic study programme is evaluated by many: employees (internal evaluation and by external evaluators: experts, agencies and organisations. Internal and external evaluation of an academic programme follow written structure that resembles on one of the quality models. We believe the quality models (mostly derived from EFQM excellence model don’t fit very well into non-profit activities, policies and programmes, because they are much more complex than environment, from which quality models derive from (for example assembly line. Quality of an academic study programme is very complex and understood differently by various stakeholders, so we present dimensional evaluation in the article. Dimensional evaluation, as opposed to component and holistic evaluation, is a form of analytical evaluation in which the quality of value of the evaluand is determined by looking at its performance on multiple dimensions of merit or evaluation criteria. First stakeholders of a study programme and their views, expectations and interests are presented, followed by evaluation criteria. They are both joined into the evaluation model revealing which evaluation criteria can and should be evaluated by which stakeholder. Main research questions are posed and research method for each dimension listed.

  18. Evaluation of models for assessing Medicago sativa L. hay quality

    African Journals Online (AJOL)

    UFS Campus

    ) model of Weiss et al. (1992), using lignin to determine truly digestible NDF, ... quality evaluation model for commercial application. .... The almost perfect relationship (r = 0.98; Table 1) between TDNlig of lucerne hay and MY, predicted.

  19. iFlorida model deployment final evaluation report

    Science.gov (United States)

    2009-01-01

    This document is the final report for the evaluation of the USDOT-sponsored Surface Transportation Security and Reliability Information System Model Deployment, or iFlorida Model Deployment. This report discusses findings in the following areas: ITS ...

  20. evaluation of models for assessing groundwater vulnerability

    African Journals Online (AJOL)

    DR. AMINU

    applied models for groundwater vulnerability assessment mapping. ... of other models have not been applied to ground water studies in Nigeria, unlike other parts of .... Clay Loam. 3. Muck. 2. Nonshrinking and nonaggregated clay. 1. Aller et al., (1987). Table 2: Assigned weights for DRASTIC parameters. Parameters.

  1. Modeling, simulation and performance evaluation of parabolic ...

    African Journals Online (AJOL)

    Model of a parabolic trough power plant, taking into consideration the different losses associated with collection of the solar irradiance and thermal losses is presented. MATLAB software is employed to model the power plant at reference state points. The code is then used to find the different reference values which are ...

  2. Evaluating Energy Efficiency Policies with Energy-Economy Models

    Energy Technology Data Exchange (ETDEWEB)

    Mundaca, Luis; Neij, Lena; Worrell, Ernst; McNeil, Michael A.

    2010-08-01

    The growing complexities of energy systems, environmental problems and technology markets are driving and testing most energy-economy models to their limits. To further advance bottom-up models from a multidisciplinary energy efficiency policy evaluation perspective, we review and critically analyse bottom-up energy-economy models and corresponding evaluation studies on energy efficiency policies to induce technological change. We use the household sector as a case study. Our analysis focuses on decision frameworks for technology choice, type of evaluation being carried out, treatment of market and behavioural failures, evaluated policy instruments, and key determinants used to mimic policy instruments. Although the review confirms criticism related to energy-economy models (e.g. unrealistic representation of decision-making by consumers when choosing technologies), they provide valuable guidance for policy evaluation related to energy efficiency. Different areas to further advance models remain open, particularly related to modelling issues, techno-economic and environmental aspects, behavioural determinants, and policy considerations.

  3. Metrics for evaluating performance and uncertainty of Bayesian network models

    Science.gov (United States)

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  4. Marketing evaluation model of the territorial image

    Directory of Open Access Journals (Sweden)

    Bacherikova M. L.

    2017-08-01

    Full Text Available this article analyzes the existing models for assessing the image of the territory and concluded that it is necessary to develop a model that allows to assess the image of the territory taking into account all the main target audiences. The study of models of the image of the territory considered in the scientific literature was carried out by the method of traditional (non-formalized analysis of documents on the basis of scientific publications of Russian and foreign authors. The author suggests using the «ideal point» model to assess the image of the territory. At the same time, the assessment of the image of the territory should be carried out for all groups of consumers, taking into account the weight coefficients reflecting the importance of opinions and the number of respondents of each group.

  5. Hydrologic Evaluation of Landfill Performance (HELP) Model

    Science.gov (United States)

    The program models rainfall, runoff, infiltration, and other water pathways to estimate how much water builds up above each landfill liner. It can incorporate data on vegetation, soil types, geosynthetic materials, initial moisture conditions, slopes, etc.

  6. Evaluation model development for sprinkler irrigation uniformity ...

    African Journals Online (AJOL)

    A new evaluation method with accompanying software was developed to precisely calculate uniformity from catch-can test data, assuming sprinkler distribution data to be a continuous variable. Two interpolation steps are required to compute unknown water application depths at grid distribution points from radial ...

  7. Model for Evaluating Teacher and Trainer Competences

    Science.gov (United States)

    Carioca, Vito; Rodrigues, Clara; Saude, Sandra; Kokosowski, Alain; Harich, Katja; Sau-Ek, Kristiina; Georgogianni, Nicole; Levy, Samuel; Speer, Sandra; Pugh, Terence

    2009-01-01

    A lack of common criteria for comparing education and training systems makes it difficult to recognise qualifications and competences acquired in different environments and levels of training. A valid basis for defining a framework for evaluating professional performance in European educational and training contexts must therefore be established.…

  8. Determining Evapotranspiration with the Eddy Covariance Method: Fast-Response Dry- and Wet-Bulb Thermocouples for Humidity Measurements Can Provide a Cheap Alternative to Infrared Hygrometers.

    Science.gov (United States)

    Holwerda, F.; Alvarado-Barrientos, M. S.

    2014-12-01

    Field data on evapotranspiration are of crucial importance for ecohydrological and hydrometeorological studies in the tropics. Probably the most direct way to measure evapotranspiration is with the eddy covariance method, in which the latent heat flux (λE) is calculated from turbulent fluctuations of vertical wind velocity and humidity. The humidity fluctuations are typically measured with some type of fast-response infrared hygrometer. However, these sensors are expensive, which can be problematic if research budgets are limited. Turbulent fluctuations of humidity can also be measured with fast-response dry- and wet-bulb thermocouples, which can be constructed easily and at a fraction of the price of infrared sensors. The idea of using dry- and wet-bulb thermocouples for measuring λE with the eddy covariance method is not new, but hasn't been tested recently, possibly because experiments in the late seventies showed that this approach is not without problems due to the slow response of the wet-bulb thermocouple. In the present study, values of λE derived from dry- and wet-bulb thermocouple measurements were compared with those obtained using a fast-response KH20 hygrometer. Measurements were made above a shaded coffee plantation and a sugarcane crop in central Veracruz, Mexico. The agreement between λE obtained with the thermocouples (y) and the hygrometer (x) was very good for both vegetation covers: y = 0.98x + 5.0 (W m-2), r2 = 0.93 (coffee plantation); y = 0.99x - 13.3 (W m-2), r2 = 0.88 (sugarcane). However, the correction factor (CF) for high frequency loss in the wet-bulb temperature signal was considerably higher for the low-statured sugarcane crop (CF = 1.33) as compared to the taller shaded coffee plantation (CF = 1.09). Nevertheless, as long as care is taken in the derivation of this correction factor, reliable λE data can be obtained using the dry- and wet-bulb thermocouples, offering a cheap alternative to infrared hygrometers.

  9. Solar Thermo-coupled Electrochemical Oxidation of Aniline in Wastewater for the Complete Mineralization Beyond an Anodic Passivation Film.

    Science.gov (United States)

    Yuan, Dandan; Tian, Lei; Li, Zhida; Jiang, Hong; Yan, Chao; Dong, Jing; Wu, Hongjun; Wang, Baohui

    2018-02-15

    Herein, we report the solar thermal electrochemical process (STEP) aniline oxidation in wastewater for totally solving the two key obstacles of the huge energy consumption and passivation film in the electrochemical treatment. The process, fully driven by solar energy without input of any other energies, sustainably serves as an efficient thermoelectrochemical oxidation of aniline by the control of the thermochemical and electrochemical coordination. The thermocoupled electrochemical oxidation of aniline achieved a fast rate and high efficiency for the full minimization of aniline to CO 2 with the stability of the electrode and without formation of polyaniline (PAN) passivation film. A clear mechanism of aniline oxidation indicated a switching of the reactive pathway by the STEP process. Due to the coupling of solar thermochemistry and electrochemistry, the electrochemical current remained stable, significantly improving the oxidation efficiency and mineralization rate by apparently decreasing the electrolytic potential when applied with high temperature. The oxidation rate of aniline and chemical oxygen demand (COD) removal rate could be lifted up to 2.03 and 2.47 times magnification compared to conventional electrolysis, respectively. We demonstrate that solar-driven STEP processes are capable of completely mineralizing aniline with high utilization of solar energy. STEP aniline oxidation can be utilized as a green, sustainable water treatment.

  10. Simulation of electric power conservation strategies: model of economic evaluation

    International Nuclear Information System (INIS)

    Pinhel, A.C.C.

    1992-01-01

    A methodology for the economic evaluation model for energy conservation programs to be executed by the National Program of Electric Power Conservation is presented. From data as: forecasting of conserved energy, tariffs, energy costs and budget, the model calculates the economic indexes for the programs, allowing the evaluation of economic impacts in the electric sector. (C.G.C.)

  11. Model evaluation and optimisation of nutrient removal potential for ...

    African Journals Online (AJOL)

    Performance of sequencing batch reactors for simultaneous nitrogen and phosphorus removal is evaluated by means of model simulation, using the activated sludge model, ASM2d, involving anoxic phosphorus uptake, recently proposed by the IAWQ Task group. The evaluation includes all major process configurations ...

  12. The Use of AMET & Automated Scripts for Model Evaluation

    Science.gov (United States)

    Brief overview of EPA’s new CMAQ website to be launched publically in June, 2017. Details on the upcoming release of the Atmospheric Model Evaluation Tool (AMET) and the creation of automated scripts for post-processing and evaluating air quality model data.

  13. Rhode Island Model Evaluation & Support System: Building Administrator. Edition III

    Science.gov (United States)

    Rhode Island Department of Education, 2015

    2015-01-01

    Rhode Island educators believe that implementing a fair, accurate, and meaningful educator evaluation and support system will help improve teaching, learning, and school leadership. The primary purpose of the Rhode Island Model Building Administrator Evaluation and Support System (Rhode Island Model) is to help all building administrators improve.…

  14. p-values for model evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Beaujean, Frederik; Caldwell, Allen [Max-Planck-Institut fuer Physik, Muenchen (Germany); Kollar, Daniel [CERN, Genf (Switzerland); Kroeninger, Kevin [Georg-August-Universitaet, Goettingen (Germany)

    2011-07-01

    In the analysis of experimental results it is often necessary to pass a judgment on the validity of a model as a representation of the data. A quantitative procedure to decide whether a model provides a good description of data is often based on a specific test statistic and a p-value summarizing both the data and the statistic's sampling distribution. Although there is considerable confusion concerning the meaning of p-values, leading to their misuse, they are nevertheless of practical importance in common data analysis tasks. We motivate the application of p-values using a Bayesian argumentation. We then describe commonly and less commonly known test statistics and how they are used to define p-values. The distribution of these are then extracted for examples modeled on typical new physics searches in high energy physics. We comment on their usefulness for determining goodness-of-fit and highlight some common pitfalls.

  15. Systematic evaluation of atmospheric chemistry-transport model CHIMERE

    Science.gov (United States)

    Khvorostyanov, Dmitry; Menut, Laurent; Mailler, Sylvain; Siour, Guillaume; Couvidat, Florian; Bessagnet, Bertrand; Turquety, Solene

    2017-04-01

    Regional-scale atmospheric chemistry-transport models (CTM) are used to develop air quality regulatory measures, to support environmentally sensitive decisions in the industry, and to address variety of scientific questions involving the atmospheric composition. Model performance evaluation with measurement data is critical to understand their limits and the degree of confidence in model results. CHIMERE CTM (http://www.lmd.polytechnique.fr/chimere/) is a French national tool for operational forecast and decision support and is widely used in the international research community in various areas of atmospheric chemistry and physics, climate, and environment (http://www.lmd.polytechnique.fr/chimere/CW-articles.php). This work presents the model evaluation framework applied systematically to the new CHIMERE CTM versions in the course of the continuous model development. The framework uses three of the four CTM evaluation types identified by the Environmental Protection Agency (EPA) and the American Meteorological Society (AMS): operational, diagnostic, and dynamic. It allows to compare the overall model performance in subsequent model versions (operational evaluation), identify specific processes and/or model inputs that could be improved (diagnostic evaluation), and test the model sensitivity to the changes in air quality, such as emission reductions and meteorological events (dynamic evaluation). The observation datasets currently used for the evaluation are: EMEP (surface concentrations), AERONET (optical depths), and WOUDC (ozone sounding profiles). The framework is implemented as an automated processing chain and allows interactive exploration of the results via a web interface.

  16. p-values for model evaluation

    International Nuclear Information System (INIS)

    Beaujean, F.; Caldwell, A.; Kollar, D.; Kroeninger, K.

    2011-01-01

    Deciding whether a model provides a good description of data is often based on a goodness-of-fit criterion summarized by a p-value. Although there is considerable confusion concerning the meaning of p-values, leading to their misuse, they are nevertheless of practical importance in common data analysis tasks. We motivate their application using a Bayesian argumentation. We then describe commonly and less commonly known discrepancy variables and how they are used to define p-values. The distribution of these are then extracted for examples modeled on typical data analysis tasks, and comments on their usefulness for determining goodness-of-fit are given.

  17. Center for Integrated Nanotechnologies (CINT) Chemical Release Modeling Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Stirrup, Timothy Scott [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-12-20

    This evaluation documents the methodology and results of chemical release modeling for operations at Building 518, Center for Integrated Nanotechnologies (CINT) Core Facility. This evaluation is intended to supplement an update to the CINT [Standalone] Hazards Analysis (SHA). This evaluation also updates the original [Design] Hazards Analysis (DHA) completed in 2003 during the design and construction of the facility; since the original DHA, additional toxic materials have been evaluated and modeled to confirm the continued low hazard classification of the CINT facility and operations. This evaluation addresses the potential catastrophic release of the current inventory of toxic chemicals at Building 518 based on a standard query in the Chemical Information System (CIS).

  18. Evaluating Models of Human Performance: Safety-Critical Systems Applications

    Science.gov (United States)

    Feary, Michael S.

    2012-01-01

    This presentation is part of panel discussion on Evaluating Models of Human Performance. The purpose of this panel is to discuss the increasing use of models in the world today and specifically focus on how to describe and evaluate models of human performance. My presentation will focus on discussions of generating distributions of performance, and the evaluation of different strategies for humans performing tasks with mixed initiative (Human-Automation) systems. I will also discuss issues with how to provide Human Performance modeling data to support decisions on acceptability and tradeoffs in the design of safety critical systems. I will conclude with challenges for the future.

  19. A Model for Telestrok Network Evaluation

    DEFF Research Database (Denmark)

    Storm, Anna; Günzel, Franziska; Theiss, Stephan

    2011-01-01

    was developed from the third-party payer perspective. In principle, it enables telestroke networks to conduct cost-effectiveness studies, because the majority of the required data can be extracted from health insurance companies’ databases and the telestroke network itself. The model presents a basis...

  20. Evaluating a Model of Youth Physical Activity

    Science.gov (United States)

    Heitzler, Carrie D.; Lytle, Leslie A.; Erickson, Darin J.; Barr-Anderson, Daheia; Sirard, John R.; Story, Mary

    2010-01-01

    Objective: To explore the relationship between social influences, self-efficacy, enjoyment, and barriers and physical activity. Methods: Structural equation modeling examined relationships between parent and peer support, parent physical activity, individual perceptions, and objectively measured physical activity using accelerometers among a…

  1. An evaluation of uncertainties in radioecological models

    International Nuclear Information System (INIS)

    Hoffmann, F.O.; Little, C.A.; Miller, C.W.; Dunning, D.E. Jr.; Rupp, E.M.; Shor, R.W.; Schaeffer, D.L.; Baes, C.F. III

    1978-01-01

    The paper presents results of analyses for seven selected parameters commonly used in environmental radiological assessment models, assuming that the available data are representative of the true distribution of parameter values and that their respective distributions are lognormal. Estimates of the most probable, median, mean, and 99th percentile for each parameter are fiven and compared to U.S. NRC default values. The regulatory default values are generally greater than the median values for the selected parameters, but some are associated with percentiles significantly less than the 50th. The largest uncertainties appear to be associated with aquatic bioaccumulation factors for fresh water fish. Approximately one order of magnitude separates median values and values of the 99th percentile. The uncertainty is also estimated for the annual dose rate predicted by a multiplicative chain model for the transport of molecular iodine-131 via the air-pasture-cow-milk-child's thyroid pathway. The value for the 99th percentile is ten times larger than the median value of the predicted dose normalized for a given air concentration of 131 I 2 . About 72% of the uncertainty in this model is contributed by the dose conversion factor and the milk transfer coefficient. Considering the difficulties in obtaining a reliable quantification of the true uncertainties in model predictions, methods for taking these uncertainties into account when determining compliance with regulatory statutes are discussed. (orig./HP) [de

  2. Evaluation Model of Tea Industry Information Service Quality

    OpenAIRE

    Shi , Xiaohui; Chen , Tian’en

    2015-01-01

    International audience; According to characteristics of tea industry information service, this paper have built service quality evaluation index system for tea industry information service quality, R-cluster analysis and multiple regression have been comprehensively used to contribute evaluation model with a high practice and credibility. Proved by the experiment, the evaluation model of information service quality has a good precision, which has guidance significance to a certain extent to e...

  3. A Model Management Approach for Co-Simulation Model Evaluation

    NARCIS (Netherlands)

    Zhang, X.C.; Broenink, Johannes F.; Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2011-01-01

    Simulating formal models is a common means for validating the correctness of the system design and reduce the time-to-market. In most of the embedded control system design, multiple engineering disciplines and various domain-specific models are often involved, such as mechanical, control, software

  4. RTMOD: Real-Time MODel evaluation

    DEFF Research Database (Denmark)

    Graziani, G.; Galmarini, S.; Mikkelsen, Torben

    2000-01-01

    the RTMOD web page for detailed information on the actual release, and as soon as possible they then uploaded their predictions to the RTMOD server and could soon after start their inter-comparison analysis with other modellers. When additionalforecast data arrived, already existing statistical results....... At that time, the World Wide Web was not available to all the exercise participants, and plume predictions were therefore submitted to JRC-Ispra by fax andregular mail for subsequent processing. The rapid development of the World Wide Web in the second half of the nineties, together with the experience gained...... during the ETEX exercises suggested the development of this project. RTMOD featured a web-baseduser-friendly interface for data submission and an interactive program module for displaying, intercomparison and analysis of the forecasts. RTMOD has focussed on model intercomparison of concentration...

  5. Standard guide for use of thermocouples in creep and stress-rupture testing to 1800°F (1000°C) in air

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2000-01-01

    1.1 This guide covers the use of ANSI thermocouple Types K, N, R, and S for creep and stress-rupture testing at temperatures up to 1800°F (1000°C) in air at one atmosphere of pressure. It does not cover the use of sheathed thermocouples. 1.2 The values stated in inch-pound units are to be regarded as the standard. The values given in parentheses are for information only. 1.3This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  6. Automated expert modeling for automated student evaluation.

    Energy Technology Data Exchange (ETDEWEB)

    Abbott, Robert G.

    2006-01-01

    The 8th International Conference on Intelligent Tutoring Systems provides a leading international forum for the dissemination of original results in the design, implementation, and evaluation of intelligent tutoring systems and related areas. The conference draws researchers from a broad spectrum of disciplines ranging from artificial intelligence and cognitive science to pedagogy and educational psychology. The conference explores intelligent tutoring systems increasing real world impact on an increasingly global scale. Improved authoring tools and learning object standards enable fielding systems and curricula in real world settings on an unprecedented scale. Researchers deploy ITS's in ever larger studies and increasingly use data from real students, tasks, and settings to guide new research. With high volumes of student interaction data, data mining, and machine learning, tutoring systems can learn from experience and improve their teaching performance. The increasing number of realistic evaluation studies also broaden researchers knowledge about the educational contexts for which ITS's are best suited. At the same time, researchers explore how to expand and improve ITS/student communications, for example, how to achieve more flexible and responsive discourse with students, help students integrate Web resources into learning, use mobile technologies and games to enhance student motivation and learning, and address multicultural perspectives.

  7. Local fit evaluation of structural equation models using graphical criteria.

    Science.gov (United States)

    Thoemmes, Felix; Rosseel, Yves; Textor, Johannes

    2018-03-01

    Evaluation of model fit is critically important for every structural equation model (SEM), and sophisticated methods have been developed for this task. Among them are the χ² goodness-of-fit test, decomposition of the χ², derived measures like the popular root mean square error of approximation (RMSEA) or comparative fit index (CFI), or inspection of residuals or modification indices. Many of these methods provide a global approach to model fit evaluation: A single index is computed that quantifies the fit of the entire SEM to the data. In contrast, graphical criteria like d-separation or trek-separation allow derivation of implications that can be used for local fit evaluation, an approach that is hardly ever applied. We provide an overview of local fit evaluation from the viewpoint of SEM practitioners. In the presence of model misfit, local fit evaluation can potentially help in pinpointing where the problem with the model lies. For models that do fit the data, local tests can identify the parts of the model that are corroborated by the data. Local tests can also be conducted before a model is fitted at all, and they can be used even for models that are globally underidentified. We discuss appropriate statistical local tests, and provide applied examples. We also present novel software in R that automates this type of local fit evaluation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. SET-MM – A Software Evaluation Technology Maturity Model

    OpenAIRE

    García-Castro, Raúl

    2011-01-01

    The application of software evaluation technologies in different research fields to verify and validate research is a key factor in the progressive evolution of those fields. Nowadays, however, to have a clear picture of the maturity of the technologies used in evaluations or to know which steps to follow in order to improve the maturity of such technologies is not easy. This paper describes a Software Evaluation Technology Maturity Model that can be used to assess software evaluation tech...

  9. Resampling methods for evaluating classification accuracy of wildlife habitat models

    Science.gov (United States)

    Verbyla, David L.; Litvaitis, John A.

    1989-11-01

    Predictive models of wildlife-habitat relationships often have been developed without being tested The apparent classification accuracy of such models can be optimistically biased and misleading. Data resampling methods exist that yield a more realistic estimate of model classification accuracy These methods are simple and require no new sample data. We illustrate these methods (cross-validation, jackknife resampling, and bootstrap resampling) with computer simulation to demonstrate the increase in precision of the estimate. The bootstrap method is then applied to field data as a technique for model comparison We recommend that biologists use some resampling procedure to evaluate wildlife habitat models prior to field evaluation.

  10. [Evaluation on a fast weight reduction model in vitro].

    Science.gov (United States)

    Li, Songtao; Li, Ying; Wen, Ying; Sun, Changhao

    2010-03-01

    To establish a fast and effective model in vitro for screening weight-reducing drugs and taking preliminary evaluation of the model. Mature adipocytes of SD rat induced by oleic acid were used to establish a obesity model in vitro. Isoprel, genistein, caffeine were selected as positive agents and curcumine as negative agent to evaluate the obesity model. Lipolysis of adipocytes was stimulated significantly by isoprel, genistein and caffeine rather than curcumine. This model could be used efficiently for screening weight-losing drugs.

  11. Teachers' Development Model to Authentic Assessment by Empowerment Evaluation Approach

    Science.gov (United States)

    Charoenchai, Charin; Phuseeorn, Songsak; Phengsawat, Waro

    2015-01-01

    The purposes of this study were 1) Study teachers authentic assessment, teachers comprehension of authentic assessment and teachers needs for authentic assessment development. 2) To create teachers development model. 3) Experiment of teachers development model. 4) Evaluate effectiveness of teachers development model. The research is divided into 4…

  12. Evaluation of habitat suitability index models for assessing biotic resources

    Science.gov (United States)

    John C. Rennie; Joseph D. Clark; James M. Sweeney

    2000-01-01

    Existing habitat suitability index (HSI) models are evaluated for assessing the biotic resources on Champion International Corporation (CIC) lands with data from a standard and an expanded timber inventory. Forty HSI models for 34 species that occur in the Southern Appalachians have been identified from the literature. All of the variables for 14 models are provided (...

  13. EcoMark: Evaluating Models of Vehicular Environmental Impact

    DEFF Research Database (Denmark)

    Guo, Chenjuan; Ma, Mike; Yang, Bin

    2012-01-01

    the vehicle travels in. We develop an evaluation framework, called EcoMark, for such environmental impact models. In addition, we survey all eleven state-of-the-art impact models known to us. To gain insight into the capabilities of the models and to understand the effectiveness of the EcoMark, we apply...

  14. Evaluating the double Poisson generalized linear model.

    Science.gov (United States)

    Zou, Yaotian; Geedipally, Srinivas Reddy; Lord, Dominique

    2013-10-01

    The objectives of this study are to: (1) examine the applicability of the double Poisson (DP) generalized linear model (GLM) for analyzing motor vehicle crash data characterized by over- and under-dispersion and (2) compare the performance of the DP GLM with the Conway-Maxwell-Poisson (COM-Poisson) GLM in terms of goodness-of-fit and theoretical soundness. The DP distribution has seldom been investigated and applied since its first introduction two decades ago. The hurdle for applying the DP is related to its normalizing constant (or multiplicative constant) which is not available in closed form. This study proposed a new method to approximate the normalizing constant of the DP with high accuracy and reliability. The DP GLM and COM-Poisson GLM were developed using two observed over-dispersed datasets and one observed under-dispersed dataset. The modeling results indicate that the DP GLM with its normalizing constant approximated by the new method can handle crash data characterized by over- and under-dispersion. Its performance is comparable to the COM-Poisson GLM in terms of goodness-of-fit (GOF), although COM-Poisson GLM provides a slightly better fit. For the over-dispersed data, the DP GLM performs similar to the NB GLM. Considering the fact that the DP GLM can be easily estimated with inexpensive computation and that it is simpler to interpret coefficients, it offers a flexible and efficient alternative for researchers to model count data. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Design Concept Evaluation Using System Throughput Model

    International Nuclear Information System (INIS)

    Sequeira, G.; Nutt, W. M.

    2004-01-01

    The U.S. Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) is currently developing the technical bases to support the submittal of a license application for construction of a geologic repository at Yucca Mountain, Nevada to the U.S. Nuclear Regulatory Commission. The Office of Repository Development (ORD) is responsible for developing the design of the proposed repository surface facilities for the handling of spent nuclear fuel and high level nuclear waste. Preliminary design activities are underway to sufficiently develop the repository surface facilities design for inclusion in the license application. The design continues to evolve to meet mission needs and to satisfy both regulatory and program requirements. A system engineering approach is being used in the design process since the proposed repository facilities are dynamically linked by a series of sub-systems and complex operations. In addition, the proposed repository facility is a major system element of the overall waste management process being developed by the OCRWM. Such an approach includes iterative probabilistic dynamic simulation as an integral part of the design evolution process. A dynamic simulation tool helps to determine if: (1) the mission and design requirements are complete, robust, and well integrated; (2) the design solutions under development meet the design requirements and mission goals; (3) opportunities exist where the system can be improved and/or optimized; and (4) proposed changes to the mission, and design requirements have a positive or negative impact on overall system performance and if design changes may be necessary to satisfy these changes. This paper will discuss the type of simulation employed to model the waste handling operations. It will then discuss the process being used to develop the Yucca Mountain surface facilities model. The latest simulation model and the results of the simulation and how the data were used in the design

  16. ECOPATH: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Bergstroem, U.; Nordlinder, S.

    1996-01-01

    The model is based upon compartment theory and it is run in combination with a statistical error propagation method (PRISM, Gardner et al. 1983). It is intended to be generic for application on other sites with simple changing of parameter values. It was constructed especially for this scenario. However, it is based upon an earlier designed model for calculating relations between released amount of radioactivity and doses to critical groups (used for Swedish regulations concerning annual reports of released radioactivity from routine operation of Swedish nuclear power plants (Bergstroem och Nordlinder, 1991)). The model handles exposure from deposition on terrestrial areas as well as deposition on lakes, starting with deposition values. 14 refs, 16 figs, 7 tabs

  17. A random walk model to evaluate autism

    Science.gov (United States)

    Moura, T. R. S.; Fulco, U. L.; Albuquerque, E. L.

    2018-02-01

    A common test administered during neurological examination in children is the analysis of their social communication and interaction across multiple contexts, including repetitive patterns of behavior. Poor performance may be associated with neurological conditions characterized by impairments in executive function, such as the so-called pervasive developmental disorders (PDDs), a particular condition of the autism spectrum disorders (ASDs). Inspired in these diagnosis tools, mainly those related to repetitive movements and behaviors, we studied here how the diffusion regimes of two discrete-time random walkers, mimicking the lack of social interaction and restricted interests developed for children with PDDs, are affected. Our model, which is based on the so-called elephant random walk (ERW) approach, consider that one of the random walker can learn and imitate the microscopic behavior of the other with probability f (1 - f otherwise). The diffusion regimes, measured by the Hurst exponent (H), is then obtained, whose changes may indicate a different degree of autism.

  18. Evaluations of an Experiential Gaming Model

    Directory of Open Access Journals (Sweden)

    Kristian Kiili

    2006-01-01

    Full Text Available This paper examines the experiences of players of a problem-solving game. The main purpose of the paper is to validate the flow antecedents included in an experiential gaming model and to study their influence on the flow experience. Additionally, the study aims to operationalize the flow construct in a game context and to start a scale development process for assessing the experience of flow in game settings. Results indicated that the flow antecedents studied—challenges matched to a player’s skill level, clear goals, unambiguous feedback, a sense of control, and playability—should be considered in game design because they contribute to the flow experience. Furthermore, the indicators of the actual flow experience were distinguished.

  19. Large scale Bayesian nuclear data evaluation with consistent model defects

    International Nuclear Information System (INIS)

    Schnabel, G

    2015-01-01

    The aim of nuclear data evaluation is the reliable determination of cross sections and related quantities of the atomic nuclei. To this end, evaluation methods are applied which combine the information of experiments with the results of model calculations. The evaluated observables with their associated uncertainties and correlations are assembled into data sets, which are required for the development of novel nuclear facilities, such as fusion reactors for energy supply, and accelerator driven systems for nuclear waste incineration. The efficiency and safety of such future facilities is dependent on the quality of these data sets and thus also on the reliability of the applied evaluation methods. This work investigated the performance of the majority of available evaluation methods in two scenarios. The study indicated the importance of an essential component in these methods, which is the frequently ignored deficiency of nuclear models. Usually, nuclear models are based on approximations and thus their predictions may deviate from reliable experimental data. As demonstrated in this thesis, the neglect of this possibility in evaluation methods can lead to estimates of observables which are inconsistent with experimental data. Due to this finding, an extension of Bayesian evaluation methods is proposed to take into account the deficiency of the nuclear models. The deficiency is modeled as a random function in terms of a Gaussian process and combined with the model prediction. This novel formulation conserves sum rules and allows to explicitly estimate the magnitude of model deficiency. Both features are missing in available evaluation methods so far. Furthermore, two improvements of existing methods have been developed in the course of this thesis. The first improvement concerns methods relying on Monte Carlo sampling. A Metropolis-Hastings scheme with a specific proposal distribution is suggested, which proved to be more efficient in the studied scenarios than the

  20. Evaluation of atmospheric dispersion/consequence models supporting safety analysis

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Lazaro, M.A.; Woodard, K.

    1996-01-01

    Two DOE Working Groups have completed evaluation of accident phenomenology and consequence methodologies used to support DOE facility safety documentation. The independent evaluations each concluded that no one computer model adequately addresses all accident and atmospheric release conditions. MACCS2, MATHEW/ADPIC, TRAC RA/HA, and COSYMA are adequate for most radiological dispersion and consequence needs. ALOHA, DEGADIS, HGSYSTEM, TSCREEN, and SLAB are recommended for chemical dispersion and consequence applications. Additional work is suggested, principally in evaluation of new models, targeting certain models for continued development, training, and establishing a Web page for guidance to safety analysts

  1. FARMLAND: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Attwood, C.; Fayers, C.; Mayall, A.; Brown, J.; Simmonds, J.R.

    1996-01-01

    The FARMLAND model was originally developed for use in connection with continuous, routine releases of radionuclides, but because it has many time-dependent features it has been developed further for a single accidental release. The most recent version of FARMLAND is flexible and can be used to predict activity concentrations in food as a function of time after both accidental and routine releases of radionuclides. The effect of deposition at different times of the year can be taken into account. FARMLAND contains a suite of models which simulate radionuclide transfer through different parts of the foodchain. The models can be used in different combinations and offer the flexibility to assess a variety of radiological situations. The main foods considered are green vegetables, grain products, root vegetables, milk, meat and offal from cattle, and meat and offal from sheep. A large variety of elements can be considered although the degree of complexity with which some are modelled is greater than others; isotopes of caesium, strontium and iodine are treated in greatest detail. 22 refs, 12 figs, 10 tabs

  2. Evaluation of Cost Models and Needs & Gaps Analysis

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad

    2014-01-01

    his report ’D3.1—Evaluation of Cost Models and Needs & Gaps Analysis’ provides an analysis of existing research related to the economics of digital curation and cost & benefit modelling. It reports upon the investigation of how well current models and tools meet stakeholders’ needs for calculating...... andcomparing financial information. Based on this evaluation, it aims to point out gaps that need to be bridged in order to increase the uptake of cost & benefit modelling and good practices that will enable costing and comparison of the costs of alternative scenarios—which in turn provides a starting point...... for amore efficient use of resources for digital curation. To facilitate and clarify the model evaluation the report first outlines a basic terminology and a generaldescription of the characteristics of cost and benefit models.The report then describes how the ten current and emerging cost and benefit...

  3. Thermo-coupled Surface Cauchy-Born Theory: An Engineering Finite Element Approach to Modeling of Nanowire Thermomechanical Response

    DEFF Research Database (Denmark)

    Esfahania, M. Nasr; Sonne, Mads Rostgaard; Hattel, J. Henri

    2016-01-01

    on Surface Cauchy-Born theory is developed, where surface energy is accounted for in the prediction of the thermomechanical response. This is achieved by using a temperature-dependent interatomic potential in the standard Cauchy-Born theory with a surface energy contribution. Simultaneous calculation...

  4. Regime-based evaluation of cloudiness in CMIP5 models

    Science.gov (United States)

    Jin, Daeho; Oreopoulos, Lazaros; Lee, Dongmin

    2017-01-01

    The concept of cloud regimes (CRs) is used to develop a framework for evaluating the cloudiness of 12 fifth Coupled Model Intercomparison Project (CMIP5) models. Reference CRs come from existing global International Satellite Cloud Climatology Project (ISCCP) weather states. The evaluation is made possible by the implementation in several CMIP5 models of the ISCCP simulator generating in each grid cell daily joint histograms of cloud optical thickness and cloud top pressure. Model performance is assessed with several metrics such as CR global cloud fraction (CF), CR relative frequency of occurrence (RFO), their product [long-term average total cloud amount (TCA)], cross-correlations of CR RFO maps, and a metric of resemblance between model and ISCCP CRs. In terms of CR global RFO, arguably the most fundamental metric, the models perform unsatisfactorily overall, except for CRs representing thick storm clouds. Because model CR CF is internally constrained by our method, RFO discrepancies yield also substantial TCA errors. Our results support previous findings that CMIP5 models underestimate cloudiness. The multi-model mean performs well in matching observed RFO maps for many CRs, but is still not the best for this or other metrics. When overall performance across all CRs is assessed, some models, despite shortcomings, apparently outperform Moderate Resolution Imaging Spectroradiometer cloud observations evaluated against ISCCP like another model output. Lastly, contrasting cloud simulation performance against each model's equilibrium climate sensitivity in order to gain insight on whether good cloud simulation pairs with particular values of this parameter, yields no clear conclusions.

  5. Evaluating energy saving system of data centers based on AHP and fuzzy comprehensive evaluation model

    Science.gov (United States)

    Jiang, Yingni

    2018-03-01

    Due to the high energy consumption of communication, energy saving of data centers must be enforced. But the lack of evaluation mechanisms has restrained the process on energy saving construction of data centers. In this paper, energy saving evaluation index system of data centers was constructed on the basis of clarifying the influence factors. Based on the evaluation index system, analytical hierarchy process was used to determine the weights of the evaluation indexes. Subsequently, a three-grade fuzzy comprehensive evaluation model was constructed to evaluate the energy saving system of data centers.

  6. Biology learning evaluation model in Senior High Schools

    Directory of Open Access Journals (Sweden)

    Sri Utari

    2017-06-01

    Full Text Available The study was to develop a Biology learning evaluation model in senior high schools that referred to the research and development model by Borg & Gall and the logic model. The evaluation model included the components of input, activities, output and outcomes. The developing procedures involved a preliminary study in the form of observation and theoretical review regarding the Biology learning evaluation in senior high schools. The product development was carried out by designing an evaluation model, designing an instrument, performing instrument experiment and performing implementation. The instrument experiment involved teachers and Students from Grade XII in senior high schools located in the City of Yogyakarta. For the data gathering technique and instrument, the researchers implemented observation sheet, questionnaire and test. The questionnaire was applied in order to attain information regarding teacher performance, learning performance, classroom atmosphere and scientific attitude; on the other hand, test was applied in order to attain information regarding Biology concept mastery. Then, for the analysis of instrument construct, the researchers performed confirmatory factor analysis by means of Lisrel 0.80 software and the results of this analysis showed that the evaluation instrument valid and reliable. The construct validity was between 0.43-0.79 while the reliability of measurement model was between 0.88-0.94. Last but not the least, the model feasibility test showed that the theoretical model had been supported by the empirical data.

  7. Applying the social relations model to self and peer evaluations

    NARCIS (Netherlands)

    Greguras, G.J.; Robie, C.; Born, M.Ph.

    2001-01-01

    Peer evaluations of performance increasingly are being used to make organizational decisions and to provide individuals with performance related feedback. Using Kenny's social relations model (SRM), data from 14 teams of undergraduate students who completed performance ratings of themselves and

  8. Industrial Waste Management Evaluation Model Version 3.1

    Science.gov (United States)

    IWEM is a screening level ground water model designed to simulate contaminant fate and transport. IWEM v3.1 is the latest version of the IWEM software, which includes additional tools to evaluate the beneficial use of industrial materials

  9. Using Models of Cognition in HRI Evaluation and Design

    National Research Council Canada - National Science Library

    Goodrich, Michael A

    2004-01-01

    ...) guide the construction of experiments. In this paper, we present an information processing model of cognition that we have used extensively in designing and evaluating interfaces and autonomy modes...

  10. Evaluation of global luminous efficacy models for Florianopolis, Brazil

    Energy Technology Data Exchange (ETDEWEB)

    De Souza, Roberta G.; Pereira, Fernando O.R. [Universidade Federal de Santa Catarina, Florianopolis (Brazil). Laboratorio de Conforto Ambiental, Dpto. de Arquitetura; Robledo, Luis [Universidad Politecnica de Madrid, Madrid (Spain). E.P.E.S. Ciencias Ambientales; Soler, Alfonso [Universidad Politecnica de Madrid, Madrid (Spain). E.P.E.S. Ciencias Ambientales and Dpto. de Fisica e Instalaciones Aplicadas, E.T.S. de Arquitectura

    2006-10-15

    Several global luminous efficacy models have been tested with daylight-measured data obtained for Felipresina, Southern Brazil. The models have been used with their original coefficients, given by the authors and also with local coefficients obtained when the models were optimized with the data measured in Felipresina. The evaluation of the different models has been carried out considering three sky categories, according to a higher or lower presence of clouds. For clear sky, the models tested have been compared with a proposed polynomial model on the solar altitude, obtained by the best fit of experimental points for Felipresina. It has been proved that the model coefficients have a local character. If those models are used with local coefficients, there is no model that works better than the others for all sky types, but that for each sky category a different model could be recommended. (author)

  11. Model of service-oriented catering supply chain performance evaluation

    OpenAIRE

    Gou, Juanqiong; Shen, Guguan; Chai, Rui

    2013-01-01

    Purpose: The aim of this paper is constructing a performance evaluation model for service-oriented catering supply chain. Design/methodology/approach: With the research on the current situation of catering industry, this paper summarized the characters of the catering supply chain, and then presents the service-oriented catering supply chain model based on the platform of logistics and information. At last, the fuzzy AHP method is used to evaluate the performance of service-oriented catering ...

  12. A New Software Quality Model for Evaluating COTS Components

    OpenAIRE

    Adnan Rawashdeh; Bassem Matalkah

    2006-01-01

    Studies show that COTS-based (Commercial off the shelf) systems that are being built recently are exceeding 40% of the total developed software systems. Therefore, a model that ensures quality characteristics of such systems becomes a necessity. Among the most critical processes in COTS-based systems are the evaluation and selection of the COTS components. There are several existing quality models used to evaluate software systems in general; however, none of them is dedicated to COTS-based s...

  13. DETRA: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Suolanen, V.

    1996-01-01

    The computer code DETRA is a generic tool for environmental transfer analyses of radioactive or stable substances. The code has been applied for various purposes, mainly problems related to the biospheric transfer of radionuclides both in safety analyses of disposal of nuclear wastes and in consideration of foodchain exposure pathways in the analyses of off-site consequences of reactor accidents. For each specific application an individually tailored conceptual model can be developed. The biospheric transfer analyses performed by the code are typically carried out for terrestrial, aquatic and food chain applications. 21 refs, 35 figs, 15 tabs

  14. Ground-water transport model selection and evaluation guidelines

    International Nuclear Information System (INIS)

    Simmons, C.S.; Cole, C.R.

    1983-01-01

    Guidelines are being developed to assist potential users with selecting appropriate computer codes for ground-water contaminant transport modeling. The guidelines are meant to assist managers with selecting appropriate predictive models for evaluating either arid or humid low-level radioactive waste burial sites. Evaluation test cases in the form of analytical solutions to fundamental equations and experimental data sets have been identified and recommended to ensure adequate code selection, based on accurate simulation of relevant physical processes. The recommended evaluation procedures will consider certain technical issues related to the present limitations in transport modeling capabilities. A code-selection plan will depend on identifying problem objectives, determining the extent of collectible site-specific data, and developing a site-specific conceptual model for the involved hydrology. Code selection will be predicated on steps for developing an appropriate systems model. This paper will review the progress in developing those guidelines. 12 references

  15. A Universal Model for the Normative Evaluation of Internet Information.

    NARCIS (Netherlands)

    Spence, E.H.

    2009-01-01

    Beginning with the initial premise that as the Internet has a global character, the paper will argue that the normative evaluation of digital information on the Internet necessitates an evaluative model that is itself universal and global in character (I agree, therefore, with Gorniak- Kocikowska’s

  16. Synthesis, evaluation and molecular modelling studies of some ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Chemical Sciences; Volume 122; Issue 2. Synthesis, evaluation and molecular modelling studies of some novel 3-(3 ... The compounds have been characterized on the basis of elemental analysis and spectral data. All the compounds were evaluated for their HIV-1 RT inhibitory activity. Among ...

  17. LINDOZ model for Finland environment: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Galeriu, D.; Apostoaie, A.I.; Mocanu, N.; Paunescu, N.

    1996-01-01

    LINDOZ model was developed as a realistic assessment tool for radioactive contamination of the environment. It was designed to produce estimates for the concentration of the pollutant in different compartments of the terrestrial ecosystem (soil, vegetation, animal tissue, and animal products), and to evaluate human exposure to the contaminant (concentration in whole human body, and dose to humans) from inhalation, ingestion and external irradiation. The user can apply LINDOZ for both routine and accidental type of releases. 2 figs, 2 tabs

  18. Evaluating a novel resident role-modelling programme.

    Science.gov (United States)

    Sternszus, Robert; Steinert, Yvonne; Bhanji, Farhan; Andonian, Sero; Snell, Linda S

    2017-05-09

    Role modelling is a fundamental method by which students learn from residents. To our knowledge, however, resident-as-teacher curricula have not explicitly addressed resident role modelling. The purpose of this project was to design, implement and evaluate an innovative programme to teach residents about role modelling. The authors designed a resident role-modelling programme and incorporated it into the 2015 and 2016 McGill University resident-as-teacher curriculum. Influenced by experiential and social learning theories, the programme incorporated flipped-classroom and simulation approaches to teach residents to be aware and deliberate role models. Outcomes were assessed through a pre- and immediate post-programme questionnaire evaluating reaction and learning, a delayed post-programme questionnaire evaluating learning, and a retrospective pre-post questionnaire (1 month following the programme) evaluating self-reported behaviour changes. Thirty-three of 38 (87%) residents who participated in the programme completed the evaluation, with 25 residents (66%) completing all questionnaires. Participants rated the programme highly on a five-point Likert scale (where 1 = not helpful and 5 = very helpful; mean score, M = 4.57; standard deviation, SD = 0.50), and showed significant improvement in their perceptions of their importance as role models and their knowledge of deliberate role modelling. Residents also reported an increased use of deliberate role-modelling strategies 1 month after completing the programme. Resident-as-teacher curricula have not explicitly addressed resident role modelling DISCUSSION: The incorporation of resident role modelling into our resident-as-teacher curriculum positively influenced the participants' perceptions of their role-modelling abilities. This programme responds to a gap in resident training and has the potential to guide further programme development in this important and often overlooked area. © 2017 John Wiley & Sons

  19. Faculty Performance Evaluation: The CIPP-SAPS Model.

    Science.gov (United States)

    Mitcham, Maralynne

    1981-01-01

    The issues of faculty performance evaluation for allied health professionals are addressed. Daniel Stufflebeam's CIPP (content-imput-process-product) model is introduced and its development into a CIPP-SAPS (self-administrative-peer- student) model is pursued. (Author/CT)

  20. Evaluation of forest snow processes models (SnowMKIP2)

    Science.gov (United States)

    Nick Rutter; Richard Essery; John Pomeroy; Nuria Altimir; Kostas Andreadis; Ian Baker; Alan Barr; Paul Bartlett; Aaron Boone; Huiping Deng; Herve Douville; Emanuel Dutra; Kelly Elder; others

    2009-01-01

    Thirty-three snowpack models of varying complexity and purpose were evaluated across a wide range of hydrometeorological and forest canopy conditions at five Northern Hemisphere locations, for up to two winter snow seasons. Modeled estimates of snow water equivalent (SWE) or depth were compared to observations at forest and open sites at each location. Precipitation...

  1. Using an ecosystem model to evaluate fisheries management ...

    African Journals Online (AJOL)

    A coral reef ecosystem simulation model, CAFFEE, was developedto evaluate the effects of fisheries management measures on coral reef ecosystem services and functioning, independently or combined with climate change impacts. As an example of the types of simulations available, we present model outputsfor ...

  2. Evaluating Emulation-based Models of Distributed Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Gabert, Kasimir G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Emulytics Initiatives

    2017-08-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  3. The fence experiment - a first evaluation of shelter models

    DEFF Research Database (Denmark)

    Peña, Alfredo; Bechmann, Andreas; Conti, Davide

    2016-01-01

    We present a preliminary evaluation of shelter models of different degrees of complexity using full-scale lidar measurements of the shelter on a vertical plane behind and orthogonal to a fence. Model results accounting for the distribution of the relative wind direction within the observed direct...

  4. Evaluation of preformance of Predictive Models for Deoxynivalenol in Wheat

    NARCIS (Netherlands)

    Fels, van der H.J.

    2014-01-01

    The aim of this study was to evaluate the performance of two predictive models for deoxynivalenol contamination of wheat at harvest in the Netherlands, including the use of weather forecast data and external model validation. Data were collected in a different year and from different wheat fields

  5. Boussinesq Modeling of Wave Propagation and Runup over Fringing Coral Reefs, Model Evaluation Report

    National Research Council Canada - National Science Library

    Demirbilek, Zeki; Nwogu, Okey G

    2007-01-01

    ..., for waves propagating over fringing reefs. The model evaluation had two goals: (a) investigate differences between laboratory and field characteristics of wave transformation processes over reefs, and (b...

  6. An applied model for the evaluation of multiple physiological stressors.

    Science.gov (United States)

    Constable, S H; Sherry, C J; Walters, T J

    1991-01-01

    In everyday life, a human is likely to be exposed to the combined effects of a number of different stressors simultaneously. Consequently, if an applied model is to ultimately provide the best 'fit' between the modeling and modeled phenomena, it must be able to accommodate the evaluation of multiple stressors. Therefore, a multidimensional, primate model is described that can fully accommodate a large number of conceivably stressful, real life scenarios that may be encountered by civilian or military workers. A number of physiological measurements were made in female rhesus monkeys in order to validate the model against previous reports. These evaluations were further expanded to include the experimental perturbation of physical work (exercise). Physiological profiles during activity were extended with the incorporation of radio telemetry. In conclusion, this model allows maximal extrapolation of the potential deleterious or ergogenic effects on systemic physiological function under conditions of realistic operational demands and environments.

  7. Ergonomic evaluation model of operational room based on team performance

    Directory of Open Access Journals (Sweden)

    YANG Zhiyi

    2017-05-01

    Full Text Available A theoretical calculation model based on the ergonomic evaluation of team performance was proposed in order to carry out the ergonomic evaluation of the layout design schemes of the action station in a multitasking operational room. This model was constructed in order to calculate and compare the theoretical value of team performance in multiple layout schemes by considering such substantial influential factors as frequency of communication, distance, angle, importance, human cognitive characteristics and so on. An experiment was finally conducted to verify the proposed model under the criteria of completion time and accuracy rating. As illustrated by the experiment results,the proposed approach is conductive to the prediction and ergonomic evaluation of the layout design schemes of the action station during early design stages,and provides a new theoretical method for the ergonomic evaluation,selection and optimization design of layout design schemes.

  8. A smart growth evaluation model based on data envelopment analysis

    Science.gov (United States)

    Zhang, Xiaokun; Guan, Yongyi

    2018-04-01

    With the rapid spread of urbanization, smart growth (SG) has attracted plenty of attention from all over the world. In this paper, by the establishment of index system for smart growth, data envelopment analysis (DEA) model was suggested to evaluate the SG level of the current growth situation in cities. In order to further improve the information of both radial direction and non-radial detection, we introduced the non-Archimedean infinitesimal to form C2GS2 control model. Finally, we evaluated the SG level in Canberra and identified a series of problems, which can verify the applicability of the model and provide us more improvement information.

  9. Literature Review on Modeling Cyber Networks and Evaluating Cyber Risks.

    Energy Technology Data Exchange (ETDEWEB)

    Kelic, Andjelka; Campbell, Philip L

    2018-04-01

    The National Infrastructure Simulations and Analysis Center (NISAC) conducted a literature review on modeling cyber networks and evaluating cyber risks. The literature review explores where modeling is used in the cyber regime and ways that consequence and risk are evaluated. The relevant literature clusters in three different spaces: network security, cyber-physical, and mission assurance. In all approaches, some form of modeling is utilized at varying levels of detail, while the ability to understand consequence varies, as do interpretations of risk. This document summarizes the different literature viewpoints and explores their applicability to securing enterprise networks.

  10. Application of Learning Curves for Didactic Model Evaluation: Case Studies

    Directory of Open Access Journals (Sweden)

    Felix Mödritscher

    2013-01-01

    Full Text Available The success of (online courses depends, among other factors, on the underlying didactical models which have always been evaluated with qualitative and quantitative research methods. Several new evaluation techniques have been developed and established in the last years. One of them is ‘learning curves’, which aim at measuring error rates of users when they interact with adaptive educational systems, thereby enabling the underlying models to be evaluated and improved. In this paper, we report how we have applied this new method to two case studies to show that learning curves are useful to evaluate didactical models and their implementation in educational platforms. Results show that the error rates follow a power law distribution with each additional attempt if the didactical model of an instructional unit is valid. Furthermore, the initial error rate, the slope of the curve and the goodness of fit of the curve are valid indicators for the difficulty level of a course and the quality of its didactical model. As a conclusion, the idea of applying learning curves for evaluating didactical model on the basis of usage data is considered to be valuable for supporting teachers and learning content providers in improving their online courses.

  11. Evaluating performances of simplified physically based landslide susceptibility models.

    Science.gov (United States)

    Capparelli, Giovanna; Formetta, Giuseppe; Versace, Pasquale

    2015-04-01

    Rainfall induced shallow landslides cause significant damages involving loss of life and properties. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. This paper presents a package of GIS based models for landslide susceptibility analysis. It was integrated in the NewAge-JGrass hydrological model using the Object Modeling System (OMS) modeling framework. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices (GOF) by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system offers the possibility to investigate and fairly compare the quality and the robustness of models and models parameters, according a procedure that includes: i) model parameters estimation by optimizing each of the GOF index separately, ii) models evaluation in the ROC plane by using each of the optimal parameter set, and iii) GOF robustness evaluation by assessing their sensitivity to the input parameter variation. This procedure was repeated for all three models. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, Average Index (AI) optimization coupled with model M3 is the best modeling solution for our test case. This research was funded by PON Project No. 01_01503 "Integrated Systems for Hydrogeological Risk

  12. Animal models for evaluation of oral delivery of biopharmaceuticals

    DEFF Research Database (Denmark)

    Harloff-Helleberg, Stine; Nielsen, Line Hagner; Nielsen, Hanne Mørck

    2017-01-01

    of systems for oral delivery of biopharmaceuticals may result in new treatment modalities to increase the patient compliance and reduce product cost. In the preclinical development phase, use of experimental animal models is essential for evaluation of new formulation designs. In general, the limited oral...... bioavailability of biopharmaceuticals, of just a few percent, is expected, and therefore, the animal models and the experimental settings must be chosen with utmost care. More knowledge and focus on this topic is highly needed, despite experience from the numerous studies evaluating animal models for oral drug...... delivery of small molecule drugs. This review highlights and discusses pros and cons of the most currently used animal models and settings. Additionally, it also looks into the influence of anesthetics and sampling methods for evaluation of drug delivery systems for oral delivery of biopharmaceuticals...

  13. Evaluation Model of Organizational Performance for Small and Medium Enterprises

    Directory of Open Access Journals (Sweden)

    Carlos Augusto Passos

    2014-12-01

    Full Text Available In the 1980’s, many tools for evaluating organizational performance were created. However, most of them are useful only to large companies and do not foster results in small and medium-sized enterprises (SMEs. In light of this fact, this article aims at proposing an Organizational Performance Assessment model (OPA which is flexible and adaptable to the reality of SMEs, based on the theoretical framework of various models, and comparisons on the basis of three major authors’ criteria to evaluate OPA models. The research has descriptive and exploratory character, with qualitative nature. The MADE-O model, according to the criteria described in the bibliography, is the one that best fits the needs of SMEs, used as a baseline for the model proposed in this study with adaptations pertaining to the BSC model. The model called the Overall Performance Indicator – Environment (IDG-E has as its main differential, in addition to the base of the models mentioned above, the assessment of the external and internal environment weighted in modules of OPA. As the SME is characterized by having few processes and people, the small amount of performance indicators is another positive aspect. Submitted to the evaluation of the criteria subscribed by the authors, the model proved to be quite feasible for use in SMEs.

  14. Airline service quality evaluation: A review on concepts and models

    Directory of Open Access Journals (Sweden)

    Navid Haghighat

    2017-12-01

    Full Text Available This paper reviews different major service quality concept and models which led to great developments in evaluating service quality with focusing on improvement process of the models through discussing criticisms of each model. Criticisms against these models are discussed to clarify development steps of newer models which led to the improvement of airline service quality models. The precise and accurate evaluation of service quality needs utilizing a reliable concept with comprehensive criteria and effective measurement techniques as the fundamentals of a valuable framework. In this paper, service quality models improvement is described based on three major service quality concepts, the disconfirmation, performance and hierarchical concepts which are developed subsequently. Reviewing various criteria and different measurement techniques such a statistical analysis and multi-criteria decision making assist researchers to have a clear understanding of the development of the evaluation framework in the airline industry. This study aims at promoting reliable frameworks for evaluating airline service quality in different countries and societies due to economic, cultural and social aspects of each society.

  15. Evaluation of Artificial Intelligence Based Models for Chemical Biodegradability Prediction

    Directory of Open Access Journals (Sweden)

    Aleksandar Sabljic

    2004-12-01

    Full Text Available This study presents a review of biodegradability modeling efforts including a detailed assessment of two models developed using an artificial intelligence based methodology. Validation results for these models using an independent, quality reviewed database, demonstrate that the models perform well when compared to another commonly used biodegradability model, against the same data. The ability of models induced by an artificial intelligence methodology to accommodate complex interactions in detailed systems, and the demonstrated reliability of the approach evaluated by this study, indicate that the methodology may have application in broadening the scope of biodegradability models. Given adequate data for biodegradability of chemicals under environmental conditions, this may allow for the development of future models that include such things as surface interface impacts on biodegradability for example.

  16. Evaluation of two ozone air quality modelling systems

    Directory of Open Access Journals (Sweden)

    S. Ortega

    2004-01-01

    Full Text Available The aim of this paper is to compare two different modelling systems and to evaluate their ability to simulate high values of ozone concentration in typical summer episodes which take place in the north of Spain near the metropolitan area of Barcelona. As the focus of the paper is the comparison of the two systems, we do not attempt to improve the agreement by adjusting the emission inventory or model parameters. The first model, or forecasting system, is made up of three modules. The first module is a mesoscale model (MASS. This provides the initial condition for the second module, which is a nonlocal boundary layer model based on the transilient turbulence scheme. The third module is a photochemical box model (OZIPR, which is applied in Eulerian and Lagrangian modes and receives suitable information from the two previous modules. The model forecast is evaluated against ground base stations during summer 2001. The second model is the MM5/UAM-V. This is a grid model designed to predict the hourly three-dimensional ozone concentration fields. The model is applied during an ozone episode that occurred between 21 and 23 June 2001. Our results reflect the good performance of the two modelling systems when they are used in a specific episode.

  17. Evaluation of potential crushed-salt constitutive models

    Energy Technology Data Exchange (ETDEWEB)

    Callahan, G.D.; Loken, M.C.; Sambeek, L.L. Van; Chen, R.; Pfeifle, T.W.; Nieland, J.D. [RE/SPEC Inc., Rapid City, SD (United States); Hansen, F.D. [Sandia National Labs., Albuquerque, NM (United States). Repository Isolation Systems Dept.

    1995-12-01

    Constitutive models describing the deformation of crushed salt are presented in this report. Ten constitutive models with potential to describe the phenomenological and micromechanical processes for crushed salt were selected from a literature search. Three of these ten constitutive models, termed Sjaardema-Krieg, Zeuch, and Spiers models, were adopted as candidate constitutive models. The candidate constitutive models were generalized in a consistent manner to three-dimensional states of stress and modified to include the effects of temperature, grain size, and moisture content. A database including hydrostatic consolidation and shear consolidation tests conducted on Waste Isolation Pilot Plant and southeastern New Mexico salt was used to determine material parameters for the candidate constitutive models. Nonlinear least-squares model fitting to data from the hydrostatic consolidation tests, the shear consolidation tests, and a combination of the shear and hydrostatic tests produces three sets of material parameter values for the candidate models. The change in material parameter values from test group to test group indicates the empirical nature of the models. To evaluate the predictive capability of the candidate models, each parameter value set was used to predict each of the tests in the database. Based on the fitting statistics and the ability of the models to predict the test data, the Spiers model appeared to perform slightly better than the other two candidate models. The work reported here is a first-of-its kind evaluation of constitutive models for reconsolidation of crushed salt. Questions remain to be answered. Deficiencies in models and databases are identified and recommendations for future work are made. 85 refs.

  18. Global Gridded Crop Model Evaluation: Benchmarking, Skills, Deficiencies and Implications.

    Science.gov (United States)

    Muller, Christoph; Elliott, Joshua; Chryssanthacopoulos, James; Arneth, Almut; Balkovic, Juraj; Ciais, Philippe; Deryng, Delphine; Folberth, Christian; Glotter, Michael; Hoek, Steven; hide

    2017-01-01

    Crop models are increasingly used to simulate crop yields at the global scale, but so far there is no general framework on how to assess model performance. Here we evaluate the simulation results of 14 global gridded crop modeling groups that have contributed historic crop yield simulations for maize, wheat, rice and soybean to the Global Gridded Crop Model Intercomparison (GGCMI) of the Agricultural Model Intercomparison and Improvement Project (AgMIP). Simulation results are compared to reference data at global, national and grid cell scales and we evaluate model performance with respect to time series correlation, spatial correlation and mean bias. We find that global gridded crop models (GGCMs) show mixed skill in reproducing time series correlations or spatial patterns at the different spatial scales. Generally, maize, wheat and soybean simulations of many GGCMs are capable of reproducing larger parts of observed temporal variability (time series correlation coefficients (r) of up to 0.888 for maize, 0.673 for wheat and 0.643 for soybean at the global scale) but rice yield variability cannot be well reproduced by most models. Yield variability can be well reproduced for most major producing countries by many GGCMs and for all countries by at least some. A comparison with gridded yield data and a statistical analysis of the effects of weather variability on yield variability shows that the ensemble of GGCMs can explain more of the yield variability than an ensemble of regression models for maize and soybean, but not for wheat and rice. We identify future research needs in global gridded crop modeling and for all individual crop modeling groups. In the absence of a purely observation-based benchmark for model evaluation, we propose that the best performing crop model per crop and region establishes the benchmark for all others, and modelers are encouraged to investigate how crop model performance can be increased. We make our evaluation system accessible to all

  19. An Efficient Dynamic Trust Evaluation Model for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Zhengwang Ye

    2017-01-01

    Full Text Available Trust evaluation is an effective method to detect malicious nodes and ensure security in wireless sensor networks (WSNs. In this paper, an efficient dynamic trust evaluation model (DTEM for WSNs is proposed, which implements accurate, efficient, and dynamic trust evaluation by dynamically adjusting the weights of direct trust and indirect trust and the parameters of the update mechanism. To achieve accurate trust evaluation, the direct trust is calculated considering multitrust including communication trust, data trust, and energy trust with the punishment factor and regulating function. The indirect trust is evaluated conditionally by the trusted recommendations from a third party. Moreover, the integrated trust is measured by assigning dynamic weights for direct trust and indirect trust and combining them. Finally, we propose an update mechanism by a sliding window based on induced ordered weighted averaging operator to enhance flexibility. We can dynamically adapt the parameters and the interactive history windows number according to the actual needs of the network to realize dynamic update of direct trust value. Simulation results indicate that the proposed dynamic trust model is an efficient dynamic and attack-resistant trust evaluation model. Compared with existing approaches, the proposed dynamic trust model performs better in defending multiple malicious attacks.

  20. A model to evaluate quality and effectiveness of disease management.

    Science.gov (United States)

    Lemmens, K M M; Nieboer, A P; van Schayck, C P; Asin, J D; Huijsman, R

    2008-12-01

    Disease management has emerged as a new strategy to enhance quality of care for patients suffering from chronic conditions, and to control healthcare costs. So far, however, the effects of this strategy remain unclear. Although current models define the concept of disease management, they do not provide a systematic development or an explanatory theory of how disease management affects the outcomes of care. The objective of this paper is to present a framework for valid evaluation of disease-management initiatives. The evaluation model is built on two pillars of disease management: patient-related and professional-directed interventions. The effectiveness of these interventions is thought to be affected by the organisational design of the healthcare system. Disease management requires a multifaceted approach; hence disease-management programme evaluations should focus on the effects of multiple interventions, namely patient-related, professional-directed and organisational interventions. The framework has been built upon the conceptualisation of these disease-management interventions. Analysis of the underlying mechanisms of these interventions revealed that learning and behavioural theories support the core assumptions of disease management. The evaluation model can be used to identify the components of disease-management programmes and the mechanisms behind them, making valid comparison feasible. In addition, this model links the programme interventions to indicators that can be used to evaluate the disease-management programme. Consistent use of this framework will enable comparisons among disease-management programmes and outcomes in evaluation research.

  1. Evaluating Vocational Educators' Training Programs: A Kirkpatrick-Inspired Evaluation Model

    Science.gov (United States)

    Ravicchio, Fabrizio; Trentin, Guglielmo

    2015-01-01

    The aim of the article is to describe the assessment model adopted by the SCINTILLA Project, a project in Italy aimed at the online vocational training of young, seriously-disabled subjects and their subsequent work inclusion in smart-work mode. It will thus describe the model worked out for evaluation of the training program conceived for the…

  2. Improved Pig Model to Evaluate Heart Valve Thrombosis.

    Science.gov (United States)

    Payanam Ramachandra, Umashankar; Shenoy, Sachin J; Arumugham, Sabareeswaran

    2016-09-01

    Although the sheep is the most acceptable animal model for heart valve evaluation, it has severe limitations for detecting heart valve thrombosis during preclinical studies. While the pig offers an alternative model and is better for detecting prosthetic valve thrombogenicity, it is not often used because of inadvertent valve thrombosis or bleeding complications. The study aim was to develop an improved pig model which can be used reliably to evaluate mechanical heart valve thrombogenicity. Mechanical heart valves were implanted in the mitral position of indigenous pigs administered aspirin-clopidogrel, and compared with similar valves implanted in control pigs to which no antiplatelet therapy had been administered. The pigs were observed for six months to study their overall survivability, inadvertent bleeding/valve thrombosis and pannus formation. The efficacy of aspirinclopidogrel on platelet aggregation and blood coagulation was also recorded and compared between test and control animals. In comparison to controls, pigs receiving anti-platelet therapy showed an overall better survivability, an absence of inadvertent valve thrombosis/ bleeding, and less obstructive pannus formation. Previously unreported inhibitory effects of aspirin-clopidogrel on the intrinsic pathway of blood coagulation were also observed in the pig model. Notably, with aspirin-clopidogrel therapy inadvertent thrombus formation or bleeding can be prevented. The newly developed pig model can be successfully used to evaluate heart valve thrombosis following chronic orthotopic valve implantation. The model may also be utilized to evaluate other bloodcontacting implantable devices.

  3. Evaluating user interactions with clinical information systems: a model based on human-computer interaction models.

    Science.gov (United States)

    Despont-Gros, Christelle; Mueller, Henning; Lovis, Christian

    2005-06-01

    This article proposes a model for dimensions involved in user evaluation of clinical information systems (CIS). The model links the dimensions in traditional CIS evaluation and the dimensions from the human-computer interaction (HCI) perspective. In this article, variables are defined as the properties measured in an evaluation, and dimensions are defined as the factors contributing to the values of the measured variables. The proposed model is based on a two-step methodology with: (1) a general review of information systems (IS) evaluations to highlight studied variables, existing models and frameworks, and (2) a review of HCI literature to provide the theoretical basis to key dimensions of user evaluation. The review of literature led to the identification of eight key variables, among which satisfaction, acceptance, and success were found to be the most referenced. Among those variables, IS acceptance is a relevant candidate to reflect user evaluation of CIS. While their goals are similar, the fields of traditional CIS evaluation, and HCI are not closely connected. Combining those two fields allows for the development of an integrated model which provides a model for summative and comprehensive user evaluation of CIS. All dimensions identified in existing studies can be linked to this model and such an integrated model could provide a new perspective to compare investigations of different CIS systems.

  4. Development and evaluation of thermal model reduction algorithms for spacecraft

    Science.gov (United States)

    Deiml, Michael; Suderland, Martin; Reiss, Philipp; Czupalla, Markus

    2015-05-01

    This paper is concerned with the topic of the reduction of thermal models of spacecraft. The work presented here has been conducted in cooperation with the company OHB AG, formerly Kayser-Threde GmbH, and the Institute of Astronautics at Technische Universität München with the goal to shorten and automatize the time-consuming and manual process of thermal model reduction. The reduction of thermal models can be divided into the simplification of the geometry model for calculation of external heat flows and radiative couplings and into the reduction of the underlying mathematical model. For simplification a method has been developed which approximates the reduced geometry model with the help of an optimization algorithm. Different linear and nonlinear model reduction techniques have been evaluated for their applicability in reduction of the mathematical model. Thereby the compatibility with the thermal analysis tool ESATAN-TMS is of major concern, which restricts the useful application of these methods. Additional model reduction methods have been developed, which account to these constraints. The Matrix Reduction method allows the approximation of the differential equation to reference values exactly expect for numerical errors. The summation method enables a useful, applicable reduction of thermal models that can be used in industry. In this work a framework for model reduction of thermal models has been created, which can be used together with a newly developed graphical user interface for the reduction of thermal models in industry.

  5. Steel Containment Vessel Model Test: Results and Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Costello, J.F.; Hashimote, T.; Hessheimer, M.F.; Luk, V.K.

    1999-03-01

    A high pressure test of the steel containment vessel (SCV) model was conducted on December 11-12, 1996 at Sandia National Laboratories, Albuquerque, NM, USA. The test model is a mixed-scaled model (1:10 in geometry and 1:4 in shell thickness) of an improved Mark II boiling water reactor (BWR) containment. A concentric steel contact structure (CS), installed over the SCV model and separated at a nominally uniform distance from it, provided a simplified representation of a reactor shield building in the actual plant. The SCV model and contact structure were instrumented with strain gages and displacement transducers to record the deformation behavior of the SCV model during the high pressure test. This paper summarizes the conduct and the results of the high pressure test and discusses the posttest metallurgical evaluation results on specimens removed from the SCV model.

  6. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  7. A Fuzzy Comprehensive Evaluation Model for Sustainability Risk Evaluation of PPP Projects

    Directory of Open Access Journals (Sweden)

    Libiao Bai

    2017-10-01

    Full Text Available Evaluating the sustainability risk level of public–private partnership (PPP projects can reduce project risk incidents and achieve the sustainable development of the organization. However, the existing studies about PPP projects risk management mainly focus on exploring the impact of financial and revenue risks but ignore the sustainability risks, causing the concept of “sustainability” to be missing while evaluating the risk level of PPP projects. To evaluate the sustainability risk level and achieve the most important objective of providing a reference for the public and private sectors when making decisions on PPP project management, this paper constructs a factor system of sustainability risk of PPP projects based on an extensive literature review and develops a mathematical model based on the methods of fuzzy comprehensive evaluation model (FCEM and failure mode, effects and criticality analysis (FMECA for evaluating the sustainability risk level of PPP projects. In addition, this paper conducts computational experiment based on a questionnaire survey to verify the effectiveness and feasibility of this proposed model. The results suggest that this model is reasonable for evaluating the sustainability risk level of PPP projects. To our knowledge, this paper is the first study to evaluate the sustainability risk of PPP projects, which would not only enrich the theories of project risk management, but also serve as a reference for the public and private sectors for the sustainable planning and development. Keywords: sustainability risk eva

  8. Evaluation-Function-based Model-free Adaptive Fuzzy Control

    Directory of Open Access Journals (Sweden)

    Agus Naba

    2016-12-01

    Full Text Available Designs of adaptive fuzzy controllers (AFC are commonly based on the Lyapunov approach, which requires a known model of the controlled plant. They need to consider a Lyapunov function candidate as an evaluation function to be minimized. In this study these drawbacks were handled by designing a model-free adaptive fuzzy controller (MFAFC using an approximate evaluation function defined in terms of the current state, the next state, and the control action. MFAFC considers the approximate evaluation function as an evaluative control performance measure similar to the state-action value function in reinforcement learning. The simulation results of applying MFAFC to the inverted pendulum benchmark verified the proposed scheme’s efficacy.

  9. Using modeling to develop and evaluate a corrective action system

    International Nuclear Information System (INIS)

    Rodgers, L.

    1995-01-01

    At a former trucking facility in EPA Region 4, a corrective action system was installed to remediate groundwater and soil contaminated with gasoline and fuel oil products released from several underground storage tanks (USTs). Groundwater modeling was used to develop the corrective action plan and later used with soil vapor modeling to evaluate the systems effectiveness. Groundwater modeling was used to determine the effects of a groundwater recovery system on the water table at the site. Information gathered during the assessment phase was used to develop a three dimensional depiction of the subsurface at the site. Different groundwater recovery schemes were then modeled to determine the most effective method for recovering contaminated groundwater. Based on the modeling and calculations, a corrective action system combining soil vapor extraction (SVE) and groundwater recovery was designed. The system included seven recovery wells, to extract both soil vapor and groundwater, and a groundwater treatment system. Operation and maintenance of the system included monthly system sampling and inspections and quarterly groundwater sampling. After one year of operation the effectiveness of the system was evaluated. A subsurface soil gas model was used to evaluate the effects of the SVE system on the site contamination as well as its effects on the water table and groundwater recovery operations. Groundwater modeling was used in evaluating the effectiveness of the groundwater recovery system. Plume migration and capture were modeled to insure that the groundwater recovery system at the site was effectively capturing the contaminant plume. The two models were then combined to determine the effects of the two systems, acting together, on the remediation process

  10. A High Effective Fuzzy Synthetic Evaluation Multi-model Estimation

    Directory of Open Access Journals (Sweden)

    Yang LIU

    2014-01-01

    Full Text Available In view of the questions that the algorithm flow of variable structure multi-model method (VSMM is too complex and the tracking performance is inefficient and therefore it is so difficult to apply VSMM into installing equipment. The paper presents a high-performance variable structure multi-model method basing on multi-factor fuzzy synthetic evaluation (HEFS_VSMM. Under the guidance of variable structure method, HEFS_VSMM uses the technique of multi-factor fuzzy synthetic evaluation in the strategy of model set adaptive to select the appropriate model set in real time and reduce the computation complexity of the model evaluation, firstly. Secondly, select the model set center according to the evaluation results of each model and set the property value for current model set. Thirdly, choose different processes basing on the current model set property value to simplify the logical complexity of the algorithm. At last, the algorithm gets the total estimation by the theories of optimal information fusion on the above-mentioned processing results. The results of simulation show that, compared with the FSMM and EMA, the mean of estimation error belonging to position, velocity and acceleration in the HEFS_VSMM is improved from -0.029 (m, -0.350 (m/s, -10.051(m/s2 to -0.023 (m, 0.052 (m/s, -5.531 (m/s2. The algorithm cycle is reduced from 0.0051(s to 0.0025 (s.

  11. Pipe fracture evaluations for leak-rate detection: Probabilistic models

    International Nuclear Information System (INIS)

    Rahman, S.; Wilkowski, G.; Ghadiali, N.

    1993-01-01

    This is the second in series of three papers generated from studies on nuclear pipe fracture evaluations for leak-rate detection. This paper focuses on the development of novel probabilistic models for stochastic performance evaluation of degraded nuclear piping systems. It was accomplished here in three distinct stages. First, a statistical analysis was conducted to characterize various input variables for thermo-hydraulic analysis and elastic-plastic fracture mechanics, such as material properties of pipe, crack morphology variables, and location of cracks found in nuclear piping. Second, a new stochastic model was developed to evaluate performance of degraded piping systems. It is based on accurate deterministic models for thermo-hydraulic and fracture mechanics analyses described in the first paper, statistical characterization of various input variables, and state-of-the-art methods of modem structural reliability theory. From this model. the conditional probability of failure as a function of leak-rate detection capability of the piping systems can be predicted. Third, a numerical example was presented to illustrate the proposed model for piping reliability analyses. Results clearly showed that the model provides satisfactory estimates of conditional failure probability with much less computational effort when compared with those obtained from Monte Carlo simulation. The probabilistic model developed in this paper will be applied to various piping in boiling water reactor and pressurized water reactor plants for leak-rate detection applications

  12. The model of evaluation of innovative potential of enterprise

    Directory of Open Access Journals (Sweden)

    Ганна Ігорівна Заднєпровська

    2015-06-01

    Full Text Available The basic components of the enterprise’s innovative potential evaluation process are investigated. It is offered the conceptual model of evaluation of the innovative potential that includes: subjects, objects, purpose, provision of information, principles, methods, criteria, indicators. It is noted that the innovative capacity characterizes the transition from the current to the strategic level of innovation potential and, thus, characterizes the composition of objects from position of user

  13. Thin film thermocouples for in situ membrane electrode assembly temperature measurements in a polybenzimidazole-based high temperature proton exchange membrane unit cell

    DEFF Research Database (Denmark)

    Ali, Syed Talat; Lebæk, Jesper; Nielsen, Lars Pleth

    2010-01-01

    This paper presents Type-T thin film thermocouples (TFTCs) fabricated on Kapton (polyimide) substrate for measuring the internal temperature of PBI(polybenzimidazole)-based high temperature proton exchange membrane fuel cell (HT-PEMFC). Magnetron sputtering technique was employed to deposit a 2 mu...... degradation. This Kapton foil with deposited TFTCs was used as sealing inside a PBI (polybenzimidazole)-based single cell test rig, which enabled measurements of in situ temperature variations of the working fuel cell MEA. The performance of the TFTCs was promising with minimal interference to the operation...

  14. Capital Cost Optimization for Prefabrication: A Factor Analysis Evaluation Model

    Directory of Open Access Journals (Sweden)

    Hong Xue

    2018-01-01

    Full Text Available High capital cost is a significant hindrance to the promotion of prefabrication. In order to optimize cost management and reduce capital cost, this study aims to explore the latent factors and factor analysis evaluation model. Semi-structured interviews were conducted to explore potential variables and then questionnaire survey was employed to collect professionals’ views on their effects. After data collection, exploratory factor analysis was adopted to explore the latent factors. Seven latent factors were identified, including “Management Index”, “Construction Dissipation Index”, “Productivity Index”, “Design Efficiency Index”, “Transport Dissipation Index”, “Material increment Index” and “Depreciation amortization Index”. With these latent factors, a factor analysis evaluation model (FAEM, divided into factor analysis model (FAM and comprehensive evaluation model (CEM, was established. The FAM was used to explore the effect of observed variables on the high capital cost of prefabrication, while the CEM was used to evaluate comprehensive cost management level on prefabrication projects. Case studies were conducted to verify the models. The results revealed that collaborative management had a positive effect on capital cost of prefabrication. Material increment costs and labor costs had significant impacts on production cost. This study demonstrated the potential of on-site management and standardization design to reduce capital cost. Hence, collaborative management is necessary for cost management of prefabrication. Innovation and detailed design were needed to improve cost performance. The new form of precast component factories can be explored to reduce transportation cost. Meanwhile, targeted strategies can be adopted for different prefabrication projects. The findings optimized the capital cost and improved the cost performance through providing an evaluation and optimization model, which helps managers to

  15. Popularity Evaluation Model for Microbloggers Online Social Network

    Directory of Open Access Journals (Sweden)

    Xia Zhang

    2014-01-01

    Full Text Available Recently, microblogging is widely studied by the researchers in the domain of the online social network (OSN. How to evaluate the popularities of microblogging users is an important research field, which can be applied to commercial advertising, user behavior analysis and information dissemination, and so forth. Previous studies on the evaluation methods cannot effectively solve and accurately evaluate the popularities of the microbloggers. In this paper, we proposed an electromagnetic field theory based model to analyze the popularities of microbloggers. The concept of the source in microblogging field is first put forward, which is based on the concept of source in the electromagnetic field; then, one’s microblogging flux is calculated according to his/her behaviors (send or receive feedbacks on the microblogging platform; finally, we used three methods to calculate one’s microblogging flux density, which can represent one’s popularity on the microblogging platform. In the experimental work, we evaluated our model using real microblogging data and selected the best one from the three popularity measure methods. We also compared our model with the classic PageRank algorithm; and the results show that our model is more effective and accurate to evaluate the popularities of the microbloggers.

  16. Road network safety evaluation using Bayesian hierarchical joint model.

    Science.gov (United States)

    Wang, Jie; Huang, Helai

    2016-05-01

    Safety and efficiency are commonly regarded as two significant performance indicators of transportation systems. In practice, road network planning has focused on road capacity and transport efficiency whereas the safety level of a road network has received little attention in the planning stage. This study develops a Bayesian hierarchical joint model for road network safety evaluation to help planners take traffic safety into account when planning a road network. The proposed model establishes relationships between road network risk and micro-level variables related to road entities and traffic volume, as well as socioeconomic, trip generation and network density variables at macro level which are generally used for long term transportation plans. In addition, network spatial correlation between intersections and their connected road segments is also considered in the model. A road network is elaborately selected in order to compare the proposed hierarchical joint model with a previous joint model and a negative binomial model. According to the results of the model comparison, the hierarchical joint model outperforms the joint model and negative binomial model in terms of the goodness-of-fit and predictive performance, which indicates the reasonableness of considering the hierarchical data structure in crash prediction and analysis. Moreover, both random effects at the TAZ level and the spatial correlation between intersections and their adjacent segments are found to be significant, supporting the employment of the hierarchical joint model as an alternative in road-network-level safety modeling as well. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. A Novel Model for Security Evaluation for Compliance

    DEFF Research Database (Denmark)

    Hald, Sara Ligaard; Pedersen, Jens Myrup; Prasad, Neeli R.

    2011-01-01

    With the increasing focus on security in information systems, it is becoming necessary to be able to describe and compare security attributes for different technologies. Existing are well-described and comprehensive, but expensive and resource demanding to apply. The Security Evaluation...... for Compliance (SEC) model offers a lightweight alternative for use by decision makers to get a quick overview of the security attributes of different technologies for easy comparison and requirement compliance evaluation. The scientific contribution is this new approach to security modelling as well...

  18. Mathematical models and lymphatic filariasis control: monitoring and evaluating interventions.

    Science.gov (United States)

    Michael, Edwin; Malecela-Lazaro, Mwele N; Maegga, Bertha T A; Fischer, Peter; Kazura, James W

    2006-11-01

    Monitoring and evaluation are crucially important to the scientific management of any mass parasite control programme. Monitoring enables the effectiveness of implemented actions to be assessed and necessary adaptations to be identified; it also determines when management objectives are achieved. Parasite transmission models can provide a scientific template for informing the optimal design of such monitoring programmes. Here, we illustrate the usefulness of using a model-based approach for monitoring and evaluating anti-parasite interventions and discuss issues that need addressing. We focus on the use of such an approach for the control and/or elimination of the vector-borne parasitic disease, lymphatic filariasis.

  19. Metric Evaluation Pipeline for 3d Modeling of Urban Scenes

    Science.gov (United States)

    Bosch, M.; Leichtman, A.; Chilcott, D.; Goldberg, H.; Brown, M.

    2017-05-01

    Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.

  20. An effective quality model for evaluating mobile websites

    International Nuclear Information System (INIS)

    Hassan, W.U.; Nawaz, M.T.; Syed, T.H.; Naseem, A.

    2015-01-01

    The Evolution in Web development in recent years has caused emergence of new area of mobile computing, Mobile phone has been transformed into high speed processing device capable of doing the processes which were suppose to be run only on computer previously, Modem mobile phones now have capability to process data with greater speed then desktop systems and with the inclusion of 3G and 4G networks, mobile became the prime choice for users to send and receive data from any device. As a result, there is a major increase in mobile website need and development but due to uniqueness of mobile website usage as compared to desktop website, there is a need to focus on quality aspect of mobile website, So, to increase and preserve quality of mobile website, a quality model is required which has to be designed specifically to evaluate mobile website quality, To design a mobile website quality model, a survey based methodology is used to gather the information regarding website unique usage in mobile from different users. On the basis of this information, a mobile website quality model is presented which aims to evaluate the quality of mobile websites. In proposed model, some sub characteristics are designed to evaluate mobile websites in particular. The result is a proposed model aims to evaluate features of website which are important in context of its deployment and its usability in mobile platform. (author)

  1. Model of service-oriented catering supply chain performance evaluation

    Directory of Open Access Journals (Sweden)

    Juanqiong Gou

    2013-03-01

    Full Text Available Purpose: The aim of this paper is constructing a performance evaluation model for service-oriented catering supply chain. Design/methodology/approach: With the research on the current situation of catering industry, this paper summarized the characters of the catering supply chain, and then presents the service-oriented catering supply chain model based on the platform of logistics and information. At last, the fuzzy AHP method is used to evaluate the performance of service-oriented catering supply chain. Findings: With the analysis of the characteristics of catering supply chain, we construct the performance evaluation model in order to guarantee the food safety, logistics efficiency, price stability and so on. Practical implications: In order to evolve an efficient and effective service supply chain, it can not only used to own enterprise improvement, but also can be used for selecting different customers, to choose a different model of development. Originality/value: This paper has a new definition of service-oriented catering supply chain. And it offers a model to evaluate the performance of this catering supply chain.

  2. METRIC EVALUATION PIPELINE FOR 3D MODELING OF URBAN SCENES

    Directory of Open Access Journals (Sweden)

    M. Bosch

    2017-05-01

    Full Text Available Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.

  3. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-01-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a ''living document'' that will be modified over the course of the execution of this work

  4. Experimental Models for Evaluation of Nanoparticles in Cancer Therapy.

    Science.gov (United States)

    Kesharwani, Prashant; Ghanghoria, Raksha; Jain, Narendra K

    2017-01-01

    Nanoparticles (NPs), the submicron-sized colloidal particles, have recently generated enormous interest among biomedical scientists, particularly in cancer therapy. A number of models are being used for exploring NPs safety and efficacy. Recently, cancer cell lines have explored as prominent experimental models for evaluating pharmacokinetic parameters, cell viability, cytotoxicity and drug efficacy in tumor cells. This review aims at thorough compilation of various cancer cell lines and in vivo models for evaluation of efficacy of NPs on one platform. This will provide a basis to explore and improvise pre-clinical models as a prelude to successful cancer research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Model to evaluate the technical efficiency of university units

    Directory of Open Access Journals (Sweden)

    Marlon Soliman

    2014-06-01

    Full Text Available In higher education institutions, the technical efficiency has been measured by several indicators that, when used separately, does not lead to an effective conclusion about the administrative reality of these. Therefore, this paper proposes a model to evaluate the technical efficiency of university units of a higher education institution (HEI from the perspectives of Teaching, Research and Extension. The conception of the model was performed according to the pressumptions of Data Envelopment Analysis (DEA, CCR model – product oriented, from the identification of relevant variables for the addressed context. The model was applied to evaluate the efficiency of nine academic units of the Federal University of Santa Maria (UFSM, obtaining as a result the efficiency of each unit as well as recommendations for the units considered inefficient. At the end of this study, it was verified that it is possible to measure the efficiency of various units and consequently establish goals for improvement based on the methodology used.

  6. Evaluation process radiological in ternopil region method of box models

    Directory of Open Access Journals (Sweden)

    І.В. Матвєєва

    2006-02-01

    Full Text Available  Results of radionuclides Sr-90 flows analyses in the ecosystem of Kotsubinchiky village of Ternopolskaya oblast were analyzed. The block-scheme of ecosystem and its mathematical model using the box models method were made. It allowed us to evaluate the ways of dose’s loadings formation of internal irradiation for miscellaneous population groups – working people, retirees, children, and also to prognose the dynamic of these loadings during the years after the Chernobyl accident.

  7. Motion Reliability Modeling and Evaluation for Manipulator Path Planning Task

    OpenAIRE

    Li, Tong; Jia, Qingxuan; Chen, Gang; Sun, Hanxu

    2015-01-01

    Motion reliability as a criterion can reflect the accuracy of manipulator in completing operations. Since path planning task takes a significant role in operations of manipulator, the motion reliability evaluation of path planning task is discussed in the paper. First, a modeling method for motion reliability is proposed by taking factors related to position accuracy of manipulator into account. In the model, multidimensional integral for PDF is carried out to calculate motion reliability. Co...

  8. Evaluating the AS-level Internet models: beyond topological characteristics

    International Nuclear Information System (INIS)

    Fan Zheng-Ping

    2012-01-01

    A surge number of models has been proposed to model the Internet in the past decades. However, the issue on which models are better to model the Internet has still remained a problem. By analysing the evolving dynamics of the Internet, we suggest that at the autonomous system (AS) level, a suitable Internet model, should at least be heterogeneous and have a linearly growing mechanism. More importantly, we show that the roles of topological characteristics in evaluating and differentiating Internet models are apparently over-estimated from an engineering perspective. Also, we find that an assortative network is not necessarily more robust than a disassortative network and that a smaller average shortest path length does not necessarily mean a higher robustness, which is different from the previous observations. Our analytic results are helpful not only for the Internet, but also for other general complex networks. (interdisciplinary physics and related areas of science and technology)

  9. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  10. Hydrology model evaluation at the Hanford Nuclear Waste Facility

    Energy Technology Data Exchange (ETDEWEB)

    1977-04-01

    One and two-dimensional flow and contaminant transport computer models have been developed at Hanford to assess the rate and direction of contaminant movement from waste disposal sites. The primary objective of this work was to evaluate the potential improvement in accuracy that a three-dimensional model might offer over the simpler one and two-dimensional models. INTERA's hydrology contaminant transport model was used for this evaluation. Although this study was conceptual in nature, an attempt was made to relate it as closely as possible to Hanford conditions. Two-dimensional model runs were performed over the period of 1968 to 1973 using estimates of waste discharge flows, tritium concentrations, vertically averaged values of aquifer properties and boundary conditions. The well test interpretation runs confirmed the applicability of the areal hydraulic conductivity distribution. Velocity fields calculated by the two-dimensional and three-dimensional models and surface concentration profiles calculated by the two-dimensional and three-dimensional models show significant differences. Vertical concentration profiles calculated by a three-dimensional model show better qualitative agreement with the limited observed concentration profile data supplied by ARHCO.

  11. EVALUATION OF RAINFALL-RUNOFF MODELS FOR MEDITERRANEAN SUBCATCHMENTS

    Directory of Open Access Journals (Sweden)

    A. Cilek

    2016-06-01

    Full Text Available The development and the application of rainfall-runoff models have been a corner-stone of hydrological research for many decades. The amount of rainfall and its intensity and variability control the generation of runoff and the erosional processes operating at different scales. These interactions can be greatly variable in Mediterranean catchments with marked hydrological fluctuations. The aim of the study was to evaluate the performance of rainfall-runoff model, for rainfall-runoff simulation in a Mediterranean subcatchment. The Pan-European Soil Erosion Risk Assessment (PESERA, a simplified hydrological process-based approach, was used in this study to combine hydrological surface runoff factors. In total 128 input layers derived from data set includes; climate, topography, land use, crop type, planting date, and soil characteristics, are required to run the model. Initial ground cover was estimated from the Landsat ETM data provided by ESA. This hydrological model was evaluated in terms of their performance in Goksu River Watershed, Turkey. It is located at the Central Eastern Mediterranean Basin of Turkey. The area is approximately 2000 km2. The landscape is dominated by bare ground, agricultural and forests. The average annual rainfall is 636.4mm. This study has a significant importance to evaluate different model performances in a complex Mediterranean basin. The results provided comprehensive insight including advantages and limitations of modelling approaches in the Mediterranean environment.

  12. Design and evaluation of a parametric model for cardiac sounds.

    Science.gov (United States)

    Ibarra-Hernández, Roilhi F; Alonso-Arévalo, Miguel A; Cruz-Gutiérrez, Alejandro; Licona-Chávez, Ana L; Villarreal-Reyes, Salvador

    2017-10-01

    Heart sound analysis plays an important role in the auscultative diagnosis process to detect the presence of cardiovascular diseases. In this paper we propose a novel parametric heart sound model that accurately represents normal and pathological cardiac audio signals, also known as phonocardiograms (PCG). The proposed model considers that the PCG signal is formed by the sum of two parts: one of them is deterministic and the other one is stochastic. The first part contains most of the acoustic energy. This part is modeled by the Matching Pursuit (MP) algorithm, which performs an analysis-synthesis procedure to represent the PCG signal as a linear combination of elementary waveforms. The second part, also called residual, is obtained after subtracting the deterministic signal from the original heart sound recording and can be accurately represented as an autoregressive process using the Linear Predictive Coding (LPC) technique. We evaluate the proposed heart sound model by performing subjective and objective tests using signals corresponding to different pathological cardiac sounds. The results of the objective evaluation show an average Percentage of Root-Mean-Square Difference of approximately 5% between the original heart sound and the reconstructed signal. For the subjective test we conducted a formal methodology for perceptual evaluation of audio quality with the assistance of medical experts. Statistical results of the subjective evaluation show that our model provides a highly accurate approximation of real heart sound signals. We are not aware of any previous heart sound model rigorously evaluated as our proposal. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. A research and evaluation capacity building model in Western Australia.

    Science.gov (United States)

    Lobo, Roanna; Crawford, Gemma; Hallett, Jonathan; Laing, Sue; Mak, Donna B; Jancey, Jonine; Rowell, Sally; McCausland, Kahlia; Bastian, Lisa; Sorenson, Anne; Tilley, P J Matt; Yam, Simon; Comfort, Jude; Brennan, Sean; Doherty, Maryanne

    2016-12-27

    Evaluation of public health programs, services and policies is increasingly required to demonstrate effectiveness. Funding constraints necessitate that existing programs, services and policies be evaluated and their findings disseminated. Evidence-informed practice and policy is also desirable to maximise investments in public health. Partnerships between public health researchers, service providers and policymakers can help address evaluation knowledge and skills gaps. The Western Australian Sexual Health and Blood-borne Virus Applied Research and Evaluation Network (SiREN) aims to build research and evaluation capacity in the sexual health and blood-borne virus sector in Western Australia (WA). Partners' perspectives of the SiREN model after 2 years were explored. Qualitative written responses from service providers, policymakers and researchers about the SiREN model were analysed thematically. Service providers reported that participation in SiREN prompted them to consider evaluation earlier in the planning process and increased their appreciation of the value of evaluation. Policymakers noted benefits of the model in generating local evidence and highlighting local issues of importance for consideration at a national level. Researchers identified challenges communicating the services available through SiREN and the time investment needed to develop effective collaborative partnerships. Stronger engagement between public health researchers, service providers and policymakers through collaborative partnerships has the potential to improve evidence generation and evidence translation. These outcomes require long-term funding and commitment from all partners to develop and maintain partnerships. Ongoing monitoring and evaluation can ensure the partnership remains responsive to the needs of key stakeholders. The findings are applicable to many sectors. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email

  14. Evaluation of black carbon estimations in global aerosol models

    Directory of Open Access Journals (Sweden)

    Y. Zhao

    2009-11-01

    Full Text Available We evaluate black carbon (BC model predictions from the AeroCom model intercomparison project by considering the diversity among year 2000 model simulations and comparing model predictions with available measurements. These model-measurement intercomparisons include BC surface and aircraft concentrations, aerosol absorption optical depth (AAOD retrievals from AERONET and Ozone Monitoring Instrument (OMI and BC column estimations based on AERONET. In regions other than Asia, most models are biased high compared to surface concentration measurements. However compared with (column AAOD or BC burden retreivals, the models are generally biased low. The average ratio of model to retrieved AAOD is less than 0.7 in South American and 0.6 in African biomass burning regions; both of these regions lack surface concentration measurements. In Asia the average model to observed ratio is 0.7 for AAOD and 0.5 for BC surface concentrations. Compared with aircraft measurements over the Americas at latitudes between 0 and 50N, the average model is a factor of 8 larger than observed, and most models exceed the measured BC standard deviation in the mid to upper troposphere. At higher latitudes the average model to aircraft BC ratio is 0.4 and models underestimate the observed BC loading in the lower and middle troposphere associated with springtime Arctic haze. Low model bias for AAOD but overestimation of surface and upper atmospheric BC concentrations at lower latitudes suggests that most models are underestimating BC absorption and should improve estimates for refractive index, particle size, and optical effects of BC coating. Retrieval uncertainties and/or differences with model diagnostic treatment may also contribute to the model-measurement disparity. Largest AeroCom model diversity occurred in northern Eurasia and the remote Arctic, regions influenced by anthropogenic sources. Changing emissions, aging, removal, or optical properties within a single model

  15. Obs4MIPS: Satellite Observations for Model Evaluation

    Science.gov (United States)

    Ferraro, R.; Waliser, D. E.; Gleckler, P. J.

    2017-12-01

    This poster will review the current status of the obs4MIPs project, whose purpose is to provide a limited collection of well-established and documented datasets for comparison with Earth system models (https://www.earthsystemcog.org/projects/obs4mips/). These datasets have been reformatted to correspond with the CMIP5 model output requirements, and include technical documentation specifically targeted for their use in model output evaluation. The project holdings now exceed 120 datasets with observations that directly correspond to CMIP5 model output variables, with new additions in response to the CMIP6 experiments. With the growth in climate model output data volume, it is increasing more difficult to bring the model output and the observations together to do evaluations. The positioning of the obs4MIPs datasets within the Earth System Grid Federation (ESGF) allows for the use of currently available and planned online tools within the ESGF to perform analysis using model output and observational datasets without necessarily downloading everything to a local workstation. This past year, obs4MIPs has updated its submission guidelines to closely align with changes in the CMIP6 experiments, and is implementing additional indicators and ancillary data to allow users to more easily determine the efficacy of an obs4MIPs dataset for specific evaluation purposes. This poster will present the new guidelines and indicators, and update the list of current obs4MIPs holdings and their connection to the ESGF evaluation and analysis tools currently available, and being developed for the CMIP6 experiments.

  16. Information technology model for evaluating emergency medicine teaching

    Science.gov (United States)

    Vorbach, James; Ryan, James

    1996-02-01

    This paper describes work in progress to develop an Information Technology (IT) model and supporting information system for the evaluation of clinical teaching in the Emergency Medicine (EM) Department of North Shore University Hospital. In the academic hospital setting student physicians, i.e. residents, and faculty function daily in their dual roles as teachers and students respectively, and as health care providers. Databases exist that are used to evaluate both groups in either academic or clinical performance, but rarely has this information been integrated to analyze the relationship between academic performance and the ability to care for patients. The goal of the IT model is to improve the quality of teaching of EM physicians by enabling the development of integrable metrics for faculty and resident evaluation. The IT model will include (1) methods for tracking residents in order to develop experimental databases; (2) methods to integrate lecture evaluation, clinical performance, resident evaluation, and quality assurance databases; and (3) a patient flow system to monitor patient rooms and the waiting area in the Emergency Medicine Department, to record and display status of medical orders, and to collect data for analyses.

  17. Recursive Model Identification for the Evaluation of Baroreflex Sensitivity.

    Science.gov (United States)

    Le Rolle, Virginie; Beuchée, Alain; Praud, Jean-Paul; Samson, Nathalie; Pladys, Patrick; Hernández, Alfredo I

    2016-12-01

    A method for the recursive identification of physiological models of the cardiovascular baroreflex is proposed and applied to the time-varying analysis of vagal and sympathetic activities. The proposed method was evaluated with data from five newborn lambs, which were acquired during injection of vasodilator and vasoconstrictors and the results show a close match between experimental and simulated signals. The model-based estimation of vagal and sympathetic contributions were consistent with physiological knowledge and the obtained estimators of vagal and sympathetic activities were compared to traditional markers associated with baroreflex sensitivity. High correlations were observed between traditional markers and model-based indices.

  18. The Applicability of Selected Evaluation Models to Evolving Investigative Designs.

    Science.gov (United States)

    Smith, Nick L.; Hauer, Diane M.

    1990-01-01

    Ten evaluation models are examined in terms of their applicability to investigative, emergent design programs: Stake's portrayal, Wolf's adversary, Patton's utilization, Guba's investigative journalism, Scriven's goal-free, Scriven's modus operandi, Eisner's connoisseurial, Stufflebeam's CIPP, Tyler's objective based, and Levin's cost…

  19. Evaluating the Predictive Value of Growth Prediction Models

    Science.gov (United States)

    Murphy, Daniel L.; Gaertner, Matthew N.

    2014-01-01

    This study evaluates four growth prediction models--projection, student growth percentile, trajectory, and transition table--commonly used to forecast (and give schools credit for) middle school students' future proficiency. Analyses focused on vertically scaled summative mathematics assessments, and two performance standards conditions (high…

  20. Evaluation of models for assessing groundwater vulnerability to ...

    African Journals Online (AJOL)

    This paper examines, based on a review and synthesis of available material, the presently most applied models for groundwater vulnerability assessment mapping. The appraoches and the pros and cons of each method are evaluated in terms of both the conditions of their implementation and the result obtained. The paper ...

  1. A model for the evaluation of different production strategies for ...

    African Journals Online (AJOL)

    An interactive user-friendly computer package is being developed to assist planners and managers with the evaluation of different livestock production strategies in semi-arid regions. It comprises a hierarchy of simulation models that predict over time the effects of past and present rainfall, stocking rates, milking and sales ...

  2. A model of evaluating the pseudogap temperature for high ...

    Indian Academy of Sciences (India)

    We have presented a model of evaluating the pseudogap temperature for high temperature superconductors using paraconductivity approach. The theoretical analysis is based on the crossing point technique of the conductivity expressions. The pseudogap temperature T ∗ is found to depend on dimension and is ...

  3. Evaluation of a stratiform cloud parameterization for general circulation models

    Energy Technology Data Exchange (ETDEWEB)

    Ghan, S.J.; Leung, L.R. [Pacific Northwest National Lab., Richland, WA (United States); McCaa, J. [Univ. of Washington, Seattle, WA (United States)

    1996-04-01

    To evaluate the relative importance of horizontal advection of cloud versus cloud formation within the grid cell of a single column model (SCM), we have performed a series of simulations with our SCM driven by a fixed vertical velocity and various rates of horizontal advection.

  4. Further Evaluation of a Brief, Intensive Teacher-Training Model

    Science.gov (United States)

    Lerman, Dorothea C.; Tetreault, Allison; Hovanetz, Alyson; Strobel, Margaret; Garro, Joanie

    2008-01-01

    The purpose of this study was to further evaluate the outcomes of a model program that was designed to train current teachers of children with autism. Nine certified special education teachers participating in an intensive 5-day summer training program were taught a relatively large number of specific skills in two areas (preference assessment and…

  5. A model of evaluating the pseudogap temperature for high ...

    Indian Academy of Sciences (India)

    DOI: 10.1007/s12043-015-1088-3; ePublication: 30 September 2015. Abstract. We have presented a model of evaluating the pseudogap temperature for high- temperature superconductors using paraconductivity approach. The theoretical analysis is based on the crossing point technique of the conductivity expressions.

  6. Applying the social relations model to self and peer evaluations

    NARCIS (Netherlands)

    G.J. Greguras; C. Robie; M.Ph. Born (Marise)

    2001-01-01

    textabstractPeer evaluations of performance increasingly are being used to make organizational decisions and to provide individuals with performance related feedback. Using Kenny’s social relations model (SRM), data from 14 teams of undergraduate students who completed performance ratings of

  7. Frontier models for evaluating environmental efficiency: an overview

    NARCIS (Netherlands)

    Oude Lansink, A.G.J.M.; Wall, A.

    2014-01-01

    Our aim in this paper is to provide a succinct overview of frontier-based models used to evaluate environmental efficiency, with a special emphasis on agricultural activity. We begin by providing a brief, up-to-date review of the main approaches used to measure environmental efficiency, with

  8. Quantitative Comparison Between Crowd Models for Evacuation Planning and Evaluation

    NARCIS (Netherlands)

    Viswanathan, V.; Lee, C.E.; Lees, M.H.; Cheong, S.A.; Sloot, P.M.A.

    2014-01-01

    Crowd simulation is rapidly becoming a standard tool for evacuation planning and evaluation. However, the many crowd models in the literature are structurally different, and few have been rigorously calibrated against real-world egress data, especially in emergency situations. In this paper we

  9. An IPA-Embedded Model for Evaluating Creativity Curricula

    Science.gov (United States)

    Chang, Chi-Cheng

    2014-01-01

    How to diagnose the effectiveness of creativity-related curricula is a crucial concern in the pursuit of educational excellence. This paper introduces an importance-performance analysis (IPA)-embedded model for curriculum evaluation, using the example of an IT project implementation course to assess the creativity performance deduced from student…

  10. Evaluation of models generated via hybrid evolutionary algorithms ...

    African Journals Online (AJOL)

    2016-04-02

    Apr 2, 2016 ... Evaluation of models generated via hybrid evolutionary algorithms for the prediction of Microcystis ... evolutionary algorithms (HEA) proved to be highly applica- ble to the hypertrophic reservoirs of South Africa. .... discovered and optimised using a large-scale parallel computational device and relevant soft-.

  11. Evaluation of Digital Model Accuracy and Time-dependent ...

    African Journals Online (AJOL)

    2017-10-26

    Oct 26, 2017 ... Objectives: The aim of this study was to evaluate the accuracy of digital models produced with the three-dimensional dental scanner, and to test the dimensional stability of alginate impressions for durations of immediately (T0), 1 day (T1), and 2 days (T2). Materials and Methods: A total of sixty impressions ...

  12. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  13. Integrated model for supplier selection and performance evaluation

    Directory of Open Access Journals (Sweden)

    Borges de Araújo, Maria Creuza

    2015-08-01

    Full Text Available This paper puts forward a model for selecting suppliers and evaluating the performance of those already working with a company. A simulation was conducted in a food industry. This sector has high significance in the economy of Brazil. The model enables the phases of selecting and evaluating suppliers to be integrated. This is important so that a company can have partnerships with suppliers who are able to meet their needs. Additionally, a group method is used to enable managers who will be affected by this decision to take part in the selection stage. Finally, the classes resulting from the performance evaluation are shown to support the contractor in choosing the most appropriate relationship with its suppliers.

  14. Uranium resources evaluation model as an exploration tool

    International Nuclear Information System (INIS)

    Ruzicka, V.

    1976-01-01

    Evaluation of uranium resources, as conducted by the Uranium Resources Evaluation Section of the Geological Survey of Canada, comprises operations analogous with those performed during the preparatory stages of uranium exploration. The uranium resources evaluation model, simulating the estimation process, can be divided into four steps. The first step includes definition of major areas and ''unit subdivisions'' for which geological data are gathered, coded, computerized and retrieved. Selection of these areas and ''unit subdivisions'' is based on a preliminary appraisal of their favourability for uranium mineralization. The second step includes analyses of the data, definition of factors controlling uranium minearlization, classification of uranium occurrences into genetic types, and final delineation of favourable areas; this step corresponds to the selection of targets for uranium exploration. The third step includes geological field work; it is equivalent to geological reconnaissance in exploration. The fourth step comprises computation of resources; the preliminary evaluation techniques in the exploration are, as a rule, analogous with the simplest methods employed in the resource evaluation. The uranium resources evaluation model can be conceptually applied for decision-making during exploration or for formulation of exploration strategy using the quantified data as weighting factors. (author)

  15. Models and Rules of Evaluation in International Accounting

    Directory of Open Access Journals (Sweden)

    Niculae Feleaga

    2006-04-01

    Full Text Available The accounting procedures cannot be analyzed without a previous evaluation. Value is in general a very subjective issue, usually the result of a monetary evaluation made to a specific asset, group of assets or entities, or to some rendered services. Within the economic sciences, value comes from its very own deep history. In accounting, the concept of value had a late and fragile start. The term of value must not be misinterpreted as being the same thing with cost, even though value is frequently measured through costs. At the origin of the international accounting standards lays the framework for preparing, presenting and disclosing the financial statements. The framework stays as a reference matrix, as a standard of standards, as a constitution of financial accounting. According to the international framework, the financial statements use different evaluation basis: the hystorical cost, the current cost, the realisable (settlement value, the present value (the present value of cash flows. Choosing the evaluation basis and the capital maintenance concept will eventually determine the accounting evaluation model used in preparing the financial statements of a company. The multitude of accounting evaluation models differentiate themselves one from another through various relevance and reliable degrees of accounting information and therefore, accountants (the prepares of financial statements must try to equilibrate these two main qualitative characteristics of financial information.

  16. Models and Rules of Evaluation in International Accounting

    Directory of Open Access Journals (Sweden)

    Liliana Feleaga

    2006-06-01

    Full Text Available The accounting procedures cannot be analyzed without a previous evaluation. Value is in general a very subjective issue, usually the result of a monetary evaluation made to a specific asset, group of assets or entities, or to some rendered services. Within the economic sciences, value comes from its very own deep history. In accounting, the concept of value had a late and fragile start. The term of value must not be misinterpreted as being the same thing with cost, even though value is frequently measured through costs. At the origin of the international accounting standards lays the framework for preparing, presenting and disclosing the financial statements. The framework stays as a reference matrix, as a standard of standards, as a constitution of financial accounting. According to the international framework, the financial statements use different evaluation basis: the hystorical cost, the current cost, the realisable (settlement value, the present value (the present value of cash flows. Choosing the evaluation basis and the capital maintenance concept will eventually determine the accounting evaluation model used in preparing the financial statements of a company. The multitude of accounting evaluation models differentiate themselves one from another through various relevance and reliable degrees of accounting information and therefore, accountants (the prepares of financial statements must try to equilibrate these two main qualitative characteristics of financial information.

  17. Ex-post evaluation of European energy models

    International Nuclear Information System (INIS)

    Pilavachi, P.A.; Dalamaga, Th.; Rossetti di Valdalbero, D.; Guilmot, J.-F.

    2008-01-01

    Various energy-modelling activities are pursued by public authorities, private companies and research institutes with the aim to provide energy forecasts and to assess the impact of energy and environmental policies. Nevertheless, no ex-post evaluations of the results of these modelling activities have been carried out at the European Community level. This paper investigates and compares the assumptions and the results from a European study carried out in the middle of the eighties with the combination of the so-called Modele de prospective de la demande energetique a long terme (MEDEE) and Energy flow optimization (EFOM) models with the targeted year of 2000 as presented in the ''ENERGY 2000'' study. Concretely, assumptions and forecasts are compared with real statistical data. In this way, an evaluation of quantitative tools and model results can be established. The aim of this paper is not to evaluate the quantitative tools themselves but their results and their policy relevance within a frame of 15 years. (author)

  18. Motion Reliability Modeling and Evaluation for Manipulator Path Planning Task

    Directory of Open Access Journals (Sweden)

    Tong Li

    2015-01-01

    Full Text Available Motion reliability as a criterion can reflect the accuracy of manipulator in completing operations. Since path planning task takes a significant role in operations of manipulator, the motion reliability evaluation of path planning task is discussed in the paper. First, a modeling method for motion reliability is proposed by taking factors related to position accuracy of manipulator into account. In the model, multidimensional integral for PDF is carried out to calculate motion reliability. Considering the complex of multidimensional integral, the approach of equivalent extreme value is introduced, with which multidimensional integral is converted into one dimensional integral for convenient calculation. Then a method based on the maximum entropy principle is proposed for model calculation. With the method, the PDF can be obtained efficiently at the state of maximum entropy. As a result, the evaluation of motion reliability can be achieved by one dimensional integral for PDF. Simulations on a particular path planning task are carried out, with which the feasibility and effectiveness of the proposed methods are verified. In addition, the modeling method which takes the factors related to position accuracy into account can represent the contributions of these factors to motion reliability. And the model calculation method can achieve motion reliability evaluation with high precision and efficiency.

  19. n⁺ GaAs/AuGeNi-Au Thermocouple-Type RF MEMS Power Sensors Based on Dual Thermal Flow Paths in GaAs MMIC.

    Science.gov (United States)

    Zhang, Zhiqiang; Liao, Xiaoping

    2017-06-17

    To achieve radio frequency (RF) power detection, gain control, and circuit protection, this paper presents n⁺ GaAs/AuGeNi-Au thermocouple-type RF microelectromechanical system (MEMS) power sensors based on dual thermal flow paths. The sensors utilize a conversion principle of RF power-heat-voltage, where a thermovoltage is obtained as the RF power changes. To improve the heat transfer efficiency and the sensitivity, structures of two heat conduction paths are designed: one in which a thermal slug of Au is placed between two load resistors and hot junctions of the thermocouples, and one in which a back cavity is fabricated by the MEMS technology to form a substrate membrane underneath the resistors and the hot junctions. The improved sensors were fabricated by a GaAs monolithic microwave integrated circuit (MMIC) process. Experiments show that these sensors have reflection losses of less than -17 dB up to 12 GHz. At 1, 5, and 10 GHz, measured sensitivities are about 63.45, 53.97, and 44.14 µ V/mW for the sensor with the thermal slug, and about 111.03, 94.79, and 79.04 µ V/mW for the sensor with the thermal slug and the back cavity, respectively.

  20. n+ GaAs/AuGeNi-Au Thermocouple-Type RF MEMS Power Sensors Based on Dual Thermal Flow Paths in GaAs MMIC

    Directory of Open Access Journals (Sweden)

    Zhiqiang Zhang

    2017-06-01

    Full Text Available To achieve radio frequency (RF power detection, gain control, and circuit protection, this paper presents n+ GaAs/AuGeNi-Au thermocouple-type RF microelectromechanical system (MEMS power sensors based on dual thermal flow paths. The sensors utilize a conversion principle of RF power-heat-voltage, where a thermovoltage is obtained as the RF power changes. To improve the heat transfer efficiency and the sensitivity, structures of two heat conduction paths are designed: one in which a thermal slug of Au is placed between two load resistors and hot junctions of the thermocouples, and one in which a back cavity is fabricated by the MEMS technology to form a substrate membrane underneath the resistors and the hot junctions. The improved sensors were fabricated by a GaAs monolithic microwave integrated circuit (MMIC process. Experiments show that these sensors have reflection losses of less than −17 dB up to 12 GHz. At 1, 5, and 10 GHz, measured sensitivities are about 63.45, 53.97, and 44.14 µV/mW for the sensor with the thermal slug, and about 111.03, 94.79, and 79.04 µV/mW for the sensor with the thermal slug and the back cavity, respectively.

  1. Distributed multi-criteria model evaluation and spatial association analysis

    Science.gov (United States)

    Scherer, Laura; Pfister, Stephan

    2015-04-01

    Model performance, if evaluated, is often communicated by a single indicator and at an aggregated level; however, it does not embrace the trade-offs between different indicators and the inherent spatial heterogeneity of model efficiency. In this study, we simulated the water balance of the Mississippi watershed using the Soil and Water Assessment Tool (SWAT). The model was calibrated against monthly river discharge at 131 measurement stations. Its time series were bisected to allow for subsequent validation at the same gauges. Furthermore, the model was validated against evapotranspiration which was available as a continuous raster based on remote sensing. The model performance was evaluated for each of the 451 sub-watersheds using four different criteria: 1) Nash-Sutcliffe efficiency (NSE), 2) percent bias (PBIAS), 3) root mean square error (RMSE) normalized to standard deviation (RSR), as well as 4) a combined indicator of the squared correlation coefficient and the linear regression slope (bR2). Conditions that might lead to a poor model performance include aridity, a very flat and steep relief, snowfall and dams, as indicated by previous research. In an attempt to explain spatial differences in model efficiency, the goodness of the model was spatially compared to these four phenomena by means of a bivariate spatial association measure which combines Pearson's correlation coefficient and Moran's index for spatial autocorrelation. In order to assess the model performance of the Mississippi watershed as a whole, three different averages of the sub-watershed results were computed by 1) applying equal weights, 2) weighting by the mean observed river discharge, 3) weighting by the upstream catchment area and the square root of the time series length. Ratings of model performance differed significantly in space and according to efficiency criterion. The model performed much better in the humid Eastern region than in the arid Western region which was confirmed by the

  2. Evaluation of Generation Alternation Models in Evolutionary Robotics

    Science.gov (United States)

    Oiso, Masashi; Matsumura, Yoshiyuki; Yasuda, Toshiyuki; Ohkura, Kazuhiro

    For efficient implementation of Evolutionary Algorithms (EA) to a desktop grid computing environment, we propose a new generation alternation model called Grid-Oriented-Deletion (GOD) based on comparison with the conventional techniques. In previous research, generation alternation models are generally evaluated by using test functions. However, their exploration performance on the real problems such as Evolutionary Robotics (ER) has not been made very clear yet. Therefore we investigate the relationship between the exploration performance of EA on an ER problem and its generation alternation model. We applied four generation alternation models to the Evolutionary Multi-Robotics (EMR), which is the package-pushing problem to investigate their exploration performance. The results show that GOD is more effective than the other conventional models.

  3. Evaluation model and experimental validation of tritium in agricultural plant

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Hee Suk; Keum, Dong Kwon; Lee, Han Soo; Jun, In; Choi, Yong Ho; Lee, Chang Woo [KAERI, Daejon (Korea, Republic of)

    2005-12-15

    This paper describes a compartment dynamic model for evaluating the contamination level of tritium in agricultural plants exposed by accidentally released tritium. The present model uses a time dependent growth equation of plant so that it can predict the effect of growth stage of plant during the exposure time. The model including atmosphere, soil and plant compartments is described by a set of nonlinear ordinary differential equations, and is able to predict time-dependent concentrations of tritium in the compartments. To validate the model, a series of exposure experiments of HTO vapor on Chinese cabbage and radish was carried out at the different growth stage of each plant. At the end of exposure, the tissue free water(TFWT) and the organically bound tritium (OBT) were measured. The measured concentrations were agreed well with model predictions.

  4. Evaluating Climate Models: Should We Use Weather or Climate Observations?

    Science.gov (United States)

    Oglesby, R. J.; Rowe, C. M.; Maasch, K. A.; Erickson, D. J.; Hays, C.

    2009-12-01

    Calling the numerical models that we use for simulations of climate change 'climate models' is a bit of a misnomer. These 'general circulation models' (GCMs, AKA global climate models) and their cousins the 'regional climate models' (RCMs) are actually physically-based weather simulators. That is, these models simulate, either globally or locally, daily weather patterns in response to some change in forcing or boundary condition. These simulated weather patterns are then aggregated into climate statistics, very much as we aggregate observations into 'real climate statistics'. Traditionally, the output of GCMs has been evaluated using climate statistics, as opposed to their ability to simulate realistic daily weather observations. At the coarse global scale this may be a reasonable approach, however, as RCM's downscale to increasingly higher resolutions, the conjunction between weather and climate becomes more problematic. We present results from a series of present-day climate simulations using the WRF ARW for domains that cover North America, much of Latin America, and South Asia. The basic domains are at a 12 km resolution, but several inner domains at 4 km have also been simulated. These include regions of complex topography in Mexico, Colombia, Peru, and Sri Lanka, as well as a region of low topography and fairly homogeneous land surface type (the U.S. Great Plains). Model evaluations are performed using standard climate analyses (e.g., reanalyses; NCDC data) but also using time series of daily station observations. Preliminary results suggest little difference in the assessment of long-term mean quantities, but the variability on seasonal and interannual timescales is better described. Furthermore, the value-added by using daily weather observations as an evaluation tool increases with the model resolution.

  5. AN INTEGRATED FUZZY AHP AND TOPSIS MODEL FOR SUPPLIER EVALUATION

    Directory of Open Access Journals (Sweden)

    Željko Stević

    2016-05-01

    Full Text Available In today’s modern supply chains, the adequate suppliers’ choice has strategic meaning for entire companies’ business. The aim of this paper is to evaluate different suppliers using the integrated model that recognizes a combination of fuzzy AHP (Analytical Hierarchy Process and the TOPSIS method. Based on six criteria, the expert team was formed to compare them, so determination of their significance is being done with fuzzy AHP method. Expert team also compares suppliers according to each criteria and on the base of triangular fuzzy numbers. Based on their inputs, TOPSIS method is used to estimate potential solutions. Suggested model accomplishes certain advantages in comparison with previously used traditional models which were used to make decisions about evaluation and choice of supplier.

  6. Underwater Acoustic Communication Quality Evaluation Model Based on USV

    Directory of Open Access Journals (Sweden)

    Zhichao Lv

    2018-01-01

    Full Text Available The unmanned surface vehicle (USV integrated with acoustic modems has some advantages such as easy integration, rapid placement, and low cost, which becomes a kind of selective novel node in the underwater acoustic (UWA communication network and a kind of underwater or overwater communication relay as well. However, it is difficult to ensure the communication quality among the nodes on the network due to the random underwater acoustic channel, the severe marine environment, and the complex mobile node system. Aiming to model the communication characteristics of the USV, the multipath effect and Doppler effect are main concerns for the UWA communication in this paper, so that the ray beam method is utilized, the channel transmission function and the channel gain are obtained, and the mobile communication quality evaluation model is built. The simulation and lake experiments verify that the built mobile UWA communication quality evaluation model on USV can provide preference and technique support for USV applications.

  7. Lifetime-Aware Cloud Data Centers: Models and Performance Evaluation

    Directory of Open Access Journals (Sweden)

    Luca Chiaraviglio

    2016-06-01

    Full Text Available We present a model to evaluate the server lifetime in cloud data centers (DCs. In particular, when the server power level is decreased, the failure rate tends to be reduced as a consequence of the limited number of components powered on. However, the variation between the different power states triggers a failure rate increase. We therefore consider these two effects in a server lifetime model, subject to an energy-aware management policy. We then evaluate our model in a realistic case study. Our results show that the impact on the server lifetime is far from negligible. As a consequence, we argue that a lifetime-aware approach should be pursued to decide how and when to apply a power state change to a server.

  8. Evaluation of a Postdischarge Call System Using the Logic Model.

    Science.gov (United States)

    Frye, Timothy C; Poe, Terri L; Wilson, Marisa L; Milligan, Gary

    2018-02-01

    This mixed-method study was conducted to evaluate a postdischarge call program for congestive heart failure patients at a major teaching hospital in the southeastern United States. The program was implemented based on the premise that it would improve patient outcomes and overall quality of life, but it had never been evaluated for effectiveness. The Logic Model was used to evaluate the input of key staff members to determine whether the outputs and results of the program matched the expectations of the organization. Interviews, online surveys, reviews of existing patient outcome data, and reviews of publicly available program marketing materials were used to ascertain current program output. After analyzing both qualitative and quantitative data from the evaluation, recommendations were made to the organization to improve the effectiveness of the program.

  9. Evaluation of Stratospheric Transport in New 3D Models Using the Global Modeling Initiative Grading Criteria

    Science.gov (United States)

    Strahan, Susan E.; Douglass, Anne R.; Einaudi, Franco (Technical Monitor)

    2001-01-01

    The Global Modeling Initiative (GMI) Team developed objective criteria for model evaluation in order to identify the best representation of the stratosphere. This work created a method to quantitatively and objectively discriminate between different models. In the original GMI study, 3 different meteorological data sets were used to run an offline chemistry and transport model (CTM). Observationally-based grading criteria were derived and applied to these simulations and various aspects of stratospheric transport were evaluated; grades were assigned. Here we report on the application of the GMI evaluation criteria to CTM simulations integrated with a new assimilated wind data set and a new general circulation model (GCM) wind data set. The Finite Volume Community Climate Model (FV-CCM) is a new GCM developed at Goddard which uses the NCAR CCM physics and the Lin and Rood advection scheme. The FV-Data Assimilation System (FV-DAS) is a new data assimilation system which uses the FV-CCM as its core model. One year CTM simulations of 2.5 degrees longitude by 2 degrees latitude resolution were run for each wind data set. We present the evaluation of temperature and annual transport cycles in the lower and middle stratosphere in the two new CTM simulations. We include an evaluation of high latitude transport which was not part of the original GMI criteria. Grades for the new simulations will be compared with those assigned during the original GMT evaluations and areas of improvement will be identified.

  10. A Spectral Evaluation of Models Performances in Mediterranean Oak Woodlands

    Science.gov (United States)

    Vargas, R.; Baldocchi, D. D.; Abramowitz, G.; Carrara, A.; Correia, A.; Kobayashi, H.; Papale, D.; Pearson, D.; Pereira, J.; Piao, S.; Rambal, S.; Sonnentag, O.

    2009-12-01

    Ecosystem processes are influenced by climatic trends at multiple temporal scales including diel patterns and other mid-term climatic modes, such as interannual and seasonal variability. Because interactions between biophysical components of ecosystem processes are complex, it is important to test how models perform in frequency (e.g. hours, days, weeks, months, years) and time (i.e. day of the year) domains in addition to traditional tests of annual or monthly sums. Here we present a spectral evaluation using wavelet time series analysis of model performance in seven Mediterranean Oak Woodlands that encompass three deciduous and four evergreen sites. We tested the performance of five models (CABLE, ORCHIDEE, BEPS, Biome-BGC, and JULES) on measured variables of gross primary production (GPP) and evapotranspiration (ET). In general, model performance fails at intermediate periods (e.g. weeks to months) likely because these models do not represent the water pulse dynamics that influence GPP and ET at these Mediterranean systems. To improve the performance of a model it is critical to identify first where and when the model fails. Only by identifying where a model fails we can improve the model performance and use them as prognostic tools and to generate further hypotheses that can be tested by new experiments and measurements.

  11. Evaluation of model quality predictions in CASP9

    KAUST Repository

    Kryshtafovych, Andriy

    2011-01-01

    CASP has been assessing the state of the art in the a priori estimation of accuracy of protein structure prediction since 2006. The inclusion of model quality assessment category in CASP contributed to a rapid development of methods in this area. In the last experiment, 46 quality assessment groups tested their approaches to estimate the accuracy of protein models as a whole and/or on a per-residue basis. We assessed the performance of these methods predominantly on the basis of the correlation between the predicted and observed quality of the models on both global and local scales. The ability of the methods to identify the models closest to the best one, to differentiate between good and bad models, and to identify well modeled regions was also analyzed. Our evaluations demonstrate that even though global quality assessment methods seem to approach perfection point (weighted average per-target Pearson\\'s correlation coefficients are as high as 0.97 for the best groups), there is still room for improvement. First, all top-performing methods use consensus approaches to generate quality estimates, and this strategy has its own limitations. Second, the methods that are based on the analysis of individual models lag far behind clustering techniques and need a boost in performance. The methods for estimating per-residue accuracy of models are less accurate than global quality assessment methods, with an average weighted per-model correlation coefficient in the range of 0.63-0.72 for the best 10 groups.

  12. Knowledge management: Postgraduate Alternative Evaluation Model (MAPA in Brazil

    Directory of Open Access Journals (Sweden)

    Deisy Cristina Corrêa Igarashi

    2013-07-01

    Full Text Available The Brazilian stricto sensu postgraduate programs that include master and / or doctorate courses are evaluated by Coordination for the Improvement of Higher Education Personnel (CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. The evaluation method used by CAPES is recognized in national and international context. However, several elements of the evaluation method can be improved. For example: to consider programs diversity, heterogeneity and specificities; to reduce subjectivity and to explain how indicators are grouped into different dimensions to generate a final result, which is scoring level reached by a program. This study aims to analyze the evaluation process by CAPES, presenting questions, difficulties and objections raised by researchers. From the analysis, the study proposes an alternative evaluation model for postgraduate (MAPA - Modelo de Avaliação para Pós graduação Alternativo which incorporates fuzzy logic in result analysis to minimize limitations identified. The MAPA was applied in three postgraduate programs, allowing: (1 better understanding of procedures used for the evaluation, (2 identifying elements that need regulation, (3 characterization of indicators that generate local evaluation, (4 support in medium and long term planning.

  13. Evaluation of Deep Learning Models for Predicting CO2 Flux

    Science.gov (United States)

    Halem, M.; Nguyen, P.; Frankel, D.

    2017-12-01

    Artificial neural networks have been employed to calculate surface flux measurements from station data because they are able to fit highly nonlinear relations between input and output variables without knowing the detail relationships between the variables. However, the accuracy in performing neural net estimates of CO2 flux from observations of CO2 and other atmospheric variables is influenced by the architecture of the neural model, the availability, and complexity of interactions between physical variables such as wind, temperature, and indirect variables like latent heat, and sensible heat, etc. We evaluate two deep learning models, feed forward and recurrent neural network models to learn how they each respond to the physical measurements, time dependency of the measurements of CO2 concentration, humidity, pressure, temperature, wind speed etc. for predicting the CO2 flux. In this paper, we focus on a) building neural network models for estimating CO2 flux based on DOE data from tower Atmospheric Radiation Measurement data; b) evaluating the impact of choosing the surface variables and model hyper-parameters on the accuracy and predictions of surface flux; c) assessing the applicability of the neural network models on estimate CO2 flux by using OCO-2 satellite data; d) studying the efficiency of using GPU-acceleration for neural network performance using IBM Power AI deep learning software and packages on IBM Minsky system.

  14. Evaluation of the C Model for Addressing Short Fatigue Crack Growth

    National Research Council Canada - National Science Library

    Walker, K. F; Hu, W

    2008-01-01

    .... This report evaluates the C* model using experimental data from the open literature. For comparison, two other models, the El Haddad model and the FASTRAN model, were also evaluated for their capability in dealing with the same problem...

  15. A Category Based Threat Evaluation Model Using Platform Kinematics Data

    Directory of Open Access Journals (Sweden)

    Mustafa Çöçelli

    2017-08-01

    Full Text Available Command and control (C2 systems direct operators to make accurate decisions in the stressful atmosphere of the battlefield at the earliest. There are powerful tools that fuse various instant piece of information and brings summary of those in front of operators. Threat evaluation is one of the important fusion method that provides these assistance to military people. However, C2 systems could be deprived of valuable data source due to the absence of capable equipment. This situation has a bad unfavorable influence on the quality of tactical picture in front of C2 operators. In this paper, we study on the threat evaluation model that take into account these deficiencies. Our method extracts threat level of various targets mostly from their kinematics in two dimensional space. In the meantime, classification of entities around battlefield is unavailable. Only, category of targets are determined as a result of sensors process, which is the information of whether entities belong to air or surface environment. Hereby, threat evaluation model is consist of three fundamental steps that runs on entities belongs to different environment separately: the extraction of threat assessment cues, threat selection based on Bayesian Inference and the calculation of threat assessment rating. We have evaluated performance of proposed model by simulating a set of synthetic scenarios.

  16. A model for curriculum development and student evaluation

    Directory of Open Access Journals (Sweden)

    Šefer Jasmina P.

    2002-01-01

    Full Text Available The paper outlines theoretical foundations for investigations to be conducted in our education, based on USA (DISCOVERY and Yugoslav (CREATIVITY previous projects that dealt with developing, investigating and evaluating (a abilities of creative problem solving within seven types of intelligence after the Gardner model and (b curriculum that provides and encourages the development of those abilities. Divergent thinking and creativity in all spheres of intellectual behavior in teaching are encouraged by introducing open-type questions, play, exploring activities and multimedia integrative-interdisciplinary thematic approach to problem solving. Multiple intelligence and a dimensional model of problem solving present theoretical foundations for curriculum development and a new qualitative approach to process evaluation of student's various abilities. Investigations should make provisions for comparing the results obtained in various cultures and for integrating best solutions into a common whole. Comparing the results of cultures and testing theoretical models and instruments for the evaluation of students are the outcomes essential to the science of pedagogy. Curriculum development oriented to problem and divergent thinking in different areas, intellectual functioning, and enrichment of the choice of instruments for multiple process evaluation of students can also significantly contribute to the current reform of Yugoslav school, development of student abilities and teacher education and in-service training.

  17. Evaluation of Model Recognition for Grammar-Based Automatic 3d Building Model Reconstruction

    Science.gov (United States)

    Yu, Qian; Helmholz, Petra; Belton, David

    2016-06-01

    In recent years, 3D city models are in high demand by many public and private organisations, and the steadily growing capacity in both quality and quantity are increasing demand. The quality evaluation of these 3D models is a relevant issue both from the scientific and practical points of view. In this paper, we present a method for the quality evaluation of 3D building models which are reconstructed automatically from terrestrial laser scanning (TLS) data based on an attributed building grammar. The entire evaluation process has been performed in all the three dimensions in terms of completeness and correctness of the reconstruction. Six quality measures are introduced to apply on four datasets of reconstructed building models in order to describe the quality of the automatic reconstruction, and also are assessed on their validity from the evaluation point of view.

  18. Boussinesq Modeling of Wave Propagation and Runup over Fringing Coral Reefs, Model Evaluation Report

    National Research Council Canada - National Science Library

    Demirbilek, Zeki; Nwogu, Okey G

    2007-01-01

    This report describes evaluation of a two-dimensional Boussinesq-type wave model, BOUSS-2D, with data obtained from two laboratory experiments and two field studies at the islands of Guam and Hawaii...

  19. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  20. Postural effects on intracranial pressure: modeling and clinical evaluation.

    Science.gov (United States)

    Qvarlander, Sara; Sundström, Nina; Malm, Jan; Eklund, Anders

    2013-11-01

    The physiological effect of posture on intracranial pressure (ICP) is not well described. This study defined and evaluated three mathematical models describing the postural effects on ICP, designed to predict ICP at different head-up tilt angles from the supine ICP value. Model I was based on a hydrostatic indifference point for the cerebrospinal fluid (CSF) system, i.e., the existence of a point in the system where pressure is independent of body position. Models II and III were based on Davson's equation for CSF absorption, which relates ICP to venous pressure, and postulated that gravitational effects within the venous system are transferred to the CSF system. Model II assumed a fully communicating venous system, and model III assumed that collapse of the jugular veins at higher tilt angles creates two separate hydrostatic compartments. Evaluation of the models was based on ICP measurements at seven tilt angles (0-71°) in 27 normal pressure hydrocephalus patients. ICP decreased with tilt angle (ANOVA: P < 0.01). The reduction was well predicted by model III (ANOVA lack-of-fit: P = 0.65), which showed excellent fit against measured ICP. Neither model I nor II adequately described the reduction in ICP (ANOVA lack-of-fit: P < 0.01). Postural changes in ICP could not be predicted based on the currently accepted theory of a hydrostatic indifference point for the CSF system, but a new model combining Davson's equation for CSF absorption and hydrostatic gradients in a collapsible venous system performed well and can be useful in future research on gravity and CSF physiology.

  1. Evaluation of Medical Education virtual Program: P3 model.

    Science.gov (United States)

    Rezaee, Rita; Shokrpour, Nasrin; Boroumand, Maryam

    2016-10-01

    In e-learning, people get involved in a process and create the content (product) and make it available for virtual learners. The present study was carried out in order to evaluate the first virtual master program in medical education at Shiraz University of Medical Sciences according to P3 Model. This is an evaluation research study with post single group design used to determine how effective this program was. All students 60 who participated more than one year in this virtual program and 21 experts including teachers and directors participated in this evaluation project. Based on the P3 e-learning model, an evaluation tool with 5-point Likert rating scale was designed and applied to collect the descriptive data. Students reported storyboard and course design as the most desirable element of learning environment (2.30±0.76), but they declared technical support as the less desirable part (1.17±1.23). Presence of such framework in this regard and using it within the format of appropriate tools for evaluation of e-learning in universities and higher education institutes, which present e-learning curricula in the country, may contribute to implementation of the present and future e-learning curricula efficiently and guarantee its implementation in an appropriate way.

  2. Modeling and evaluating user behavior in exploratory visual analysis

    Energy Technology Data Exchange (ETDEWEB)

    Reda, Khairi; Johnson, Andrew E.; Papka, Michael E.; Leigh, Jason

    2016-07-25

    Empirical evaluation methods for visualizations have traditionally focused on assessing the outcome of the visual analytic process as opposed to characterizing how that process unfolds. There are only a handful of methods that can be used to systematically study how people use visualizations, making it difficult for researchers to capture and characterize the subtlety of cognitive and interaction behaviors users exhibit during visual analysis. To validate and improve visualization design, however, it is important for researchers to be able to assess and understand how users interact with visualization systems under realistic scenarios. This paper presents a methodology for modeling and evaluating the behavior of users in exploratory visual analysis. We model visual exploration using a Markov chain process comprising transitions between mental, interaction, and computational states. These states and the transitions between them can be deduced from a variety of sources, including verbal transcripts, videos and audio recordings, and log files. This model enables the evaluator to characterize the cognitive and computational processes that are essential to insight acquisition in exploratory visual analysis, and reconstruct the dynamics of interaction between the user and the visualization system. We illustrate this model with two exemplar user studies, and demonstrate the qualitative and quantitative analytical tools it affords.

  3. Evaluating fugacity models for trace components in landfill gas

    International Nuclear Information System (INIS)

    Shafi, Sophie; Sweetman, Andrew; Hough, Rupert L.; Smith, Richard; Rosevear, Alan; Pollard, Simon J.T.

    2006-01-01

    A fugacity approach was evaluated to reconcile loadings of vinyl chloride (chloroethene), benzene, 1,3-butadiene and trichloroethylene in waste with concentrations observed in landfill gas monitoring studies. An evaluative environment derived from fictitious but realistic properties such as volume, composition, and temperature, constructed with data from the Brogborough landfill (UK) test cells was used to test a fugacity approach to generating the source term for use in landfill gas risk assessment models (e.g. GasSim). SOILVE, a dynamic Level II model adapted here for landfills, showed greatest utility for benzene and 1,3-butadiene, modelled under anaerobic conditions over a 10 year simulation. Modelled concentrations of these components (95 300 μg m -3 ; 43 μg m -3 ) fell within measured ranges observed in gas from landfills (24 300-180 000 μg m -3 ; 20-70 μg m -3 ). This study highlights the need (i) for representative and time-referenced biotransformation data; (ii) to evaluate the partitioning characteristics of organic matter within waste systems and (iii) for a better understanding of the role that gas extraction rate (flux) plays in producing trace component concentrations in landfill gas. - Fugacity for trace component in landfill gas

  4. Evaluating fugacity models for trace components in landfill gas.

    Science.gov (United States)

    Shafi, Sophie; Sweetman, Andrew; Hough, Rupert L; Smith, Richard; Rosevear, Alan; Pollard, Simon J T

    2006-12-01

    A fugacity approach was evaluated to reconcile loadings of vinyl chloride (chloroethene), benzene, 1,3-butadiene and trichloroethylene in waste with concentrations observed in landfill gas monitoring studies. An evaluative environment derived from fictitious but realistic properties such as volume, composition, and temperature, constructed with data from the Brogborough landfill (UK) test cells was used to test a fugacity approach to generating the source term for use in landfill gas risk assessment models (e.g. GasSim). SOILVE, a dynamic Level II model adapted here for landfills, showed greatest utility for benzene and 1,3-butadiene, modelled under anaerobic conditions over a 10 year simulation. Modelled concentrations of these components (95,300 microg m(-3); 43 microg m(-3)) fell within measured ranges observed in gas from landfills (24,300-180,000 microg m(-3); 20-70 microg m(-3)). This study highlights the need (i) for representative and time-referenced biotransformation data; (ii) to evaluate the partitioning characteristics of organic matter within waste systems and (iii) for a better understanding of the role that gas extraction rate (flux) plays in producing trace component concentrations in landfill gas.

  5. Degradation modeling with application to aging and maintenance effectiveness evaluations

    International Nuclear Information System (INIS)

    Samanta, P.K.; Vesely, W.E.; Hsu, F.; Subudhi, M.

    1991-01-01

    This paper describes a modeling approach to analyze light water reactor component degradation and failure data to understand the aging process of components. As used here, degradation modeling is the analysis of information on component degradation in order to develop models of the process and its implications. This particular modeling focuses on the analysis of the times of component degradations, to model how the rate of degradation changes with the age of the component. The methodology presented also discusses the effectiveness of maintenance as applicable to aging evaluations. The specific applications which are performed show quantitative models of component degradation rates and component failure rates from plant-specific data. The statistical techniques which are developed and applied allow aging trends to be effectively identified in the degradation data, and in the failure data. Initial estimates of the effectiveness of maintenance in limiting degradations from becoming failures also are developed. These results are important first steps in degradation modeling, and show that degradation can be modeled to identify aging trends

  6. Degradation modeling with application to aging and maintenance effectiveness evaluations

    International Nuclear Information System (INIS)

    Samanta, P.K.; Hsu, F.; Subduhi, M.; Vesely, W.E.

    1990-01-01

    This paper describes a modeling approach to analyze component degradation and failure data to understand the aging process of components. As used here, degradation modeling is the analysis of information on component degradation in order to develop models of the process and its implications. This particular modeling focuses on the analysis of the times of component degradations, to model how the rate of degradation changes with the age of the component. The methodology presented also discusses the effectiveness of maintenance as applicable to aging evaluations. The specific applications which are performed show quantitative models of component degradation rates and component failure rates from plant-specific data. The statistical techniques which are developed and applied allow aging trends to be effectively identified in the degradation data, and in the failure data. Initial estimates of the effectiveness of maintenance in limiting degradations from becoming failures also are developed. These results are important first steps in degradation modeling, and show that degradation can be modeled to identify aging trends. 2 refs., 8 figs

  7. LOCA analysis evaluation model with TRAC-PF1/NEM

    International Nuclear Information System (INIS)

    Orive Moreno, Raul; Gallego Cabezon, Ines; Garcia Sedano, Pablo

    2004-01-01

    Nowadays regulatory rules and code models development are progressing on the goal of using best-estimate approximations in applications of license. Inside this framework, IBERDROLA is developing a PWR LOCA Analysis Methodology with one double slope, by a side the development of an Evaluation Model (upper-bounding model) that covers with conservative form the different aspects from the PWR LOCA phenomenology and on the other hand, a proposal of CSAU (Code Scaling Applicability and Uncertainty) type evaluation, methodology that strictly covers the 95/95 criterion in the Peak Cladding Temperature. A structured method is established, that basically involves the following steps: 1. Selection of the Large Break LOCA like accident to analyze and of TRAC-PF1/MOD2 V99.1 NEM (PSU version) computer code like analysis tool. 2. Code Assessment, identifying the most remarkable phenomena (PIRT, Phenomena Identification and Ranking Tabulation) and estimation of a possible code deviation (bias) and uncertainties associated to the specific models that control these phenomena (critical flow mass, heat transfer, countercurrent flow, etc...). 3. Evaluation of an overall PCT uncertainty, taking into account code uncertainty, reactor initial conditions, and accident boundary conditions. Uncertainties quantification requires an excellent experiments selection that allows to define a complete evaluation matrix, and the comparison of the simulations results with the experiments measured data, as well as in the relative to the scaling of these phenomena. To simulate these experiments it was necessary to modify the original code, because it was not able to reproduce, in a qualitative way, the expected phenomenology. It can be concluded that there is a good agreement between the TRAC-PF1/NEM results and the experimental data. Once average error (ε) and standard deviation (σ) for those correlations under study are obtained, these factors could be used to correct in a conservative way code

  8. Systematic review of model-based cervical screening evaluations.

    Science.gov (United States)

    Mendes, Diana; Bains, Iren; Vanni, Tazio; Jit, Mark

    2015-05-01

    Optimising population-based cervical screening policies is becoming more complex due to the expanding range of screening technologies available and the interplay with vaccine-induced changes in epidemiology. Mathematical models are increasingly being applied to assess the impact of cervical cancer screening strategies. We systematically reviewed MEDLINE®, Embase, Web of Science®, EconLit, Health Economic Evaluation Database, and The Cochrane Library databases in order to identify the mathematical models of human papillomavirus (HPV) infection and cervical cancer progression used to assess the effectiveness and/or cost-effectiveness of cervical cancer screening strategies. Key model features and conclusions relevant to decision-making were extracted. We found 153 articles meeting our eligibility criteria published up to May 2013. Most studies (72/153) evaluated the introduction of a new screening technology, with particular focus on the comparison of HPV DNA testing and cytology (n = 58). Twenty-eight in forty of these analyses supported HPV DNA primary screening implementation. A few studies analysed more recent technologies - rapid HPV DNA testing (n = 3), HPV DNA self-sampling (n = 4), and genotyping (n = 1) - and were also supportive of their introduction. However, no study was found on emerging molecular markers and their potential utility in future screening programmes. Most evaluations (113/153) were based on models simulating aggregate groups of women at risk of cervical cancer over time without accounting for HPV infection transmission. Calibration to country-specific outcome data is becoming more common, but has not yet become standard practice. Models of cervical screening are increasingly used, and allow extrapolation of trial data to project the population-level health and economic impact of different screening policy. However, post-vaccination analyses have rarely incorporated transmission dynamics. Model calibration to country

  9. Physically based evaluation of climate models over the Iberian Peninsula

    Science.gov (United States)

    Sánchez de Cos, Carmen; Sánchez-Laulhé, Jose M.; Jiménez-Alonso, Carlos; Sancho-Avila, Juan M.; Rodriguez-Camino, Ernesto

    2013-04-01

    A novel approach is proposed for evaluating regional climate models based on the comparison of empirical relationships among model outcome variables. The approach is actually a quantitative adaptation of the method for evaluating global climate models proposed by Betts (Bull Am Meteorol Soc 85:1673-1688, 2004). Three selected relationships among different magnitudes involved in water and energy land surface budgets are firstly established using daily re-analysis data. The selected relationships are obtained for an area encompassing two river basins in the southern Iberian Peninsula corresponding to 2 months, representative of dry and wet seasons. The same corresponding relations are also computed for each of the thirteen regional simulations of the ENSEMBLES project over the same area. The usage of a metric based on the Hellinger coefficient allows a quantitative estimation of how well models are performing in simulating the relations among surface magnitudes. Finally, a series of six rankings of the thirteen regional climate models participating in the ENSEMBLES project is obtained based on their ability to simulate such surface processes.

  10. Homogeneous ice nucleation evaluated for several water models

    Science.gov (United States)

    Espinosa, J. R.; Sanz, E.; Valeriani, C.; Vega, C.

    2014-11-01

    In this work, we evaluate by means of computer simulations the rate for ice homogeneous nucleation for several water models such as TIP4P, TIP4P/2005,TIP4P/ICE, and mW (following the same procedure as in Sanz et al. [J. Am. Chem. Soc. 135, 15008 (2013)]) in a broad temperature range. We estimate the ice-liquid interfacial free-energy, and conclude that for all water models γ decreases as the temperature decreases. Extrapolating our results to the melting temperature, we obtain a value of the interfacial free-energy between 25 and 32 mN/m in reasonable agreement with the reported experimental values. Moreover, we observe that the values of γ depend on the chosen water model and this is a key factor when numerically evaluating nucleation rates, given that the kinetic prefactor is quite similar for all water models with the exception of the mW (due to the absence of hydrogens). Somewhat surprisingly the estimates of the nucleation rates found in this work for TIP4P/2005 are slightly higher than those of the mW model, even though the former has explicit hydrogens. Our results suggest that it may be possible to observe in computer simulations spontaneous crystallization of TIP4P/2005 at about 60 K below the melting point.

  11. Evaluating the reliability of predictions made using environmental transfer models

    International Nuclear Information System (INIS)

    1989-01-01

    The development and application of mathematical models for predicting the consequences of releases of radionuclides into the environment from normal operations in the nuclear fuel cycle and in hypothetical accident conditions has increased dramatically in the last two decades. This Safety Practice publication has been prepared to provide guidance on the available methods for evaluating the reliability of environmental transfer model predictions. It provides a practical introduction of the subject and a particular emphasis has been given to worked examples in the text. It is intended to supplement existing IAEA publications on environmental assessment methodology. 60 refs, 17 figs, 12 tabs

  12. Evaluation of Differentiation Strategy in Shipping Enterprises with Simulation Model

    Science.gov (United States)

    Vaxevanou, Anthi Z.; Ferfeli, Maria V.; Damianos, Sakas P.

    2009-08-01

    The present inquiring study aims at investigating the circumstances that prevail in the European Shipping Enterprises with special reference to the Greek ones. This investigation is held in order to explore the potential implementation of strategies so as to create a unique competitive advantage [1]. The Shipping sector is composed of enterprises that are mainly activated in the following three areas: the passenger, the commercial and the naval. The main target is to create a dynamic simulation model which, with reference to the STAIR strategic model, will evaluate the strategic differential choice that some of the shipping enterprises have.

  13. ANL/HIWAY: an air pollution evaluation model for roadways

    Energy Technology Data Exchange (ETDEWEB)

    Concaildi, G. A.; Cohen, A. S.; King, R. F.

    1976-12-01

    This report describes a computer program, called ANL/HIWAY, for estimating air quality levels of nonreactive pollutants produced by vehicular sources. It is valid for receptors at distances of tens to hundreds of meters, at an angle, downwind of the roadway, in relatively uncomplicated terrain. It may be used by planners to analyze the effects of a proposed roadway on adjacent air quality. The ANL/HIWAY model expands the evaluation capabilities of the EPA/HIWAY dispersion model. This report also serves as a user's manual for running the ANL/HIWAY PROGRAM. All command structures are described in detail, with sample problems exemplifying their use.

  14. Model Test Bed for Evaluating Wave Models and Best Practices for Resource Assessment and Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Neary, Vincent Sinclair [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies; Yang, Zhaoqing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Coastal Sciences Division; Wang, Taiping [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Coastal Sciences Division; Gunawan, Budi [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies; Dallman, Ann Renee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies

    2016-03-01

    A wave model test bed is established to benchmark, test and evaluate spectral wave models and modeling methodologies (i.e., best practices) for predicting the wave energy resource parameters recommended by the International Electrotechnical Commission, IEC TS 62600-101Ed. 1.0 ©2015. Among other benefits, the model test bed can be used to investigate the suitability of different models, specifically what source terms should be included in spectral wave models under different wave climate conditions and for different classes of resource assessment. The overarching goal is to use these investigations to provide industry guidance for model selection and modeling best practices depending on the wave site conditions and desired class of resource assessment. Modeling best practices are reviewed, and limitations and knowledge gaps in predicting wave energy resource parameters are identified.

  15. Evaluating Flight Crew Performance by a Bayesian Network Model

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2018-03-01

    Full Text Available Flight crew performance is of great significance in keeping flights safe and sound. When evaluating the crew performance, quantitative detailed behavior information may not be available. The present paper introduces the Bayesian Network to perform flight crew performance evaluation, which permits the utilization of multidisciplinary sources of objective and subjective information, despite sparse behavioral data. In this paper, the causal factors are selected based on the analysis of 484 aviation accidents caused by human factors. Then, a network termed Flight Crew Performance Model is constructed. The Delphi technique helps to gather subjective data as a supplement to objective data from accident reports. The conditional probabilities are elicited by the leaky noisy MAX model. Two ways of inference for the BN—probability prediction and probabilistic diagnosis are used and some interesting conclusions are drawn, which could provide data support to make interventions for human error management in aviation safety.

  16. How Useful Are Our Models? Pre-Service and Practicing Teacher Evaluations of Technology Integration Models

    Science.gov (United States)

    Kimmons, Royce; Hall, Cassidy

    2018-01-01

    We report on a survey of K-12 teachers and teacher candidates wherein participants evaluated known models (e.g., TPACK, SAMR, RAT, TIP) and provided insight on what makes a model valuable for them in the classroom. Results indicated that: (1) technology integration should be coupled with good theory to be effective, (2) classroom experience did…

  17. Modelling of nutrient partitioning in growing pigs to predict their anatomical body composition. 2. Model evaluation

    NARCIS (Netherlands)

    Halas, V.; Dijkstra, J.; Babinszky, L.; Verstegen, M.W.A.; Gerrits, W.J.J.

    2004-01-01

    The objective of the present paper was to evaluate a dynamic mechanistic model for growing and fattening pigs presented in a companion paper. The model predicted the rate of protein and fat deposition (chemical composition), rate of tissue deposition (anatomical composition) and performance of pigs

  18. Doctoral Dissertation Supervision: Identification and Evaluation of Models

    Directory of Open Access Journals (Sweden)

    Ngozi Agu

    2014-01-01

    Full Text Available Doctoral research supervision is one of the major avenues for sustaining students’ satisfaction with the programme, preparing students to be independent researchers and effectively initiating students into the academic community. This work reports doctoral students’ evaluation of their various supervision models, their satisfaction with these supervision models, and development of research-related skills. The study used a descriptive research design and was guided by three research questions and two hypotheses. A sample of 310 Ph.D. candidates drawn from a federal university in Eastern part of Nigeria was used for this study. The data generated through the questionnaire was analyzed using descriptive statistics and t-tests. Results show that face-to-face interactive model was not only the most frequently used, but also the most widely adopted in doctoral thesis supervision while ICT-based models were rarely used. Students supervised under face-to-face interactive model reported being more satisfied with dissertation supervision than those operating under face-to-face noninteractive model. However, students supervised under these two models did not differ significantly in their perceived development in research-related skills.

  19. Evaluating transport in the WRF model along the California coast

    OpenAIRE

    C. E. Yver; H. D. Graven; D. D. Lucas; P. J. Cameron-Smith; R. F. Keeling; R. F. Weiss

    2013-01-01

    This paper presents a step in the development of a top-down method to complement the bottom-up inventories of halocarbon emissions in California using high frequency observations, forward simulations and inverse methods. The Scripps Institution of Oceanography high-frequency atmospheric halocarbons measurement sites are located along the California coast and therefore the evaluation of transport in the chosen Weather Research Forecast (WRF) model at these sites is crucial fo...

  20. Evaluating transport in the WRF model along the California coast

    OpenAIRE

    C. Yver; H. Graven; D. D. Lucas; P. Cameron-Smith; R. Keeling; R. Weiss

    2012-01-01

    This paper presents a step in the development of a top-down method to complement the bottom-up inventories of halocarbon emissions in California using high frequency observations, forward simulations and inverse methods. The Scripps Institution of Oceanography high-frequency atmospheric halocarbon measurement sites are located along the California coast and therefore the evaluation of transport in the chosen Weather Research Forecast (WRF) model at these sites is crucial for inverse mo...

  1. Evaluation of the US Army fallout prediction model

    International Nuclear Information System (INIS)

    Pernick, A.; Levanon, I.

    1987-01-01

    The US Army fallout prediction method was evaluated against an advanced fallout prediction model--SIMFIC (Simplified Fallout Interpretive Code). The danger zone areas of the US Army method were found to be significantly greater (up to a factor of 8) than the areas of corresponding radiation hazard as predicted by SIMFIC. Nonetheless, because the US Army's method predicts danger zone lengths that are commonly shorter than the corresponding hot line distances of SIMFIC, the US Army's method is not reliably conservative

  2. Evaluation of a differentiation model of preschoolers’ executive functions

    OpenAIRE

    Howard, Steven J.; Okely, Anthony D.; Ellis, Yvonne G.

    2015-01-01

    Despite the prominent role of executive functions in children’s emerging competencies, there remains debate regarding the structure and development of executive functions. In an attempt to reconcile these discrepancies, a differentiation model of executive function development was evaluated in the early years using 6-month age groupings. Specifically, 281 preschoolers completed measures of working memory, inhibition, and shifting. Results contradicted suggestions that executive functions foll...

  3. Model to Evaluate Pro-Environmental Consumer Practices

    OpenAIRE

    Wendolyn Aguilar-Salinas; Sara Ojeda-Benitez; Samantha E. Cruz-Sotelo; Juan Ramón Castro-Rodríguez

    2017-01-01

    The consumer plays a key role in resource conservation; therefore, it is important to know consumer behavior to identify consumer profiles and to promote pro-environmental practices in society that encourage resource conservation and reductions in waste generation. The purpose of this paper is to implement a fuzzy model to evaluate consumer behavior in relation to three pro-environmental practices that can be implemented at the household level, including reductions in resource consumption (re...

  4. Evaluation of fish models of soluble epoxide hydrolase inhibition.

    OpenAIRE

    Newman, J W; Denton, D L; Morisseau, C; Koger, C S; Wheelock, C E; Hinton, D E; Hammock, B D

    2001-01-01

    Substituted ureas and carbamates are mechanistic inhibitors of the soluble epoxide hydrolase (sEH). We screened a set of chemicals containing these functionalities in larval fathead minnow (Pimphales promelas) and embryo/larval golden medaka (Oryzias latipes) models to evaluate the utility of these systems for investigating sEH inhibition in vivo. Both fathead minnow and medaka sEHs were functionally similar to the tested mammalian orthologs (murine and human) with respect to substrate hydrol...

  5. Hybrid Model for e-Learning Quality Evaluation

    Directory of Open Access Journals (Sweden)

    Suzana M. Savic

    2012-02-01

    Full Text Available E-learning is becoming increasingly important for the competitive advantage of economic organizations and higher education institutions. Therefore, it is becoming a significant aspect of quality which has to be integrated into the management system of every organization or institution. The paper examines e-learning quality characteristics, standards, criteria and indicators and presents a multi-criteria hybrid model for e-learning quality evaluation based on the method of Analytic Hierarchy Process, trend analysis, and data comparison.

  6. PERFORMANCE EVALUATION OF EMPIRICAL MODELS FOR VENTED LEAN HYDROGEN EXPLOSIONS

    OpenAIRE

    Anubhav Sinha; Vendra C. Madhav Rao; Jennifer X. Wen

    2017-01-01

    Explosion venting is a method commonly used to prevent or minimize damage to an enclosure caused by an accidental explosion. An estimate of the maximum overpressure generated though explosion is an important parameter in the design of the vents. Various engineering models (Bauwens et al., 2012, Molkov and Bragin, 2015) and European (EN 14994 ) and USA standards (NFPA 68) are available to predict such overpressure. In this study, their performance is evaluated using a number of published exper...

  7. Evaluating alternate discrete outcome frameworks for modeling crash injury severity.

    Science.gov (United States)

    Yasmin, Shamsunnahar; Eluru, Naveen

    2013-10-01

    This paper focuses on the relevance of alternate discrete outcome frameworks for modeling driver injury severity. The study empirically compares the ordered response and unordered response models in the context of driver injury severity in traffic crashes. The alternative modeling approaches considered for the comparison exercise include: for the ordered response framework-ordered logit (OL), generalized ordered logit (GOL), mixed generalized ordered logit (MGOL) and for the unordered response framework-multinomial logit (MNL), nested logit (NL), ordered generalized extreme value logit (OGEV) and mixed multinomial logit (MMNL) model. A host of comparison metrics are computed to evaluate the performance of these alternative models. The study provides a comprehensive comparison exercise of the performance of ordered and unordered response models for examining the impact of exogenous factors on driver injury severity. The research also explores the effect of potential underreporting on alternative frameworks by artificially creating an underreported data sample from the driver injury severity sample. The empirical analysis is based on the 2010 General Estimates System (GES) data base-a nationally representative sample of road crashes collected and compiled from about 60 jurisdictions across the United States. The performance of the alternative frameworks are examined in the context of model estimation and validation (at the aggregate and disaggregate level). Further, the performance of the model frameworks in the presence of underreporting is explored, with and without corrections to the estimates. The results from these extensive analyses point toward the emergence of the GOL framework (MGOL) as a strong competitor to the MMNL model in modeling driver injury severity. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Evaluation of the Current State of Integrated Water Quality Modelling

    Science.gov (United States)

    Arhonditsis, G. B.; Wellen, C. C.; Ecological Modelling Laboratory

    2010-12-01

    Environmental policy and management implementation require robust methods for assessing the contribution of various point and non-point pollution sources to water quality problems as well as methods for estimating the expected and achieved compliance with the water quality goals. Water quality models have been widely used for creating the scientific basis for management decisions by providing a predictive link between restoration actions and ecosystem response. Modelling water quality and nutrient transport is challenging due a number of constraints associated with the input data and existing knowledge gaps related to the mathematical description of landscape and in-stream biogeochemical processes. While enormous effort has been invested to make watershed models process-based and spatially-distributed, there has not been a comprehensive meta-analysis of model credibility in watershed modelling literature. In this study, we evaluate the current state of integrated water quality modeling across the range of temporal and spatial scales typically utilized. We address several common modeling questions by providing a quantitative assessment of model performance and by assessing how model performance depends on model development. The data compiled represent a heterogeneous group of modeling studies, especially with respect to complexity, spatial and temporal scales and model development objectives. Beginning from 1992, the year when Beven and Binley published their seminal paper on uncertainty analysis in hydrological modelling, and ending in 2009, we selected over 150 papers fitting a number of criteria. These criteria involved publications that: (i) employed distributed or semi-distributed modelling approaches; (ii) provided predictions on flow and nutrient concentration state variables; and (iii) reported fit to measured data. Model performance was quantified with the Nash-Sutcliffe Efficiency, the relative error, and the coefficient of determination. Further, our

  9. Application of a theoretical model to evaluate COPD disease management

    Science.gov (United States)

    2010-01-01

    Background Disease management programmes are heterogeneous in nature and often lack a theoretical basis. An evaluation model has been developed in which theoretically driven inquiries link disease management interventions to outcomes. The aim of this study is to methodically evaluate the impact of a disease management programme for patients with chronic obstructive pulmonary disease (COPD) on process, intermediate and final outcomes of care in a general practice setting. Methods A quasi-experimental research was performed with 12-months follow-up of 189 COPD patients in primary care in the Netherlands. The programme included patient education, protocolised assessment and treatment of COPD, structural follow-up and coordination by practice nurses at 3, 6 and 12 months. Data on intermediate outcomes (knowledge, psychosocial mediators, self-efficacy and behaviour) and final outcomes (dyspnoea, quality of life, measured by the CRQ and CCQ, and patient experiences) were obtained from questionnaires and electronic registries. Results Implementation of the programme was associated with significant improvements in dyspnoea (p theory-driven model enhances the design and evaluation of disease management programmes aimed at improving health outcomes. This study supports the notion that a theoretical approach strengthens the evaluation designs of complex interventions. Moreover, it provides prudent evidence that the implementation of COPD disease management programmes can positively influence outcomes of care. PMID:20346135

  10. Application of a theoretical model to evaluate COPD disease management

    Directory of Open Access Journals (Sweden)

    Asin Javier D

    2010-03-01

    Full Text Available Abstract Background Disease management programmes are heterogeneous in nature and often lack a theoretical basis. An evaluation model has been developed in which theoretically driven inquiries link disease management interventions to outcomes. The aim of this study is to methodically evaluate the impact of a disease management programme for patients with chronic obstructive pulmonary disease (COPD on process, intermediate and final outcomes of care in a general practice setting. Methods A quasi-experimental research was performed with 12-months follow-up of 189 COPD patients in primary care in the Netherlands. The programme included patient education, protocolised assessment and treatment of COPD, structural follow-up and coordination by practice nurses at 3, 6 and 12 months. Data on intermediate outcomes (knowledge, psychosocial mediators, self-efficacy and behaviour and final outcomes (dyspnoea, quality of life, measured by the CRQ and CCQ, and patient experiences were obtained from questionnaires and electronic registries. Results Implementation of the programme was associated with significant improvements in dyspnoea (p Conclusions The application of a theory-driven model enhances the design and evaluation of disease management programmes aimed at improving health outcomes. This study supports the notion that a theoretical approach strengthens the evaluation designs of complex interventions. Moreover, it provides prudent evidence that the implementation of COPD disease management programmes can positively influence outcomes of care.

  11. Application of a theoretical model to evaluate COPD disease management.

    Science.gov (United States)

    Lemmens, Karin M M; Nieboer, Anna P; Rutten-Van Mölken, Maureen P M H; van Schayck, Constant P; Asin, Javier D; Dirven, Jos A M; Huijsman, Robbert

    2010-03-26

    Disease management programmes are heterogeneous in nature and often lack a theoretical basis. An evaluation model has been developed in which theoretically driven inquiries link disease management interventions to outcomes. The aim of this study is to methodically evaluate the impact of a disease management programme for patients with chronic obstructive pulmonary disease (COPD) on process, intermediate and final outcomes of care in a general practice setting. A quasi-experimental research was performed with 12-months follow-up of 189 COPD patients in primary care in the Netherlands. The programme included patient education, protocolised assessment and treatment of COPD, structural follow-up and coordination by practice nurses at 3, 6 and 12 months. Data on intermediate outcomes (knowledge, psychosocial mediators, self-efficacy and behaviour) and final outcomes (dyspnoea, quality of life, measured by the CRQ and CCQ, and patient experiences) were obtained from questionnaires and electronic registries. Implementation of the programme was associated with significant improvements in dyspnoea (p theory-driven model enhances the design and evaluation of disease management programmes aimed at improving health outcomes. This study supports the notion that a theoretical approach strengthens the evaluation designs of complex interventions. Moreover, it provides prudent evidence that the implementation of COPD disease management programmes can positively influence outcomes of care.

  12. Model for Quantitative Evaluation of Enzyme Replacement Treatment

    Directory of Open Access Journals (Sweden)

    Radeva B.

    2009-12-01

    Full Text Available Gaucher disease is the most frequent lysosomal disorder. Its enzyme replacement treatment was the new progress of modern biotechnology, successfully used in the last years. The evaluation of optimal dose of each patient is important due to health and economical reasons. The enzyme replacement is the most expensive treatment. It must be held continuously and without interruption. Since 2001, the enzyme replacement therapy with Cerezyme*Genzyme was formally introduced in Bulgaria, but after some time it was interrupted for 1-2 months. The dose of the patients was not optimal. The aim of our work is to find a mathematical model for quantitative evaluation of ERT of Gaucher disease. The model applies a kind of software called "Statistika 6" via the input of the individual data of 5-year-old children having the Gaucher disease treated with Cerezyme. The output results of the model gave possibilities for quantitative evaluation of the individual trends in the development of the disease of each child and its correlation. On the basis of this results, we might recommend suitable changes in ERT.

  13. ICT evaluation models and performance of medium and small enterprises

    Directory of Open Access Journals (Sweden)

    Bayaga Anass

    2014-01-01

    Full Text Available Building on prior research related to (1 impact of information communication technology (ICT and (2 operational risk management (ORM in the context of medium and small enterprises (MSEs, the focus of this study was to investigate the relationship between (1 ICT operational risk management (ORM and (2 performances of MSEs. To achieve the focus, the research investigated evaluating models for understanding the value of ICT ORM in MSEs. Multiple regression, Repeated-Measures Analysis of Variance (RM-ANOVA and Repeated-Measures Multivariate Analysis of Variance (RM-MANOVA were performed. The findings of the distribution revealed that only one variable made a significant percentage contribution to the level of ICT operation in MSEs, the Payback method (β = 0.410, p < .000. It may thus be inferred that the Payback method is the prominent variable, explaining the variation in level of evaluation models affecting ICT adoption within MSEs. Conclusively, in answering the two questions (1 degree of variability explained and (2 predictors, the results revealed that the variable contributed approximately 88.4% of the variations in evaluation models affecting ICT adoption within MSEs. The analysis of variance also revealed that the regression coefficients were real and did not occur by chance

  14. Evaluating Learner Autonomy: A Dynamic Model with Descriptors

    Directory of Open Access Journals (Sweden)

    Maria Giovanna Tassinari

    2012-03-01

    Full Text Available Every autonomous learning process should entail an evaluation of the learner’s competencies for autonomy. The dynamic model of learner autonomy described in this paper is a tool designed in order to support the self-assessment and evaluation of learning competencies and to help both learners and advisors to focus on relevant aspects of the learning process. The dynamic model accounts for cognitive, metacognitive, action-oriented and affective components of learner autonomy and provides descriptors of learners’ attitudes, competencies and behaviors. It is dynamic in order to allow learners to focus on their own needs and goals.The model (http://www.sprachenzentrum.fuberlin.de/v/autonomiemodell/index.html has been validated in several workshops with experts at the Université Nancy 2, France and at the Freie Universität Berlin, Germany and tested by students, advisors and teachers. It is currently used at the Centre for Independent Language Learning at the Freie Universität Berlin for language advising. Learners can freely choose the components they would like to assess themselves in. Their assessment is then discussed in an advising session, where the learner and the advisor can compare their perspectives, focus on single aspects of the leaning process and set goals for further learning. The students’ feedback gathered in my PhD investigation shows that they are able to benefit from this evaluation; their awareness, self-reflection and decision-making in the autonomous learning process improved.

  15. Animal Models for Evaluation of Bone Implants and Devices: Comparative Bone Structure and Common Model Uses.

    Science.gov (United States)

    Wancket, L M

    2015-09-01

    Bone implants and devices are a rapidly growing field within biomedical research, and implants have the potential to significantly improve human and animal health. Animal models play a key role in initial product development and are important components of nonclinical data included in applications for regulatory approval. Pathologists are increasingly being asked to evaluate these models at the initial developmental and nonclinical biocompatibility testing stages, and it is important to understand the relative merits and deficiencies of various species when evaluating a new material or device. This article summarizes characteristics of the most commonly used species in studies of bone implant materials, including detailed information about the relevance of a particular model to human bone physiology and pathology. Species reviewed include mice, rats, rabbits, guinea pigs, dogs, sheep, goats, and nonhuman primates. Ultimately, a comprehensive understanding of the benefits and limitations of different model species will aid in rigorously evaluating a novel bone implant material or device. © The Author(s) 2015.

  16. Evaluation and hydrological modelization in the natural hazard prevention

    International Nuclear Information System (INIS)

    Pla Sentis, Ildefonso

    2011-01-01

    Soil degradation affects negatively his functions as a base to produce food, to regulate the hydrological cycle and the environmental quality. All over the world soil degradation is increasing partly due to lacks or deficiencies in the evaluations of the processes and causes of this degradation on each specific situation. The processes of soil physical degradation are manifested through several problems as compaction, runoff, hydric and Eolic erosion, landslides with collateral effects in situ and in the distance, often with disastrous consequences as foods, landslides, sedimentations, droughts, etc. These processes are frequently associated to unfavorable changes into the hydrologic processes responsible of the water balance and soil hydric regimes, mainly derived to soil use changes and different management practices and climatic changes. The evaluation of these processes using simple simulation models; under several scenarios of climatic change, soil properties and land use and management; would allow to predict the occurrence of this disastrous processes and consequently to select and apply the appropriate practices of soil conservation to eliminate or reduce their effects. This simulation models require, as base, detailed climatic information and hydrologic soil properties data. Despite of the existence of methodologies and commercial equipment (each time more sophisticated and precise) to measure the different physical and hydrological soil properties related with degradation processes, most of them are only applicable under really specific or laboratory conditions. Often indirect methodologies are used, based on relations or empiric indexes without an adequate validation, that often lead to expensive mistakes on the evaluation of soil degradation processes and their effects on natural disasters. It could be preferred simple field methodologies, direct and adaptable to different soil types and climates and to the sample size and the spatial variability of the

  17. Evaluation of cat brain infarction model using microPET

    International Nuclear Information System (INIS)

    Lee, J. J.; Lee, D. S.; Kim, J. H.; Hwang, D. W.; Jung, J. G.; Lee, M. C; Lim, S. M

    2004-01-01

    PET has some disadvantage in the imaging of small animal due to poor resolution. With the advance of microPET scanner, it is possible to image small animals. However, the image quality was not so much satisfactory as human image. As cats have relatively large sized brain, cat brain imaging was superior to mice or rat. In this study, we established the cat brain infarction model and evaluate it and its temporal change using microPET scanner. Two adult male cats were used. Anesthesia was done with xylazine and ketamine HCl. A burr hole was made at 1cm right lateral to the bregma. Collagenase type IV 10 ul was injected using 30G needle for 5 minutes to establish the infarction model. F-18 FDG microPET (Concorde Microsystems Inc., Knoxville. TN) scans were performed 1. 11 and 32 days after the infarction. In addition. 18F-FDG PET scans were performed using Gemini PET scanner (Philips medical systems. CA, USA) 13 and 47 days after the infarction. Two cat brain infarction models were established. The glucose metabolism of an infraction lesion improved with time. An infarction lesion was also distinguishable in the Gemini PET scan. We successfully established the cat brain infarction model and evaluated the infarcted lesion and its temporal change using F-18 FDG microPET scanner

  18. Development and Evaluation of Amino Acid Molecular Models

    Directory of Open Access Journals (Sweden)

    Aparecido R. Silva

    2007-05-01

    Full Text Available The comprehension of structure and function of proteins has a tight relationshipwith the development of structural biology. However, biochemistry students usuallyfind difficulty to visualize the structures when they use only schematic drawings ofdidactic books. The representation of three-dimensional structures of somebiomolecules with ludic models, built with representative units, have supplied tothe students and teachers a successfully experience to better visualize andcorrelate the structures to the real molecules. The present work shows thedeveloped models and the process to produce the representative units of the mainamino acids in industrial scale. The design and applicability of the representativeunits were discussed with many teachers and some suggestions wereimplemented to the models. The preliminary evaluation and perspective ofutilization by researchers show that the work is in the right direction. At the actualstage, the models are defined, prototypes were made and will be presented in thismeeting. The moulds for the units are at the final stage of construction and trial inspecialized tool facilities. The last term will consist of an effective evaluation of thedidactic tool for the teaching/learning process in Structural Molecular Biology. Theevaluation protocol is being elaborated containing simple and objective questions,similar to those used in research on science teaching.

  19. Genetic evaluation of European quails by random regression models

    Directory of Open Access Journals (Sweden)

    Flaviana Miranda Gonçalves

    2012-09-01

    Full Text Available The objective of this study was to compare different random regression models, defined from different classes of heterogeneity of variance combined with different Legendre polynomial orders for the estimate of (covariance of quails. The data came from 28,076 observations of 4,507 female meat quails of the LF1 lineage. Quail body weights were determined at birth and 1, 14, 21, 28, 35 and 42 days of age. Six different classes of residual variance were fitted to Legendre polynomial functions (orders ranging from 2 to 6 to determine which model had the best fit to describe the (covariance structures as a function of time. According to the evaluated criteria (AIC, BIC and LRT, the model with six classes of residual variances and of sixth-order Legendre polynomial was the best fit. The estimated additive genetic variance increased from birth to 28 days of age, and dropped slightly from 35 to 42 days. The heritability estimates decreased along the growth curve and changed from 0.51 (1 day to 0.16 (42 days. Animal genetic and permanent environmental correlation estimates between weights and age classes were always high and positive, except for birth weight. The sixth order Legendre polynomial, along with the residual variance divided into six classes was the best fit for the growth rate curve of meat quails; therefore, they should be considered for breeding evaluation processes by random regression models.

  20. Evaluation of cat brain infarction model using microPET

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Jin; Lee, Dong Soo; Kim, Yun Hui; Hwang, Do Won; Kim, Jin Su; Chung, June Key; Lee, Myung Chul [College of Medicine, Seoul National Univ., Seoul (Korea, Republic of); Lim, Sang Moo [Korea Institite of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2004-12-01

    PET has some disadvantage in the imaging of small animal due to poor resolution. With the advent of microPET scanner, it is possible to image small animals. However, the image quality was not good enough as human image. Due to larger brain, cat brain imaging was superior to mouse or rat. In this study, we established the cat brain infarction model and evaluate it and its temporal change using microPET scanner. Two adult male cats were used. Anesthesia was done with xylazine and ketamine HCI. A burr hole was made at 1 cm right lateral to the bregma. Collagenase type IV 10 {mu}l was injected using 30 G needle for 5 minutes to establish the infarction model. {sup 18}F-FDG microPET (Concorde Microsystems Inc., Knoxville, TN) scans were performed 1, 11 and 32 days after the infarction. In addition, {sup 18}F-FDG PET scans were performed using human PET scanner (Gemini, Philips medical systems, CA, USA) 13 and 47 days after the infarction. Two cat brain infarction models were established. The glucose metabolism of an infarction lesion improved with time. An infarction lesion was also distinguishable in the human PET scan. We successfully established the cat brain infarction model and evaluated the infarcted lesion and its temporal change using {sup 18}F-FDG microPET scanner.

  1. Evaluation of cat brain infarction model using microPET

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. J.; Lee, D. S.; Kim, J. H.; Hwang, D. W.; Jung, J. G.; Lee, M. C [College of Medicine, Seoul National University, Seoul (Korea, Republic of); Lim, S. M [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2004-07-01

    PET has some disadvantage in the imaging of small animal due to poor resolution. With the advance of microPET scanner, it is possible to image small animals. However, the image quality was not so much satisfactory as human image. As cats have relatively large sized brain, cat brain imaging was superior to mice or rat. In this study, we established the cat brain infarction model and evaluate it and its temporal change using microPET scanner. Two adult male cats were used. Anesthesia was done with xylazine and ketamine HCl. A burr hole was made at 1cm right lateral to the bregma. Collagenase type IV 10 ul was injected using 30G needle for 5 minutes to establish the infarction model. F-18 FDG microPET (Concorde Microsystems Inc., Knoxville. TN) scans were performed 1. 11 and 32 days after the infarction. In addition. 18F-FDG PET scans were performed using Gemini PET scanner (Philips medical systems. CA, USA) 13 and 47 days after the infarction. Two cat brain infarction models were established. The glucose metabolism of an infraction lesion improved with time. An infarction lesion was also distinguishable in the Gemini PET scan. We successfully established the cat brain infarction model and evaluated the infarcted lesion and its temporal change using F-18 FDG microPET scanner.

  2. The improved sequential puff model for atmospheric dispersion evaluation (SPADE)

    International Nuclear Information System (INIS)

    Desiato, F.

    1990-05-01

    The present report describes the improved version of the Sequential Puff for Atmospheric Dispersion Evaluation Model (SPADE), developed at EKEA-DISP as a component of ARIES (Atmospheric Release Impact Evaluation System). SPADE has been originally designed for real time assessment of the consequences of a nuclear release into the atmosphere, but it is also suited for sensitivity studies, investigations, or routine applications. It can estimate ground-level air concentrations, deposition and cloud γ dose rate in flat or gently rolling terrain in the vicinity of a point source. During the last years several aspects of the modelling of dispersion processes have been improved, and new modules have been implemented in SPADE. In the first part of the report, a general description of the model is given, and the assumptions and parameterizations used to simulate the main physical processes are described. The second part concerns with the structure of the computer code and of input and output files, and can be regarded as a user's guide to the model. (author)

  3. Evaluation of wave runup predictions from numerical and parametric models

    Science.gov (United States)

    Stockdon, Hilary F.; Thompson, David M.; Plant, Nathaniel G.; Long, Joseph W.

    2014-01-01

    Wave runup during storms is a primary driver of coastal evolution, including shoreline and dune erosion and barrier island overwash. Runup and its components, setup and swash, can be predicted from a parameterized model that was developed by comparing runup observations to offshore wave height, wave period, and local beach slope. Because observations during extreme storms are often unavailable, a numerical model is used to simulate the storm-driven runup to compare to the parameterized model and then develop an approach to improve the accuracy of the parameterization. Numerically simulated and parameterized runup were compared to observations to evaluate model accuracies. The analysis demonstrated that setup was accurately predicted by both the parameterized model and numerical simulations. Infragravity swash heights were most accurately predicted by the parameterized model. The numerical model suffered from bias and gain errors that depended on whether a one-dimensional or two-dimensional spatial domain was used. Nonetheless, all of the predictions were significantly correlated to the observations, implying that the systematic errors can be corrected. The numerical simulations did not resolve the incident-band swash motions, as expected, and the parameterized model performed best at predicting incident-band swash heights. An assimilated prediction using a weighted average of the parameterized model and the numerical simulations resulted in a reduction in prediction error variance. Finally, the numerical simulations were extended to include storm conditions that have not been previously observed. These results indicated that the parameterized predictions of setup may need modification for extreme conditions; numerical simulations can be used to extend the validity of the parameterized predictions of infragravity swash; and numerical simulations systematically underpredict incident swash, which is relatively unimportant under extreme conditions.

  4. Peformance Tuning and Evaluation of a Parallel Community Climate Model

    Energy Technology Data Exchange (ETDEWEB)

    Drake, J.B.; Worley, P.H.; Hammond, S.

    1999-11-13

    The Parallel Community Climate Model (PCCM) is a message-passing parallelization of version 2.1 of the Community Climate Model (CCM) developed by researchers at Argonne and Oak Ridge National Laboratories and at the National Center for Atmospheric Research in the early to mid 1990s. In preparation for use in the Department of Energy's Parallel Climate Model (PCM), PCCM has recently been updated with new physics routines from version 3.2 of the CCM, improvements to the parallel implementation, and ports to the SGIKray Research T3E and Origin 2000. We describe our experience in porting and tuning PCCM on these new platforms, evaluating the performance of different parallel algorithm options and comparing performance between the T3E and Origin 2000.

  5. Evaluation of the integrated community based home care model

    Directory of Open Access Journals (Sweden)

    LR Uys

    2001-09-01

    Full Text Available In 1999-2000 the Integrated Community-Based Home Care model for the care of people with AIDS in communities were implemented in seven sites across the country. The post-implementation evaluation showed that most respondents felt that the model could be replicated if a functioning and informed network including all partners, and a strong management team were in place. The effects of the project were mainly positive for all stakeholders (hospice, clinic, hospital, PWA and their carers, professionals and other community members. Hospitals and community- based services became more aware of and involved in the needs of PWA and felt that the model enabled them to address these needs. PWA and their carers felt supported and respected.

  6. Anticonvulsive evaluation of THIP in the murine pentylenetetrazole kindling model

    DEFF Research Database (Denmark)

    Simonsen, Charlotte; Boddum, Kim; von Schoubye, Nadia L

    2017-01-01

    . Evaluation of THIP as a potential anticonvulsant has given contradictory results in different animal models and for this reason, we reevaluated the anticonvulsive properties of THIP in the murine pentylenetetrazole (PTZ) kindling model. As loss of δ-GABAA R in the dentate gyrus has been associated...... with several animal models of epilepsy, we first investigated the presence of functional δ-GABAA receptors. Both immunohistochemistry and Western blot data demonstrated that δ-GABAA R expression is not only present in the dentate gyrus, but also the expression level was enhanced in the early phase after PTZ...... kindling. Whole-cell patch-clamp studies in acute hippocampal brain slices revealed that THIP was indeed able to induce a tonic inhibition in dentate gyrus granule cells. However, THIP induced a tonic current of similar magnitude in the PTZ-kindled mice compared to saline-treated animals despite...

  7. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    Science.gov (United States)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  8. PERFORMANCE EVALUATION OF 3D MODELING SOFTWARE FOR UAV PHOTOGRAMMETRY

    Directory of Open Access Journals (Sweden)

    H. Yanagi

    2016-06-01

    Full Text Available UAV (Unmanned Aerial Vehicle photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  9. Evaluating the performance and utility of regional climate models

    DEFF Research Database (Denmark)

    Christensen, Jens H.; Carter, Timothy R.; Rummukainen, Markku

    2007-01-01

    This special issue of Climatic Change contains a series of research articles documenting co-ordinated work carried out within a 3-year European Union project 'Prediction of Regional scenarios and Uncertainties for Defining European Climate change risks and Effects' (PRUDENCE). The main objective...... weather events and (7) implications of the results for policy. A paper summarising the related MICE (Modelling the Impact of Climate Extremes) project is also included. The second part of the issue contains 12 articles that focus in more detail on some of the themes summarised in the overarching papers....... The PRUDENCE results represent the first comprehensive, continental-scale intercomparison and evaluation of high resolution climate models and their applications, bringing together climate modelling, impact research and social sciences expertise on climate change....

  10. Evaluation of the St. Lucia geothermal resource: macroeconomic models

    Energy Technology Data Exchange (ETDEWEB)

    Burris, A.E.; Trocki, L.K.; Yeamans, M.K.; Kolstad, C.D.

    1984-08-01

    A macroeconometric model describing the St. Lucian economy was developed using 1970 to 1982 economic data. Results of macroeconometric forecasts for the period 1983 through 1985 show an increase in gross domestic product (GDP) for 1983 and 1984 with a decline in 1985. The rate of population growth is expected to exceed GDP growth so that a small decline in per capita GDP will occur. We forecast that garment exports will increase, providing needed employment and foreign exchange. To obtain a longer-term but more general outlook on St. Lucia's economy, and to evaluate the benefit of geothermal energy development, we applied a nonlinear programming model. The model maximizes discounted cumulative consumption.

  11. Modeling and evaluation of information systems using coloured petri network

    Directory of Open Access Journals (Sweden)

    Ehsan Zamirpour

    2014-07-01

    Full Text Available Nowadays with the growth of organizations and their affiliates, the importance of information systems has increased. Functional and non-functional requirements of information systems in an organization are supported. There are literally several techniques to support the functional requirements in terms of software methodologies, but support for the second set of requirements has received little attention. Software Performance Engineering (SPE forum tries to address this issue by presenting software methodologies to support both types of requirements. In this paper, we present a formal model for the evaluation of system performance based on a pragmatic model. Because of supporting the concurrency concepts, petri net has a higher priority than queuing system. For mapping UML to colored Petri net diagram, we use an intermediate graph. The preliminary results indicate that the proposed model may save significant amount of computations.

  12. Models of evaluation of public joint-stock property management

    Science.gov (United States)

    Yakupova, N. M.; Levachkova, S.; Absalyamova, S. G.; Kvon, G.

    2017-12-01

    The paper deals with the models of evaluation of performance of both the management company and the individual subsidiaries on the basis of a combination of elements and multi-parameter and target approaches. The article shows that due to the power of multi-dimensional and multi-directional indicators of financial and economic activity it is necessary to assess the degree of achievement of the objectives with the use of multivariate ordinal model as a set of indicators, ordered by growth so that the maintenance of this order on a long interval of time will ensure the effective functioning of the enterprise in the long term. It is shown that these models can be regarded as the monitoring tools of implementation of strategies and guide the justification effectiveness of implementation of management decisions.

  13. Risk Evaluation of Railway Coal Transportation Network Based on Multi Level Grey Evaluation Model

    Science.gov (United States)

    Niu, Wei; Wang, Xifu

    2018-01-01

    The railway transport mode is currently the most important way of coal transportation, and now China’s railway coal transportation network has become increasingly perfect, but there is still insufficient capacity, some lines close to saturation and other issues. In this paper, the theory and method of risk assessment, analytic hierarchy process and multi-level gray evaluation model are applied to the risk evaluation of coal railway transportation network in China. Based on the example analysis of Shanxi railway coal transportation network, to improve the internal structure and the competitiveness of the market.

  14. [Evaluation of a face model for surgical education].

    Science.gov (United States)

    Schneider, G; Voigt, S; Rettinger, G

    2011-09-01

    The complex anatomy of the human face requires a high degree of experience and skills in surgical dressing of facial soft tissue defects. The previous education contains literature studies and supervision during surgery, according to surgical spectrum of the educating hospital. A structured education including a training of different surgical methods on a model and slow increase of complexity could improve considerably the following education related to the patient.During a cooperative project, the 3 di GmbH and the Department of Otolaryngology at the Friedrich-Schiller-University Jena developed a face model for surgical education that allows the training of surgical interventions in the face. The model was used during the 6th and 8th Jena Workshop for Functional and Aesthetic Surgery as well as a workshop for surgical suturation, and tested and evaluated by the attendees.The attendees mostly rated the work-ability of the models and the possibility to practice on a realistic face model with artificial skin very well and beneficial. This model allows a repeatable and structured education of surgical standards, and is very helpful in preparation for operating facial defects of a patient. Georg Thieme Verlag KG Stuttgart · New York.

  15. Diagnostic Air Quality Model Evaluation of Source-Specific ...

    Science.gov (United States)

    Ambient measurements of 78 source-specific tracers of primary and secondary carbonaceous fine particulate matter collected at four midwestern United States locations over a full year (March 2004–February 2005) provided an unprecedented opportunity to diagnostically evaluate the results of a numerical air quality model. Previous analyses of these measurements demonstrated excellent mass closure for the variety of contributing sources. In this study, a carbon-apportionment version of the Community Multiscale Air Quality (CMAQ) model was used to track primary organic and elemental carbon emissions from 15 independent sources such as mobile sources and biomass burning in addition to four precursor-specific classes of secondary organic aerosol (SOA) originating from isoprene, terpenes, aromatics, and sesquiterpenes. Conversion of the source-resolved model output into organic tracer concentrations yielded a total of 2416 data pairs for comparison with observations. While emission source contributions to the total model bias varied by season and measurement location, the largest absolute bias of −0.55 μgC/m3 was attributed to insufficient isoprene SOA in the summertime CMAQ simulation. Biomass combustion was responsible for the second largest summertime model bias (−0.46 μgC/m3 on average). Several instances of compensating errors were also evident; model underpredictions in some sectors were masked by overpredictions in others. The National Exposure Research L

  16. Metal Mixture Modeling Evaluation project: 2. Comparison of four modeling approaches

    Science.gov (United States)

    Farley, Kevin J.; Meyer, Joe; Balistrieri, Laurie S.; DeSchamphelaere, Karl; Iwasaki, Yuichi; Janssen, Colin; Kamo, Masashi; Lofts, Steve; Mebane, Christopher A.; Naito, Wataru; Ryan, Adam C.; Santore, Robert C.; Tipping, Edward

    2015-01-01

    As part of the Metal Mixture Modeling Evaluation (MMME) project, models were developed by the National Institute of Advanced Industrial Science and Technology (Japan), the U.S. Geological Survey (USA), HDR⎪HydroQual, Inc. (USA), and the Centre for Ecology and Hydrology (UK) to address the effects of metal mixtures on biological responses of aquatic organisms. A comparison of the 4 models, as they were presented at the MMME Workshop in Brussels, Belgium (May 2012), is provided herein. Overall, the models were found to be similar in structure (free ion activities computed by WHAM; specific or non-specific binding of metals/cations in or on the organism; specification of metal potency factors and/or toxicity response functions to relate metal accumulation to biological response). Major differences in modeling approaches are attributed to various modeling assumptions (e.g., single versus multiple types of binding site on the organism) and specific calibration strategies that affected the selection of model parameters. The models provided a reasonable description of additive (or nearly additive) toxicity for a number of individual toxicity test results. Less-than-additive toxicity was more difficult to describe with the available models. Because of limitations in the available datasets and the strong inter-relationships among the model parameters (log KM values, potency factors, toxicity response parameters), further evaluation of specific model assumptions and calibration strategies is needed.

  17. Evaluation of a laboratory model of human head impact biomechanics.

    Science.gov (United States)

    Hernandez, Fidel; Shull, Peter B; Camarillo, David B

    2015-09-18

    This work describes methodology for evaluating laboratory models of head impact biomechanics. Using this methodology, we investigated: how closely does twin-wire drop testing model head rotation in American football impacts? Head rotation is believed to cause mild traumatic brain injury (mTBI) but helmet safety standards only model head translations believed to cause severe TBI. It is unknown whether laboratory head impact models in safety standards, like twin-wire drop testing, reproduce six degree-of-freedom (6DOF) head impact biomechanics that may cause mTBI. We compared 6DOF measurements of 421 American football head impacts to twin-wire drop tests at impact sites and velocities weighted to represent typical field exposure. The highest rotational velocities produced by drop testing were the 74th percentile of non-injury field impacts. For a given translational acceleration level, drop testing underestimated field rotational acceleration by 46% and rotational velocity by 72%. Primary rotational acceleration frequencies were much larger in drop tests (~100 Hz) than field impacts (~10 Hz). Drop testing was physically unable to produce acceleration directions common in field impacts. Initial conditions of a single field impact were highly resolved in stereo high-speed video and reconstructed in a drop test. Reconstruction results reflected aggregate trends of lower amplitude rotational velocity and higher frequency rotational acceleration in drop testing, apparently due to twin-wire constraints and the absence of a neck. These results suggest twin-wire drop testing is limited in modeling head rotation during impact, and motivate continued evaluation of head impact models to ensure helmets are tested under conditions that may cause mTBI. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Information Society and Library Evaluation Transitions in Portugal: A Meta-evaluation Model and Frameworks (1970–2013)

    OpenAIRE

    Leonor Gaspar Pinto; Ochôa Paula

    2014-01-01

    The need for greater understanding of assessment practices and models highlights a deficit of an up-to-date meta-evaluation model, whilst articulating with new phases in Information Society (IS) development. This paper aims to discuss the meta-evaluation model and frameworks that were created to explain the relations between IS transitions and the development of library performance evaluation models in Portugal (1970–2013).The research is based on a qualitative methodology supported by a comb...

  19. Review of models used for determining consequences of UF6 release: Model evaluation report. Volume 2

    International Nuclear Information System (INIS)

    Nair, S.K.; Chambers, D.B.; Park, S.H.; Radonjic, Z.R.; Coutts, P.T.; Lewis, C.J.; Hammonds, J.S.; Hoffman, F.O.

    1997-11-01

    Three uranium hexafluoride-(UF 6 -) specific models--HGSYSTEM/UF 6 , Science Application International Corporation, and RTM-96; three dense-gas models--DEGADIS, SLAB, and the Chlorine Institute methodology; and one toxic chemical model--AFTOX--are evaluated on their capabilities to simulate the chemical reactions, thermodynamics, and atmospheric dispersion of UF 6 released from accidents at nuclear fuel-cycle facilities, to support Integrated Safety Analysis, Emergency Response Planning, and Post-Accident Analysis. These models are also evaluated for user-friendliness and for quality assurance and quality control features, to ensure the validity and credibility of the results. Model performance evaluations are conducted for the three UF 6 -specific models, using field data on releases of UF 6 and other heavy gases. Predictions from the HGSYSTEM/UF 6 and SAIC models are within an order of magnitude of the field data, but the SAIC model overpredicts beyond an order of magnitude for a few UF 6 -specific data points. The RTM-96 model provides overpredictions within a factor of 3 for all data points beyond 400 m from the source. For one data set, however, the RTM-96 model severely underpredicts the observations within 200 m of the source. Outputs of the models are most sensitive to the meteorological parameters at large distances from the source and to certain source-specific and meteorological parameters at distances close to the source. Specific recommendations are being made to improve the applicability and usefulness of the three models and to choose a specific model to support the intended analyses. Guidance is also provided on the choice of input parameters for initial dilution, building wake effects, and distance to completion of UF 6 reaction with water

  20. Evaluation of Data Used for Modelling the Stratosphere of Saturn

    Science.gov (United States)

    Armstrong, Eleanor Sophie; Irwin, Patrick G. J.; Moses, Julianne I.

    2015-11-01

    Planetary atmospheres are modeled through the use of a photochemical and kinetic reaction scheme constructed from experimentally and theoretically determined rate coefficients, photoabsorption cross sections and branching ratios for the molecules described within them. The KINETICS architecture has previously been developed to model planetary atmospheres and is applied here to Saturn’s stratosphere. We consider the pathways that comprise the reaction scheme of a current model, and update the reaction scheme according the to findings in a literature investigation. We evaluate contemporary photochemical literature, studying recent data sets of cross-sections and branching ratios for a number of hydrocarbons used in the photochemical scheme of Model C of KINETICS. In particular evaluation of new photodissociation branching ratios for CH4, C2H2, C2H4, C3H3, C3H5 and C4H2, and new cross-sectional data for C2H2, C2H4, C2H6, C3H3, C4H2, C6H2 and C8H2 are considered. By evaluating the techniques used and data sets obtained, a new reaction scheme selection was drawn up. These data are then used within the preferred reaction scheme of the thesis and applied to the KINETICS atmospheric model to produce a model of the stratosphere of Saturn in a steady state. A total output of the preferred reaction scheme is presented, and the data is compared both with the previous reaction scheme and with data from the Cassini spacecraft in orbit around Saturn.One of the key findings of this work is that there is significant change in the model’s output as a result of temperature dependent data determination. Although only shown within the changes to the photochemical portion of the preferred reaction scheme, it is suggested that an equally important temperature dependence will be exhibited in the kinetic section of the reaction scheme. The photochemical model output is shown to be highly dependent on the preferred reaction scheme used within it by this thesis. The importance of correct

  1. Evaluation of radiobiological effects in 3 distinct biological models

    International Nuclear Information System (INIS)

    Lemos, J.; Costa, P.; Cunha, L.; Metello, L.F.; Carvalho, A.P.; Vasconcelos, V.; Genesio, P.; Ponte, F.; Costa, P.S.; Crespo, P.

    2015-01-01

    Full text of publication follows. The present work aims at sharing the process of development of advanced biological models to study radiobiological effects. Recognizing several known limitations and difficulties of the current monolayer cellular models, as well as the increasing difficulties to use advanced biological models, our group has been developing advanced biological alternative models, namely three-dimensional cell cultures and a less explored animal model (the Zebra fish - Danio rerio - which allows the access to inter-generational data, while characterized by a great genetic homology towards the humans). These 3 models (monolayer cellular model, three-dimensional cell cultures and zebra fish) were externally irradiated with 100 mGy, 500 mGy or 1 Gy. The consequences of that irradiation were studied using cellular and molecular tests. Our previous experimental studies with 100 mGy external gamma irradiation of HepG2 monolayer cells showed a slight increase in the proliferation rate 24 h, 48 h and 72 h post irradiation. These results also pointed into the presence of certain bystander effects 72 h post irradiation, constituting the starting point for the need of a more accurate analysis realized with this work. At this stage, we continue focused on the acute biological effects. Obtained results, namely MTT and clonogenic assays for evaluating cellular metabolic activity and proliferation in the in vitro models, as well as proteomics for the evaluation of in vivo effects will be presented, discussed and explained. Several hypotheses will be presented and defended based on the facts previously demonstrated. This work aims at sharing the actual state and the results already available from this medium-term project, building the proof of the added value on applying these advanced models, while demonstrating the strongest and weakest points from all of them (so allowing the comparison between them and to base the subsequent choice for research groups starting

  2. Photochemical Air Quality Model Evaluation and Rog Reactivity Analysis

    Science.gov (United States)

    McNair, Laurie A.

    1995-11-01

    This thesis is concerned with the model performance evaluation and the analysis of control strategies. First, the model was applied to Los Angeles and a suite of statistical and graphical analyses found that the model could predict ozone within a GNE of 31%. It was also found that the variability of the observations was similar to predictions (GNE of 27%) and that these variations may need to be taken into account when evaluating models. The model was also applied to Mexico City and model results indicated that the vertical diffusivity may not be represented well. An alternate algorithm showed the model predictions for ozone were within a GNE of 36% and that the spatial variability of the observations was similar to those found for Los Angeles. Next, the model was applied to an episode in Los Angeles when liquid water was present in order to determine the effect of modeling aqueous phase chemistry. It was found to have no affect on gas phase species, but to be important for acid deposition. The thesis then focused on using the differences in the ozone forming potential (or reactivity) to determine control strategies. First, the reactivity of 10 ROG classes commonly present in automobile exhaust were found. Next, these reactivities were used to calculate reactivity adjustment factors to determine the allowable emissions for an alternatively fueled vehicle. A test was then performed to see if the reactivity adjusted emissions for three AFVs led to similar ozone formation. It was found that M85 led to 10% more ozone formation than a conventionally fueled vehicle during very stagnant meteorologies. Next, the exemption of acetone from ROG regulations was explored to estimate how the exemption of a less reactive ROG could affect the use of more reactive compounds which performed the same function. It was found that acetone was about as reactive as ethane, although this varied due to ROG/NO_{rm x} conditions. Thus, California would have to decide if they wished to

  3. Developing a conceptual model for selecting and evaluating online markets

    Directory of Open Access Journals (Sweden)

    Sadegh Feizollahi

    2013-04-01

    Full Text Available There are many evidences, which emphasis on the benefits of using new technologies of information and communication in international business and many believe that E-Commerce can help satisfy customer explicit and implicit requirements. Internet shopping is a concept developed after the introduction of electronic commerce. Information technology (IT and its applications, specifically in the realm of the internet and e-mail promoted the development of e-commerce in terms of advertising, motivating and information. However, with the development of new technologies, credit and financial exchange on the internet websites were constructed so to facilitate e-commerce. The proposed study sends a total of 200 questionnaires to the target group (teachers - students - professionals - managers of commercial web sites and it manages to collect 130 questionnaires for final evaluation. Cronbach's alpha test is used for measuring reliability and to evaluate the validity of measurement instruments (questionnaires, and to assure construct validity, confirmatory factor analysis is employed. In addition, in order to analyze the research questions based on the path analysis method and to determine markets selection models, a regular technique is implemented. In the present study, after examining different aspects of e-commerce, we provide a conceptual model for selecting and evaluating online marketing in Iran. These findings provide a consistent, targeted and holistic framework for the development of the Internet market in the country.

  4. Evaluating Alzheimer's Disease Progression by Modeling Crosstalk Network Disruption

    Science.gov (United States)

    Liu, Haochen; Wei, Chunxiang; He, Hua; Liu, Xiaoquan

    2016-01-01

    Aβ, tau, and P-tau have been widely accepted as reliable markers for Alzheimer's disease (AD). The crosstalk between these markers forms a complex network. AD may induce the integral variation and disruption of the network. The aim of this study was to develop a novel mathematic model based on a simplified crosstalk network to evaluate the disease progression of AD. The integral variation of the network is measured by three integral disruption parameters. The robustness of network is evaluated by network disruption probability. Presented results show that network disruption probability has a good linear relationship with Mini Mental State Examination (MMSE). The proposed model combined with Support vector machine (SVM) achieves a relative high 10-fold cross-validated performance in classification of AD vs. normal and mild cognitive impairment (MCI) vs. normal (95% accuracy, 95% sensitivity, 95% specificity for AD vs. normal; 90% accuracy, 94% sensitivity, 83% specificity for MCI vs. normal). This research evaluates the progression of AD and facilitates AD early diagnosis. PMID:26834548

  5. The Third Phase of AQMEII: Evaluation Strategy and Multi-Model Performance Analysis

    Science.gov (United States)

    AQMEII (Air Quality Model Evaluation International Initiative) is an extraordinary effort promoting policy-relevant research on regional air quality model evaluation across the European and North American atmospheric modelling communities, providing the ideal platform for advanci...

  6. BALANCED SCORECARDS EVALUATION MODEL THAT INCLUDES ELEMENTS OF ENVIRONMENTAL MANAGEMENT SYSTEM USING AHP MODEL

    Directory of Open Access Journals (Sweden)

    Jelena Jovanović

    2010-03-01

    Full Text Available The research is oriented on improvement of environmental management system (EMS using BSC (Balanced Scorecard model that presents strategic model of measurem ents and improvement of organisational performance. The research will present approach of objectives and environmental management me trics involvement (proposed by literature review in conventional BSC in "Ad Barska plovi dba" organisation. Further we will test creation of ECO-BSC model based on business activities of non-profit organisations in order to improve envir onmental management system in parallel with other systems of management. Using this approach we may obtain 4 models of BSC that includ es elements of environmen tal management system for AD "Barska plovidba". Taking into acc ount that implementation and evaluation need long period of time in AD "Barska plovidba", the final choice will be based on 14598 (Information technology - Software product evaluation and ISO 9126 (Software engineering - Product quality using AHP method. Those standards are usually used for evaluation of quality software product and computer programs that serve in organisation as support and factors for development. So, AHP model will be bas ed on evolution criteria based on suggestion of ISO 9126 standards and types of evaluation from two evaluation teams. Members of team & will be experts in BSC and environmental management system that are not em ployed in AD "Barska Plovidba" organisation. The members of team 2 will be managers of AD "Barska Plovidba" organisation (including manage rs from environmental department. Merging results based on previously cr eated two AHP models, one can obtain the most appropriate BSC that includes elements of environmental management system. The chosen model will present at the same time suggestion for approach choice including ecological metrics in conventional BSC model for firm that has at least one ECO strategic orientation.

  7. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D.; Halford, Keith J.; Binley, Andrew; Lane, John W.; Werkema, Dale D.

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  8. Communicating Sustainability: An Operational Model for Evaluating Corporate Websites

    Directory of Open Access Journals (Sweden)

    Alfonso Siano

    2016-09-01

    Full Text Available The interest in corporate sustainability has increased rapidly in recent years and has encouraged organizations to adopt appropriate digital communication strategies, in which the corporate website plays a key role. Despite this growing attention in both the academic and business communities, models for the analysis and evaluation of online sustainability communication have not been developed to date. This paper aims to develop an operational model to identify and assess the requirements of sustainability communication in corporate websites. It has been developed from a literature review on corporate sustainability and digital communication and the analysis of the websites of the organizations included in the “Global CSR RepTrak 2015” by the Reputation Institute. The model identifies the core dimensions of online sustainability communication (orientation, structure, ergonomics, content—OSEC, sub-dimensions, such as stakeholder engagement and governance tools, communication principles, and measurable items (e.g., presence of the materiality matrix, interactive graphs. A pilot study on the websites of the energy and utilities companies included in the Dow Jones Sustainability World Index 2015 confirms the applicability of the OSEC framework. Thus, the model can provide managers and digital communication consultants with an operational tool that is useful for developing an industry ranking and assessing the best practices. The model can also help practitioners to identify corrective actions in the critical areas of digital sustainability communication and avoid greenwashing.

  9. Evaluation of the performance of DIAS ionospheric forecasting models

    Directory of Open Access Journals (Sweden)

    Tsagouri Ioanna

    2011-08-01

    Full Text Available Nowcasting and forecasting ionospheric products and services for the European region are regularly provided since August 2006 through the European Digital upper Atmosphere Server (DIAS, http://dias.space.noa.gr. Currently, DIAS ionospheric forecasts are based on the online implementation of two models: (i the solar wind driven autoregression model for ionospheric short-term forecast (SWIF, which combines historical and real-time ionospheric observations with solar-wind parameters obtained in real time at the L1 point from NASA ACE spacecraft, and (ii the geomagnetically correlated autoregression model (GCAM, which is a time series forecasting method driven by a synthetic geomagnetic index. In this paper we investigate the operational ability and the accuracy of both DIAS models carrying out a metrics-based evaluation of their performance under all possible conditions. The analysis was established on the systematic comparison between models’ predictions with actual observations obtained over almost one solar cycle (1998–2007 at four European ionospheric locations (Athens, Chilton, Juliusruh and Rome and on the comparison of the models’ performance against two simple prediction strategies, the median- and the persistence-based predictions during storm conditions. The results verify operational validity for both models and quantify their prediction accuracy under all possible conditions in support of operational applications but also of comparative studies in assessing or expanding the current ionospheric forecasting capabilities.

  10. Introducing A Hybrid Data Mining Model to Evaluate Customer Loyalty

    Directory of Open Access Journals (Sweden)

    H. Alizadeh

    2016-12-01

    Full Text Available The main aim of this study was introducing a comprehensive model of bank customers᾽ loyalty evaluation based on the assessment and comparison of different clustering methods᾽ performance. This study also pursues the following specific objectives: a using different clustering methods and comparing them for customer classification, b finding the effective variables in determining the customer loyalty, and c using different collective classification methods to increase the modeling accuracy and comparing the results with the basic methods. Since loyal customers generate more profit, this study aims at introducing a two-step model for classification of customers and their loyalty. For this purpose, various methods of clustering such as K-medoids, X-means and K-means were used, the last of which outperformed the other two through comparing with Davis-Bouldin index. Customers were clustered by using K-means and members of these four clusters were analyzed and labeled. Then, a predictive model was run based on demographic variables of customers using various classification methods such as DT (Decision Tree, ANN (Artificial Neural Networks, NB (Naive Bayes, KNN (K-Nearest Neighbors and SVM (Support Vector Machine, as well as their bagging and boosting to predict the class of loyal customers. The results showed that the bagging-ANN was the most accurate method in predicting loyal customers. This two-stage model can be used in banks and financial institutions with similar data to identify the type of future customers.

  11. An evaluation of attention models for use in SLAM

    Science.gov (United States)

    Dodge, Samuel; Karam, Lina

    2013-12-01

    In this paper we study the application of visual saliency models for the simultaneous localization and mapping (SLAM) problem. We consider visual SLAM, where the location of the camera and a map of the environment can be generated using images from a single moving camera. In visual SLAM, the interest point detector is of key importance. This detector must be invariant to certain image transformations so that features can be matched across di erent frames. Recent work has used a model of human visual attention to detect interest points, however it is unclear as to what is the best attention model for this purpose. To this aim, we compare the performance of interest points from four saliency models (Itti, GBVS, RARE, and AWS) with the performance of four traditional interest point detectors (Harris, Shi-Tomasi, SIFT, and FAST). We evaluate these detectors under several di erent types of image transformation and nd that the Itti saliency model, in general, achieves the best performance in terms of keypoint repeatability.

  12. Comprehensive evaluation of Shahid Motahari Educational Festival during 2008-2013 based on CIPP Evaluation Model

    Directory of Open Access Journals (Sweden)

    SN Hosseini

    2014-09-01

    Full Text Available Introduction: Education quality improvement is one of the main goals of higher education. In this regard, has been provided various solutions such as holding educational Shahid Motahari annual festivals, in order to appreciate of educational process, development and innovation educational processes and procedures, preparation of calibration standards and processes of accrediting educational. The aim of this study was to comprehensive evaluating of educational Shahid Motahari festival during six periods (2008-2013 based on CIPP evaluation model. Method : This cross-sectional study was conducted among the 473 faculty members include deputies and administrators educational, administrators and faculty members of medical education development centers, members of the scientific committee and faculty member’s participants in Shahid Motahari festival from 42 universities medical sciences of Iran. Data collection based on self-report writing questionnaires. Data were analyzed by SPSS version 20 at α=0.05 significant level. Results: The subjects reported 75.13%, 65.33%, 64.5%, and 59.21 % of receivable scores of process, context, input and product, respectively. In addition, there was a direct and significant correlation between all domains . Conclusion : According to the study findings, in the evaluation and correlation of domains models, we can explicitly how to holding festivals was appropriate and the main reason for the poor evaluation in product domain is related to the problems in input and product domains.

  13. Applying a realistic evaluation model to occupational safety interventions

    DEFF Research Database (Denmark)

    Pedersen, Louise Møller

    2018-01-01

    Background: Recent literature characterizes occupational safety interventions as complex social activities, applied in complex and dynamic social systems. Hence, the actual outcomes of an intervention will vary, depending on the intervention, the implementation process, context, personal characte......Background: Recent literature characterizes occupational safety interventions as complex social activities, applied in complex and dynamic social systems. Hence, the actual outcomes of an intervention will vary, depending on the intervention, the implementation process, context, personal...... characteristics of key actors (defined mechanisms), and the interplay between them, and can be categorized as expected or unexpected. However, little is known about ’how’ to include context and mechanisms in evaluations of intervention effectiveness. A revised realistic evaluation model has been introduced...... and qualitative methods. This revised model has, however, not been applied in a real life context. Method: The model is applied in a controlled, four-component, integrated behaviour-based and safety culture-based safety intervention study (2008-2010) in a medium-sized wood manufacturing company. The interventions...

  14. A Multiscale Model Evaluates Screening for Neoplasia in Barrett's Esophagus.

    Directory of Open Access Journals (Sweden)

    Kit Curtius

    2015-05-01

    Full Text Available Barrett's esophagus (BE patients are routinely screened for high grade dysplasia (HGD and esophageal adenocarcinoma (EAC through endoscopic screening, during which multiple esophageal tissue samples are removed for histological analysis. We propose a computational method called the multistage clonal expansion for EAC (MSCE-EAC screening model that is used for screening BE patients in silico to evaluate the effects of biopsy sampling, diagnostic sensitivity, and treatment on disease burden. Our framework seamlessly integrates relevant cell-level processes during EAC development with a spatial screening process to provide a clinically relevant model for detecting dysplastic and malignant clones within the crypt-structured BE tissue. With this computational approach, we retain spatio-temporal information about small, unobserved tissue lesions in BE that may remain undetected during biopsy-based screening but could be detected with high-resolution imaging. This allows evaluation of the efficacy and sensitivity of current screening protocols to detect neoplasia (dysplasia and early preclinical EAC in the esophageal lining. We demonstrate the clinical utility of this model by predicting three important clinical outcomes: (1 the probability that small cancers are missed during biopsy-based screening, (2 the potential gains in neoplasia detection probabilities if screening occurred via high-resolution tomographic imaging, and (3 the efficacy of ablative treatments that result in the curative depletion of metaplastic and neoplastic cell populations in BE in terms of the long-term impact on reducing EAC incidence.

  15. Biofuel market and carbon modeling to evaluate French biofuel policy

    International Nuclear Information System (INIS)

    Bernard, F.; Prieur, A.

    2006-10-01

    In order to comply with European objectives, France has set up an ambitious biofuel plan. This plan is evaluated considering two criteria: tax exemption need and GHG emission savings. An economic marginal analysis and a life cycle assessment (LCA) are provided using a coupling procedure between a partial agro-industrial equilibrium model and a refining optimization model. Thus, we are able to determine the minimum tax exemption needed to place on the market a targeted quantity of biofuel by deducing the agro-industrial marginal cost of biofuel production to the biofuel refining long-run marginal revenue. In parallel, a biofuels LCA is carried out using model outputs. Such a method avoid common allocation problems between joint products. The French biofuel plan is evaluated for 2008, 2010 and 2012 using prospective scenarios. Results suggest that biofuel competitiveness depends on crude oil prices and petroleum products demands. Consequently, biofuel tax exemption does not always appear to be necessary. LCA results show that biofuels production and use, from 'seed to wheel', would facilitate the French Government's to compliance with its 'Plan Climat' objectives by reducing up to 5% GHG emissions in the French road transport sector by 2010. (authors)

  16. Addressing Conceptual Model Uncertainty in the Evaluation of Model Prediction Errors

    Science.gov (United States)

    Carrera, J.; Pool, M.

    2014-12-01

    Model predictions are uncertain because of errors in model parameters, future forcing terms, and model concepts. The latter remain the largest and most difficult to assess source of uncertainty in long term model predictions. We first review existing methods to evaluate conceptual model uncertainty. We argue that they are highly sensitive to the ingenuity of the modeler, in the sense that they rely on the modeler's ability to propose alternative model concepts. Worse, we find that the standard practice of stochastic methods leads to poor, potentially biased and often too optimistic, estimation of actual model errors. This is bad news because stochastic methods are purported to properly represent uncertainty. We contend that the problem does not lie on the stochastic approach itself, but on the way it is applied. Specifically, stochastic inversion methodologies, which demand quantitative information, tend to ignore geological understanding, which is conceptually rich. We illustrate some of these problems with the application to Mar del Plata aquifer, where extensive data are available for nearly a century. Geologically based models, where spatial variability is handled through zonation, yield calibration fits similar to geostatiscally based models, but much better predictions. In fact, the appearance of the stochastic T fields is similar to the geologically based models only in areas with high density of data. We take this finding to illustrate the ability of stochastic models to accommodate many data, but also, ironically, their inability to address conceptual model uncertainty. In fact, stochastic model realizations tend to be too close to the "most likely" one (i.e., they do not really realize the full conceptualuncertainty). The second part of the presentation is devoted to argue that acknowledging model uncertainty may lead to qualitatively different decisions than just working with "most likely" model predictions. Therefore, efforts should concentrate on

  17. Development of Conceptual Benchmark Models to Evaluate Complex Hydrologic Model Calibration in Managed Basins Using Python

    Science.gov (United States)

    Hughes, J. D.; White, J.

    2013-12-01

    For many numerical hydrologic models it is a challenge to quantitatively demonstrate that complex models are preferable to simpler models. Typically, a decision is made to develop and calibrate a complex model at the beginning of a study. The value of selecting a complex model over simpler models is commonly inferred from use of a model with fewer simplifications of the governing equations because it can be time consuming to develop another numerical code with data processing and parameter estimation functionality. High-level programming languages like Python can greatly reduce the effort required to develop and calibrate simple models that can be used to quantitatively demonstrate the increased value of a complex model. We have developed and calibrated a spatially-distributed surface-water/groundwater flow model for managed basins in southeast Florida, USA, to (1) evaluate the effect of municipal groundwater pumpage on surface-water/groundwater exchange, (2) investigate how the study area will respond to sea-level rise, and (3) explore combinations of these forcing functions. To demonstrate the increased value of this complex model, we developed a two-parameter conceptual-benchmark-discharge model for each basin in the study area. The conceptual-benchmark-discharge model includes seasonal scaling and lag parameters and is driven by basin rainfall. The conceptual-benchmark-discharge models were developed in the Python programming language and used weekly rainfall data. Calibration was implemented with the Broyden-Fletcher-Goldfarb-Shanno method available in the Scientific Python (SciPy) library. Normalized benchmark efficiencies calculated using output from the complex model and the corresponding conceptual-benchmark-discharge model indicate that the complex model has more explanatory power than the simple model driven only by rainfall.

  18. Evaluation and Comparison of Ecological Models Simulating Nitrogen Processes in Treatment Wetlands,Implemented in Modelica

    OpenAIRE

    Edelfeldt, Stina

    2005-01-01

    Two ecological models of nitrogen processes in treatment wetlands have been evaluated and compared. These models have been implemented, simulated, and visualized in the Modelica language. The differences and similarities between the Modelica modeling environment used in this thesis and other environments or tools for ecological modeling have been evaluated. The modeling tools evaluated are PowerSim, Simile, Stella, the MathModelica Model Editor, and WEST. The evaluation and the analysis have...

  19. Evaluation of internal noise methods for Hotelling observer models

    International Nuclear Information System (INIS)

    Zhang Yani; Pham, Binh T.; Eckstein, Miguel P.

    2007-01-01

    The inclusion of internal noise in model observers is a common method to allow for quantitative comparisons between human and model observer performance in visual detection tasks. In this article, we studied two different strategies for inserting internal noise into Hotelling model observers. In the first strategy, internal noise was added to the output of individual channels: (a) Independent nonuniform channel noise, (b) independent uniform channel noise. In the second strategy, internal noise was added to the decision variable arising from the combination of channel responses. The standard deviation of the zero mean internal noise was either constant or proportional to: (a) the decision variable's standard deviation due to the external noise, (b) the decision variable's variance caused by the external noise, (c) the decision variable magnitude on a trial to trial basis. We tested three model observers: square window Hotelling observer (HO), channelized Hotelling observer (CHO), and Laguerre-Gauss Hotelling observer (LGHO) using a four alternative forced choice (4AFC) signal known exactly but variable task with a simulated signal embedded in real x-ray coronary angiogram backgrounds. The results showed that the internal noise method that led to the best prediction of human performance differed across the studied model observers. The CHO model best predicted human observer performance with the channel internal noise. The HO and LGHO best predicted human observer performance with the decision variable internal noise. The present results might guide researchers with the choice of methods to include internal noise into Hotelling model observers when evaluating and optimizing medical image quality

  20. A neural network model for credit risk evaluation.

    Science.gov (United States)

    Khashman, Adnan

    2009-08-01

    Credit scoring is one of the key analytical techniques in credit risk evaluation which has been an active research area in financial risk management. This paper presents a credit risk evaluation system that uses a neural network model based on the back propagation learning algorithm. We train and implement the neural network to decide whether to approve or reject a credit application, using seven learning schemes and real world credit applications from the Australian credit approval datasets. A comparison of the system performance under the different learning schemes is provided, furthermore, we compare the performance of two neural networks; with one and two hidden layers following the ideal learning scheme. Experimental results suggest that neural networks can be effectively used in automatic processing of credit applications.

  1. Mesoscale to microscale wind farm flow modeling and evaluation: Mesoscale to Microscale Wind Farm Models

    Energy Technology Data Exchange (ETDEWEB)

    Sanz Rodrigo, Javier [National Renewable Energy Centre (CENER), Sarriguren Spain; Chávez Arroyo, Roberto Aurelio [National Renewable Energy Centre (CENER), Sarriguren Spain; Moriarty, Patrick [National Renewable Energy Laboratory (NREL), Golden CO USA; Churchfield, Matthew [National Renewable Energy Laboratory (NREL), Golden CO USA; Kosović, Branko [National Center for Atmospheric Research (NCAR), Boulder CO USA; Réthoré, Pierre-Elouan [Technical University of Denmark (DTU), Roskilde Denmark; Hansen, Kurt Schaldemose [Technical University of Denmark (DTU), Lyngby Denmark; Hahmann, Andrea [Technical University of Denmark (DTU), Roskilde Denmark; Mirocha, Jeffrey D. [Lawrence Livermore National Laboratory, Livermore CA USA; Rife, Daran [DNV GL, San Diego CA USA

    2016-08-31

    The increasing size of wind turbines, with rotors already spanning more than 150 m diameter and hub heights above 100 m, requires proper modeling of the atmospheric boundary layer (ABL) from the surface to the free atmosphere. Furthermore, large wind farm arrays create their own boundary layer structure with unique physics. This poses significant challenges to traditional wind engineering models that rely on surface-layer theories and engineering wind farm models to simulate the flow in and around wind farms. However, adopting an ABL approach offers the opportunity to better integrate wind farm design tools and meteorological models. The challenge is how to build the bridge between atmospheric and wind engineering model communities and how to establish a comprehensive evaluation process that identifies relevant physical phenomena for wind energy applications with modeling and experimental requirements. A framework for model verification, validation, and uncertainty quantification is established to guide this process by a systematic evaluation of the modeling system at increasing levels of complexity. In terms of atmospheric physics, 'building the bridge' means developing models for the so-called 'terra incognita,' a term used to designate the turbulent scales that transition from mesoscale to microscale. This range of scales within atmospheric research deals with the transition from parameterized to resolved turbulence and the improvement of surface boundary-layer parameterizations. The coupling of meteorological and wind engineering flow models and the definition of a formal model evaluation methodology, is a strong area of research for the next generation of wind conditions assessment and wind farm and wind turbine design tools. Some fundamental challenges are identified in order to guide future research in this area.

  2. visCOS: An R-package to evaluate model performance of hydrological models

    Science.gov (United States)

    Klotz, Daniel; Herrnegger, Mathew; Wesemann, Johannes; Schulz, Karsten

    2016-04-01

    The evaluation of model performance is a central part of (hydrological) modelling. Much attention has been given to the development of evaluation criteria and diagnostic frameworks. (Klemeš, 1986; Gupta et al., 2008; among many others). Nevertheless, many applications exist for which objective functions do not yet provide satisfying summaries. Thus, the necessity to visualize results arises in order to explore a wider range of model capacities, be it strengths or deficiencies. Visualizations are usually devised for specific projects and these efforts are often not distributed to a broader community (e.g. via open source software packages). Hence, the opportunity to explicitly discuss a state-of-the-art presentation technique is often missed. We therefore present a comprehensive R-package for evaluating model performance by visualizing and exploring different aspects of hydrological time-series. The presented package comprises a set of useful plots and visualization methods, which complement existing packages, such as hydroGOF (Zambrano-Bigiarini et al., 2012). It is derived from practical applications of the hydrological models COSERO and COSEROreg (Kling et al., 2014). visCOS, providing an interface in R, represents an easy-to-use software package for visualizing and assessing model performance and can be implemented in the process of model calibration or model development. The package provides functions to load hydrological data into R, clean the data, process, visualize, explore and finally save the results in a consistent way. Together with an interactive zoom function of the time series, an online calculation of the objective functions for variable time-windows is included. Common hydrological objective functions, such as the Nash-Sutcliffe Efficiency and the Kling-Gupta Efficiency, can also be evaluated and visualized in different ways for defined sub-periods like hydrological years or seasonal sections. Many hydrologists use long-term water-balances as a

  3. Looking for a balance between internal and external evaluation of school quality: evaluation of the SVI model.

    NARCIS (Netherlands)

    Blok, H.; Sleegers, P.; Karsten, S.

    2008-01-01

    This article describes the results of a study into the utility of the SVI model, a model in which internal and external evaluation are balanced. The model consists of three phases: school self-evaluation, visitation and inspection. Under the guidance of school consultants, 27 Dutch primary schools

  4. Modelling public risk evaluation of natural hazards: a conceptual approach

    Science.gov (United States)

    Plattner, Th.

    2005-04-01

    In recent years, the dealing with natural hazards in Switzerland has shifted away from being hazard-oriented towards a risk-based approach. Decreasing societal acceptance of risk, accompanied by increasing marginal costs of protective measures and decreasing financial resources cause an optimization problem. Therefore, the new focus lies on the mitigation of the hazard's risk in accordance with economical, ecological and social considerations. This modern proceeding requires an approach in which not only technological, engineering or scientific aspects of the definition of the hazard or the computation of the risk are considered, but also the public concerns about the acceptance of these risks. These aspects of a modern risk approach enable a comprehensive assessment of the (risk) situation and, thus, sound risk management decisions. In Switzerland, however, the competent authorities suffer from a lack of decision criteria, as they don't know what risk level the public is willing to accept. Consequently, there exists a need for the authorities to know what the society thinks about risks. A formalized model that allows at least a crude simulation of the public risk evaluation could therefore be a useful tool to support effective and efficient risk mitigation measures. This paper presents a conceptual approach of such an evaluation model using perception affecting factors PAF, evaluation criteria EC and several factors without any immediate relation to the risk itself, but to the evaluating person. Finally, the decision about the acceptance Acc of a certain risk i is made by a comparison of the perceived risk Ri,perc with the acceptable risk Ri,acc.

  5. Error apportionment for atmospheric chemistry-transport models – a new approach to model evaluation

    Directory of Open Access Journals (Sweden)

    E. Solazzo

    2016-05-01

    Full Text Available In this study, methods are proposed to diagnose the causes of errors in air quality (AQ modelling systems. We investigate the deviation between modelled and observed time series of surface ozone through a revised formulation for breaking down the mean square error (MSE into bias, variance and the minimum achievable MSE (mMSE. The bias measures the accuracy and implies the existence of systematic errors and poor representation of data complexity, the variance measures the precision and provides an estimate of the variability of the modelling results in relation to the observed data, and the mMSE reflects unsystematic errors and provides a measure of the associativity between the modelled and the observed fields through the correlation coefficient. Each of the error components is analysed independently and apportioned to resolved processes based on the corresponding timescale (long scale, synoptic, diurnal, and intra-day and as a function of model complexity.The apportionment of the error is applied to the AQMEII (Air Quality Model Evaluation International Initiative group of models, which embrace the majority of regional AQ modelling systems currently used in Europe and North America.The proposed technique has proven to be a compact estimator of the operational metrics commonly used for model evaluation (bias, variance, and correlation coefficient, and has the further benefit of apportioning the error to the originating timescale, thus allowing for a clearer diagnosis of the processes that caused the error.

  6. Evaluation of a differentiation model of preschoolers’ executive functions

    Science.gov (United States)

    Howard, Steven J.; Okely, Anthony D.; Ellis, Yvonne G.

    2015-01-01

    Despite the prominent role of executive functions in children’s emerging competencies, there remains debate regarding the structure and development of executive functions. In an attempt to reconcile these discrepancies, a differentiation model of executive function development was evaluated in the early years using 6-month age groupings. Specifically, 281 preschoolers completed measures of working memory, inhibition, and shifting. Results contradicted suggestions that executive functions follow a single trajectory of progressive separation in childhood, instead suggesting that these functions may undergo a period of integration in the preschool years. These results highlight potential problems with current practices and theorizing in executive function research. PMID:25852603

  7. Evaluation of a differentiation model of preschoolers' executive functions.

    Science.gov (United States)

    Howard, Steven J; Okely, Anthony D; Ellis, Yvonne G

    2015-01-01

    Despite the prominent role of executive functions in children's emerging competencies, there remains debate regarding the structure and development of executive functions. In an attempt to reconcile these discrepancies, a differentiation model of executive function development was evaluated in the early years using 6-month age groupings. Specifically, 281 preschoolers completed measures of working memory, inhibition, and shifting. Results contradicted suggestions that executive functions follow a single trajectory of progressive separation in childhood, instead suggesting that these functions may undergo a period of integration in the preschool years. These results highlight potential problems with current practices and theorizing in executive function research.

  8. MATHEMATICAL MODEL FOR SOFTWARE USABILITY AUTOMATED EVALUATION AND ASSURANCE

    Directory of Open Access Journals (Sweden)

    І. Гученко

    2011-04-01

    Full Text Available The subject of the research is software usability and the aim is construction of mathematicalmodel of estimation and providing of the set level of usability. Methodology of structural analysis,methods of multicriterion optimization and theory of making decision, method of convolution,scientific methods of analysis and analogies is used in the research. The result of executed work isthe model for software usability automated evaluation and assurance that allows not only toestimate the current level of usability during every iteration of agile development but also tomanage the usability of created software products. Results can be used for the construction ofautomated support systems of management the software usability.

  9. Model-based efficiency evaluation of combine harvester traction drives

    Directory of Open Access Journals (Sweden)

    Steffen Häberle

    2015-08-01

    Full Text Available As part of the research the drive train of the combine harvesters is investigated in detail. The focus on load and power distribution, energy consumption and usage distribution are explicitly explored on two test machines. Based on the lessons learned during field operations, model-based studies of energy saving potential in the traction train of combine harvesters can now be quantified. Beyond that the virtual machine trial provides an opportunity to compare innovative drivetrain architectures and control solutions under reproducible conditions. As a result, an evaluation method is presented and generically used to draw comparisons under local representative operating conditions.

  10. Simplified Human-Robot Interaction: Modeling and Evaluation

    Directory of Open Access Journals (Sweden)

    Balazs Daniel

    2013-10-01

    Full Text Available In this paper a novel concept of human-robot interaction (HRI modeling is proposed. Including factors like trust in automation, situational awareness, expertise and expectations a new user experience framework is formed for industrial robots. Service Oriented Robot Operation, proposed in a previous paper, creates an abstract level in HRI and it is also included in the framework. This concept is evaluated with exhaustive tests. Results prove that significant improvement in task execution may be achieved and the new system is more usable for operators with less experience with robotics; personnel specific for small and medium enterprises (SMEs.

  11. A RETROSPECTIVE OF EVALUATION MODELS ON INTELLECTUAL CAPITAL

    Directory of Open Access Journals (Sweden)

    Ienciu Nicoleta Maria

    2011-12-01

    Full Text Available In the classical theory of economics, capital is one of the three factors of production, in addition to land and labor, and refers in particular to buildings, equipment, and machinery etc., used for the production of other goods (the term physical capital is also used by the specialized literature (Bratianu and Jianu, 2006. The present study intend to bring to the forefront the main evalluation methods for intellectual capital, as proposed, supported and criticized at the same time by researchers and practitioners. The study offers response to the following research questions: Which are the advantages and disadvantages of the intellectual capital evaluation methods? And what are the main studies approaching the subject of intellectual capital evaluation at international level? The collection and analysis of intellectual capital evaluation models and the non-participative observation are the main instruments used to bring to the forefront the main international existing evaluation frameworks. The information sources representing the base for these researches are especially constituted by articles published in specialized magazines, both from accounting and economics fields, specialized works relevant to the reference field, legislative documents, official documents, press releases and other documents issued by various national and international bodies. The most representative studies bringing to the forefront the evaluation of intellectual capital are the ones elaborated by Mouritsen et al (Mouritsen et al, 2001, Manea and Gorgan (Manea and Gorgan, 2003, Tayles (Tayles, 2002, Tayles et al (Tayles et al, 2007. The presented approaches offer a general idea on the range of methods, disciplines and operational specializations existing for the evaluation of intellectual capital. Only one of them - Balanced Scorecard is largely used, while the rest of the methods remain too theoretical or too poorly developed to be universally accepted. We believe that

  12. Evaluating procedural modelling for 3D models of informal settlements in urban design activities

    Directory of Open Access Journals (Sweden)

    Victoria Rautenbach

    2015-11-01

    Full Text Available Three-dimensional (3D modelling and visualisation is one of the fastest growing application fields in geographic information science. 3D city models are being researched extensively for a variety of purposes and in various domains, including urban design, disaster management, education and computer gaming. These models typically depict urban business districts (downtown or suburban residential areas. Despite informal settlements being a prevailing feature of many cities in developing countries, 3D models of informal settlements are virtually non-existent. 3D models of informal settlements could be useful in various ways, e.g. to gather information about the current environment in the informal settlements, to design upgrades, to communicate these and to educate inhabitants about environmental challenges. In this article, we described the development of a 3D model of the Slovo Park informal settlement in the City of Johannesburg Metropolitan Municipality, South Africa. Instead of using time-consuming traditional manual methods, we followed the procedural modelling technique. Visualisation characteristics of 3D models of informal settlements were described and the importance of each characteristic in urban design activities for informal settlement upgrades was assessed. Next, the visualisation characteristics of the Slovo Park model were evaluated. The results of the evaluation showed that the 3D model produced by the procedural modelling technique is suitable for urban design activities in informal settlements. The visualisation characteristics and their assessment are also useful as guidelines for developing 3D models of informal settlements. In future, we plan to empirically test the use of such 3D models in urban design projects in informal settlements.

  13. Fast Prediction and Evaluation of Gravitational Waveforms Using Surrogate Models

    Directory of Open Access Journals (Sweden)

    Scott E. Field

    2014-07-01

    Full Text Available We propose a solution to the problem of quickly and accurately predicting gravitational waveforms within any given physical model. The method is relevant for both real-time applications and more traditional scenarios where the generation of waveforms using standard methods can be prohibitively expensive. Our approach is based on three offline steps resulting in an accurate reduced order model in both parameter and physical dimensions that can be used as a surrogate for the true or fiducial waveform family. First, a set of m parameter values is determined using a greedy algorithm from which a reduced basis representation is constructed. Second, these m parameters induce the selection of m time values for interpolating a waveform time series using an empirical interpolant that is built for the fiducial waveform family. Third, a fit in the parameter dimension is performed for the waveform’s value at each of these m times. The cost of predicting L waveform time samples for a generic parameter choice is of order O(mL+mc_{fit} online operations, where c_{fit} denotes the fitting function operation count and, typically, m≪L. The result is a compact, computationally efficient, and accurate surrogate model that retains the original physics of the fiducial waveform family while also being fast to evaluate. We generate accurate surrogate models for effective-one-body waveforms of nonspinning binary black hole coalescences with durations as long as 10^{5}M, mass ratios from 1 to 10, and for multiple spherical harmonic modes. We find that these surrogates are more than 3 orders of magnitude faster to evaluate as compared to the cost of generating effective-one-body waveforms in standard ways. Surrogate model building for other waveform families and models follows the same steps and has the same low computational online scaling cost. For expensive numerical simulations of binary black hole coalescences, we thus anticipate extremely large speedups in

  14. Comparison of static model and dynamic model for the evaluation of station blackout sequences

    International Nuclear Information System (INIS)

    Lee, Kwang-Nam; Kang, Sun-Koo; Hong, Sung-Yull.

    1992-01-01

    Station blackout is one of major contributors to the core damage frequency (CDF) in many PSA studies. Since station blackout sequence exhibits dynamic features, accurate calculation of CDF for the station blackout sequence is not possible with event tree/fault tree (ET/FT) method. Although the integral method can determine accurate CDF, it is time consuming and is difficult to evaluate various alternative AC source configuration and sensitivities. In this study, a comparison is made between static model and dynamic model and a new methodology which combines static model and dynamic model is provided for the accurate quantification of CDF and evaluation of improvement alternatives. Results of several case studies show that accurate calculation of CDF is possible by introducing equivalent mission time. (author)

  15. Design, modeling, simulation and evaluation of a distributed energy system

    Science.gov (United States)

    Cultura, Ambrosio B., II

    This dissertation presents the design, modeling, simulation and evaluation of distributed energy resources (DER) consisting of photovoltaics (PV), wind turbines, batteries, a PEM fuel cell and supercapacitors. The distributed energy resources installed at UMass Lowell consist of the following: 2.5kW PV, 44kWhr lead acid batteries and 1500W, 500W & 300W wind turbines, which were installed before year 2000. Recently added to that are the following: 10.56 kW PV array, 2.4 kW wind turbine, 29 kWhr Lead acid batteries, a 1.2 kW PEM fuel cell and 4-140F supercapacitors. Each newly added energy resource has been designed, modeled, simulated and evaluated before its integration into the existing PV/Wind grid-connected system. The Mathematical and Simulink model of each system was derived and validated by comparing the simulated and experimental results. The Simulated results of energy generated from a 10.56kW PV system are in good agreement with the experimental results. A detailed electrical model of a 2.4kW wind turbine system equipped with a permanent magnet generator, diode rectifier, boost converter and inverter is presented. The analysis of the results demonstrates the effectiveness of the constructed simulink model, and can be used to predict the performance of the wind turbine. It was observed that a PEM fuel cell has a very fast response to load changes. Moreover, the model has validated the actual operation of the PEM fuel cell, showing that the simulated results in Matlab Simulink are consistent with the experimental results. The equivalent mathematical equation, derived from an electrical model of the supercapacitor, is used to simulate its voltage response. The model is completely capable of simulating its voltage behavior, and can predict the charge time and discharge time of voltages on the supercapacitor. The bi-directional dc-dc converter was designed in order to connect the 48V battery bank storage to the 24V battery bank storage. This connection was

  16. An updated summary of MATHEW/ADPIC model evaluation studies

    Energy Technology Data Exchange (ETDEWEB)

    Foster, K.T.; Dickerson, M.H.

    1990-05-01

    This paper summarizes the major model evaluation studies conducted for the MATHEW/ADPIC atmospheric transport and diffusion models used by the US Department of Energy's Atmospheric Release Advisory Capability. These studies have taken place over the last 15 years and involve field tracer releases influenced by a variety of meteorological and topographical conditions. Neutrally buoyant tracers released both as surface and elevated point sources, as well as material dispersed by explosive, thermally bouyant release mechanisms have been studied. Results from these studies show that the MATHEW/ADPIC models estimate the tracer air concentrations to within a factor of two of the measured values 20% to 50% of the time, and within a factor of five of the measurements 35% to 85% of the time depending on the complexity of the meteorology and terrain, and the release height of the tracer. Comparisons of model estimates to peak downwind deposition and air concentration measurements from explosive releases are shown to be generally within a factor of two to three. 24 refs., 14 figs., 3 tabs.

  17. An updated summary of MATHEW/ADPIC model evaluation studies

    International Nuclear Information System (INIS)

    Foster, K.T.; Dickerson, M.H.

    1990-05-01

    This paper summarizes the major model evaluation studies conducted for the MATHEW/ADPIC atmospheric transport and diffusion models used by the US Department of Energy's Atmospheric Release Advisory Capability. These studies have taken place over the last 15 years and involve field tracer releases influenced by a variety of meteorological and topographical conditions. Neutrally buoyant tracers released both as surface and elevated point sources, as well as material dispersed by explosive, thermally bouyant release mechanisms have been studied. Results from these studies show that the MATHEW/ADPIC models estimate the tracer air concentrations to within a factor of two of the measured values 20% to 50% of the time, and within a factor of five of the measurements 35% to 85% of the time depending on the complexity of the meteorology and terrain, and the release height of the tracer. Comparisons of model estimates to peak downwind deposition and air concentration measurements from explosive releases are shown to be generally within a factor of two to three. 24 refs., 14 figs., 3 tabs

  18. Evaluation of modelling body burden of Cs-137

    Energy Technology Data Exchange (ETDEWEB)

    Bergstroem, U.; Nordlinder, S.

    1996-05-01

    Within the IAEA/CEC VAMP-program one working group studied the precision in dose assessment models when calculating body burden of {sup 137}Cs as a result of exposure through multiple exposure pathways. One scenario used data from southern Finland regarding contamination of various media due to the fallout from the Chernobyl accident. In this study, a time dependent multiple exposure pathway model was constructed based on compartment theory. Uncertainties in model responses due to uncertainties in input parameter values were studied. The initial predictions for body burden were good, within a factor of 2 of the observed, while the time dynamics of levels in milk and meat did not agree satisfactorily. Some results, nevertheless, showed good agreement with observations due to compensatory effects. After disclosure of additional observational data, major reasons for mispredictions were identified as lack of consideration of time dependence of fixation of {sup 137}Cs in soils, and the selection of parameter values. When correction of this was made, a close agreement between predictions and observations was obtained. This study shows that the dose contribution due to {sup 137}Cs in food products from the seminatural environment is important for long-term exposure to man. The evaluation provided a basis for improvements of crucial parts in the model. 14 refs, 18 figs, 8 tabs.

  19. Evaluation of modelling body burden of Cs-137

    International Nuclear Information System (INIS)

    Bergstroem, U.; Nordlinder, S.

    1996-05-01

    Within the IAEA/CEC VAMP-program one working group studied the precision in dose assessment models when calculating body burden of 137 Cs as a result of exposure through multiple exposure pathways. One scenario used data from southern Finland regarding contamination of various media due to the fallout from the Chernobyl accident. In this study, a time dependent multiple exposure pathway model was constructed based on compartment theory. Uncertainties in model responses due to uncertainties in input parameter values were studied. The initial predictions for body burden were good, within a factor of 2 of the observed, while the time dynamics of levels in milk and meat did not agree satisfactorily. Some results, nevertheless, showed good agreement with observations due to compensatory effects. After disclosure of additional observational data, major reasons for mispredictions were identified as lack of consideration of time dependence of fixation of 137 Cs in soils, and the selection of parameter values. When correction of this was made, a close agreement between predictions and observations was obtained. This study shows that the dose contribution due to 137 Cs in food products from the seminatural environment is important for long-term exposure to man. The evaluation provided a basis for improvements of crucial parts in the model. 14 refs, 18 figs, 8 tabs

  20. Semantic Modeling for Exposomics with Exploratory Evaluation in Clinical Context

    Directory of Open Access Journals (Sweden)

    Jung-wei Fan

    2017-01-01

    Full Text Available Exposome is a critical dimension in the precision medicine paradigm. Effective representation of exposomics knowledge is instrumental to melding nongenetic factors into data analytics for clinical research. There is still limited work in (1 modeling exposome entities and relations with proper integration to mainstream ontologies and (2 systematically studying their presence in clinical context. Through selected ontological relations, we developed a template-driven approach to identifying exposome concepts from the Unified Medical Language System (UMLS. The derived concepts were evaluated in terms of literature coverage and the ability to assist in annotating clinical text. The generated semantic model represents rich domain knowledge about exposure events (454 pairs of relations between exposure and outcome. Additionally, a list of 5667 disorder concepts with microbial etiology was created for inferred pathogen exposures. The model consistently covered about 90% of PubMed literature on exposure-induced iatrogenic diseases over 10 years (2001–2010. The model contributed to the efficiency of exposome annotation in clinical text by filtering out 78% of irrelevant machine annotations. Analysis into 50 annotated discharge summaries helped advance our understanding of the exposome information in clinical text. This pilot study demonstrated feasibility of semiautomatically developing a useful semantic resource for exposomics.

  1. Evaluating transport in the WRF model along the California coast

    Directory of Open Access Journals (Sweden)

    C. E. Yver

    2013-02-01

    Full Text Available This paper presents a step in the development of a top-down method to complement the bottom-up inventories of halocarbon emissions in California using high frequency observations, forward simulations and inverse methods. The Scripps Institution of Oceanography high-frequency atmospheric halocarbons measurement sites are located along the California coast and therefore the evaluation of transport in the chosen Weather Research Forecast (WRF model at these sites is crucial for inverse modeling. The performance of the transport model has been investigated by comparing the wind direction and speed and temperature at four locations using aircraft weather reports as well at all METAR weather stations in our domain for hourly variations. Different planetary boundary layer (PBL schemes, horizontal resolutions (achieved through nesting and two meteorological datasets have been tested. Finally, simulated concentration of an inert tracer has been briefly investigated. All the PBL schemes present similar results that generally agree with observations, except in summer when the model sea breeze is too strong. At the coarse 12 km resolution, using ERA-interim (ECMWF Re-Analysis as initial and boundary conditions leads to improvements compared to using the North American Model (NAM dataset. Adding higher resolution nests also improves the match with the observations. However, no further improvement is observed from increasing the nest resolution from 4 km to 0.8 km. Once optimized, the model is able to reproduce tracer measurements during typical winter California large-scale events (Santa Ana. Furthermore, with the WRF/CHEM chemistry module and the European Database for Global Atmospheric Research (EDGAR version 4.1 emissions for HFC-134a, we find that using a simple emission scaling factor is not sufficient to infer emissions, which highlights the need for more complex inversions.

  2. Example of emergency response model evaluation of studies using the Mathew/Adpic models

    International Nuclear Information System (INIS)

    Dickerson, M.H.; Lange, R.

    1986-04-01

    This report summarizes model evaluation studies conducted for the MATHEW/ADPIC transport and diffusion models during the past ten years. These models support the US Department of Energy Atmospheric Release Advisory Capability, an emergency response service for atmospheric releases of nuclear material. Field studies involving tracer releases used in these studies cover a broad range of meteorology, terrain and tracer release heights, the three most important aspects of estimating air concentration values resulting from airborne releases of toxic material. Results of these studies show that these models can estimate air concentration values within a factor of 2 20% to 50% of the time and a factor of 5 40% to 80% of the time. As the meterology and terrain become more complex and the release height of the tracer is increased, the accuracy of the model calculations degrades. This band of uncertainty appears to correctly represent the capability of these models at this time. A method for estimating angular uncertainty in the model calculations is described and used to suggest alternative methods for evaluating emergency response models

  3. Evaluation of gas radiation models in CFD modeling of oxy-combustion

    International Nuclear Information System (INIS)

    Rajhi, M.A.; Ben-Mansour, R.; Habib, M.A.; Nemitallah, M.A.; Andersson, K.

    2014-01-01

    Highlights: • CFD modeling of a typical industrial water tube boiler is conducted. • Different combustion processes were considered including air and oxy-fuel combustion. • SGG, EWBM, Leckner, Perry and WSGG radiation models were considered in the study. • EWBM is the most accurate model and it’s considered to be the benchmark model. • Characteristics of oxy-fuel combustion are compared to those of air–fuel combustion. - Abstract: Proper determination of the radiation energy is very important for proper predictions of the combustion characteristics inside combustion devices using CFD modeling. For this purpose, different gas radiation models were developed and applied in the present work. These radiation models vary in their accuracy and complexity according to the application. In this work, a CFD model for a typical industrial water tube boiler was developed, considering three different combustion environments. The combustion environments are air–fuel combustion (21% O 2 and 79% N 2 ), oxy-fuel combustion (21% O 2 and 79% CO 2 ) and oxy-fuel combustion (27% O 2 and 73% CO 2 ). Simple grey gas (SGG), exponential wide band model (EWBM), Leckner, Perry and weighted sum of grey gases (WSGG) radiation models were examined and their influences on the combustion characteristics were evaluated. Among those radiation models, the EWBM was found to provide close results to the experimental data for the present boiler combustion application. The oxy-fuel combustion characteristics were analyzed and compared with those of air–fuel combustion

  4. Evaluation of long-range transport models in NOVANA; Evaluering af langtransportmodeller i NOVANA

    Energy Technology Data Exchange (ETDEWEB)

    Frohn, L.M.; Brandt, J.; Christensen, J.H.; Geels, C.; Hertel, O.; Skjoeth, C.A.; Ellemann, T.

    2007-06-15

    The Lagrangian model ACDEP which has been applied in BOP/-NOVA/NOVANA during the period 1995-2004, has been replaced by the more modern Eulerian model DEHM. The new model has a number of advantages, such as a better description of the three-dimensional atmospheric transport, a larger domain, a possibility for high spatial resolution in the calculations and a more detailed description of photochemical processes and dry deposition. In advance of the replacement, the results of the two models have been compared and evaluated using European and Danish measurements. Calculations have been performed with both models applying the same meteorological and emission input, for Europe for the year 2000 as well as for Denmark for the period 2000-2003. The European measurements applied in the present evaluation are obtained through EMEP. Using these measurements DEHM and ACDEP have been compared with respect to daily and yearly mean concentrations of ammonia (NH{sub 3}), ammonium (NH{sub 4}{sup +}), the sum of NH{sub 3} and NH{sub 4}{sup +} (SNH), nitric acid (HNO{sub 3}), nitrate (NO{sub 3}{sup -}), the sum of HNO{sub 3} and NO{sub 3}{sup -} (SNO{sub 3}), nitrogen dioxide (NO{sub 2}), ozone (O{sub 3}), sulphur dioxide (SO{sub 2}) and sulphate (SO{sub 4}{sup 2-}) as well as the hourly mean and daily maximum concentrations of O{sub 3}. Furthermore the daily and yearly total values of precipitation and wet deposition of NH{sub 4}{sup +}, NO{sub 3}{sup -} and SO{sub 4}{sup 2-} have been compared for the two models. The statistical parameters applied in the comparison are correlation, bias and fractional bias. The result of the comparison with the EMEP data is, that DEHM achieves better correlation coefficients for all chemical parameters (16 parameters in total) when the daily values are analysed, and for 15 out of 16 parameters when yearly values are taken into account. With respect to the fractional bias, the results obtained with DEHM are better than the corresponding results

  5. Evaluation of field development plans using 3-D reservoir modelling

    Energy Technology Data Exchange (ETDEWEB)

    Seifert, D.; Lewis, J.J.M. [Heriot-Watt Univ., Edinburgh (United Kingdom); Newbery, J.D.H. [Conoco, UK Ltd., Aberdeen (United Kingdom)] [and others

    1997-08-01

    Three-dimensional reservoir modelling has become an accepted tool in reservoir description and is used for various purposes, such as reservoir performance prediction or integration and visualisation of data. In this case study, a small Northern North Sea turbiditic reservoir was to be developed with a line drive strategy utilising a series of horizontal producer and injector pairs, oriented north-south. This development plan was to be evaluated and the expected outcome of the wells was to be assessed and risked. Detailed analyses of core, well log and analogue data has led to the development of two geological {open_quotes}end member{close_quotes} scenarios. Both scenarios have been stochastically modelled using the Sequential Indicator Simulation method. The resulting equiprobable realisations have been subjected to detailed statistical well placement optimisation techniques. Based upon bivariate statistical evaluation of more than 1000 numerical well trajectories for each of the two scenarios, it was found that the wells inclinations and lengths had a great impact on the wells success, whereas the azimuth was found to have only a minor impact. After integration of the above results, the actual well paths were redesigned to meet external drilling constraints, resulting in substantial reductions in drilling time and costs.

  6. In-vitro model for evaluation of pulse oximetry

    Science.gov (United States)

    Vegfors, Magnus; Lindberg, Lars-Goeran; Lennmarken, Claes; Oberg, P. Ake

    1991-06-01

    An in vitro model with blood circulating in a silicon tubing system and including an artificial arterial bed is an important tool for evaluation of the pulse oximetry technique. The oxygen saturation was measured on an artificial finger using a pulse oximeter (SpO2) and on blood samples using a hemoximeter (SaO2). Measurements were performed at different blood flows and at different blood hematocrits. An increase in steady as well as in pulsatile blood flow was followed by an increase in pulse oximeter readings and a better agreement between SpO2 and SaO2 readings. After diluting the blood with normal saline (decreased hematocrit) the agreement was further improved. These results indicate that the pulse oximeter signal is related to blood hematocrit and the velocity of blood. The flow-related dependance of SpO2 was also evaluated in a human model. These results provided evidence that the pulse oximeter signal is dependent on vascular changes.

  7. Sustainable Deforestation Evaluation Model and System Dynamics Analysis

    Directory of Open Access Journals (Sweden)

    Huirong Feng

    2014-01-01

    Full Text Available The current study used the improved fuzzy analytic hierarchy process to construct a sustainable deforestation development evaluation system and evaluation model, which has refined a diversified system to evaluate the theory of sustainable deforestation development. Leveraging the visual image of the system dynamics causal and power flow diagram, we illustrated here that sustainable forestry development is a complex system that encompasses the interaction and dynamic development of ecology, economy, and society and has reflected the time dynamic effect of sustainable forestry development from the three combined effects. We compared experimental programs to prove the direct and indirect impacts of the ecological, economic, and social effects of the corresponding deforest techniques and fully reflected the importance of developing scientific and rational ecological harvesting and transportation technologies. Experimental and theoretical results illustrated that light cableway skidding is an ecoskidding method that is beneficial for the sustainable development of resources, the environment, the economy, and society and forecasted the broad potential applications of light cableway skidding in timber production technology. Furthermore, we discussed the sustainable development countermeasures of forest ecosystems from the aspects of causality, interaction, and harmony.

  8. Sustainable Deforestation Evaluation Model and System Dynamics Analysis

    Science.gov (United States)

    Feng, Huirong; Lim, C. W.; Chen, Liqun; Zhou, Xinnian; Zhou, Chengjun; Lin, Yi

    2014-01-01

    The current study used the improved fuzzy analytic hierarchy process to construct a sustainable deforestation development evaluation system and evaluation model, which has refined a diversified system to evaluate the theory of sustainable deforestation development. Leveraging the visual image of the system dynamics causal and power flow diagram, we illustrated here that sustainable forestry development is a complex system that encompasses the interaction and dynamic development of ecology, economy, and society and has reflected the time dynamic effect of sustainable forestry development from the three combined effects. We compared experimental programs to prove the direct and indirect impacts of the ecological, economic, and social effects of the corresponding deforest techniques and fully reflected the importance of developing scientific and rational ecological harvesting and transportation technologies. Experimental and theoretical results illustrated that light cableway skidding is an ecoskidding method that is beneficial for the sustainable development of resources, the environment, the economy, and society and forecasted the broad potential applications of light cableway skidding in timber production technology. Furthermore, we discussed the sustainable development countermeasures of forest ecosystems from the aspects of causality, interaction, and harmony. PMID:25254225

  9. Sustainable deforestation evaluation model and system dynamics analysis.

    Science.gov (United States)

    Feng, Huirong; Lim, C W; Chen, Liqun; Zhou, Xinnian; Zhou, Chengjun; Lin, Yi

    2014-01-01

    The current study used the improved fuzzy analytic hierarchy process to construct a sustainable deforestation development evaluation system and evaluation model, which has refined a diversified system to evaluate the theory of sustainable deforestation development. Leveraging the visual image of the system dynamics causal and power flow diagram, we illustrated here that sustainable forestry development is a complex system that encompasses the interaction and dynamic development of ecology, economy, and society and has reflected the time dynamic effect of sustainable forestry development from the three combined effects. We compared experimental programs to prove the direct and indirect impacts of the ecological, economic, and social effects of the corresponding deforest techniques and fully reflected the importance of developing scientific and rational ecological harvesting and transportation technologies. Experimental and theoretical results illustrated that light cableway skidding is an ecoskidding method that is beneficial for the sustainable development of resources, the environment, the economy, and society and forecasted the broad potential applications of light cableway skidding in timber production technology. Furthermore, we discussed the sustainable development countermeasures of forest ecosystems from the aspects of causality, interaction, and harmony.

  10. Large scale solar district heating. Evaluation, modelling and designing

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.

    2000-07-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the tool for design studies and on a local energy planning case. The evaluation of the central solar heating technology is based on measurements on the case plant in Marstal, Denmark, and on published and unpublished data for other, mainly Danish, CSDHP plants. Evaluations on the thermal, economical and environmental performances are reported, based on the experiences from the last decade. The measurements from the Marstal case are analysed, experiences extracted and minor improvements to the plant design proposed. For the detailed designing and energy planning of CSDHPs, a computer simulation model is developed and validated on the measurements from the Marstal case. The final model is then generalised to a 'generic' model for CSDHPs in general. The meteorological reference data, Danish Reference Year, is applied to find the mean performance for the plant designs. To find the expectable variety of the thermal performance of such plants, a method is proposed where data from a year with poor solar irradiation and a year with strong solar irradiation are applied. Equipped with a simulation tool design studies are carried out spreading from parameter analysis over energy planning for a new settlement to a proposal for the combination of plane solar collectors with high performance solar collectors, exemplified by a trough solar collector. The methodology of utilising computer simulation proved to be a cheap and relevant tool in the design of future solar heating plants. The thesis also exposed the demand for developing computer models for the more advanced solar collector designs and especially for the control operation of CSHPs. In the final chapter the CSHP technology is put into perspective with respect to other possible technologies to find the relevance of the application

  11. Model selection on solid ground: Rigorous comparison of nine ways to evaluate Bayesian model evidence.

    Science.gov (United States)

    Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang

    2014-12-01

    Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible.

  12. Sustainable BECCS pathways evaluated by an integrated assessment model

    Science.gov (United States)

    Kato, E.

    2017-12-01

    Negative emissions technologies, particularly Bioenergy with Carbon Capture and Storage (BECCS), are key components of mitigation strategies in ambitious future socioeconomic scenarios analysed by integrated assessment models. Generally, scenarios aiming to keep mean global temperature rise below 2°C above pre-industrial would require net negative carbon emissions in the end of the 21st century. Also, in the context of Paris agreement which acknowledges "a balance between anthropogenic emissions by sources and removals by sinks of greenhouse gases in the second half of this century", RD&D for the negative emissions technologies in this decade has a crucial role for the possibility of early deployment of the technology. Because of the requirement of potentially extensive use of land and water for producing the bioenergy feedstock to get the anticipated level of gross negative emissions, researches on how to develop sustainable scenarios of BECCS is needed. Here, we present BECCS deployment scenarios that consider economically viable flow of bioenergy system including power generation and conversion process to liquid and gaseous fuels for transportation and heat with consideration of sustainable global biomass use. In the modelling process, detailed bioenergy representations, i.e. various feedstock and conversion technologies with and without CCS, are implemented in an integrated assessment (IA) model GRAPE (Global Relationship Assessment to Protect the Environment). Also, to overcome a general discrepancy about assumed future agricultural yield between 'top-down' IA models and 'bottom-up' estimates, which would crucially affect the land-use pattern, we applied yields change of food and energy crops consistent with process-based biophysical crop models in consideration of changing climate conditions. Using the framework, economically viable strategy for implementing sustainable bioenergy and BECCS flow are evaluated in the scenarios targeting to keep global average

  13. The Analysis of Several Models of Investment Value of Logistics Project Evaluation

    Directory of Open Access Journals (Sweden)

    Ke Qiu Cheng Zhou

    2013-01-01

    Full Text Available The study of the logistics project evaluation model features reviews the traditional value evaluation model. On the basis of this, using the fuzzy theory, we establish several logistics project evaluation models under fuzzy environment. The analysis of the respective characteristics and the comparison of the calculated results of the three models show that these models are important methods of investment value of logistics evaluation.

  14. Evaluation of a pig femoral head osteonecrosis model

    Science.gov (United States)

    2010-01-01

    Background A major cause of osteonecrosis of the femoral head is interruption of a blood supply to the proximal femur. In order to evaluate blood circulation and pathogenetic alterations, a pig femoral head osteonecrosis model was examined to address whether ligature of the femoral neck (vasculature deprivation) induces a reduction of blood circulation in the femoral head, and whether transphyseal vessels exist for communications between the epiphysis and the metaphysis. We also tested the hypothesis that the vessels surrounding the femoral neck and the ligamentum teres represent the primary source of blood flow to the femoral head. Methods Avascular osteonecrosis of the femoral head was induced in Yorkshire pigs by transecting the ligamentum teres and placing two ligatures around the femoral neck. After heparinized saline infusion and microfil perfusion via the abdominal aorta, blood circulation in the femoral head was evaluated by optical and CT imaging. Results An angiogram of the microfil casted sample allowed identification of the major blood vessels to the proximal femur including the iliac, common femoral, superficial femoral, deep femoral and circumflex arteries. Optical imaging in the femoral neck showed that a microfil stained vessel network was visible in control sections but less noticeable in necrotic sections. CT images showed a lack of microfil staining in the epiphysis. Furthermore, no transphyseal vessels were observed to link the epiphysis to the metaphysis. Conclusion Optical and CT imaging analyses revealed that in this present pig model the ligatures around the femoral neck were the primary cause of induction of avascular osteonecrosis. Since the vessels surrounding the femoral neck are comprised of the branches of the medial and the lateral femoral circumflex vessels, together with the extracapsular arterial ring and the lateral epiphyseal arteries, augmentation of blood circulation in those arteries will improve pathogenetic alterations in

  15. Evaluation of a pig femoral head osteonecrosis model

    Directory of Open Access Journals (Sweden)

    Kim Harry

    2010-03-01

    Full Text Available Abstract Background A major cause of osteonecrosis of the femoral head is interruption of a blood supply to the proximal femur. In order to evaluate blood circulation and pathogenetic alterations, a pig femoral head osteonecrosis model was examined to address whether ligature of the femoral neck (vasculature deprivation induces a reduction of blood circulation in the femoral head, and whether transphyseal vessels exist for communications between the epiphysis and the metaphysis. We also tested the hypothesis that the vessels surrounding the femoral neck and the ligamentum teres represent the primary source of blood flow to the femoral head. Methods Avascular osteonecrosis of the femoral head was induced in Yorkshire pigs by transecting the ligamentum teres and placing two ligatures around the femoral neck. After heparinized saline infusion and microfil perfusion via the abdominal aorta, blood circulation in the femoral head was evaluated by optical and CT imaging. Results An angiogram of the microfil casted sample allowed identification of the major blood vessels to the proximal femur including the iliac, common femoral, superficial femoral, deep femoral and circumflex arteries. Optical imaging in the femoral neck showed that a microfil stained vessel network was visible in control sections but less noticeable in necrotic sections. CT images showed a lack of microfil staining in the epiphysis. Furthermore, no transphyseal vessels were observed to link the epiphysis to the metaphysis. Conclusion Optical and CT imaging analyses revealed that in this present pig model the ligatures around the femoral neck were the primary cause of induction of avascular osteonecrosis. Since the vessels surrounding the femoral neck are comprised of the branches of the medial and the lateral femoral circumflex vessels, together with the extracapsular arterial ring and the lateral epiphyseal arteries, augmentation of blood circulation in those arteries will improve

  16. The issue of statistical power for overall model fit in evaluating structural equation models

    Directory of Open Access Journals (Sweden)

    Richard HERMIDA

    2015-06-01

    Full Text Available Statistical power is an important concept for psychological research. However, examining the power of a structural equation model (SEM is rare in practice. This article provides an accessible review of the concept of statistical power for the Root Mean Square Error of Approximation (RMSEA index of overall model fit in structural equation modeling. By way of example, we examine the current state of power in the literature by reviewing studies in top Industrial-Organizational (I/O Psychology journals using SEMs. Results indicate that in many studies, power is very low, which implies acceptance of invalid models. Additionally, we examined methodological situations which may have an influence on statistical power of SEMs. Results showed that power varies significantly as a function of model type and whether or not the model is the main model for the study. Finally, results indicated that power is significantly related to model fit statistics used in evaluating SEMs. The results from this quantitative review imply that researchers should be more vigilant with respect to power in structural equation modeling. We therefore conclude by offering methodological best practices to increase confidence in the interpretation of structural equation modeling results with respect to statistical power issues.

  17. The Glasgow-Maastricht foot model, evaluation of a 26 segment kinematic model of the foot.

    Science.gov (United States)

    Oosterwaal, Michiel; Carbes, Sylvain; Telfer, Scott; Woodburn, James; Tørholm, Søren; Al-Munajjed, Amir A; van Rhijn, Lodewijk; Meijer, Kenneth

    2016-01-01

    Accurately measuring of intrinsic foot kinematics using skin mounted markers is difficult, limited in part by the physical dimensions of the foot. Existing kinematic foot models solve this problem by combining multiple bones into idealized rigid segments. This study presents a novel foot model that allows the motion of the 26 bones to be individually estimated via a combination of partial joint constraints and coupling the motion of separate joints using kinematic rhythms. Segmented CT data from one healthy subject was used to create a template Glasgow-Maastricht foot model (GM-model). Following this, the template was scaled to produce subject-specific models for five additional healthy participants using a surface scan of the foot and ankle. Forty-three skin mounted markers, mainly positioned around the foot and ankle, were used to capture the stance phase of the right foot of the six healthy participants during walking. The GM-model was then applied to calculate the intrinsic foot kinematics. Distinct motion patterns where found for all joints. The variability in outcome depended on the location of the joint, with reasonable results for sagittal plane motions and poor results for transverse plane motions. The results of the GM-model were comparable with existing literature, including bone pin studies, with respect to the range of motion, motion pattern and timing of the motion in the studied joints. This novel model is the most complete kinematic model to date. Further evaluation of the model is warranted.

  18. Improving Baseline Model Assumptions: Evaluating the Impacts of Typical Methodological Approaches in Watershed Models

    Science.gov (United States)

    Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.

    2017-12-01

    Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.

  19. Systematic Review of Health Economic Impact Evaluations of Risk Prediction Models: Stop Developing, Start Evaluating.

    Science.gov (United States)

    van Giessen, Anoukh; Peters, Jaime; Wilcher, Britni; Hyde, Chris; Moons, Carl; de Wit, Ardine; Koffijberg, Erik

    2017-04-01

    Although health economic evaluations (HEEs) are increasingly common for therapeutic interventions, they appear to be rare for the use of risk prediction models (PMs). To evaluate the current state of HEEs of PMs by performing a comprehensive systematic review. Four databases were searched for HEEs of PM-based strategies. Two reviewers independently selected eligible articles. A checklist was compiled to score items focusing on general characteristics of HEEs of PMs, model characteristics and quality of HEEs, evidence on PMs typically used in the HEEs, and the specific challenges in performing HEEs of PMs. After screening 791 abstracts, 171 full texts, and reference checking, 40 eligible HEEs evaluating 60 PMs were identified. In these HEEs, PM strategies were compared with current practice (n = 32; 80%), to other stratification methods for patient management (n = 19; 48%), to an extended PM (n = 9; 23%), or to alternative PMs (n = 5; 13%). The PMs guided decisions on treatment (n = 42; 70%), further testing (n = 18; 30%), or treatment prioritization (n = 4; 7%). For 36 (60%) PMs, only a single decision threshold was evaluated. Costs of risk prediction were ignored for 28 (46%) PMs. Uncertainty in outcomes was assessed using probabilistic sensitivity analyses in 22 (55%) HEEs. Despite the huge number of PMs in the medical literature, HEE of PMs remains rare. In addition, we observed great variety in their quality and methodology, which may complicate interpretation of HEE results and implementation of PMs in practice. Guidance on HEE of PMs could encourage and standardize their application and enhance methodological quality, thereby improving adequate use of PM strategies. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  20. Evaluation of Gaussian approximations for data assimilation in reservoir models

    KAUST Repository

    Iglesias, Marco A.

    2013-07-14

    The Bayesian framework is the standard approach for data assimilation in reservoir modeling. This framework involves characterizing the posterior distribution of geological parameters in terms of a given prior distribution and data from the reservoir dynamics, together with a forward model connecting the space of geological parameters to the data space. Since the posterior distribution quantifies the uncertainty in the geologic parameters of the reservoir, the characterization of the posterior is fundamental for the optimal management of reservoirs. Unfortunately, due to the large-scale highly nonlinear properties of standard reservoir models, characterizing the posterior is computationally prohibitive. Instead, more affordable ad hoc techniques, based on Gaussian approximations, are often used for characterizing the posterior distribution. Evaluating the performance of those Gaussian approximations is typically conducted by assessing their ability at reproducing the truth within the confidence interval provided by the ad hoc technique under consideration. This has the disadvantage of mixing up the approximation properties of the history matching algorithm employed with the information content of the particular observations used, making it hard to evaluate the effect of the ad hoc approximations alone. In this paper, we avoid this disadvantage by comparing the ad hoc techniques with a fully resolved state-of-the-art probing of the Bayesian posterior distribution. The ad hoc techniques whose performance we assess are based on (1) linearization around the maximum a posteriori estimate, (2) randomized maximum likelihood, and (3) ensemble Kalman filter-type methods. In order to fully resolve the posterior distribution, we implement a state-of-the art Markov chain Monte Carlo (MCMC) method that scales well with respect to the dimension of the parameter space, enabling us to study realistic forward models, in two space dimensions, at a high level of grid refinement. Our