WorldWideScience

Sample records for model perform significantly

  1. Field significance of performance measures in the context of regional climate model evaluation. Part 2: precipitation

    Science.gov (United States)

    Ivanov, Martin; Warrach-Sagi, Kirsten; Wulfmeyer, Volker

    2017-02-01

    A new approach for rigorous spatial analysis of the downscaling performance of regional climate model (RCM) simulations is introduced. It is based on a multiple comparison of the local tests at the grid cells and is also known as `field' or `global' significance. The block length for the local resampling tests is precisely determined to adequately account for the time series structure. New performance measures for estimating the added value of downscaled data relative to the large-scale forcing fields are developed. The methodology is exemplarily applied to a standard EURO-CORDEX hindcast simulation with the Weather Research and Forecasting (WRF) model coupled with the land surface model NOAH at 0.11 ∘ grid resolution. Daily precipitation climatology for the 1990-2009 period is analysed for Germany for winter and summer in comparison with high-resolution gridded observations from the German Weather Service. The field significance test controls the proportion of falsely rejected local tests in a meaningful way and is robust to spatial dependence. Hence, the spatial patterns of the statistically significant local tests are also meaningful. We interpret them from a process-oriented perspective. While the downscaled precipitation distributions are statistically indistinguishable from the observed ones in most regions in summer, the biases of some distribution characteristics are significant over large areas in winter. WRF-NOAH generates appropriate stationary fine-scale climate features in the daily precipitation field over regions of complex topography in both seasons and appropriate transient fine-scale features almost everywhere in summer. As the added value of global climate model (GCM)-driven simulations cannot be smaller than this perfect-boundary estimate, this work demonstrates in a rigorous manner the clear additional value of dynamical downscaling over global climate simulations. The evaluation methodology has a broad spectrum of applicability as it is

  2. Does Statistical Significance Help to Evaluate Predictive Performance of Competing Models?

    Directory of Open Access Journals (Sweden)

    Levent Bulut

    2016-04-01

    Full Text Available In Monte Carlo experiment with simulated data, we show that as a point forecast criterion, the Clark and West's (2006 unconditional test of mean squared prediction errors does not reflect the relative performance of a superior model over a relatively weaker one. The simulation results show that even though the mean squared prediction errors of a constructed superior model is far below a weaker alternative, the Clark- West test does not reflect this in their test statistics. Therefore, studies that use this statistic in testing the predictive accuracy of alternative exchange rate models, stock return predictability, inflation forecasting, and unemployment forecasting should not weight too much on the magnitude of the statistically significant Clark-West tests statistics.

  3. Significance of uncertainties derived from settling tank model structure and parameters on predicting WWTP performance - A global sensitivity analysis study

    DEFF Research Database (Denmark)

    Ramin, Elham; Sin, Gürkan; Mikkelsen, Peter Steen

    2011-01-01

    Uncertainty derived from one of the process models – such as one-dimensional secondary settling tank (SST) models – can impact the output of the other process models, e.g., biokinetic (ASM1), as well as the integrated wastewater treatment plant (WWTP) models. The model structure and parameter...... uncertainty of settler models can therefore propagate, and add to the uncertainties in prediction of any plant performance criteria. Here we present an assessment of the relative significance of secondary settling model performance in WWTP simulations. We perform a global sensitivity analysis (GSA) based....... The outcome of this study contributes to a better understanding of uncertainty in WWTPs, and explicitly demonstrates the significance of secondary settling processes that are crucial elements of model prediction under dry and wet-weather loading conditions....

  4. On the significance of the noise model for the performance of a linear MPC in closed-loop operation

    DEFF Research Database (Denmark)

    Hagdrup, Morten; Boiroux, Dimitri; Mahmoudi, Zeinab

    2016-01-01

    models typically means less parameters to identify. Systematic tuning of such controllers is discussed. Simulation studies are conducted for linear time-invariant systems showing that choosing a noise model of low order is beneficial for closed-loop performance. (C) 2016, IFAC (International Federation...... of Automatic Control) Hosting by Elsevier Ltd. All rights reserved....

  5. The Real World Significance of Performance Prediction

    Science.gov (United States)

    Pardos, Zachary A.; Wang, Qing Yang; Trivedi, Shubhendu

    2012-01-01

    In recent years, the educational data mining and user modeling communities have been aggressively introducing models for predicting student performance on external measures such as standardized tests as well as within-tutor performance. While these models have brought statistically reliable improvement to performance prediction, the real world…

  6. Assessing the performance characteristics and clinical forces in simulated shape memory bone staple surgical procedure: The significance of SMA material model.

    Science.gov (United States)

    Saleeb, A F; Dhakal, B; Owusu-Danquah, J S

    2015-07-01

    This work is focused on the detailed computer simulation of the key stages involved in a shape memory alloy (SMA) osteosynthesis bone stapling procedure. To this end, a recently developed three-dimensional constitutive SMA material model was characterized from test data of three simple uniaxial-isothermal-tension experiments for powder metallurgically processed nickel-rich NiTi (PM/NiTi-P) material. The calibrated model was subsequently used under the complex, thermomechanical loading conditions involved in the surgical procedure using the body-temperature-activated PM/NiTi-P bone staple. Our aim here is to assess the immediate and post-surgical performance characteristics of the stapling operation using the material model. From this study: (1) it was found that adequate compressive forces were developed by the PM/NiTi-P bone staple, with the tendency of this force to even increase under sustained thermal loading due to the intrinsic "inverse relaxation phenomena" in the SMA material, (2) the simulation results correlated well with those from experimental measurements, (3) the body-temperature-activated PM/NiTi-P staple was proved to be clinically viable, providing a stable clamping force needed for speedy coaptation of the fractured bones, and (4) these realistic assessments crucially depend on the use of suitable and comprehensive SMA material models. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. The significance of task significance: Job performance effects, relational mechanisms, and boundary conditions.

    Science.gov (United States)

    Grant, Adam M

    2008-01-01

    Does task significance increase job performance? Correlational designs and confounded manipulations have prevented researchers from assessing the causal impact of task significance on job performance. To address this gap, 3 field experiments examined the performance effects, relational mechanisms, and boundary conditions of task significance. In Experiment 1, fundraising callers who received a task significance intervention increased their levels of job performance relative to callers in 2 other conditions and to their own prior performance. In Experiment 2, task significance increased the job dedication and helping behavior of lifeguards, and these effects were mediated by increases in perceptions of social impact and social worth. In Experiment 3, conscientiousness and prosocial values moderated the effects of task significance on the performance of new fundraising callers. The results provide fresh insights into the effects, relational mechanisms, and boundary conditions of task significance, offering noteworthy implications for theory, research, and practice on job design, social information processing, and work motivation and performance.

  8. Training Significantly Improves Fetoscopy Performance: A Pilot Randomized Controlled Trial.

    Science.gov (United States)

    Mietzsch, Stefan; Boettcher, Johannes; Yang, Sisi; Chantereau, Pierre; Romero, Philip; Bergholz, Robert; Reinshagen, Konrad; Boettcher, Michael

    2016-10-01

    Background Implementation of complex fetoscopic procedures that included intracorporeal suturing has been limited due to technical difficulties that might be surmounted with adequate training. Evaluating the impact of laparoscopic or fetoscopic training on fetoscopic performance was the aim of this study. Methods To evaluate fetoscopic performance after either laparoscopic or fetoscopic training, subjects were asked to perform four surgeon's square knots fetoscopically prior and post 2 hours of hands-on training. All subjects were medical students and novice in laparoscopic and fetoscopic interventions. Total time, knot stability (evaluated via tensiometer), suture accuracy, knot quality, and fetoscopic performance were assessed. Results Forty-six subjects were included in the study; after simple randomization, 24 were trained fetoscopically and 22 laparoscopically. Both groups had comparable baseline characteristics and improved after training significantly regarding all aspects assessed in this study. Subjects who trained fetoscopically were superior in terms of suturing and knot-tying performance. Conclusion Training significantly improves fetoscopic performance and may indeed be the keystone for future complex fetoscopic interventions. It seems advisable to train rather fetoscopically than laparoscopically resulting in higher suture and knot-tying quality. Georg Thieme Verlag KG Stuttgart · New York.

  9. Modeling typical performance measures

    NARCIS (Netherlands)

    Weekers, Anke Martine

    2009-01-01

    In the educational, employment, and clinical context, attitude and personality inventories are used to measure typical performance traits. Statistical models are applied to obtain latent trait estimates. Often the same statistical models as the models used in maximum performance measurement are appl

  10. Photovoltaic array performance model.

    Energy Technology Data Exchange (ETDEWEB)

    Kratochvil, Jay A.; Boyson, William Earl; King, David L.

    2004-08-01

    This document summarizes the equations and applications associated with the photovoltaic array performance model developed at Sandia National Laboratories over the last twelve years. Electrical, thermal, and optical characteristics for photovoltaic modules are included in the model, and the model is designed to use hourly solar resource and meteorological data. The versatility and accuracy of the model has been validated for flat-plate modules (all technologies) and for concentrator modules, as well as for large arrays of modules. Applications include system design and sizing, 'translation' of field performance measurements to standard reporting conditions, system performance optimization, and real-time comparison of measured versus expected system performance.

  11. Modelling vocal anatomy's significant effect on speech

    NARCIS (Netherlands)

    de Boer, B.

    2010-01-01

    This paper investigates the effect of larynx position on the articulatory abilities of a humanlike vocal tract. Previous work has investigated models that were built to resemble the anatomy of existing species or fossil ancestors. This has led to conflicting conclusions about the relation between

  12. Hadoop Performance Models

    OpenAIRE

    Herodotou, Herodotos

    2011-01-01

    Hadoop MapReduce is now a popular choice for performing large-scale data analytics. This technical report describes a detailed set of mathematical performance models for describing the execution of a MapReduce job on Hadoop. The models describe dataflow and cost information at the fine granularity of phases within the map and reduce tasks of a job execution. The models can be used to estimate the performance of MapReduce jobs as well as to find the optimal configuration settings to use when r...

  13. Hadoop Performance Models

    CERN Document Server

    Herodotou, Herodotos

    2011-01-01

    Hadoop MapReduce is now a popular choice for performing large-scale data analytics. This technical report describes a detailed set of mathematical performance models for describing the execution of a MapReduce job on Hadoop. The models describe dataflow and cost information at the fine granularity of phases within the map and reduce tasks of a job execution. The models can be used to estimate the performance of MapReduce jobs as well as to find the optimal configuration settings to use when running the jobs.

  14. NIF capsule performance modeling

    OpenAIRE

    Weber S.; Callahan D.; Cerjan C.; Edwards M.; Haan S.; Hicks D.; Jones O.; Kyrala G.; Meezan N.; Olson R; Robey H.; Spears B.; Springer P.; Town R.

    2013-01-01

    Post-shot modeling of NIF capsule implosions was performed in order to validate our physical and numerical models. Cryogenic layered target implosions and experiments with surrogate targets produce an abundance of capsule performance data including implosion velocity, remaining ablator mass, times of peak x-ray and neutron emission, core image size, core symmetry, neutron yield, and x-ray spectra. We have attempted to match the integrated data set with capsule-only simulations by adjusting th...

  15. Significant improvements in long trace profiler measurement performance

    Energy Technology Data Exchange (ETDEWEB)

    Takacs, P.Z. [Brookhaven National Lab., Upton, NY (United States); Bresloff, C.J. [Argonne National Lab., IL (United States)

    1996-07-01

    A Modifications made to the Long Trace Profiler (LTP II) system at the Advanced Photon Source at Argonne National Laboratory have significantly improved the accuracy and repeatability of the instrument The use of a Dove prism in the reference beam path corrects for phasing problems between mechanical efforts and thermally-induced system errors. A single reference correction now completely removes both error signals from the measured surface profile. The addition of a precision air conditioner keeps the temperature in the metrology enclosure constant to within {+-}0.1{degrees}C over a 24 hour period and has significantly improved the stability and repeatability of the system. We illustrate the performance improvements with several sets of measurements. The improved environmental control has reduced thermal drift error to about 0.75 microradian RMS over a 7.5 hour time period. Measurements made in the forward scan direction and the reverse scan direction differ by only about 0.5 microradian RMS over a 500mm, trace length. We are now able to put 1-sigma error bar of 0.3 microradian on an average of 10 slope profile measurements over a 500mm long trace length, and we are now able to put a 0.2 microradian error bar on an average of 10 measurements over a 200mm trace length. The corresponding 1-sigma height error bar for this measurement is 1.1 run.

  16. NIF capsule performance modeling

    Directory of Open Access Journals (Sweden)

    Weber S.

    2013-11-01

    Full Text Available Post-shot modeling of NIF capsule implosions was performed in order to validate our physical and numerical models. Cryogenic layered target implosions and experiments with surrogate targets produce an abundance of capsule performance data including implosion velocity, remaining ablator mass, times of peak x-ray and neutron emission, core image size, core symmetry, neutron yield, and x-ray spectra. We have attempted to match the integrated data set with capsule-only simulations by adjusting the drive and other physics parameters within expected uncertainties. The simulations include interface roughness, time-dependent symmetry, and a model of mix. We were able to match many of the measured performance parameters for a selection of shots.

  17. Significance of Attaining Users’ Feedback in Building Performance Assessment

    Directory of Open Access Journals (Sweden)

    Khalil Natasha

    2014-01-01

    Full Text Available Generally, building is a structure that provides basic shelter for the humans to conduct general activities. In common prose, the purpose of buildings is to provide humans a comfortable working and living space and protected from the extremes of climate. However, a building usage is depends on the lifespan and the change of rate effected on their impact on efficiency of use. Hence, more attention needs to be emphasized on the performance of buildings as the changes are not static over time. This paper highlights the concept and requirements in evaluating building performance. Exploration on the concept of building performance is also addressed on the purposes of building performance and the link of performance towards the end-users and incorporating their feedback. It concludes that obtaining users’ feedback is vital in building performance and the requirements of assessment must outline the performance criteria and mandates in such building.

  18. Performance modeling of Beamlet

    Energy Technology Data Exchange (ETDEWEB)

    Auerbach, J.M.; Lawson, J.K.; Rotter, M.D.; Sacks, R.A.; Van Wonterghem, B.W.; Williams, W.H.

    1995-06-27

    Detailed modeling of beam propagation in Beamlet has been made to predict system performance. New software allows extensive use of optical component characteristics. This inclusion of real optical component characteristics has resulted in close agreement between calculated and measured beam distributions.

  19. ATR performance modeling concepts

    Science.gov (United States)

    Ross, Timothy D.; Baker, Hyatt B.; Nolan, Adam R.; McGinnis, Ryan E.; Paulson, Christopher R.

    2016-05-01

    Performance models are needed for automatic target recognition (ATR) development and use. ATRs consume sensor data and produce decisions about the scene observed. ATR performance models (APMs) on the other hand consume operating conditions (OCs) and produce probabilities about what the ATR will produce. APMs are needed for many modeling roles of many kinds of ATRs (each with different sensing modality and exploitation functionality combinations); moreover, there are different approaches to constructing the APMs. Therefore, although many APMs have been developed, there is rarely one that fits a particular need. Clarified APM concepts may allow us to recognize new uses of existing APMs and identify new APM technologies and components that better support coverage of the needed APMs. The concepts begin with thinking of ATRs as mapping OCs of the real scene (including the sensor data) to reports. An APM is then a mapping from explicit quantized OCs (represented with less resolution than the real OCs) and latent OC distributions to report distributions. The roles of APMs can be distinguished by the explicit OCs they consume. APMs used in simulations consume the true state that the ATR is attempting to report. APMs used online with the exploitation consume the sensor signal and derivatives, such as match scores. APMs used in sensor management consume neither of those, but estimate performance from other OCs. This paper will summarize the major building blocks for APMs, including knowledge sources, OC models, look-up tables, analytical and learned mappings, and tools for signal synthesis and exploitation.

  20. THE SIGNIFICANCE OF LEADERSHIP FOR PERFORMANCE OF CONTRACT SERVICEMEN

    Directory of Open Access Journals (Sweden)

    Fomicheva O. V.

    2015-03-01

    Full Text Available Reform of the Armed Forces of the Russian Federation and, in particular, large-scale organizational and personnel changes, the transition to acquisition of the army and navy mostly contract servicemen, demanded new approaches to training and education of this category of servicemen. One of the innovative approaches of pedagogy, in this way, has been the development of techniques to build leadership skills and training of soldiers under contract to perform the functions of the military leader of the team that can win credibility with their subordinates, bring people together and send to the effective performance of duties. The article deals with the problem and the urgency of the formation of leadership qualities and the ability of the warrior-contractor for the effective performance of the functions under the leadership of the leader of the primary military collectives. Based on the analysis of domestic and foreign scientific literature on the study of the problems of leadership and formation of leadership in military, the article defines the essence of the concepts of "leadership" and "leader", and gives specific definitions of the concepts as applied to military-professional work of contract servicemen. The article concretized the value of leadership in the practical professional military activity warriors contract and the main features of its manifestations in the military collective

  1. Data management system performance modeling

    Science.gov (United States)

    Kiser, Larry M.

    1993-01-01

    This paper discusses analytical techniques that have been used to gain a better understanding of the Space Station Freedom's (SSF's) Data Management System (DMS). The DMS is a complex, distributed, real-time computer system that has been redesigned numerous times. The implications of these redesigns have not been fully analyzed. This paper discusses the advantages and disadvantages for static analytical techniques such as Rate Monotonic Analysis (RMA) and also provides a rationale for dynamic modeling. Factors such as system architecture, processor utilization, bus architecture, queuing, etc. are well suited for analysis with a dynamic model. The significance of performance measures for a real-time system are discussed.

  2. A Model Performance

    Science.gov (United States)

    Thornton, Bradley D.; Smalley, Robert A.

    2008-01-01

    Building information modeling (BIM) uses three-dimensional modeling concepts, information technology and interoperable software to design, construct and operate a facility. However, BIM can be more than a tool for virtual modeling--it can provide schools with a 3-D walkthrough of a project while it still is on the electronic drawing board. BIM can…

  3. Principles of Sonar Performance Modeling

    NARCIS (Netherlands)

    Ainslie, M.A.

    2010-01-01

    Sonar performance modelling (SPM) is concerned with the prediction of quantitative measures of sonar performance, such as probability of detection. It is a multidisciplinary subject, requiring knowledge and expertise in the disparate fields of underwater acoustics, acoustical oceanography, sonar sig

  4. Digital Troposcatter Performance Model

    Science.gov (United States)

    1983-12-01

    D,.-iD out ’e Pthr ~ These performance measures require a complete statistical de- scription of the components of the detection variable, which we...BER threshold Pthr " Let us denote by r the region of the 5-dimensional space (y,y) in which the BER exceeds Pthr : r = (Yy I): Pe(Y,!i) > Pthrl (A.46...y) by solving the nonlinear equation ... Pe ( Y,_21)= Pthr . A closed form expression for Pout(y) cannot be - obtained. Instead we developed an

  5. Statistical modeling of program performance

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2014-01-01

    Full Text Available A task of evaluation of program performance often occurs in the process of design of computer systems or during iterative compilation. A traditional way to solve this problem is emulation of program execution on the target system. A modern alternative approach to evaluation of program performance is based on statistical modeling of program performance on a computer under investigation. This statistical method of modeling program performance called Velocitas was introduced in this work. The method and its implementation in the Adaptor framework were presented. Investigation of the method's effectiveness showed high adequacy of program performance prediction.

  6. Parameter and Process Significance in Mechanistic Modeling of Cellulose Hydrolysis

    Science.gov (United States)

    Rotter, B.; Barry, A.; Gerhard, J.; Small, J.; Tahar, B.

    2005-12-01

    The rate of cellulose hydrolysis, and of associated microbial processes, is important in determining the stability of landfills and their potential impact on the environment, as well as associated time scales. To permit further exploration in this field, a process-based model of cellulose hydrolysis was developed. The model, which is relevant to both landfill and anaerobic digesters, includes a novel approach to biomass transfer between a cellulose-bound biofilm and biomass in the surrounding liquid. Model results highlight the significance of the bacterial colonization of cellulose particles by attachment through contact in solution. Simulations revealed that enhanced colonization, and therefore cellulose degradation, was associated with reduced cellulose particle size, higher biomass populations in solution, and increased cellulose-binding ability of the biomass. A sensitivity analysis of the system parameters revealed different sensitivities to model parameters for a typical landfill scenario versus that for an anaerobic digester. The results indicate that relative surface area of cellulose and proximity of hydrolyzing bacteria are key factors determining the cellulose degradation rate.

  7. MODELING SUPPLY CHAIN PERFORMANCE VARIABLES

    Directory of Open Access Journals (Sweden)

    Ashish Agarwal

    2005-01-01

    Full Text Available In order to understand the dynamic behavior of the variables that can play a major role in the performance improvement in a supply chain, a System Dynamics-based model is proposed. The model provides an effective framework for analyzing different variables affecting supply chain performance. Among different variables, a causal relationship among different variables has been identified. Variables emanating from performance measures such as gaps in customer satisfaction, cost minimization, lead-time reduction, service level improvement and quality improvement have been identified as goal-seeking loops. The proposed System Dynamics-based model analyzes the affect of dynamic behavior of variables for a period of 10 years on performance of case supply chain in auto business.

  8. Is flow velocity a significant parameter in flood damage modelling?

    Directory of Open Access Journals (Sweden)

    H. Kreibich

    2009-10-01

    Full Text Available Flow velocity is generally presumed to influence flood damage. However, this influence is hardly quantified and virtually no damage models take it into account. Therefore, the influences of flow velocity, water depth and combinations of these two impact parameters on various types of flood damage were investigated in five communities affected by the Elbe catchment flood in Germany in 2002. 2-D hydraulic models with high to medium spatial resolutions were used to calculate the impact parameters at the sites in which damage occurred. A significant influence of flow velocity on structural damage, particularly on roads, could be shown in contrast to a minor influence on monetary losses and business interruption. Forecasts of structural damage to road infrastructure should be based on flow velocity alone. The energy head is suggested as a suitable flood impact parameter for reliable forecasting of structural damage to residential buildings above a critical impact level of 2 m of energy head or water depth. However, general consideration of flow velocity in flood damage modelling, particularly for estimating monetary loss, cannot be recommended.

  9. Air Conditioner Compressor Performance Model

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Ning; Xie, YuLong; Huang, Zhenyu

    2008-09-05

    During the past three years, the Western Electricity Coordinating Council (WECC) Load Modeling Task Force (LMTF) has led the effort to develop the new modeling approach. As part of this effort, the Bonneville Power Administration (BPA), Southern California Edison (SCE), and Electric Power Research Institute (EPRI) Solutions tested 27 residential air-conditioning units to assess their response to delayed voltage recovery transients. After completing these tests, different modeling approaches were proposed, among them a performance modeling approach that proved to be one of the three favored for its simplicity and ability to recreate different SVR events satisfactorily. Funded by the California Energy Commission (CEC) under its load modeling project, researchers at Pacific Northwest National Laboratory (PNNL) led the follow-on task to analyze the motor testing data to derive the parameters needed to develop a performance models for the single-phase air-conditioning (SPAC) unit. To derive the performance model, PNNL researchers first used the motor voltage and frequency ramping test data to obtain the real (P) and reactive (Q) power versus voltage (V) and frequency (f) curves. Then, curve fitting was used to develop the P-V, Q-V, P-f, and Q-f relationships for motor running and stalling states. The resulting performance model ignores the dynamic response of the air-conditioning motor. Because the inertia of the air-conditioning motor is very small (H<0.05), the motor reaches from one steady state to another in a few cycles. So, the performance model is a fair representation of the motor behaviors in both running and stalling states.

  10. Modeling road-cycling performance.

    Science.gov (United States)

    Olds, T S; Norton, K I; Lowe, E L; Olive, S; Reay, F; Ly, S

    1995-04-01

    This paper presents a complete set of equations for a "first principles" mathematical model of road-cycling performance, including corrections for the effect of winds, tire pressure and wheel radius, altitude, relative humidity, rotational kinetic energy, drafting, and changed drag. The relevant physiological, biophysical, and environmental variables were measured in 41 experienced cyclists completing a 26-km road time trial. The correlation between actual and predicted times was 0.89 (P road-cycling performance are maximal O2 consumption, fractional utilization of maximal O2 consumption, mechanical efficiency, and projected frontal area. The model is then applied to some practical problems in road cycling: the effect of drafting, the advantage of using smaller front wheels, the effects of added mass, the importance of rotational kinetic energy, the effect of changes in drag due to changes in bicycle configuration, the normalization of performances under different conditions, and the limits of human performance.

  11. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  12. Accuracy of pitch matching significantly improved by live voice model.

    Science.gov (United States)

    Granot, Roni Y; Israel-Kolatt, Rona; Gilboa, Avi; Kolatt, Tsafrir

    2013-05-01

    Singing is, undoubtedly, the most fundamental expression of our musical capacity, yet an estimated 10-15% of Western population sings "out-of-tune (OOT)." Previous research in children and adults suggests, albeit inconsistently, that imitating a human voice can improve pitch matching. In the present study, we focus on the potentially beneficial effects of the human voice and especially the live human voice. Eighteen participants varying in their singing abilities were required to imitate in singing a set of nine ascending and descending intervals presented to them in five different randomized blocked conditions: live piano, recorded piano, live voice using optimal voice production, recorded voice using optimal voice production, and recorded voice using artificial forced voice production. Pitch and interval matching in singing were much more accurate when participants repeated sung intervals as compared with intervals played to them on the piano. The advantage of the vocal over the piano stimuli was robust and emerged clearly regardless of whether piano tones were played live and in full view or were presented via recording. Live vocal stimuli elicited higher accuracy than recorded vocal stimuli, especially when the recorded vocal stimuli were produced in a forced vocal production. Remarkably, even those who would be considered OOT singers on the basis of their performance when repeating piano tones were able to pitch match live vocal sounds, with deviations well within the range of what is considered accurate singing (M=46.0, standard deviation=39.2 cents). In fact, those participants who were most OOT gained the most from the live voice model. Results are discussed in light of the dual auditory-motor encoding of pitch analogous to that found in speech. Copyright © 2013 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  13. Injection temperature significantly affects in vitro and in vivo performance of collagen-platelet scaffolds.

    Science.gov (United States)

    Palmer, M P; Abreu, E L; Mastrangelo, A; Murray, M M

    2009-07-01

    Collagen-platelet composites have recently been successfully used as scaffolds to stimulate anterior cruciate ligament (ACL) wound healing in large animal models. These materials are typically kept on ice until use to prevent premature gelation; however, with surgical use, placement of a cold solution then requires up to an hour while the solution comes to body temperature (at which point gelation occurs). Bringing the solution to a higher temperature before injection would likely decrease this intra-operative wait; however, the effects of this on composite performance are not known. The hypothesis tested here was that increasing the temperature of the gel at the time of injection would significantly decrease the time to gelation, but would not significantly alter the mechanical properties of the composite or its ability to support functional tissue repair. Primary outcome measures included the maximum elastic modulus (stiffness) of the composite in vitro and the in vivo yield load of an ACL transection treated with an injected collagen-platelet composite. In vitro findings were that injection temperatures over 30 degrees C resulted in a faster visco-elastic transition; however, the warmed composites had a 50% decrease in their maximum elastic modulus. In vivo studies found that warming the gels prior to injection also resulted in a decrease in the yield load of the healing ACL at 14 weeks. These studies suggest that increasing injection temperature of collagen-platelet composites results in a decrease in performance of the composite in vitro and in the strength of the healing ligament in vivo and this technique should be used only with great caution.

  14. Performance modeling of optical refrigerators

    Energy Technology Data Exchange (ETDEWEB)

    Mills, G.; Mord, A. [Ball Aerospace and Technologies Corp., Boulder, CO (United States). Cryogenic and Thermal Engineering

    2006-02-15

    Optical refrigeration using anti-Stokes fluorescence in solids has several advantages over more conventional techniques including low mass, low volume, low cost and no vibration. It also has the potential of allowing miniature cryocoolers on the scale of a few cubic centimeters. It has been the topic of analysis and experimental work by several organizations. In 2003, we demonstrated the first optical refrigerator. We have developed a comprehensive system-level performance model of optical refrigerators. Our current version models the refrigeration cycle based on the fluorescent material emission and absorption data at ambient and reduced temperature for the Ytterbium-ZBLAN glass (Yb:ZBLAN) cooling material. It also includes the heat transfer into the refrigerator cooling assembly due to radiation and conduction. In this paper, we report on modeling results which reveal the interplay between size, power input, and cooling load. This interplay results in practical size limitations using Yb:ZBLAN. (author)

  15. Multiresolution wavelet-ANN model for significant wave height forecasting.

    Digital Repository Service at National Institute of Oceanography (India)

    Deka, P.C.; Mandal, S.; Prahlada, R.

    Hybrid wavelet artificial neural network (WLNN) has been applied in the present study to forecast significant wave heights (Hs). Here Discrete Wavelet Transformation is used to preprocess the time series data (Hs) prior to Artificial Neural Network...

  16. Significant alterations in reported clinical practice associated with increased oversight of organ transplant center performance.

    Science.gov (United States)

    Schold, Jesse D; Arrington, Charlotte J; Levine, Greg

    2010-09-01

    In the past several years, emphasis on quality metrics in the field of organ transplantation has increased significantly, largely because of the new conditions of participation issued by the Centers for Medicare and Medicaid Services. These regulations directly associate patients' outcomes and measured performance of centers with the distribution of public funding to institutions. Moreover, insurers and marketing ventures have used publicly available outcomes data from transplant centers for business decision making and advertisement purposes. We gave a 10-question survey to attendees of the Transplant Management Forum at the 2009 meeting of the United Network for Organ Sharing to ascertain how centers have responded to the increased oversight of performance. Of 63 responses, 55% indicated a low or near low performance rating at their center in the past 3 years. Respondents from low-performing centers were significantly more likely to indicate increased selection criteria for candidates (81% vs 38%, P = .001) and donors (77% vs 31%, P < .001) as well as alterations in clinical protocols (84% vs 52%, P = .007). Among respondents indicating lost insurance contracts (31%), these differences were also highly significant. Based on respondents' perceptions, outcomes of performance evaluations are associated with significant changes in clinical practice at transplant centers. The transplant community and policy makers should practice vigilance that performance evaluations and regulatory oversight do not inadvertently lead to diminished access to care among viable candidates or decreased transplant volume.

  17. Scalar Dark Matter Models with Significant Internal Bremsstrahlung

    CERN Document Server

    Giacchino, Federica; Tytgat, Michel H G

    2013-01-01

    There has been interest recently on particle physics models that may give rise to sharp gamma ray spectral features from dark matter annihilation. Because dark matter is supposed to be electrically neutral, it is challenging to build weakly interacting massive particle models that may accommodate both a large cross section into gamma rays at, say, the Galactic center, and the right dark matter abundance. In this work, we consider the gamma ray signatures of a class of scalar dark matter models that interact with Standard Model dominantly through heavy vector-like fermions (the vector-like portal). We focus on a real scalar singlet S annihilating into lepton-antilepton pairs. Because this two-body final-state annihilation channel is d-wave suppressed in the chiral limit, we show that virtual internal bremsstrahlung emission of a gamma ray gives a large correction, both today and at the time of freeze-out. For the sake of comparison, we confront this scenario to the familiar case of a Majorana singlet annihilat...

  18. The significance of genetics in pathophysiologic models of premature birth.

    Science.gov (United States)

    Uberos, Jose

    2017-05-31

    Prematurity is a major health problem in all countries, especially in certain ethic groups and increasing recurrence imply the influence of genetic factors. Published genetic polymorphisms are identified in relation to the 4 pathophysiological models of prematurity described: Chorioamniotic-decidual inflammation, premature contraction pathway, decidual haemorrhage and susceptibility to environmental toxins. 240 articles are identified, 52 articles are excluded because they are not original, not written in English or duplicated. From them 125 articles were included in qualitative analysis This review aims to update recent knowledge about genes associated with premature birth.

  19. On the significance of the Nash-Sutcliffe efficiency measure for event-based flood models

    Science.gov (United States)

    Moussa, Roger

    2010-05-01

    When modelling flood events, the important challenge that awaits the modeller is first to choose a rainfall-runoff model, then to calibrate a set of parameters that can accurately simulate a number of flood events and related hydrograph shapes, and finally to evaluate the model performance separately on each event using multi-criteria functions. This study analyses the significance of the Nash-Sutcliffe efficiency (NSE) and proposes a new method to assess the performance of flood event models (see Moussa, 2010, "When monstrosity can be beautiful while normality can be ugly : assessing the performance of event-based-flood-models", Hydrological Science Journal, in press). We focus on the specific cases of events difficult to model and characterized by low NSE values, which we call "monsters". The properties of the NSE were analysed as a function of the calculated hydrograph shape and of the benchmark reference model. As application case, a multi-criteria analysis method to assess the model performance on each event is proposed and applied on the Gardon d'Anduze catchment. This paper discusses first the significance of the well-known Nash-Sutcliffe efficiency (NSE) criteria function when calculated separately on flood events. The NSE is a convenient and normalized measure of model performance, but does not provide a reliable basis for comparing the results of different case studies. We show that simulated hydrographs with low or negative values of NSE, called "monsters", can be due solely to a simple lag translation or a homothetic ratio of the observed hydrograph which reproduces the dynamic of the hydrograph, with acceptable errors on other criteria. In the opposite, results show that simulations with a NSE close to 1 can become "monsters" and give very low values (even negative) of the criteria function G, if the average observed discharged used as a benchmark reference model in the NSE is modified. This paper argues that the definition of an appropriate benchmark

  20. Parameter definition using vibration prediction software leads to significant drilling performance improvements

    Energy Technology Data Exchange (ETDEWEB)

    Amorim, Dalmo; Hanley, Chris Hanley; Fonseca, Isaac; Santos, Juliana [National Oilwell Varco, Houston TX (United States); Leite, Daltro J.; Borella, Augusto; Gozzi, Danilo [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    The understanding and mitigation of downhole vibration has been a heavily researched subject in the oil industry as it results in more expensive drilling operations, as vibrations significantly diminish the amount of effective drilling energy available to the bit and generate forces that can push the bit or the Bottom Hole Assembly (BHA) off its concentric axis of rotation, producing high magnitude impacts with the borehole wall. In order to drill ahead, a sufficient amount of energy must be supplied by the rig to overcome the resistance of the drilling system, including the reactive torque of the system, drag forces, fluid pressure losses and energy dissipated by downhole vibrations, then providing the bit with the energy required to fail the rock. If the drill string enters resonant modes of vibration, not only does it decreases the amount of available energy to drill, but increases the potential for catastrophic downhole equipment and drilling bit failures. In this sense, the mitigation of downhole vibrations will result in faster, smoother, and cheaper drilling operations. A software tool using Finite Element Analysis (FEA) has been developed to provide better understanding of downhole vibration phenomena in drilling environments. The software tool calculates the response of the drilling system at various input conditions, based on the design of the wellbore along with the geometry of the Bottom Hole Assembly (BHA) and the drill string. It identifies where undesired levels of resonant vibration will be driven by certain combinations of specific drilling parameters, and also which combinations of drilling parameters will result in lower levels of vibration, so the least shocks, the highest penetration rate and the lowest cost per foot can be achieved. With the growing performance of personal computers, complex software systems modeling the drilling vibrations using FEA has been accessible to a wider audience of field users, further complimenting with real time

  1. Oil cracking to gases: Kinetic modeling and geological significance

    Institute of Scientific and Technical Information of China (English)

    TIAN Hui; WANG Zhaoming; XIAO Zhongyao; LI Xianqing; XIAO Xianming

    2006-01-01

    ATriassic oil sample from LN14 of Tarim Basin was pyrolyzed using the sealed gold tubes at 200-620℃ under a constant pressure of 50 MPa.The gaseous and residual soluble hydrocarbons were analyzed. The results show that the cracking of oil to gas can be divided into two distinct stages: the primary generation of total C1-5 gases from liquid oil characterized by the dominance of C2-5 hydrocarbons and the secondary or further cracking of C2-5gases to methane and carbon-rich matters leading to the progressive dryness of gases. Based on the experimental data, the kinetic parameters were determined for the primary generation and secondary cracking of oil cracking gases and extrapolated to geological conditions to predict the thermal stability and cracking extent of crude oil. Finally, an evolution model for the thermal destruction of crude oil was proposed and its implications to the migration and accumulation of oil cracking gases were discussed.

  2. A Procurement Performance Model for Construction Frameworks

    Directory of Open Access Journals (Sweden)

    Terence Y M Lam

    2015-07-01

    Full Text Available Collaborative construction frameworks have been developed in the United Kingdom (UK to create longer term relationships between clients and suppliers in order to improve project outcomes. Research undertaken into highways maintenance set within a major county council has confirmed that such collaborative procurement methods can improve time, cost and quality of construction projects. Building upon this and examining the same single case, this research aims to develop a performance model through identification of performance drivers in the whole project delivery process including pre and post contract phases. A priori performance model based on operational and sociological constructs was proposed and then checked by a pilot study. Factor analysis and central tendency statistics from the questionnaires as well as content analysis from the interview transcripts were conducted. It was confirmed that long term relationships, financial and non-financial incentives and stronger communication are the sociological behaviour factors driving performance. The interviews also established that key performance indicators (KPIs can be used as an operational measure to improve performance. With the posteriori performance model, client project managers can effectively collaboratively manage contractor performance through procurement measures including use of longer term and KPIs for the contract so that the expected project outcomes can be achieved. The findings also make significant contribution to construction framework procurement theory by identifying the interrelated sociological and operational performance drivers. This study is set predominantly in the field of highways civil engineering. It is suggested that building based projects or other projects that share characteristics are grouped together and used for further research of the phenomena discovered.

  3. Significance of exchanging SSURGO and STATSGO data when modeling hydrology in diverse physiographic terranes

    Science.gov (United States)

    Williamson, Tanja N.; Taylor, Charles J.; Newson, Jeremy K.

    2013-01-01

    The Water Availability Tool for Environmental Resources (WATER) is a TOPMODEL-based hydrologic model that depends on spatially accurate soils data to function in diverse terranes. In Kentucky, this includes mountainous regions, karstic plateau, and alluvial plains. Soils data are critical because they quantify the space to store water, as well as how water moves through the soil to the stream during storm events. We compared how the model performs using two different sources of soils data--Soil Survey Geographic Database (SSURGO) and State Soil Geographic Database laboratory data (STATSGO)--for 21 basins ranging in size from 17 to 1564 km2. Model results were consistently better when SSURGO data were used, likely due to the higher field capacity, porosity, and available-water holding capacity, which cause the model to store more soil-water in the landscape and improve streamflow estimates for both low- and high-flow conditions. In addition, there were significant differences in the conductivity multiplier and scaling parameter values that describe how water moves vertically and laterally, respectively, as quantified by TOPMODEL. We also evaluated whether partitioning areas that drain to streams via sinkholes in karstic basins as separate hydrologic modeling units (HMUs) improved model performance. There were significant differences between HMUs in properties that control soil-water storage in the model, although the effect of partitioning these HMUs on streamflow simulation was inconclusive.

  4. Diabetes is associated with impaired myocardial performance in patients without significant coronary artery disease

    DEFF Research Database (Denmark)

    Andersson, Charlotte; Gislason, Gunnar H; Weeke, Peter;

    2010-01-01

    Patients with diabetes mellitus (DM) have high risk of heart failure. Whether some of the risk is directly linked to metabolic derangements in the myocardium or whether the risk is primarily caused by coronary artery disease (CAD) and hypertension is incompletely understood. Echocardiographic tis...... tissue Doppler imaging was therefore performed in DM patients without significant CAD to examine whether DM per se influenced cardiac function....

  5. Non-conventional modeling of extreme significant wave height through random sets

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yi; LAM Jasmine Siu Lee

    2014-01-01

    The analysis and design of offshore structures necessitates the consideration of wave loads. Realistic model-ing of wave loads is particularly important to ensure reliable performance of these structures. Among the available methods for the modeling of the extreme significant wave height on a statistical basis, the peak over threshold method has attracted most attention. This method employs Poisson process to character-ize time-varying properties in the parameters of an extreme value distribution. In this paper, the peak over threshold method is reviewed and extended to account for subjectivity in the modeling. The freedom in selecting the threshold and the time span to separate extremes from the original time series data is incorpo-rated as imprecision in the model. This leads to an extension from random variables to random sets in the probabilistic model for the extreme significant wave height. The extended model is also applied to different periods of the sampled data to evaluate the significance of the climatic conditions on the uncertainties of the parameters.

  6. Significantly Enhanced Actuation Performance of IPMC by Surfactant-Assisted Processable MWCNT/Nafion Composite

    Institute of Scientific and Technical Information of China (English)

    Qingsong He; Min Yu; Dingshan Yu; Yan Ding; Zhendong Dai

    2013-01-01

    The performance of Ionic Polymer Metal Composite (IPMC) actuator was significantly enhanced by incorporating surfactant-assisted processable Multi-Walled Carbon Nanotubes (MWCNTs) into a Nation solution.Cationic surfactant Cetyl Trimethyl Ammonium Bromide (CTAB) was employed to disperse MWCNTs in the Nation matrix,forming a homogeneous and stable dispersion of nanotubes.The processing did not involve any strong acid treatment and thus effectively preserved the excellent electronic properties associated with MWCNT.The as-obtained MWCNT/Nafion-IPMC actuator was tested in terms of conductivity,bulk and surface morphology,blocking force and electric current.It was shown that the blocking force and the current of the new IPMC are 2.4 times and 1.67 times higher compared with a pure Nation-based IPMC.Moreover,the MWCNT/IPMC performance is much better than previously reported Nafion-IPMC doped by acid-treated MWCNT.Such significantly improved performance should be attributed to the improvement of electrical property associated with the addition of MWCNTs without acid treatment.

  7. Performance-oriented Organisation Modelling

    NARCIS (Netherlands)

    Popova, V.; Sharpanskykh, A.

    2006-01-01

    Each organisation exists or is created for the achievement of one or more goals. To ensure continued success, the organisation should monitor its performance with respect to the formulated goals. In practice the performance of an organization is often evaluated by estimating its performance indicato

  8. Significantly Increasing the Ductility of High Performance Polymer Semiconductors through Polymer Blending.

    Science.gov (United States)

    Scott, Joshua I; Xue, Xiao; Wang, Ming; Kline, R Joseph; Hoffman, Benjamin C; Dougherty, Daniel; Zhou, Chuanzhen; Bazan, Guillermo; O'Connor, Brendan T

    2016-06-08

    Polymer semiconductors based on donor-acceptor monomers have recently resulted in significant gains in field effect mobility in organic thin film transistors (OTFTs). These polymers incorporate fused aromatic rings and have been designed to have stiff planar backbones, resulting in strong intermolecular interactions, which subsequently result in stiff and brittle films. The complex synthesis typically required for these materials may also result in increased production costs. Thus, the development of methods to improve mechanical plasticity while lowering material consumption during fabrication will significantly improve opportunities for adoption in flexible and stretchable electronics. To achieve these goals, we consider blending a brittle donor-acceptor polymer, poly[4-(4,4-dihexadecyl-4H-cyclopenta[1,2-b:5,4-b']dithiophen-2-yl)-alt-[1,2,5]thiadiazolo[3,4-c]pyridine] (PCDTPT), with ductile poly(3-hexylthiophene). We found that the ductility of the blend films is significantly improved compared to that of neat PCDTPT films, and when the blend film is employed in an OTFT, the performance is largely maintained. The ability to maintain charge transport character is due to vertical segregation within the blend, while the improved ductility is due to intermixing of the polymers throughout the film thickness. Importantly, the application of large strains to the ductile films is shown to orient both polymers, which further increases charge carrier mobility. These results highlight a processing approach to achieve high performance polymer OTFTs that are electrically and mechanically optimized.

  9. Parameter definition using vibration prediction software leads to significant drilling performance improvements

    Energy Technology Data Exchange (ETDEWEB)

    Amorim, Dalmo; Hanley, Chris Hanley; Fonseca, Isaac; Santos, Juliana [National Oilwell Varco, Houston TX (United States); Leite, Daltro J.; Borella, Augusto; Gozzi, Danilo [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    The understanding and mitigation of downhole vibration has been a heavily researched subject in the oil industry as it results in more expensive drilling operations, as vibrations significantly diminish the amount of effective drilling energy available to the bit and generate forces that can push the bit or the Bottom Hole Assembly (BHA) off its concentric axis of rotation, producing high magnitude impacts with the borehole wall. In order to drill ahead, a sufficient amount of energy must be supplied by the rig to overcome the resistance of the drilling system, including the reactive torque of the system, drag forces, fluid pressure losses and energy dissipated by downhole vibrations, then providing the bit with the energy required to fail the rock. If the drill string enters resonant modes of vibration, not only does it decreases the amount of available energy to drill, but increases the potential for catastrophic downhole equipment and drilling bit failures. In this sense, the mitigation of downhole vibrations will result in faster, smoother, and cheaper drilling operations. A software tool using Finite Element Analysis (FEA) has been developed to provide better understanding of downhole vibration phenomena in drilling environments. The software tool calculates the response of the drilling system at various input conditions, based on the design of the wellbore along with the geometry of the Bottom Hole Assembly (BHA) and the drill string. It identifies where undesired levels of resonant vibration will be driven by certain combinations of specific drilling parameters, and also which combinations of drilling parameters will result in lower levels of vibration, so the least shocks, the highest penetration rate and the lowest cost per foot can be achieved. With the growing performance of personal computers, complex software systems modeling the drilling vibrations using FEA has been accessible to a wider audience of field users, further complimenting with real time

  10. Diabetes is associated with impaired myocardial performance in patients without significant coronary artery disease

    Directory of Open Access Journals (Sweden)

    Hansen Peter R

    2010-01-01

    Full Text Available Abstract Background Patients with diabetes mellitus (DM have high risk of heart failure. Whether some of the risk is directly linked to metabolic derangements in the myocardium or whether the risk is primarily caused by coronary artery disease (CAD and hypertension is incompletely understood. Echocardiographic tissue Doppler imaging was therefore performed in DM patients without significant CAD to examine whether DM per se influenced cardiac function. Methods Patients with a left ventricular (LV ejection fraction (EF > 35% and without significant CAD, prior myocardial infarction, cardiac pacemaker, atrial fibrillation, or significant valve disease were identified from a tertiary invasive center register. DM patients were matched with controls on age, gender and presence of hypertension. Results In total 31 patients with diabetes and 31 controls were included. Mean age was 58 ± 12 years, mean LVEF was 51 ± 7%, and 48% were women. No significant differences were found in LVEF, left atrial end systolic volume, or left ventricular dimensions. The global longitudinal strain was significantly reduced in patients with DM (15.9 ± 2.9 vs. 17.7 ± 2.9, p = 0.03, as were peak longitudinal systolic (S' and early diastolic (E' velocities (5.7 ± 1.1 vs. 6.4 ± 1.1 cm/s, p = 0.02 and 6.1 ± 1.7 vs. 7.7 ± 2.0 cm/s, p = 0.002. In multivariable regression analyses, DM remained significantly associated with impairments of S' and E', respectively. Conclusion In patients without significant CAD, DM is associated with an impaired systolic longitudinal LV function and global diastolic dysfunction. These abnormalities are likely to be markers of adverse prognosis.

  11. Significant performance enhancement in continuous wave terahertz photomixers based on fractal structures

    Science.gov (United States)

    Jafari, H.; Heidarzadeh, H.; Rostami, A.; Rostami, G.; Dolatyari, M.

    2017-01-01

    A photoconductive fractal antenna significantly improves the performance of photomixing-based continuous wave (CW) terahertz (THz) systems. An analysis has been carried out for the generation of CW-THz radiation by photomixer photoconductive antenna technique. To increase the active area for generation and hence the THz radiation power we used interdigitated electrodes that are coupled with a fractal tree antenna. In this paper, both semiconductor and electromagnetic problems are considered. Here, photomixer devices with Thue-Morse fractal tree antennas in two configurations (narrow and wide) are discussed. This new approach gives better performance, especially in the increasing of THz output power of photomixer devices, when compared with the conventional structures. In addition, applying the interdigitated electrodes improved THz photocurrent, considerably. It produces THz radiation power several times higher than the photomixers with simple gap.

  12. A conceptual model for manufacturing performance improvement

    Directory of Open Access Journals (Sweden)

    M.A. Karim

    2009-07-01

    Full Text Available Purpose: Important performance objectives manufacturers sought can be achieved through adopting the appropriate manufacturing practices. This paper presents a conceptual model proposing relationship between advanced quality practices, perceived manufacturing difficulties and manufacturing performances.Design/methodology/approach: A survey-based approach was adopted to test the hypotheses proposed in this study. The selection of research instruments for inclusion in this survey was based on literature review, the pilot case studies and relevant industrial experience of the author. A sample of 1000 manufacturers across Australia was randomly selected. Quality managers were requested to complete the questionnaire, as the task of dealing with the quality and reliability issues is a quality manager’s major responsibility.Findings: Evidence indicates that product quality and reliability is the main competitive factor for manufacturers. Design and manufacturing capability and on time delivery came second. Price is considered as the least important factor for the Australian manufacturers. Results show that collectively the advanced quality practices proposed in this study neutralize the difficulties manufacturers face and contribute to the most performance objectives of the manufacturers. The companies who have put more emphasize on the advanced quality practices have less problem in manufacturing and better performance in most manufacturing performance indices. The results validate the proposed conceptual model and lend credence to hypothesis that proposed relationship between quality practices, manufacturing difficulties and manufacturing performances.Practical implications: The model shown in this paper provides a simple yet highly effective approach to achieving significant improvements in product quality and manufacturing performance. This study introduces a relationship based ‘proactive’ quality management approach and provides great

  13. Focused R&D For Electrochromic Smart Windowsa: Significant Performance and Yield Enhancements

    Energy Technology Data Exchange (ETDEWEB)

    Mark Burdis; Neil Sbar

    2003-01-31

    There is a need to improve the energy efficiency of building envelopes as they are the primary factor governing the heating, cooling, lighting and ventilation requirements of buildings--influencing 53% of building energy use. In particular, windows contribute significantly to the overall energy performance of building envelopes, thus there is a need to develop advanced energy efficient window and glazing systems. Electrochromic (EC) windows represent the next generation of advanced glazing technology that will (1) reduce the energy consumed in buildings, (2) improve the overall comfort of the building occupants, and (3) improve the thermal performance of the building envelope. ''Switchable'' EC windows provide, on demand, dynamic control of visible light, solar heat gain, and glare without blocking the view. As exterior light levels change, the window's performance can be electronically adjusted to suit conditions. A schematic illustrating how SageGlass{reg_sign} electrochromic windows work is shown in Figure I.1. SageGlass{reg_sign} EC glazings offer the potential to save cooling and lighting costs, with the added benefit of improving thermal and visual comfort. Control over solar heat gain will also result in the use of smaller HVAC equipment. If a step change in the energy efficiency and performance of buildings is to be achieved, there is a clear need to bring EC technology to the marketplace. This project addresses accelerating the widespread introduction of EC windows in buildings and thus maximizing total energy savings in the U.S. and worldwide. We report on R&D activities to improve the optical performance needed to broadly penetrate the full range of architectural markets. Also, processing enhancements have been implemented to reduce manufacturing costs. Finally, tests are being conducted to demonstrate the durability of the EC device and the dual pane insulating glass unit (IGU) to be at least equal to that of conventional

  14. Assembly line performance and modeling

    National Research Council Canada - National Science Library

    Rane, Arun B; Sunnapwar, Vivek K

    2017-01-01

    Automobile sector forms the backbone of manufacturing sector. Vehicle assembly line is important section in automobile plant where repetitive tasks are performed one after another at different workstations...

  15. Behavioral Change and Building Performance: Strategies for Significant, Persistent, and Measurable Institutional Change

    Energy Technology Data Exchange (ETDEWEB)

    Wolfe, Amy K.; Malone, Elizabeth L.; Heerwagen, Judith H.; Dion, Jerome P.

    2014-04-01

    The people who use Federal buildings — Federal employees, operations and maintenance staff, and the general public — can significantly impact a building’s environmental performance and the consumption of energy, water, and materials. Many factors influence building occupants’ use of resources (use behaviors) including work process requirements, ability to fulfill agency missions, new and possibly unfamiliar high-efficiency/high-performance building technologies; a lack of understanding, education, and training; inaccessible information or ineffective feedback mechanisms; and cultural norms and institutional rules and requirements, among others. While many strategies have been used to introduce new occupant use behaviors that promote sustainability and reduced resource consumption, few have been verified in the scientific literature or have properly documented case study results. This paper documents validated strategies that have been shown to encourage new use behaviors that can result in significant, persistent, and measureable reductions in resource consumption. From the peer-reviewed literature, the paper identifies relevant strategies for Federal facilities and commercial buildings that focus on the individual, groups of individuals (e.g., work groups), and institutions — their policies, requirements, and culture. The paper documents methods with evidence of success in changing use behaviors and enabling occupants to effectively interact with new technologies/designs. It also provides a case study of the strategies used at a Federal facility — Fort Carson, Colorado. The paper documents gaps in the current literature and approaches, and provides topics for future research.

  16. A peptide-retrieval strategy enables significant improvement of quantitative performance without compromising confidence of identification.

    Science.gov (United States)

    Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun

    2017-01-30

    Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches. We hypothesize that more quantifiable spectra and peptides in a protein, even including less confident peptides, could help reduce variations and improve protein quantification. Hence the peptide retrieval strategy was developed and evaluated in two spike-in sample sets with different LC-MS/MS variations using both MS1- and MS2-based quantitative approach. The list of confidently identified proteins using the standard target-decoy search strategy was fixed and more spectra/peptides with less

  17. 48 CFR 1553.216-70 - EPA Form 1900-41A, CPAF Contract Summary of Significant Performance Observation.

    Science.gov (United States)

    2010-10-01

    ... Contract Summary of Significant Performance Observation. 1553.216-70 Section 1553.216-70 Federal... 1553.216-70 EPA Form 1900-41A, CPAF Contract Summary of Significant Performance Observation. As prescribed in 1516.404-278, EPA Form 1900-41A shall be used to document significant performance observations...

  18. Examining significant factors in micro and small enterprises performance: case study in Amhara region, Ethiopia

    Science.gov (United States)

    Cherkos, Tomas; Zegeye, Muluken; Tilahun, Shimelis; Avvari, Muralidhar

    2017-07-01

    Furniture manufacturing micro and small enterprises are confronted with several factors that affect their performance. Some enterprises fail to sustain, some others remain for long period of time without transforming, and most are producing similar and non-standard products. The main aim of this manuscript is on improving the performance and contribution of MSEs by analyzing impact of significant internal and external factors. Data was collected via a questionnaire, group discussion with experts and interviewing process. Randomly selected eight representative main cities of Amhara region with 120 furniture manufacturing enterprises are considered. Data analysis and presentation was made using SPSS tools (correlation, proximity, and T test) and impact-effort analysis matrix tool. The correlation analysis shows that politico-legal with infrastructure, leadership with entrepreneurship skills and finance and credit with marketing factors are those factors, which result in high correlation with Pearson correlation values of r = 0.988, 0.983, and 0.939, respectively. The study investigates that the most critical factors faced by MSEs are work premises, access to finance, infrastructure, entrepreneurship and business managerial problems. The impact of these factors is found to be high and is confirmed by the 50% drop-out rate in 2014/2015. Furthermore, more than 25% work time losses due to power interruption daily and around 65% work premises problems challenged MSEs. Further, an impact-effort matrix was developed to help the MSEs to prioritize the affecting factors.

  19. Cognitive performance modeling based on general systems performance theory.

    Science.gov (United States)

    Kondraske, George V

    2010-01-01

    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  20. Graphene Oxide Quantum Dots Covalently Functionalized PVDF Membrane with Significantly-Enhanced Bactericidal and Antibiofouling Performances

    Science.gov (United States)

    Zeng, Zhiping; Yu, Dingshan; He, Ziming; Liu, Jing; Xiao, Fang-Xing; Zhang, Yan; Wang, Rong; Bhattacharyya, Dibakar; Tan, Timothy Thatt Yang

    2016-02-01

    Covalent bonding of graphene oxide quantum dots (GOQDs) onto amino modified polyvinylidene fluoride (PVDF) membrane has generated a new type of nano-carbon functionalized membrane with significantly enhanced antibacterial and antibiofouling properties. A continuous filtration test using E. coli containing feedwater shows that the relative flux drop over GOQDs modified PVDF is 23%, which is significantly lower than those over pristine PVDF (86%) and GO-sheet modified PVDF (62%) after 10 h of filtration. The presence of GOQD coating layer effectively inactivates E. coli and S. aureus cells, and prevents the biofilm formation on the membrane surface, producing excellent antimicrobial activity and potentially antibiofouling capability, more superior than those of previously reported two-dimensional GO sheets and one-dimensional CNTs modified membranes. The distinctive antimicrobial and antibiofouling performances could be attributed to the unique structure and uniform dispersion of GOQDs, enabling the exposure of a larger fraction of active edges and facilitating the formation of oxidation stress. Furthermore, GOQDs modified membrane possesses satisfying long-term stability and durability due to the strong covalent interaction between PVDF and GOQDs. This study opens up a new synthetic avenue in the fabrication of efficient surface-functionalized polymer membranes for potential waste water treatment and biomolecules separation.

  1. METAPHOR (version 1): Users guide. [performability modeling

    Science.gov (United States)

    Furchtgott, D. G.

    1979-01-01

    General information concerning METAPHOR, an interactive software package to facilitate performability modeling and evaluation, is presented. Example systems are studied and their performabilities are calculated. Each available METAPHOR command and array generator is described. Complete METAPHOR sessions are included.

  2. Assembly line performance and modeling

    Science.gov (United States)

    Rane, Arun B.; Sunnapwar, Vivek K.

    2017-03-01

    Automobile sector forms the backbone of manufacturing sector. Vehicle assembly line is important section in automobile plant where repetitive tasks are performed one after another at different workstations. In this thesis, a methodology is proposed to reduce cycle time and time loss due to important factors like equipment failure, shortage of inventory, absenteeism, set-up, material handling, rejection and fatigue to improve output within given cost constraints. Various relationships between these factors, corresponding cost and output are established by scientific approach. This methodology is validated in three different vehicle assembly plants. Proposed methodology may help practitioners to optimize the assembly line using lean techniques.

  3. Generalization performance of regularized neural network models

    DEFF Research Database (Denmark)

    Larsen, Jan; Hansen, Lars Kai

    1994-01-01

    Architecture optimization is a fundamental problem of neural network modeling. The optimal architecture is defined as the one which minimizes the generalization error. This paper addresses estimation of the generalization performance of regularized, complete neural network models. Regularization...

  4. No significant effect of prefrontal tDCS on working memory performance in older adults

    Directory of Open Access Journals (Sweden)

    Jonna eNilsson

    2015-12-01

    Full Text Available Transcranial direct current stimulation (tDCS has been put forward as a non-pharmacological alternative for alleviating cognitive decline in old age. Although results have shown some promise, little is known about the optimal stimulation parameters for modulation in the cognitive domain. In this study, the effects of tDCS over the dorsolateral prefrontal cortex (dlPFC on working memory performance were investigated in thirty older adults. An N-back task assessed working memory before, during and after anodal tDCS at a current strength of 1mA and 2mA, in addition to sham stimulation. The study used a single-blind, cross-over design. The results revealed no significant effect of tDCS on accuracy or response times during or after stimulation, for any of the current strengths. These results suggest that a single session of tDCS over the dlPFC is unlikely to improve working memory, as assessed by an N-back task, in old age.

  5. A background error covariance model of significant wave height employing Monte Carlo simulation

    Institute of Scientific and Technical Information of China (English)

    GUO Yanyou; HOU Yijun; ZHANG Chunmei; YANG Jie

    2012-01-01

    The quality of background error statistics is one of the key components for successful assimilation of observations in a numerical model.The background error covariance(BEC)of ocean waves is generally estimated under an assumption that it is stationary over a period of time and uniform over a domain.However,error statistics are in fact functions of the physical processes governing the meteorological situation and vary with the wave condition.In this paper,we simulated the BEC of the significant wave height(SWH)employing Monte Carlo methods.An interesting result is that the BEC varies consistently with the mean wave direction(MWD).In the model domain,the BEC of the SWH decreases significantly when the MWD changes abruptly.A new BEC model of the SWH based on the correlation between the BEC and MWD was then developed.A case study of regional data assimilation was performed,where the SWH observations of buoy 22001 were used to assess the SWH hindcast.The results show that the new BEC model benefits wave prediction and allows reasonable approximations of anisotropy and inhomogeneous errors.

  6. Summary of photovoltaic system performance models

    Energy Technology Data Exchange (ETDEWEB)

    Smith, J. H.; Reiter, L. J.

    1984-01-15

    The purpose of this study is to provide a detailed overview of photovoltaics (PV) performance modeling capabilities that have been developed during recent years for analyzing PV system and component design and policy issues. A set of 10 performance models have been selected which span a representative range of capabilities from generalized first-order calculations to highly specialized electrical network simulations. A set of performance modeling topics and characteristics is defined and used to examine some of the major issues associated with photovoltaic performance modeling. Next, each of the models is described in the context of these topics and characteristics to assess its purpose, approach, and level of detail. Then each of the issues is discussed in terms of the range of model capabilities available and summarized in tabular form for quick reference. Finally, the models are grouped into categories to illustrate their purposes and perspectives.

  7. ROLE AND SIGNIFICANCE OF STATEMENT OF OTHER COMPREHENSIVE INCOME– IN RESPECT OF REPORTING COMPANIES’ PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Ildiko Orban

    2014-07-01

    Full Text Available A commonly accepted rule-system, which name was International Financial Reporting Standards (IFRS created the framework for represent the financial performace, and other facts related to the company’s health. In the system of IFRS profit is not equal to income less expenses, this deviation led to the other comprehensive income, OCI term. IFRS have created the term of other comprehensive income, but knowledge and using of it is not widespread. In this paper I tend to present the meaning and essence of this income category, and to reveal how it is work in corporate practice. As basis of the research, definitions and formats related to the statement of comprehensive income will be presented in the paper first. In order to get a clear picture about the differences between the income statements, I make a comparison of the IFRS and the Hungarian Accounting Act in the field of performance’s representation. As a result of my comparison I’ve stated that the EU accepted the international financial reporting standards to present the financial performance of publicly traded companies, and as EU member state it is obligatory for the Hungarian companies as well. This is the reason why Hungary’s present task is taking over the IFRS mentality. After the comparative analysis I’ve examined the Statement of other comprehensive income in the practice of 11 listed companies in the Budapest Stock Exchange. The Premium category includes those companies’ series of liquid shares, which has got broader investor base. The aim of this examination was to reveal if the most significant listed companies calculate other comprehensive income and what kind of items do they present in the statement of OCI. As a result of the research we can state that statement of other comprehensive income is part of the statement of total comprehensive income in general, and not an individual statement. Main items of the other comprehensive income of the examined companies are the

  8. A medal share model for Olympic performance

    OpenAIRE

    Ang Sun; Rui Wang; Zhaoguo Zhan

    2015-01-01

    A sizable empirical literature relates a nation's Olympic performance to socioeconomic factors by adopting linear regression or a Tobit approach suggested by Bernard and Busse (2004). We propose an alternative model where a nation's medal share depends on its competitiveness relative to other nations and the model is logically consistent. Empirical evidence shows that our model fits data better than the existing linear regression and Tobit model. Besides Olympic Games, the proposed model and ...

  9. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  10. Intern Performance in Three Supervisory Models

    Science.gov (United States)

    Womack, Sid T.; Hanna, Shellie L.; Callaway, Rebecca; Woodall, Peggy

    2011-01-01

    Differences in intern performance, as measured by a Praxis III-similar instrument were found between interns supervised in three supervisory models: Traditional triad model, cohort model, and distance supervision. Candidates in this study's particular form of distance supervision were not as effective as teachers as candidates in traditional-triad…

  11. Performance modeling of automated manufacturing systems

    Science.gov (United States)

    Viswanadham, N.; Narahari, Y.

    A unified and systematic treatment is presented of modeling methodologies and analysis techniques for performance evaluation of automated manufacturing systems. The book is the first treatment of the mathematical modeling of manufacturing systems. Automated manufacturing systems are surveyed and three principal analytical modeling paradigms are discussed: Markov chains, queues and queueing networks, and Petri nets.

  12. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  13. AHP SUPPORT TO ESTIMATION OF THE INFORMATION SYSTEM (IS SIGNIFICANCE TO THE BUSINESS PERFORMANCE, PARTICULARLY THE HOSPITALITY PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Daniela Garbin Praničević

    2013-02-01

    Full Text Available Numerous resources and working practice indicate that information system consists of elements generally categorized as IT infrastructure, human IT resources and IT-enabled intangibles (Bharadwaj, 2000, which, to some extent, affect business performance. However, it has not been sufficiently explored which elements of the information system tend to impact it more and which less. Therefore, the main aim of the paper is to assess certain IS elements that influence business performance in general. It is supposed that the possibility of the assessment despite the complexity of IS elements and related decisions can be supported by using the AHP method. Additionally, the obtained results are compared with the results of one of the author’s previous studies based on the hospitality manager experience regarding the relationship between IS elements and hospitality performance.

  14. Photovoltaic performance models - A report card

    Science.gov (United States)

    Smith, J. H.; Reiter, L. R.

    1985-01-01

    Models for the analysis of photovoltaic (PV) systems' designs, implementation policies, and economic performance, have proliferated while keeping pace with rapid changes in basic PV technology and extensive empirical data compiled for such systems' performance. Attention is presently given to the results of a comparative assessment of ten well documented and widely used models, which range in complexity from first-order approximations of PV system performance to in-depth, circuit-level characterizations. The comparisons were made on the basis of the performance of their subsystem, as well as system, elements. The models fall into three categories in light of their degree of aggregation into subsystems: (1) simplified models for first-order calculation of system performance, with easily met input requirements but limited capability to address more than a small variety of design considerations; (2) models simulating PV systems in greater detail, encompassing types primarily intended for either concentrator-incorporating or flat plate collector PV systems; and (3) models not specifically designed for PV system performance modeling, but applicable to aspects of electrical system design. Models ignoring subsystem failure or degradation are noted to exclude operating and maintenance characteristics as well.

  15. Performance of Information Criteria for Spatial Models.

    Science.gov (United States)

    Lee, Hyeyoung; Ghosh, Sujit K

    2009-01-01

    Model choice is one of the most crucial aspect in any statistical data analysis. It is well known that most models are just an approximation to the true data generating process but among such model approximations it is our goal to select the "best" one. Researchers typically consider a finite number of plausible models in statistical applications and the related statistical inference depends on the chosen model. Hence model comparison is required to identify the "best" model among several such candidate models. This article considers the problem of model selection for spatial data. The issue of model selection for spatial models has been addressed in the literature by the use of traditional information criteria based methods, even though such criteria have been developed based on the assumption of independent observations. We evaluate the performance of some of the popular model selection critera via Monte Carlo simulation experiments using small to moderate samples. In particular, we compare the performance of some of the most popular information criteria such as Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC), and Corrected AIC (AICc) in selecting the true model. The ability of these criteria to select the correct model is evaluated under several scenarios. This comparison is made using various spatial covariance models ranging from stationary isotropic to nonstationary models.

  16. Evidence That Bimanual Motor Timing Performance Is Not a Significant Factor in Developmental Stuttering

    Science.gov (United States)

    Hilger, Allison I.; Zelaznik, Howard; Smith, Anne

    2016-01-01

    Purpose: Stuttering involves a breakdown in the speech motor system. We address whether stuttering in its early stage is specific to the speech motor system or whether its impact is observable across motor systems. Method: As an extension of Olander, Smith, and Zelaznik (2010), we measured bimanual motor timing performance in 115 children: 70…

  17. Evidence That Bimanual Motor Timing Performance Is Not a Significant Factor in Developmental Stuttering

    Science.gov (United States)

    Hilger, Allison I.; Zelaznik, Howard; Smith, Anne

    2016-01-01

    Purpose: Stuttering involves a breakdown in the speech motor system. We address whether stuttering in its early stage is specific to the speech motor system or whether its impact is observable across motor systems. Method: As an extension of Olander, Smith, and Zelaznik (2010), we measured bimanual motor timing performance in 115 children: 70…

  18. MOF Thin Film-Coated Metal Oxide Nanowire Array: Significantly Improved Chemiresistor Sensor Performance.

    Science.gov (United States)

    Yao, Ming-Shui; Tang, Wen-Xiang; Wang, Guan-E; Nath, Bhaskar; Xu, Gang

    2016-07-01

    A strategy for combining metal oxides and metal-organic frameworks is proposed to design new materials for sensing volatile organic compounds, for the first time. The prepared ZnO@ZIF-CoZn core-sheath nanowire arrays show greatly enhanced performance not only on its selectivity but also on its response, recovery behavior, and working temperature.

  19. Student-Led Project Teams: Significance of Regulation Strategies in High- and Low-Performing Teams

    Science.gov (United States)

    Ainsworth, Judith

    2016-01-01

    We studied group and individual co-regulatory and self-regulatory strategies of self-managed student project teams using data from intragroup peer evaluations and a postproject survey. We found that high team performers shared their research and knowledge with others, collaborated to advise and give constructive criticism, and demonstrated moral…

  20. Student-Led Project Teams: Significance of Regulation Strategies in High- and Low-Performing Teams

    Science.gov (United States)

    Ainsworth, Judith

    2016-01-01

    We studied group and individual co-regulatory and self-regulatory strategies of self-managed student project teams using data from intragroup peer evaluations and a postproject survey. We found that high team performers shared their research and knowledge with others, collaborated to advise and give constructive criticism, and demonstrated moral…

  1. Labour Mobility and Plant Performance in Denmark: The Significance of Related Inflows

    DEFF Research Database (Denmark)

    Timmermans, Bram; Boschma, Ron

    This paper investigates the impact of different types of labour mobility on plant performance, making use of the IDA-database that provides detailed information on all individuals and plants for the whole of Denmark. Our study shows that the effect of labour mobility can only be assessed when one a...

  2. Modeling Performance of Plant Growth Regulators

    Directory of Open Access Journals (Sweden)

    W. C. Kreuser

    2017-03-01

    Full Text Available Growing degree day (GDD models can predict the performance of plant growth regulators (PGRs applied to creeping bentgrass ( L.. The goal of this letter is to describe experimental design strategies and modeling approaches to create PGR models for different PGRs, application rates, and turf species. Results from testing the models indicate that clipping yield should be measured until the growth response has diminished. This is in contrast to reapplication of a PGR at preselected intervals. During modeling, inclusion of an amplitude-dampening coefficient in the sinewave model allows the PGR effect to dissipate with time.

  3. The Proposal of Key Performance Indicators in Facility Management and Determination the Weights of Significance

    Science.gov (United States)

    Rimbalová, Jarmila; Vilčeková, Silvia

    2013-11-01

    The practice of facilities management is rapidly evolving with the increasing interest in the discourse of sustainable development. The industry and its market are forecasted to develop to include non-core functions, activities traditionally not associated with this profession, but which are increasingly being addressed by facilities managers. The scale of growth in the built environment and the consequential growth of the facility management sector is anticipated to be enormous. Key Performance Indicators (KPI) are measure that provides essential information about performance of facility services delivery. In selecting KPI, it is critical to limit them to those factors that are essential to the organization reaching its goals. It is also important to keep the number of KPI small just to keep everyone's attention focused on achieving the same KPIs. This paper deals with the determination of weights of KPI of FM in terms of the design and use of sustainable buildings.

  4. Cost and Performance Model for Photovoltaic Systems

    Science.gov (United States)

    Borden, C. S.; Smith, J. H.; Davisson, M. C.; Reiter, L. J.

    1986-01-01

    Lifetime cost and performance (LCP) model assists in assessment of design options for photovoltaic systems. LCP is simulation of performance, cost, and revenue streams associated with photovoltaic power systems connected to electric-utility grid. LCP provides user with substantial flexibility in specifying technical and economic environment of application.

  5. FOCUSED R&D FOR ELECTROCHROMIC SMART WINDOWS: SIGNIFICANT PERFORMANCE AND YIELD ENHANCEMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Marcus Milling

    2004-09-23

    Developments made under this program will play a key role in underpinning the technology for producing EC devices. It is anticipated that the work begun during this period will continue to improve materials properties, and drive yields up and costs down, increase durability and make manufacture simpler and more cost effective. It is hoped that this will contribute to a successful and profitable industry, which will help reduce energy consumption and improve comfort for building occupants worldwide. The first major task involved improvements to the materials used in the process. The improvements made as a result of the work done during this project have contributed to the enhanced performance, including dynamic range, uniformity and electrical characteristics. Another major objective of the project was to develop technology to improve yield, reduce cost, and facilitate manufacturing of EC products. Improvements directly attributable to the work carried out as part of this project and seen in the overall EC device performance, have been accompanied by an improvement in the repeatability and consistency of the production process. Innovative test facilities for characterizing devices in a timely and well-defined manner have been developed. The equipment has been designed in such a way as to make scaling-up to accommodate higher throughput necessary for manufacturing relatively straightforward. Finally, the third major goal was to assure the durability of the EC product, both by developments aimed at improving the product performance, as well as development of novel procedures to test the durability of this new product. Both aspects have been demonstrated, both by carrying out a number of different durability tests, both in-house and by independent third-party testers, and also developing several novel durability tests.

  6. Performance Engineering in the Community Atmosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    Worley, P; Mirin, A; Drake, J; Sawyer, W

    2006-05-30

    The Community Atmosphere Model (CAM) is the atmospheric component of the Community Climate System Model (CCSM) and is the primary consumer of computer resources in typical CCSM simulations. Performance engineering has been an important aspect of CAM development throughout its existence. This paper briefly summarizes these efforts and their impacts over the past five years.

  7. Performance of hedging strategies in interval models

    NARCIS (Netherlands)

    Roorda, Berend; Engwerda, Jacob; Schumacher, J.M.

    2005-01-01

    For a proper assessment of risks associated with the trading of derivatives, the performance of hedging strategies should be evaluated not only in the context of the idealized model that has served as the basis of strategy development, but also in the context of other models. In this paper we consid

  8. Significant performance improvement obtained in a wireless mesh network using a beamswitching antenna

    CSIR Research Space (South Africa)

    Lysko, AA

    2012-09-01

    Full Text Available , beamswitching, parasitic array I. INTRODUCTION The wireless communications experience significant growth, which can be expected to continue for at least one- two decades due to steadily growing demand and introduction of new technologies bandwidth hungry... to improve the throughput in a wireless network by an order of magnitude. . VI. REFERENCES [1] C. Balanis, Antenna Theory: Analysis and Design. 3rd Ed, Willey, 2005, 1136p. [2] C. Balanis, Introduction to Smart Antennas (Synthesis Lectures...

  9. In surgeons performing cardiothoracic surgery is sleep deprivation significant in its impact on morbidity or mortality?

    Science.gov (United States)

    Asfour, Leila; Asfour, Victoria; McCormack, David; Attia, Rizwan

    2014-09-01

    A best evidence topic in cardiac surgery was written according to a structured protocol. The question addressed was: is there a difference in cardiothoracic surgery outcomes in terms of morbidity or mortality of patients operated on by a sleep-deprived surgeon compared with those operated by a non-sleep-deprived surgeon? Reported search criteria yielded 77 papers, of which 15 were deemed to represent the best evidence on the topic. Three studies directly related to cardiothoracic surgery and 12 studies related to non-cardiothoracic surgery. Recommendations are based on 18 121 cardiothoracic patients and 214 666 non-cardiothoracic surgical patients. Different definitions of sleep deprivation were used in the studies, either reviewing surgeon's sleeping hours or out-of-hours operating. Surgical outcomes reviewed included: mortality rate, neurological, renal, pulmonary, infectious complications, length of stay, length of intensive care stay, cardiopulmonary bypass times and aortic-cross-clamp times. There were no significant differences in mortality or intraoperative complications in the groups of patients operated on by sleep-deprived versus non-sleep-deprived surgeons in cardiothoracic studies. One study showed a significant increase in the rate of septicaemia in patients operated on by severely sleep-deprived surgeons (3.6%) compared with the moderately sleep-deprived (0.9%) and non-sleep-deprived groups (0.8%) (P = 0.03). In the non-cardiothoracic studies, 7 of the 12 studies demonstrated statistically significant higher reoperation rate in trauma cases (P sleep deprivation in cardiothoracic surgeons on morbidity or mortality. However, overall the non-cardiothoracic studies have demonstrated that operative time and sleep deprivation can have a significant impact on overall morbidity and mortality. It is likely that other confounding factors concomitantly affect outcomes in out-of-hours surgery. © The Author 2014. Published by Oxford University Press on behalf of

  10. Analytical performance modeling for computer systems

    CERN Document Server

    Tay, Y C

    2013-01-01

    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems that they are interested in.Describing a complicated system abstractly with mathematical equations requires a careful choice of assumpti

  11. The relative significance of lexical richness and syntactic complexity as predictors of academic reading performance

    Directory of Open Access Journals (Sweden)

    Mehdi Karami

    2013-11-01

    Full Text Available Reading academic texts that include mainly university textbooks has been a challenge for EAP learners. There are various reasons for text difficulty; however, linguistic elements were investigated in this study. The aim of this study was to determine whether lexical richness of the readers would be a more potent predictor of their academic reading performance or their ability for producing and processing complex syntactic structures. The study involved 50 ELT teacher trainees, 25 juniors and 25 seniors, at Shahid Madani University of Azerbaijan, Iran. In a standard multiple regression design, the participants were given an opinion essay-writing task and an IELTS academic reading test. Their scores on IELTS academic reading test were regressed against LFP (Lexical Frequency Profile and MLTU (Mean Length of T-Unit indexes of their essays. LFP index is a measure of lexical richness adapted to the web for free online access under the name Web-VocabProfile, and MLTU index is a measure of syntactic complexity. Results indicated that the ability in producing and processing complex syntactic structures rather than mere grammatical knowledge can be considered as effective a predictor of academic reading comprehension as lexical richness. Therefore, lexical richness may no longer be supposed as the single most important predictor of academic reading performance.

  12. Analysing the temporal dynamics of model performance for hydrological models

    NARCIS (Netherlands)

    Reusser, D.E.; Blume, T.; Schaefli, B.; Zehe, E.

    2009-01-01

    The temporal dynamics of hydrological model performance gives insights into errors that cannot be obtained from global performance measures assigning a single number to the fit of a simulated time series to an observed reference series. These errors can include errors in data, model parameters, or m

  13. Performance modeling, loss networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi

    2009-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of understanding the phenomenon of statistical multiplexing. The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the important ideas of Palm distributions associated with traffic models and their role in performance measures. Also presented are recent ideas of large buffer, and many sources asymptotics that play an important role in understanding statistical multiplexing. I

  14. Enhanced Thermoelectric Performance of Nanostructured Bi2Te3 through Significant Phonon Scattering.

    Science.gov (United States)

    Yang, Lei; Chen, Zhi-Gang; Hong, Min; Han, Guang; Zou, Jin

    2015-10-28

    N-type Bi2Te3 nanostructures were synthesized using a solvothermal method and in turn sintered using sparking plasma sintering. The sintered n-type Bi2Te3 pellets reserved nanosized grains and showed an ultralow lattice thermal conductivity (∼0.2 W m(-1) K(-1)), which benefits from high-density small-angle grain boundaries accommodated by dislocations. Such a high phonon scattering leads an enhanced ZT of 0.88 at 400 K. This study provides an efficient method to enhance thermoelectric performance of thermoelectric nanomaterials through nanostructure engineering, making the as-prepared n-type nanostructured Bi2Te3 as a promising candidate for room-temperature thermoelectric power generation and Peltier cooling.

  15. Postexercise Glycogen Recovery and Exercise Performance is Not Significantly Different Between Fast Food and Sport Supplements.

    Science.gov (United States)

    Cramer, Michael J; Dumke, Charles L; Hailes, Walter S; Cuddy, John S; Ruby, Brent C

    2015-10-01

    A variety of dietary choices are marketed to enhance glycogen recovery after physical activity. Past research informs recommendations regarding the timing, dose, and nutrient compositions to facilitate glycogen recovery. This study examined the effects of isoenergetic sport supplements (SS) vs. fast food (FF) on glycogen recovery and exercise performance. Eleven males completed two experimental trials in a randomized, counterbalanced order. Each trial included a 90-min glycogen depletion ride followed by a 4-hr recovery period. Absolute amounts of macronutrients (1.54 ± 0.27 g·kg-1 carbohydrate, 0.24 ± 0.04 g·kg fat-1, and 0.18 ±0.03g·kg protein-1) as either SS or FF were provided at 0 and 2 hr. Muscle biopsies were collected from the vastus lateralis at 0 and 4 hr post exercise. Blood samples were analyzed at 0, 30, 60, 120, 150, 180, and 240 min post exercise for insulin and glucose, with blood lipids analyzed at 0 and 240 min. A 20k time-trial (TT) was completed following the final muscle biopsy. There were no differences in the blood glucose and insulin responses. Similarly, rates of glycogen recovery were not different across the diets (6.9 ± 1.7 and 7.9 ± 2.4 mmol·kg wet weight- 1·hr-1 for SS and FF, respectively). There was also no difference across the diets for TT performance (34.1 ± 1.8 and 34.3 ± 1.7 min for SS and FF, respectively. These data indicate that short-term food options to initiate glycogen resynthesis can include dietary options not typically marketed as sports nutrition products such as fast food menu items.

  16. Prognostic significance of performing universal HER2 testing in cases of advanced gastric cancer.

    Science.gov (United States)

    Jiménez-Fonseca, Paula; Carmona-Bayonas, Alberto; Sánchez Lorenzo, Maria Luisa; Plazas, Javier Gallego; Custodio, Ana; Hernández, Raquel; Garrido, Marcelo; García, Teresa; Echavarría, Isabel; Cano, Juana María; Rodríguez Palomo, Alberto; Mangas, Monserrat; Macías Declara, Ismael; Ramchandani, Avinash; Visa, Laura; Viudez, Antonio; Buxó, Elvira; Díaz-Serrano, Asunción; López, Carlos; Azkarate, Aitor; Longo, Federico; Castañón, Eduardo; Sánchez Bayona, Rodrigo; Pimentel, Paola; Limón, Maria Luisa; Cerdá, Paula; Álvarez Llosa, Renata; Serrano, Raquel; Lobera, Maria Pilar Felices; Alsina, María; Hurtado Nuño, Alicia; Gómez-Martin, Carlos

    2017-05-01

    Trastuzumab significantly improves overall survival (OS) when added to cisplatin and fluoropyrimidine as a treatment for HER2-positive advanced gastric cancers (AGC). The aim of this study was to evaluate the impact of the gradual implementation of HER2 testing on patient prognosis in a national registry of AGC. This Spanish National Cancer Registry includes cases who were consecutively recruited at 28 centers from January 2008 to January 2016. The effect of missing HER2 status was assessed using stratified Cox proportional hazards (PH) regression. The rate of HER2 testing increased steadily over time, from 58.3 % in 2008 to 92.9 % in 2016. HER2 was positive in 194 tumors (21.3 %). In the stratified Cox PH regression, each 1 % increase in patients who were not tested for HER2 at the institutions was associated with an approximately 0.3 % increase in the risk of death: hazard ratio, 1.0035 (CI 95 %, 1.001-1.005), P = 0.0019. Median OS was significantly lower at institutions with the highest proportions of patients who were not tested for HER2. Patients treated at centers that took longer to implement HER2 testing exhibited worse clinical outcomes. The speed of implementation behaves as a quality-of-care indicator. Reviewed guidelines on HER2 testing should be used to achieve this goal in a timely manner.

  17. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    Science.gov (United States)

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  18. Towards Systematic Benchmarking of Climate Model Performance

    Science.gov (United States)

    Gleckler, P. J.

    2014-12-01

    The process by which climate models are evaluated has evolved substantially over the past decade, with the Coupled Model Intercomparison Project (CMIP) serving as a centralizing activity for coordinating model experimentation and enabling research. Scientists with a broad spectrum of expertise have contributed to the CMIP model evaluation process, resulting in many hundreds of publications that have served as a key resource for the IPCC process. For several reasons, efforts are now underway to further systematize some aspects of the model evaluation process. First, some model evaluation can now be considered routine and should not require "re-inventing the wheel" or a journal publication simply to update results with newer models. Second, the benefit of CMIP research to model development has not been optimal because the publication of results generally takes several years and is usually not reproducible for benchmarking newer model versions. And third, there are now hundreds of model versions and many thousands of simulations, but there is no community-based mechanism for routinely monitoring model performance changes. An important change in the design of CMIP6 can help address these limitations. CMIP6 will include a small set standardized experiments as an ongoing exercise (CMIP "DECK": ongoing Diagnostic, Evaluation and Characterization of Klima), so that modeling groups can submit them at any time and not be overly constrained by deadlines. In this presentation, efforts to establish routine benchmarking of existing and future CMIP simulations will be described. To date, some benchmarking tools have been made available to all CMIP modeling groups to enable them to readily compare with CMIP5 simulations during the model development process. A natural extension of this effort is to make results from all CMIP simulations widely available, including the results from newer models as soon as the simulations become available for research. Making the results from routine

  19. The significance of orbital anatomy and periocular wrinkling when performing laser skin resurfacing.

    Science.gov (United States)

    Trelles, M A; Pardo, L; Benedetto, A V; García-Solana, L; Torrens, J

    2000-03-01

    Knowledge of orbital anatomy and the interaction of muscle contractions, gravitational forces and photoagingis fundamental in understanding the limitations of carbon dioxide (CO2) laser skin resurfacing when rejuvenating the skin of the periocular area. Laser resurfacing does not change the mimetic behavior of the facial muscles nor does it influence gravitational forces. When resurfacing periocular tissue, the creation of scleral show and ectropion are a potential consequence when there is an over zealous attempt at improving the sagging malar fat pad and eyelid laxity by performing an excess amount of laser passes at the lateral portion of the lower eyelid. This results in an inadvertent widening of the palpebral fissure due to the lateral pull of the Orbicularis oculi. Retrospectively, 85 patients were studied, who had undergone periorbital resurfacing with a CO2 laser using anew treatment approach. The Sharplan 40C CO2 Feather Touchlaser was programmed with a circular scanning pattern and used just for the shoulders of the wrinkles. A final laser pass was performed with the same program over the entire lower eyelid skin surface, excluding the outer lateral portion (e.g. a truncated triangle-like area),corresponding to the lateral canthus. Only a single laser pass was delivered to the lateral canthal triangle to avoid widening the lateral opening of the eyelid, which might lead to the potential complications of scleral show and ectropion. When the area of the crows' feet is to be treated, three passes on the skin of this entire lateral orbital surface are completed by moving laterally and upward toward the hairline. Patients examined on days 1, 7, 15, 30, 60, and one year after laser resurfacing showed good results. At two months after treatment, the clinical improvement was rated by the patient and physician as being "very good" in 81 of the 85 patients reviewed. These patients underwent laser resurfacing without complications. The proposed technique of

  20. Significance tests to determine the direction of effects in linear regression models.

    Science.gov (United States)

    Wiedermann, Wolfgang; Hagmann, Michael; von Eye, Alexander

    2015-02-01

    Previous studies have discussed asymmetric interpretations of the Pearson correlation coefficient and have shown that higher moments can be used to decide on the direction of dependence in the bivariate linear regression setting. The current study extends this approach by illustrating that the third moment of regression residuals may also be used to derive conclusions concerning the direction of effects. Assuming non-normally distributed variables, it is shown that the distribution of residuals of the correctly specified regression model (e.g., Y is regressed on X) is more symmetric than the distribution of residuals of the competing model (i.e., X is regressed on Y). Based on this result, 4 one-sample tests are discussed which can be used to decide which variable is more likely to be the response and which one is more likely to be the explanatory variable. A fifth significance test is proposed based on the differences of skewness estimates, which leads to a more direct test of a hypothesis that is compatible with direction of dependence. A Monte Carlo simulation study was performed to examine the behaviour of the procedures under various degrees of associations, sample sizes, and distributional properties of the underlying population. An empirical example is given which illustrates the application of the tests in practice.

  1. Performance results of HESP physical model

    Science.gov (United States)

    Chanumolu, Anantha; Thirupathi, Sivarani; Jones, Damien; Giridhar, Sunetra; Grobler, Deon; Jakobsson, Robert

    2017-02-01

    As a continuation to the published work on model based calibration technique with HESP(Hanle Echelle Spectrograph) as a case study, in this paper we present the performance results of the technique. We also describe how the open parameters were chosen in the model for optimization, the glass data accuracy and handling the discrepancies. It is observed through simulations that the discrepancies in glass data can be identified but not quantifiable. So having an accurate glass data is important which is possible to obtain from the glass manufacturers. The model's performance in various aspects is presented using the ThAr calibration frames from HESP during its pre-shipment tests. Accuracy of model predictions and its wave length calibration comparison with conventional empirical fitting, the behaviour of open parameters in optimization, model's ability to track instrumental drifts in the spectrum and the double fibres performance were discussed. It is observed that the optimized model is able to predict to a high accuracy the drifts in the spectrum from environmental fluctuations. It is also observed that the pattern in the spectral drifts across the 2D spectrum which vary from image to image is predictable with the optimized model. We will also discuss the possible science cases where the model can contribute.

  2. Critical review of glass performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Bourcier, W.L. [Lawrence Livermore National Lab., CA (United States)

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process.

  3. More Use of Peritoneal Dialysis Gives Significant Savings: A Systematic Review and Health Economic Decision Model

    Science.gov (United States)

    Pike, Eva; Hamidi, Vida; Ringerike, Tove; Wisloff, Torbjorn; Klemp, Marianne

    2017-01-01

    Background Patients with end-stage renal disease (ESRD) are in need of renal replacement therapy as dialysis and/or transplantation. The prevalence of ESRD and, thus, the need for dialysis are constantly growing. The dialysis modalities are either peritoneal performed at home or hemodialysis (HD) performed in-center (hospital or satellite) or home. We examined effectiveness and cost-effectiveness of HD performed at different locations (hospital, satellite, and home) and peritoneal dialysis (PD) at home in the Norwegian setting. Methods We conducted a systematic review for patients above 18 years with end-stage renal failure requiring dialysis in several databases and performed several meta-analyses of existing literature. Mortality and major complications that required were our main clinical outcomes. The quality of the evidence for each outcome was evaluated using GRADE. Cost-effectiveness was assessed by developing a probabilistic Markov model. The analysis was carried out from a societal perspective, and effects were expressed in quality-adjusted life-years. Uncertainties in the base-case parameter values were explored with a probabilistic sensitivity analysis. Scenario analyses were conducted by increasing the proportion of patients receiving PD with a corresponding reduction in HD patients in-center both for Norway and Europian Union. We assumed an annual growth rate of 4% in the number of dialysis patients, and a relative distribution between PD and HD in-center of 30% and 70%, respectively. Results From a societal perspective and over a 5-year time horizon, PD was the most cost-effective dialysis alternative. We found no significant difference in mortality between peritoneal and HD modalities. Our scenario analyses showed that a shift toward more patients on PD (as a first choice) with a corresponding reduction in HD in-center gave a saving over a 5-year period of 32 and 10,623 million EURO, respectively, for Norway and the European Union. Conclusions PD was

  4. Significance of radiation models in investigating the flow phenomena around a Jovian entry body

    Science.gov (United States)

    Tiwari, S. N.; Subramanian, S. V.

    1978-01-01

    Formulation is presented to demonstrate the significance of a simplified radiation model in investigating the flow-phenomena in the viscous radiating shock layer of a Jovian entry body. For this, a nongray absorption model for hydrogen-helium gas is developed which consists of 30 steps over the spectral range of 0-20 eV. By employing this model results were obtained for temperature, pressure, density, and radiative flux in the shock layer and along the body surface. These are compared with results of two sophisticated radiative transport models available in the literature. Use of the present radiation model results in significant reduction in computational time. Results of this model are found to be in general agreement with results of other models. It is concluded that use of the present model is justified in investigating the flow phenomena around a Jovian entry body because it is relatively simple, computationally fast, and yields fairly accurate results.

  5. A novel approach to achieving significant reverberation control in performance halls

    Science.gov (United States)

    Conant, David A.; Chu, William

    2005-09-01

    Conventional methods for achieving broadband, variable sound absorption in large halls normally include heavy application of sound-absorptive drapery and/or thick fibrous panels, applied near available surfaces below, at, and in volumes above the catwalk plane. Occasionally, direct adjustments to room air volume are also provided to effect double-sloped decays. The novel method described here combines carefully located, broad scattering and absorption in singular architectural elements and was applied to a new, 1200-seat concert hall. A change of 0.70 s RT60 in midfrequency is achieved in a visually dramatic manner while neither materially changing room volume nor introducing often-maligned drapery. The aggregate of reverberation control methodologies employed reduces the unoccupied RT60 at midfrequencies from about 3.2 to 1.7 s in this space programed principally for music, including pipe organ. Results of MLS measurements including binaural measurements and binaural recordings of anechoic material and CATT-acoustic modeling and auralizations are discussed.

  6. Performance modeling, stochastic networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi R

    2013-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of introducing an appropriate mathematical framework for modeling and analysis as well as understanding the phenomenon of statistical multiplexing. The models, techniques, and results presented form the core of traffic engineering methods used to design, control and allocate resources in communication networks.The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the importan

  7. PV performance modeling workshop summary report.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua S.; Tasca, Coryne Adelle (SRA International, Inc., Fairfax, VA); Cameron, Christopher P.

    2011-05-01

    During the development of a solar photovoltaic (PV) energy project, predicting expected energy production from a system is a key part of understanding system value. System energy production is a function of the system design and location, the mounting configuration, the power conversion system, and the module technology, as well as the solar resource. Even if all other variables are held constant, annual energy yield (kWh/kWp) will vary among module technologies because of differences in response to low-light levels and temperature. A number of PV system performance models have been developed and are in use, but little has been published on validation of these models or the accuracy and uncertainty of their output. With support from the U.S. Department of Energy's Solar Energy Technologies Program, Sandia National Laboratories organized a PV Performance Modeling Workshop in Albuquerque, New Mexico, September 22-23, 2010. The workshop was intended to address the current state of PV system models, develop a path forward for establishing best practices on PV system performance modeling, and set the stage for standardization of testing and validation procedures for models and input parameters. This report summarizes discussions and presentations from the workshop, as well as examines opportunities for collaborative efforts to develop objective comparisons between models and across sites and applications.

  8. Generating Performance Models for Irregular Applications

    Energy Technology Data Exchange (ETDEWEB)

    Friese, Ryan D.; Tallent, Nathan R.; Vishnu, Abhinav; Kerbyson, Darren J.; Hoisie, Adolfy

    2017-05-30

    Many applications have irregular behavior --- non-uniform input data, input-dependent solvers, irregular memory accesses, unbiased branches --- that cannot be captured using today's automated performance modeling techniques. We describe new hierarchical critical path analyses for the \\Palm model generation tool. To create a model's structure, we capture tasks along representative MPI critical paths. We create a histogram of critical tasks with parameterized task arguments and instance counts. To model each task, we identify hot instruction-level sub-paths and model each sub-path based on data flow, instruction scheduling, and data locality. We describe application models that generate accurate predictions for strong scaling when varying CPU speed, cache speed, memory speed, and architecture. We present results for the Sweep3D neutron transport benchmark; Page Rank on multiple graphs; Support Vector Machine with pruning; and PFLOTRAN's reactive flow/transport solver with domain-induced load imbalance.

  9. Quasi-degenerate Neutrino mass models and their significance: A model independent investigation

    CERN Document Server

    Roy, S

    2016-01-01

    The prediction of possible ordering of neutrino masses relies mostly on the model selected. Alienating the $\\mu-\\tau$ interchange symmetry from discrete flavour symmetry based models, turns the neutrino mass matrix less predictive. But this inspires one to seek the answer from other phenomenological frameworks. We need a proper parametrization of the neutrino mass matrices concerning individual hierarchies. In the present work, we attempt to study the six different cases of Quasi-degenerate (QDN) neutrino models. The related mass matrices, $m_{LL}^{\

  10. Significant uncertainty in global scale hydrological modeling from precipitation data errors

    NARCIS (Netherlands)

    Weiland, Frederiek C. Sperna; Vrugt, Jasper A.; van Beek, Rens (L. ) P. H.; Weerts, Albrecht H.; Bierkens, Marc F. P.

    2015-01-01

    In the past decades significant progress has been made in the fitting of hydrologic models to data. Most of this work has focused on simple, CPU-efficient, lumped hydrologic models using discharge, water table depth, soil moisture, or tracer data from relatively small river basins. In this paper, we

  11. Significant uncertainty in global scale hydrological modeling from precipitation data erros

    NARCIS (Netherlands)

    Sperna Weiland, F.; Vrugt, J.A.; Beek, van P.H.; Weerts, A.H.; Bierkens, M.F.P.

    2015-01-01

    In the past decades significant progress has been made in the fitting of hydrologic models to data. Most of this work has focused on simple, CPU-efficient, lumped hydrologic models using discharge, water table depth, soil moisture, or tracer data from relatively small river basins. In this paper, we

  12. Significance of model credibility in estimating climate projection distributions for regional hydroclimatological risk assessments

    Science.gov (United States)

    Brekke, L.D.; Dettinger, M.D.; Maurer, E.P.; Anderson, M.

    2008-01-01

    Ensembles of historical climate simulations and climate projections from the World Climate Research Programme's (WCRP's) Coupled Model Intercomparison Project phase 3 (CMIP3) multi-model dataset were investigated to determine how model credibility affects apparent relative scenario likelihoods in regional risk assessments. Methods were developed and applied in a Northern California case study. An ensemble of 59 twentieth century climate simulations from 17 WCRP CMIP3 models was analyzed to evaluate relative model credibility associated with a 75-member projection ensemble from the same 17 models. Credibility was assessed based on how models realistically reproduced selected statistics of historical climate relevant to California climatology. Metrics of this credibility were used to derive relative model weights leading to weight-threshold culling of models contributing to the projection ensemble. Density functions were then estimated for two projected quantities (temperature and precipitation), with and without considering credibility-based ensemble reductions. An analysis for Northern California showed that, while some models seem more capable at recreating limited aspects twentieth century climate, the overall tendency is for comparable model performance when several credibility measures are combined. Use of these metrics to decide which models to include in density function development led to local adjustments to function shapes, but led to limited affect on breadth and central tendency, which were found to be more influenced by 'completeness' of the original ensemble in terms of models and emissions pathways. ?? 2007 Springer Science+Business Media B.V.

  13. The Significance of the Bystander Effect: Modeling, Experiments, and More Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Brenner, David J.

    2009-07-22

    Non-targeted (bystander) effects of ionizing radiation are caused by intercellular signaling; they include production of DNA damage and alterations in cell fate (i.e. apoptosis, differentiation, senescence or proliferation). Biophysical models capable of quantifying these effects may improve cancer risk estimation at radiation doses below the epidemiological detection threshold. Understanding the spatial patterns of bystander responses is important, because it provides estimates of how many bystander cells are affected per irradiated cell. In a first approach to modeling of bystander spatial effects in a three-dimensional artificial tissue, we assumed the following: (1) The bystander phenomenon results from signaling molecules (S) that rapidly propagate from irradiated cells and decrease in concentration (exponentially in the case of planar symmetry) as distance increases. (2) These signals can convert cells to a long-lived epigenetically activated state, e.g. a state of oxidative stress; cells in this state are more prone to DNA damage and behavior alterations than normal and therefore exhibit an increased response (R) for many end points (e.g. apoptosis, differentiation, micronucleation). These assumptions were implemented by a mathematical formalism and computational algorithms. The model adequately described data on bystander responses in the 3D system using a small number of adjustable parameters. Mathematical models of radiation carcinogenesis are important for understanding mechanisms and for interpreting or extrapolating risk. There are two classes of such models: (1) long-term formalisms that track pre-malignant cell numbers throughout an entire lifetime but treat initial radiation dose-response simplistically and (2) short-term formalisms that provide a detailed initial dose-response even for complicated radiation protocols, but address its modulation during the subsequent cancer latency period only indirectly. We argue that integrating short- and long

  14. Introducing Model Predictive Control for Improving Power Plant Portfolio Performance

    DEFF Research Database (Denmark)

    Edlund, Kristian Skjoldborg; Bendtsen, Jan Dimon; Børresen, Simon

    2008-01-01

    This paper introduces a model predictive control (MPC) approach for construction of a controller for balancing the power generation against consumption in a power system. The objective of the controller is to coordinate a portfolio consisting of multiple power plant units in the effort to perform...... reference tracking and disturbance rejection in an economically optimal way. The performance function is chosen as a mixture of the `1-norm and a linear weighting to model the economics of the system. Simulations show a significant improvement of the performance of the MPC compared to the current...

  15. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  16. Novel submicronized rebamipide liquid with moderate viscosity: significant effects on oral mucositis in animal models.

    Science.gov (United States)

    Nakashima, Takako; Sako, Nobutomo; Matsuda, Takakuni; Uematsu, Naoya; Sakurai, Kazushi; Ishida, Tatsuhiro

    2014-01-01

    This study aimed at developing a novel rebamipide liquid for an effective treatment of oral mucositis. The healing effects of a variety of liquids comprising submicronized rebamipide crystals were investigated using a rat cauterization-induced oral ulcer model. Whereas 2% rebamipide liquid comprising micro-crystals did not exhibit significant curative effect, 2% rebamipide liquids comprising submicronized crystals with moderate viscosities exhibited healing effects following intra-oral administration. The 2% and 4% optimized rebamipide liquids showed significant healing effects in the rat oral ulcer model (prebamipide liquid significantly reduced the percent area of ulcerated injury (prebamipide liquid with moderate viscosity following intra-oral administration showed better both healing effect in the rat oral ulcer model and preventive effect in the rat irradiation-induced glossitis model.

  17. The quest for significance model of radicalization: implications for the management of terrorist detainees.

    Science.gov (United States)

    Dugas, Michelle; Kruglanski, Arie W

    2014-01-01

    Radicalization and its culmination in terrorism represent a grave threat to the security and stability of the world. A related challenge is effective management of extremists who are detained in prison facilities. The major aim of this article is to review the significance quest model of radicalization and its implications for management of terrorist detainees. First, we review the significance quest model, which elaborates on the roles of motivation, ideology, and social processes in radicalization. Secondly, we explore the implications of the model in relation to the risks of prison radicalization. Finally, we analyze the model's implications for deradicalization strategies and review preliminary evidence for the effectiveness of a rehabilitation program targeting components of the significance quest. Based on this evidence, we argue that the psychology of radicalization provides compelling reason for the inclusion of deradicalization efforts as an essential component of the management of terrorist detainees. Copyright © 2014 John Wiley & Sons, Ltd.

  18. Analysing the temporal dynamics of model performance for hydrological models

    Directory of Open Access Journals (Sweden)

    D. E. Reusser

    2008-11-01

    Full Text Available The temporal dynamics of hydrological model performance gives insights into errors that cannot be obtained from global performance measures assigning a single number to the fit of a simulated time series to an observed reference series. These errors can include errors in data, model parameters, or model structure. Dealing with a set of performance measures evaluated at a high temporal resolution implies analyzing and interpreting a high dimensional data set. This paper presents a method for such a hydrological model performance assessment with a high temporal resolution and illustrates its application for two very different rainfall-runoff modeling case studies. The first is the Wilde Weisseritz case study, a headwater catchment in the eastern Ore Mountains, simulated with the conceptual model WaSiM-ETH. The second is the Malalcahuello case study, a headwater catchment in the Chilean Andes, simulated with the physics-based model Catflow. The proposed time-resolved performance assessment starts with the computation of a large set of classically used performance measures for a moving window. The key of the developed approach is a data-reduction method based on self-organizing maps (SOMs and cluster analysis to classify the high-dimensional performance matrix. Synthetic peak errors are used to interpret the resulting error classes. The final outcome of the proposed method is a time series of the occurrence of dominant error types. For the two case studies analyzed here, 6 such error types have been identified. They show clear temporal patterns which can lead to the identification of model structural errors.

  19. Analysing the temporal dynamics of model performance for hydrological models

    Directory of Open Access Journals (Sweden)

    E. Zehe

    2009-07-01

    Full Text Available The temporal dynamics of hydrological model performance gives insights into errors that cannot be obtained from global performance measures assigning a single number to the fit of a simulated time series to an observed reference series. These errors can include errors in data, model parameters, or model structure. Dealing with a set of performance measures evaluated at a high temporal resolution implies analyzing and interpreting a high dimensional data set. This paper presents a method for such a hydrological model performance assessment with a high temporal resolution and illustrates its application for two very different rainfall-runoff modeling case studies. The first is the Wilde Weisseritz case study, a headwater catchment in the eastern Ore Mountains, simulated with the conceptual model WaSiM-ETH. The second is the Malalcahuello case study, a headwater catchment in the Chilean Andes, simulated with the physics-based model Catflow. The proposed time-resolved performance assessment starts with the computation of a large set of classically used performance measures for a moving window. The key of the developed approach is a data-reduction method based on self-organizing maps (SOMs and cluster analysis to classify the high-dimensional performance matrix. Synthetic peak errors are used to interpret the resulting error classes. The final outcome of the proposed method is a time series of the occurrence of dominant error types. For the two case studies analyzed here, 6 such error types have been identified. They show clear temporal patterns, which can lead to the identification of model structural errors.

  20. High temperature furnace modeling and performance verifications

    Science.gov (United States)

    Smith, James E., Jr.

    1992-01-01

    Analytical, numerical, and experimental studies were performed on two classes of high temperature materials processing sources for their potential use as directional solidification furnaces. The research concentrated on a commercially available high temperature furnace using a zirconia ceramic tube as the heating element and an Arc Furnace based on a tube welder. The first objective was to assemble the zirconia furnace and construct parts needed to successfully perform experiments. The 2nd objective was to evaluate the zirconia furnace performance as a directional solidification furnace element. The 3rd objective was to establish a data base on materials used in the furnace construction, with particular emphasis on emissivities, transmissivities, and absorptivities as functions of wavelength and temperature. A 1-D and 2-D spectral radiation heat transfer model was developed for comparison with standard modeling techniques, and were used to predict wall and crucible temperatures. The 4th objective addressed the development of a SINDA model for the Arc Furnace and was used to design sample holders and to estimate cooling media temperatures for the steady state operation of the furnace. And, the 5th objective addressed the initial performance evaluation of the Arc Furnace and associated equipment for directional solidification. Results of these objectives are presented.

  1. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, J.; Whitmore, J.; Kaffine, L.; Blair, N.; Dobos, A. P.

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  2. Human visual performance model for crewstation design

    Science.gov (United States)

    Larimer, James; Prevost, Michael; Arditi, Aries; Azueta, Steven; Bergen, James; Lubin, Jeffrey

    1991-01-01

    An account is given of a Visibility Modeling Tool (VMT) which furnishes a crew-station designer with the means to assess configurational tradeoffs, with a view to the impact of various options on the unambiguous access of information to the pilot. The interactive interface of the VMT allows the manipulation of cockpit geometry, ambient lighting, pilot ergonomics, and the displayed symbology. Performance data can be displayed in the form of 3D contours into the crewstation graphic model, thereby yielding an indication of the operator's visual capabilities.

  3. Construction Of A Performance Assessment Model For Zakat Management Institutions

    Directory of Open Access Journals (Sweden)

    Sri Fadilah

    2016-12-01

    Full Text Available The objective of the research is to examine the performance evaluation using Balanced Scorecard model. The research is conducted due to a big gap existing between zakat (alms and religious tax in Islam with its potential earn of as much as 217 trillion rupiahs and the realization of the collected zakat fund that is only reached for three trillion. This indicates that the performance of zakat management organizations in collecting the zakat is still very low. On the other hand, the quantity and the quality of zakat management organizations have to be improved. This means the performance evaluation model as a tool to evaluate performance is needed. The model construct is making a performance evaluation model that can be implemented to zakat management organizations. The organizational performance with Balanced Scorecard evaluation model will be effective if it is supported by three aspects, namely:  PI, BO and TQM. This research uses explanatory method and data analysis tool of SEM/PLS. Data collecting technique are questionnaires, interviews and documentation. The result of this research shows that PI, BO and TQM simultaneously and partially gives a significant effect on organizational performance.

  4. Teaching physical activities to students with significant disabilities using video modeling.

    Science.gov (United States)

    Cannella-Malone, Helen I; Mizrachi, Sharona V; Sabielny, Linsey M; Jimenez, Eliseo D

    2013-06-01

    The objective of this study was to examine the effectiveness of video modeling on teaching physical activities to three adolescents with significant disabilities. The study implemented a multiple baseline across six physical activities (three per student): jumping rope, scooter board with cones, ladder drill (i.e., feet going in and out), ladder design (i.e., multiple steps), shuttle run, and disc ride. Additional prompt procedures (i.e., verbal, gestural, visual cues, and modeling) were implemented within the study. After the students mastered the physical activities, we tested to see if they would link the skills together (i.e., complete an obstacle course). All three students made progress learning the physical activities, but only one learned them with video modeling alone (i.e., without error correction). Video modeling can be an effective tool for teaching students with significant disabilities various physical activities, though additional prompting procedures may be needed.

  5. Hybrid Modeling Improves Health and Performance Monitoring

    Science.gov (United States)

    2007-01-01

    Scientific Monitoring Inc. was awarded a Phase I Small Business Innovation Research (SBIR) project by NASA's Dryden Flight Research Center to create a new, simplified health-monitoring approach for flight vehicles and flight equipment. The project developed a hybrid physical model concept that provided a structured approach to simplifying complex design models for use in health monitoring, allowing the output or performance of the equipment to be compared to what the design models predicted, so that deterioration or impending failure could be detected before there would be an impact on the equipment's operational capability. Based on the original modeling technology, Scientific Monitoring released I-Trend, a commercial health- and performance-monitoring software product named for its intelligent trending, diagnostics, and prognostics capabilities, as part of the company's complete ICEMS (Intelligent Condition-based Equipment Management System) suite of monitoring and advanced alerting software. I-Trend uses the hybrid physical model to better characterize the nature of health or performance alarms that result in "no fault found" false alarms. Additionally, the use of physical principles helps I-Trend identify problems sooner. I-Trend technology is currently in use in several commercial aviation programs, and the U.S. Air Force recently tapped Scientific Monitoring to develop next-generation engine health-management software for monitoring its fleet of jet engines. Scientific Monitoring has continued the original NASA work, this time under a Phase III SBIR contract with a joint NASA-Pratt & Whitney aviation security program on propulsion-controlled aircraft under missile-damaged aircraft conditions.

  6. Multilevel Modeling of the Performance Variance

    Directory of Open Access Journals (Sweden)

    Alexandre Teixeira Dias

    2012-12-01

    Full Text Available Focusing on the identification of the role played by Industry on the relations between Corporate Strategic Factors and Performance, the hierarchical multilevel modeling method was adopted when measuring and analyzing the relations between the variables that comprise each level of analysis. The adequacy of the multilevel perspective to the study of the proposed relations was identified and the relative importance analysis point out to the lower relevance of industry as a moderator of the effects of corporate strategic factors on performance, when the latter was measured by means of return on assets, and that industry don‟t moderates the relations between corporate strategic factors and Tobin‟s Q. The main conclusions of the research are that the organizations choices in terms of corporate strategy presents a considerable influence and plays a key role on the determination of performance level, but that industry should be considered when analyzing the performance variation despite its role as a moderator or not of the relations between corporate strategic factors and performance.

  7. A new model to estimate significant wave heights with ERS-1/2 scatterometer data

    Institute of Scientific and Technical Information of China (English)

    GUO Jie; HE Yijun; William Perrie; SHEN Hui; CHU Xiaoqing

    2009-01-01

    A new model is proposed to estimate the significant wave heights with ERS-1/2 scatterometer data. The results show that the relationship between wave parameters and radar backscattering cross section is similar to that between wind and the radar backscattering cross section. Therefore, the relationship between significant wave height and the radar backscattering cross section is established with a neural network algorithm, which is, if the average wave period is ≤7s, the root mean square of significant wave height retrieved from ERS-1/2 data is 0.51 m, or 0.72 m if it is >7s otherwise.

  8. Assessment of performance of survival prediction models for cancer prognosis

    Directory of Open Access Journals (Sweden)

    Chen Hung-Chia

    2012-07-01

    Full Text Available Abstract Background Cancer survival studies are commonly analyzed using survival-time prediction models for cancer prognosis. A number of different performance metrics are used to ascertain the concordance between the predicted risk score of each patient and the actual survival time, but these metrics can sometimes conflict. Alternatively, patients are sometimes divided into two classes according to a survival-time threshold, and binary classifiers are applied to predict each patient’s class. Although this approach has several drawbacks, it does provide natural performance metrics such as positive and negative predictive values to enable unambiguous assessments. Methods We compare the survival-time prediction and survival-time threshold approaches to analyzing cancer survival studies. We review and compare common performance metrics for the two approaches. We present new randomization tests and cross-validation methods to enable unambiguous statistical inferences for several performance metrics used with the survival-time prediction approach. We consider five survival prediction models consisting of one clinical model, two gene expression models, and two models from combinations of clinical and gene expression models. Results A public breast cancer dataset was used to compare several performance metrics using five prediction models. 1 For some prediction models, the hazard ratio from fitting a Cox proportional hazards model was significant, but the two-group comparison was insignificant, and vice versa. 2 The randomization test and cross-validation were generally consistent with the p-values obtained from the standard performance metrics. 3 Binary classifiers highly depended on how the risk groups were defined; a slight change of the survival threshold for assignment of classes led to very different prediction results. Conclusions 1 Different performance metrics for evaluation of a survival prediction model may give different conclusions in

  9. Strategies for Testing Statistical and Practical Significance in Detecting DIF with Logistic Regression Models

    Science.gov (United States)

    Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza

    2014-01-01

    This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…

  10. Innovations in individual feature history management - The significance of feature-based temporal model

    Science.gov (United States)

    Choi, J.; Seong, J.C.; Kim, B.; Usery, E.L.

    2008-01-01

    A feature relies on three dimensions (space, theme, and time) for its representation. Even though spatiotemporal models have been proposed, they have principally focused on the spatial changes of a feature. In this paper, a feature-based temporal model is proposed to represent the changes of both space and theme independently. The proposed model modifies the ISO's temporal schema and adds new explicit temporal relationship structure that stores temporal topological relationship with the ISO's temporal primitives of a feature in order to keep track feature history. The explicit temporal relationship can enhance query performance on feature history by removing topological comparison during query process. Further, a prototype system has been developed to test a proposed feature-based temporal model by querying land parcel history in Athens, Georgia. The result of temporal query on individual feature history shows the efficiency of the explicit temporal relationship structure. ?? Springer Science+Business Media, LLC 2007.

  11. Modelling the predictive performance of credit scoring

    Directory of Open Access Journals (Sweden)

    Shi-Wei Shen

    2013-02-01

    Full Text Available Orientation: The article discussed the importance of rigour in credit risk assessment.Research purpose: The purpose of this empirical paper was to examine the predictive performance of credit scoring systems in Taiwan.Motivation for the study: Corporate lending remains a major business line for financial institutions. However, in light of the recent global financial crises, it has become extremely important for financial institutions to implement rigorous means of assessing clients seeking access to credit facilities.Research design, approach and method: Using a data sample of 10 349 observations drawn between 1992 and 2010, logistic regression models were utilised to examine the predictive performance of credit scoring systems.Main findings: A test of Goodness of fit demonstrated that credit scoring models that incorporated the Taiwan Corporate Credit Risk Index (TCRI, micro- and also macroeconomic variables possessed greater predictive power. This suggests that macroeconomic variables do have explanatory power for default credit risk.Practical/managerial implications: The originality in the study was that three models were developed to predict corporate firms’ defaults based on different microeconomic and macroeconomic factors such as the TCRI, asset growth rates, stock index and gross domestic product.Contribution/value-add: The study utilises different goodness of fits and receiver operator characteristics during the examination of the robustness of the predictive power of these factors.

  12. Human performance modeling for system of systems analytics: combat performance-shaping factors.

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R.; Miller, Dwight Peter

    2006-01-01

    The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate they would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.

  13. Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator

    Science.gov (United States)

    Rehman, Naveed Ur; Siddiqui, Mubashir Ali

    2017-01-01

    In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.

  14. Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator

    Science.gov (United States)

    Rehman, Naveed Ur; Siddiqui, Mubashir Ali

    2017-03-01

    In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.

  15. CASTOR detector Model, objectives and simulated performance

    CERN Document Server

    Angelis, Aris L S; Bartke, Jerzy; Bogolyubsky, M Yu; Chileev, K; Erine, S; Gladysz-Dziadus, E; Kharlov, Yu V; Kurepin, A B; Lobanov, M O; Maevskaya, A I; Mavromanolakis, G; Nicolis, N G; Panagiotou, A D; Sadovsky, S A; Wlodarczyk, Z

    2001-01-01

    We present a phenomenological model describing the formation and evolution of a Centauro fireball in the baryon-rich region in nucleus-nucleus interactions in the upper atmosphere and at the LHC. The small particle multiplicity and imbalance of electromagnetic and hadronic content characterizing a Centauro event and also the strongly penetrating particles (assumed to be strangelets) frequently accompanying them can be naturally explained. We describe the CASTOR calorimeter, a subdetector of the ALICE experiment dedicated to the search for Centauro in the very forward, baryon-rich region of central Pb+Pb collisions at the LHC. The basic characteristics and simulated performance of the calorimeter are presented. (22 refs).

  16. CASTOR detector. Model, objectives and simulated performance

    Energy Technology Data Exchange (ETDEWEB)

    Angelis, A. L. S.; Mavromanolakis, G.; Panagiotou, A. D. [University of Athens, Nuclear and Particle Physics Division, Athens (Greece); Aslanoglou, X.; Nicolis, N. [Ioannina Univ., Ioannina (Greece). Dept. of Physics; Bartke, J.; Gladysz-Dziadus, E. [Institute of Nuclear Physics, Cracow (Poland); Lobanov, M.; Erine, S.; Kharlov, Y.V.; Bogolyubsky, M.Y. [Institute for High Energy Physics, Protvino (Russian Federation); Kurepin, A.B.; Chileev, K. [Institute for Nuclear Research, Moscow (Russian Federation); Wlodarczyk, Z. [Pedagogical University, Institute of Physics, Kielce (Poland)

    2001-10-01

    It is presented a phenomenological model describing the formation and evolution of a Centauro fireball in the baryon-rich region in nucleus-nucleus interactions in the upper atmosphere and at the LHC. The small particle multiplicity and imbalance of electromagnetic and hadronic content characterizing a Centauro event and also the strongly penetrating particles (assumed to be strangelets) frequently accompanying them can be naturally explained. It is described the CASTOR calorimeter, a sub detector of the ALICE experiment dedicated to the search for Centauro in the very forward, baryon-rich region of central Pb+Pb collisions at the LHC. The basic characteristics and simulated performance of the calorimeter are presented.

  17. A quantitative approach for assessing significant improvements in elite sprint performance: has IGF-1 entered the arena?

    Science.gov (United States)

    Ernst, Simon; Simon, Perikles

    2013-06-01

    The introduction of doping substances and methods in sports triggers noticeable effects on physical performance in metric sports. Here, we use time series analysis to investigate the recent development in male and female elite sprinting performance. Time series displaying the average of the world's top 20 athletes were analyzed employing polynomial spline functions and moving averages. Outstanding changes in performance over time were statistically analyzed by Welch's t-test and by Cohen's measurements of effect. For validation we exemplarily show that our analysis is capable of indicating the effect of the introduction of in- and out-of-competition doping testing on women's shot put as well as the effects of the market introduction of erythropoietin (EPO) and the introduction of EPO and continuous erythropoiesis receptor activator (CERA) testing on 5000 m top 20 male performances. Time series analysis for 100 m men reveals a highly significant (p < 0.001) drop by more than 0.1 s from 2006 to 2011 with a large effect size of 0.952. This is roughly half of the effect size that can be found for the development of the 5000 m performance during the introduction of EPO between 1991 and 1996. While the men's 200 m sprinting performance shows a similar development, the women's 100 m and 200 m sprinting performances only show some minor abnormalities. We will discuss here why the striking sex-specific improvement in sprinting performance is indicative for a novel, very effective doping procedure with insulin-like growth factor-1 (IGF-1) being the primary candidate explaining the observed effects.

  18. Computer modeling of thermoelectric generator performance

    Science.gov (United States)

    Chmielewski, A. B.; Shields, V.

    1982-01-01

    Features of the DEGRA 2 computer code for simulating the operations of a spacecraft thermoelectric generator are described. The code models the physical processes occurring during operation. Input variables include the thermoelectric couple geometry and composition, the thermoelectric materials' properties, interfaces and insulation in the thermopile, the heat source characteristics, mission trajectory, and generator electrical requirements. Time steps can be specified and sublimation of the leg and hot shoe is accounted for, as are shorts between legs. Calculations are performed for conduction, Peltier, Thomson, and Joule heating, the cold junction can be adjusted for solar radition, and the legs of the thermoelectric couple are segmented to enhance the approximation accuracy. A trial run covering 18 couple modules yielded data with 0.3% accuracy with regard to test data. The model has been successful with selenide materials, SiGe, and SiN4, with output of all critical operational variables.

  19. Mantis: Predicting System Performance through Program Analysis and Modeling

    CERN Document Server

    Chun, Byung-Gon; Lee, Sangmin; Maniatis, Petros; Naik, Mayur

    2010-01-01

    We present Mantis, a new framework that automatically predicts program performance with high accuracy. Mantis integrates techniques from programming language and machine learning for performance modeling, and is a radical departure from traditional approaches. Mantis extracts program features, which are information about program execution runs, through program instrumentation. It uses machine learning techniques to select features relevant to performance and creates prediction models as a function of the selected features. Through program analysis, it then generates compact code slices that compute these feature values for prediction. Our evaluation shows that Mantis can achieve more than 93% accuracy with less than 10% training data set, which is a significant improvement over models that are oblivious to program features. The system generates code slices that are cheap to compute feature values.

  20. Using animal models to determine the significance of complement activation in Alzheimer's disease

    Directory of Open Access Journals (Sweden)

    Loeffler David A

    2004-10-01

    Full Text Available Abstract Complement inflammation is a major inflammatory mechanism whose function is to promote the removal of microorganisms and the processing of immune complexes. Numerous studies have provided evidence for an increase in this process in areas of pathology in the Alzheimer's disease (AD brain. Because complement activation proteins have been demonstrated in vitro to exert both neuroprotective and neurotoxic effects, the significance of this process in the development and progression of AD is unclear. Studies in animal models of AD, in which brain complement activation can be experimentally altered, should be of value for clarifying this issue. However, surprisingly little is known about complement activation in the transgenic animal models that are popular for studying this disorder. An optimal animal model for studying the significance of complement activation on Alzheimer's – related neuropathology should have complete complement activation associated with senile plaques, neurofibrillary tangles (if present, and dystrophic neurites. Other desirable features include both classical and alternative pathway activation, increased neuronal synthesis of native complement proteins, and evidence for an increase in complement activation prior to the development of extensive pathology. In order to determine the suitability of different animal models for studying the role of complement activation in AD, the extent of complement activation and its association with neuropathology in these models must be understood.

  1. Computational model of sustained acceleration effects on human cognitive performance.

    Science.gov (United States)

    McKinlly, Richard A; Gallimore, Jennie J

    2013-08-01

    Extreme acceleration maneuvers encountered in modern agile fighter aircraft can wreak havoc on human physiology, thereby significantly influencing cognitive task performance. As oxygen content declines under acceleration stress, the activity of high order cortical tissue reduces to ensure sufficient metabolic resources are available for critical life-sustaining autonomic functions. Consequently, cognitive abilities reliant on these affected areas suffer significant performance degradations. The goal was to develop and validate a model capable of predicting human cognitive performance under acceleration stress. Development began with creation of a proportional control cardiovascular model that produced predictions of several hemodynamic parameters, including eye-level blood pressure and regional cerebral oxygen saturation (rSo2). An algorithm was derived to relate changes in rSo2 within specific brain structures to performance on cognitive tasks that require engagement of different brain areas. Data from the "precision timing" experiment were then used to validate the model predicting cognitive performance as a function of G(z) profile. The following are value ranges. Results showed high agreement between the measured and predicted values for the rSo2 (correlation coefficient: 0.7483-0.8687; linear best-fit slope: 0.5760-0.9484; mean percent error: 0.75-3.33) and cognitive performance models (motion inference task--correlation coefficient: 0.7103-0.9451; linear best-fit slope: 0.7416-0.9144; mean percent error: 6.35-38.21; precision timing task--correlation coefficient: 0.6856-0.9726; linear best-fit slope: 0.5795-1.027; mean percent error: 6.30-17.28). The evidence suggests that the model is capable of accurately predicting cognitive performance of simplistic tasks under high acceleration stress.

  2. Serum NX-DCP as a New Noninvasive Model to Predict Significant Liver Fibrosis in Chronic Hepatitis C.

    Science.gov (United States)

    Saito, Masaya; Yano, Yoshihiko; Hirano, Hirotaka; Momose, Kenji; Yoshida, Masaru; Azuma, Takeshi

    2015-02-01

    Finding a noninvasive method to predict liver fibrosis using inexpensive and easy-to-use markers is important. We aimed to clarify whether NX-des-γ-carboxyprothrombin (NX-DCP) could become a new noninvasive model to predict liver fibrosis in hepatitis C virus (HCV) related liver disease. We performed a prospective cohort study on a consecutive group of 101 patients who underwent liver biopsy for HCV-related liver disease at Kobe University Hospital. Laboratory measurements were performed on the same day as the biopsy. Factors associated with significant fibrosis (F3-4) were assessed by multivariate analyses. A comparison of predictive ability between multivariate factors and abovementioned noninvasive models was also performed. Increase in serum NX-DCP was significantly related to increase in fibrosis stage (P = 0.006). Moreover, NX-DCP was a multivariate factor associated with the presence of significant fibrosis F 3-4 (median 21 of F0-2 group vs. median 22 of F3-4 group with P = 0.002). The AUC of NX-DCP showed no significant differences compared with those of the AST-to-platelet ratio index (APRI), modified-APRI, the Göteborg University Cirrhosis Index (GUCI), the Lok index, the Hui score, cirrhosis discriminating score (CDS) and the Pohl score (P > 0.05). NX-DCP correlated positively with fibrosis stage and could discriminate well between HCV-related patients with or without significant fibrosis. Moreover, NX-DCP had a similar predictive ability to the abovementioned models, and thereby could be a new noninvasive prediction tool for fibrosis.

  3. A more robust model of the biodiesel reaction, allowing identification of process conditions for significantly enhanced rate and water tolerance.

    Science.gov (United States)

    Eze, Valentine C; Phan, Anh N; Harvey, Adam P

    2014-03-01

    A more robust kinetic model of base-catalysed transesterification than the conventional reaction scheme has been developed. All the relevant reactions in the base-catalysed transesterification of rapeseed oil (RSO) to fatty acid methyl ester (FAME) were investigated experimentally, and validated numerically in a model implemented using MATLAB. It was found that including the saponification of RSO and FAME side reactions and hydroxide-methoxide equilibrium data explained various effects that are not captured by simpler conventional models. Both the experiment and modelling showed that the "biodiesel reaction" can reach the desired level of conversion (>95%) in less than 2min. Given the right set of conditions, the transesterification can reach over 95% conversion, before the saponification losses become significant. This means that the reaction must be performed in a reactor exhibiting good mixing and good control of residence time, and the reaction mixture must be quenched rapidly as it leaves the reactor.

  4. Optical Performance Modeling of FUSE Telescope Mirror

    Science.gov (United States)

    Saha, Timo T.; Ohl, Raymond G.; Friedman, Scott D.; Moos, H. Warren

    2000-01-01

    We describe the Metrology Data Processor (METDAT), the Optical Surface Analysis Code (OSAC), and their application to the image evaluation of the Far Ultraviolet Spectroscopic Explorer (FUSE) mirrors. The FUSE instrument - designed and developed by the Johns Hopkins University and launched in June 1999 is an astrophysics satellite which provides high resolution spectra (lambda/Delta(lambda) = 20,000 - 25,000) in the wavelength region from 90.5 to 118.7 nm The FUSE instrument is comprised of four co-aligned, normal incidence, off-axis parabolic mirrors, four Rowland circle spectrograph channels with holographic gratings, and delay line microchannel plate detectors. The OSAC code provides a comprehensive analysis of optical system performance, including the effects of optical surface misalignments, low spatial frequency deformations described by discrete polynomial terms, mid- and high-spatial frequency deformations (surface roughness), and diffraction due to the finite size of the aperture. Both normal incidence (traditionally infrared, visible, and near ultraviolet mirror systems) and grazing incidence (x-ray mirror systems) systems can be analyzed. The code also properly accounts for reflectance losses on the mirror surfaces. Low frequency surface errors are described in OSAC by using Zernike polynomials for normal incidence mirrors and Legendre-Fourier polynomials for grazing incidence mirrors. The scatter analysis of the mirror is based on scalar scatter theory. The program accepts simple autocovariance (ACV) function models or power spectral density (PSD) models derived from mirror surface metrology data as input to the scatter calculation. The end product of the program is a user-defined pixel array containing the system Point Spread Function (PSF). The METDAT routine is used in conjunction with the OSAC program. This code reads in laboratory metrology data in a normalized format. The code then fits the data using Zernike polynomials for normal incidence

  5. Food restriction alters salivary cortisol and α-amylase responses to a simulated weightlifting competition without significant performance modification.

    Science.gov (United States)

    Durguerian, Alexandre; Filaire, Edith; Drogou, Catherine; Bougard, Clément; Chennaoui, Mounir

    2017-05-05

    The aim of this investigation was to evaluate the effect of a 6-day food restriction period on the physiological responses and performance of 11 high-level weightlifters. After a period of weight maintenance (T2), they were assigned into two groups depending on whether they lost (Diet group, n = 6) or maintained their body weight (Control group, n = 5) during the course of those 6 days. An evaluation of performance and the measurement of salivary cortisol concentrations and salivary α-amylase (sAA) activity were performed during a simulated weightlifting competition which took place at T2, after a 6-day period of food restriction (T3). Dietary data were collected using a 6-day diet record. We noted a 41.8% decrease in mean energy intake during the dietary restriction period, leading to a 4.34% weight loss for the Diet group. Dietary restriction did not modify absolute performance levels, whilst a significant improvement was noted for the Control group. Furthermore, we noted a response of decreased salivary cortisol and increased sAA activity to the simulated competition stress at T3 for the Diet group. These results may indicate that dietary reduction led to a dissociation of the hypothalamo-pituitary-adrenal axis and the sympatho-adreno-medullary system, which could impair training adaptations and absolute performance development.

  6. The COD Model: Simulating Workgroup Performance

    Science.gov (United States)

    Biggiero, Lucio; Sevi, Enrico

    Though the question of the determinants of workgroup performance is one of the most central in organization science, precise theoretical frameworks and formal demonstrations are still missing. In order to fill in this gap the COD agent-based simulation model is here presented and used to study the effects of task interdependence and bounded rationality on workgroup performance. The first relevant finding is an algorithmic demonstration of the ordering of interdependencies in terms of complexity, showing that the parallel mode is the most simplex, followed by the sequential and then by the reciprocal. This result is far from being new in organization science, but what is remarkable is that now it has the strength of an algorithmic demonstration instead of being based on the authoritativeness of some scholar or on some episodic empirical finding. The second important result is that the progressive introduction of realistic limits to agents' rationality dramatically reduces workgroup performance and addresses to a rather interesting result: when agents' rationality is severely bounded simple norms work better than complex norms. The third main finding is that when the complexity of interdependence is high, then the appropriate coordination mechanism is agents' direct and active collaboration, which means teamwork.

  7. Human performance modeling for system of systems analytics :soldier fatigue.

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R.; Campbell, James E.; Miller, Dwight Peter

    2005-10-01

    The military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives as can be seen in the Department of Defense's (DoD) Defense Modeling and Simulation Office's (DMSO) Master Plan (DoD 5000.59-P 1995). To this goal, the military is currently spending millions of dollars on programs devoted to HPM in various military contexts. Examples include the Human Performance Modeling Integration (HPMI) program within the Air Force Research Laboratory, which focuses on integrating HPMs with constructive models of systems (e.g. cockpit simulations) and the Navy's Human Performance Center (HPC) established in September 2003. Nearly all of these initiatives focus on the interface between humans and a single system. This is insufficient in the era of highly complex network centric SoS. This report presents research and development in the area of HPM in a system-of-systems (SoS). Specifically, this report addresses modeling soldier fatigue and the potential impacts soldier fatigue can have on SoS performance.

  8. Performance Management: A model and research agenda

    NARCIS (Netherlands)

    D.N. den Hartog (Deanne); J.P.P.E.F. Boselie (Paul); J. Paauwe (Jaap)

    2004-01-01

    textabstractPerformance Management deals with the challenge organizations face in defining, measuring and stimulating employee performance with the ultimate goal to improve organizational performance. Thus, Performance Management involves multiple levels of analysis and is clearly linked to the topi

  9. Performance on a computerized shopping task significantly predicts real world functioning in persons diagnosed with bipolar disorder.

    Science.gov (United States)

    Laloyaux, Julien; Pellegrini, Nadia; Mourad, Haitham; Bertrand, Hervé; Domken, Marc-André; Van der Linden, Martial; Larøi, Frank

    2013-12-15

    Persons diagnosed with bipolar disorder often suffer from cognitive impairments. However, little is known concerning how these cognitive deficits impact their real world functioning. We developed a computerized real-life activity task, where participants are required to shop for a list of grocery store items. Twenty one individuals diagnosed with bipolar disorder and 21 matched healthy controls were administered the computerized shopping task. Moreover, the patient group was assessed with a battery of cognitive tests and clinical scales. Performance on the shopping task significantly differentiated patients and healthy controls for two variables: Total time to complete the shopping task and Mean time spent to consult the shopping list. Moreover, in the patient group, performance on these variables from the shopping task correlated significantly with cognitive functioning (i.e. processing speed, verbal episodic memory, planning, cognitive flexibility, and inhibition) and with clinical variables including duration of illness and real world functioning. Finally, variables from the shopping task were found to significantly explain 41% of real world functioning of patients diagnosed with bipolar disorder. These findings suggest that the shopping task provides a good indication of real world functioning and cognitive functioning of persons diagnosed with bipolar disorder. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. DKIST Polarization Modeling and Performance Predictions

    Science.gov (United States)

    Harrington, David

    2016-05-01

    Calibrating the Mueller matrices of large aperture telescopes and associated coude instrumentation requires astronomical sources and several modeling assumptions to predict the behavior of the system polarization with field of view, altitude, azimuth and wavelength. The Daniel K Inouye Solar Telescope (DKIST) polarimetric instrumentation requires very high accuracy calibration of a complex coude path with an off-axis f/2 primary mirror, time dependent optical configurations and substantial field of view. Polarization predictions across a diversity of optical configurations, tracking scenarios, slit geometries and vendor coating formulations are critical to both construction and contined operations efforts. Recent daytime sky based polarization calibrations of the 4m AEOS telescope and HiVIS spectropolarimeter on Haleakala have provided system Mueller matrices over full telescope articulation for a 15-reflection coude system. AEOS and HiVIS are a DKIST analog with a many-fold coude optical feed and similar mirror coatings creating 100% polarization cross-talk with altitude, azimuth and wavelength. Polarization modeling predictions using Zemax have successfully matched the altitude-azimuth-wavelength dependence on HiVIS with the few percent amplitude limitations of several instrument artifacts. Polarization predictions for coude beam paths depend greatly on modeling the angle-of-incidence dependences in powered optics and the mirror coating formulations. A 6 month HiVIS daytime sky calibration plan has been analyzed for accuracy under a wide range of sky conditions and data analysis algorithms. Predictions of polarimetric performance for the DKIST first-light instrumentation suite have been created under a range of configurations. These new modeling tools and polarization predictions have substantial impact for the design, fabrication and calibration process in the presence of manufacturing issues, science use-case requirements and ultimate system calibration

  11. Numerical modeling capabilities to predict repository performance

    Energy Technology Data Exchange (ETDEWEB)

    1979-09-01

    This report presents a summary of current numerical modeling capabilities that are applicable to the design and performance evaluation of underground repositories for the storage of nuclear waste. The report includes codes that are available in-house, within Golder Associates and Lawrence Livermore Laboratories; as well as those that are generally available within the industry and universities. The first listing of programs are in-house codes in the subject areas of hydrology, solute transport, thermal and mechanical stress analysis, and structural geology. The second listing of programs are divided by subject into the following categories: site selection, structural geology, mine structural design, mine ventilation, hydrology, and mine design/construction/operation. These programs are not specifically designed for use in the design and evaluation of an underground repository for nuclear waste; but several or most of them may be so used.

  12. Performance model to predict overall defect density

    Directory of Open Access Journals (Sweden)

    J Venkatesh

    2012-08-01

    Full Text Available Management by metrics is the expectation from the IT service providers to stay as a differentiator. Given a project, the associated parameters and dynamics, the behaviour and outcome need to be predicted. There is lot of focus on the end state and in minimizing defect leakage as much as possible. In most of the cases, the actions taken are re-active. It is too late in the life cycle. Root cause analysis and corrective actions can be implemented only to the benefit of the next project. The focus has to shift left, towards the execution phase than waiting for lessons to be learnt post the implementation. How do we pro-actively predict defect metrics and have a preventive action plan in place. This paper illustrates the process performance model to predict overall defect density based on data from projects in an organization.

  13. Significance of settling model structures and parameter subsets in modelling WWTPs under wet-weather flow and filamentous bulking conditions.

    Science.gov (United States)

    Ramin, Elham; Sin, Gürkan; Mikkelsen, Peter Steen; Plósz, Benedek Gy

    2014-10-15

    Current research focuses on predicting and mitigating the impacts of high hydraulic loadings on centralized wastewater treatment plants (WWTPs) under wet-weather conditions. The maximum permissible inflow to WWTPs depends not only on the settleability of activated sludge in secondary settling tanks (SSTs) but also on the hydraulic behaviour of SSTs. The present study investigates the impacts of ideal and non-ideal flow (dry and wet weather) and settling (good settling and bulking) boundary conditions on the sensitivity of WWTP model outputs to uncertainties intrinsic to the one-dimensional (1-D) SST model structures and parameters. We identify the critical sources of uncertainty in WWTP models through global sensitivity analysis (GSA) using the Benchmark simulation model No. 1 in combination with first- and second-order 1-D SST models. The results obtained illustrate that the contribution of settling parameters to the total variance of the key WWTP process outputs significantly depends on the influent flow and settling conditions. The magnitude of the impact is found to vary, depending on which type of 1-D SST model is used. Therefore, we identify and recommend potential parameter subsets for WWTP model calibration, and propose optimal choice of 1-D SST models under different flow and settling boundary conditions. Additionally, the hydraulic parameters in the second-order SST model are found significant under dynamic wet-weather flow conditions. These results highlight the importance of developing a more mechanistic based flow-dependent hydraulic sub-model in second-order 1-D SST models in the future.

  14. Significant Term List Based Metadata Conceptual Mining Model for Effective Text Clustering

    Directory of Open Access Journals (Sweden)

    J. Janet

    2012-01-01

    Full Text Available As the engineering world are growing fast, the usage of data for the day to day activity of the engineering industry also growing rapidly. In order to handle and to find the hidden knowledge from huge data storage, data mining is very helpful right now. Text mining, network mining, multimedia mining, trend analysis are few applications of data mining. In text mining, there are variety of methods are proposed by many researchers, even though high precision, better recall are still is a critical issues. In this study, text mining is focused and conceptual mining model is applied for improved clustering in the text mining. The proposed work is termed as Meta data Conceptual Mining Model (MCMM, is validated with few world leading technical digital library data sets such as IEEE, ACM and Scopus. The performance derived as precision, recall are described in terms of Entropy, F-Measure which are calculated and compared with existing term based model and concept based mining model.

  15. Performance potential for simulating spin models on GPU

    CERN Document Server

    Weigel, Martin

    2011-01-01

    Graphics processing units (GPUs) are recently being used to an increasing degree for general computational purposes. This development is motivated by their theoretical peak performance, which significantly exceeds that of broadly available CPUs. For practical purposes, however, it is far from clear how much of this theoretical performance can be realized in actual scientific applications. As is discussed here for the case of studying classical spin models of statistical mechanics by Monte Carlo simulations, only an explicit tailoring of the involved algorithms to the specific architecture under consideration allows to harvest the computational power of GPU systems. A number of examples, ranging from Metropolis simulations of ferromagnetic Ising models, over continuous Heisenberg and disordered spin-glass systems to parallel-tempering simulations are discussed. Significant speed-ups by factors of up to 1000 compared to serial CPU code as well as previous GPU implementations are observed.

  16. Performance potential for simulating spin models on GPU

    Science.gov (United States)

    Weigel, Martin

    2012-04-01

    Graphics processing units (GPUs) are recently being used to an increasing degree for general computational purposes. This development is motivated by their theoretical peak performance, which significantly exceeds that of broadly available CPUs. For practical purposes, however, it is far from clear how much of this theoretical performance can be realized in actual scientific applications. As is discussed here for the case of studying classical spin models of statistical mechanics by Monte Carlo simulations, only an explicit tailoring of the involved algorithms to the specific architecture under consideration allows to harvest the computational power of GPU systems. A number of examples, ranging from Metropolis simulations of ferromagnetic Ising models, over continuous Heisenberg and disordered spin-glass systems to parallel-tempering simulations are discussed. Significant speed-ups by factors of up to 1000 compared to serial CPU code as well as previous GPU implementations are observed.

  17. Reactive puff model SCICHEM: Model enhancements and performance studies

    Science.gov (United States)

    Chowdhury, B.; Karamchandani, P. K.; Sykes, R. I.; Henn, D. S.; Knipping, E.

    2015-09-01

    The SCICHEM model incorporates complete gas phase, aqueous and aerosol phase chemistry within a state-of-the-science Gaussian puff model SCIPUFF (Second-order Closure Integrated Puff). The model is a valuable tool that can be used to calculate the impacts of a single source or a small number of sources on downwind ozone and PM2.5. The model has flexible data requirements: it can be run with routine surface and upper air observations or with prognostic meteorological model outputs and source emissions are specified in a simple text format. This paper describes significant advances to the dispersion and chemistry components of the model in the latest release, SCICHEM 3.0. Some of the major advancements include modeling of skewed turbulence for convective boundary layer and updated chemistry schemes (CB05 gas phase chemical mechanism; AERO5 aerosol and aqueous modules). The results from SCICHEM 3.0 are compared with observations from a tracer study as well as aircraft measurements of reactive species in power plant plumes from two field studies. The results with the tracer experiment (Copenhagen study) show that the incorporation of skewed turbulence improves the calculation of tracer dispersion and transport. The comparisons with the Cumberland and Dolet Hills power plume measurements show good correlation between the observed and predicted concentrations of reactive gaseous species at most downwind distances from the source.

  18. High-performance phase-field modeling

    KAUST Repository

    Vignal, Philippe

    2015-04-27

    Many processes in engineering and sciences involve the evolution of interfaces. Among the mathematical frameworks developed to model these types of problems, the phase-field method has emerged as a possible solution. Phase-fields nonetheless lead to complex nonlinear, high-order partial differential equations, whose solution poses mathematical and computational challenges. Guaranteeing some of the physical properties of the equations has lead to the development of efficient algorithms and discretizations capable of recovering said properties by construction [2, 5]. This work builds-up on these ideas, and proposes novel discretization strategies that guarantee numerical energy dissipation for both conserved and non-conserved phase-field models. The temporal discretization is based on a novel method which relies on Taylor series and ensures strong energy stability. It is second-order accurate, and can also be rendered linear to speed-up the solution process [4]. The spatial discretization relies on Isogeometric Analysis, a finite element method that possesses the k-refinement technology and enables the generation of high-order, high-continuity basis functions. These basis functions are well suited to handle the high-order operators present in phase-field models. Two-dimensional and three dimensional results of the Allen-Cahn, Cahn-Hilliard, Swift-Hohenberg and phase-field crystal equation will be presented, which corroborate the theoretical findings, and illustrate the robustness of the method. Results related to more challenging examples, namely the Navier-Stokes Cahn-Hilliard and a diusion-reaction Cahn-Hilliard system, will also be presented. The implementation was done in PetIGA and PetIGA-MF, high-performance Isogeometric Analysis frameworks [1, 3], designed to handle non-linear, time-dependent problems.

  19. Significance of Kinetics for Sorption on Inorganic Colloids: Modeling and Data Interpretation Issues

    Science.gov (United States)

    Painter, S.; Cvetkovic, V.; Pickett, D.; Turner, D.

    2001-12-01

    Irreversible or slowly reversible attachment to inorganic colloids is a process that may enhance radionuclide transport in the environment. An understanding of sorption kinetics is critical in evaluating this process. A two-site kinetic model for sorption on inorganic colloids is developed and used to evaluate laboratory data. This model was developed as an alternative to the equilibrium colloid sorption model employed by the U.S. Department of Energy (DOE) in their performance assessment for the proposed repository for high-level nuclear waste at Yucca Mountain, Nevada. The model quantifies linear first-order sorption on two types of hypothetical sites (fast and slow) characterized by two pairs of rates (forward and reverse). We use the model to explore data requirements for long-term predictive calculations and to evaluate laboratory kinetic sorption data of Lu et al. Five batch sorption data sets are considered with Pu(V) as the tracer and montmorillonite, hematite, silica, and smectite as colloids. Using asymptotic results applicable on the 240 hour time-scale of the experiments, a robust estimation procedure is developed for the fast-site partitioning coefficient and the slow forward rate. The estimated range for the partition coefficient is 1.1-76 L/g; the range for the slow forward rate is 0.0017-0.02 L/h. Comparison of one-site and two-site sorption interpretations reveals the difficulty in discriminating between the two models for montmorillonite and to a lesser extent for hematite. For silica and smectite the two-site model clearly provides a better representation of the data as compared with a single site model. Kinetic data for silica are available for different colloid concentrations (0.2 g/L and 1.0 g/L). For the range of experimental conditions considered, the forward rate appears to be independent of the colloid concentration. The slow reverse rate cannot be estimated on the time scale of the experiments; we estimate the detection limits for the

  20. Significantly improved HIV inhibitor efficacy prediction employing proteochemometric models generated from antivirogram data.

    Directory of Open Access Journals (Sweden)

    Gerard J P van Westen

    Full Text Available Infection with HIV cannot currently be cured; however it can be controlled by combination treatment with multiple anti-retroviral drugs. Given different viral genotypes for virtually each individual patient, the question now arises which drug combination to use to achieve effective treatment. With the availability of viral genotypic data and clinical phenotypic data, it has become possible to create computational models able to predict an optimal treatment regimen for an individual patient. Current models are based only on sequence data derived from viral genotyping; chemical similarity of drugs is not considered. To explore the added value of chemical similarity inclusion we applied proteochemometric models, combining chemical and protein target properties in a single bioactivity model. Our dataset was a large scale clinical database of genotypic and phenotypic information (in total ca. 300,000 drug-mutant bioactivity data points, 4 (NNRTI, 8 (NRTI or 9 (PI drugs, and 10,700 (NNRTI 10,500 (NRTI or 27,000 (PI mutants. Our models achieved a prediction error below 0.5 Log Fold Change. Moreover, when directly compared with previously published sequence data, derived models PCM performed better in resistance classification and prediction of Log Fold Change (0.76 log units versus 0.91. Furthermore, we were able to successfully confirm both known and identify previously unpublished, resistance-conferring mutations of HIV Reverse Transcriptase (e.g. K102Y, T216M and HIV Protease (e.g. Q18N, N88G from our dataset. Finally, we applied our models prospectively to the public HIV resistance database from Stanford University obtaining a correct resistance prediction rate of 84% on the full set (compared to 80% in previous work on a high quality subset. We conclude that proteochemometric models are able to accurately predict the phenotypic resistance based on genotypic data even for novel mutants and mixtures. Furthermore, we add an applicability domain to

  1. Comprehensive Simulation Lifecycle Management for High Performance Computing Modeling and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M&S) users. Performing large-scale, massively...

  2. Evaluating performances of simplified physically based models for landslide susceptibility

    Directory of Open Access Journals (Sweden)

    G. Formetta

    2015-12-01

    Full Text Available Rainfall induced shallow landslides cause loss of life and significant damages involving private and public properties, transportation system, etc. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. Reliable models' applications involve: automatic parameters calibration, objective quantification of the quality of susceptibility maps, model sensitivity analysis. This paper presents a methodology to systemically and objectively calibrate, verify and compare different models and different models performances indicators in order to individuate and eventually select the models whose behaviors are more reliable for a certain case study. The procedure was implemented in package of models for landslide susceptibility analysis and integrated in the NewAge-JGrass hydrological model. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3 and a component for models verifications. It computes eight goodness of fit indices by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system was applied for a case study in Calabria (Italy along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, the optimization of the index distance to perfect classification in the receiver operating characteristic plane (D2PC coupled with model M3 is the best modeling solution for our test case.

  3. Evaluating performances of simplified physically based models for landslide susceptibility

    Science.gov (United States)

    Formetta, G.; Capparelli, G.; Versace, P.

    2015-12-01

    Rainfall induced shallow landslides cause loss of life and significant damages involving private and public properties, transportation system, etc. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. Reliable models' applications involve: automatic parameters calibration, objective quantification of the quality of susceptibility maps, model sensitivity analysis. This paper presents a methodology to systemically and objectively calibrate, verify and compare different models and different models performances indicators in order to individuate and eventually select the models whose behaviors are more reliable for a certain case study. The procedure was implemented in package of models for landslide susceptibility analysis and integrated in the NewAge-JGrass hydrological model. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, the optimization of the index distance to perfect classification in the receiver operating characteristic plane (D2PC) coupled with model M3 is the best modeling solution for our test case.

  4. Significant glial alterations in response to iron loading in a novel organotypic hippocampal slice culture model

    Science.gov (United States)

    Healy, Sinead; McMahon, Jill; Owens, Peter; FitzGerald, Una

    2016-01-01

    Aberrant iron deposition in the brain is associated with neurodegenerative disorders including Multiple Sclerosis, Alzheimer’s disease and Parkinson’s disease. To study the collective response to iron loading, we have used hippocampal organotypic slices as a platform to develop a novel ex vivo model of iron accumulation. We demonstrated differential uptake and toxicity of iron after 12 h exposure to 10 μM ferrous ammonium sulphate, ferric citrate or ferrocene. Having established the supremacy of ferrocene in this model, the cultures were then loaded with 0.1–100 μM ferrocene for 12 h. One μM ferrocene exposure produced the maximal 1.6-fold increase in iron compared with vehicle. This was accompanied by a 1.4-fold increase in ferritin transcripts and mild toxicity. Using dual-immunohistochemistry, we detected ferritin in oligodendrocytes, microglia, but rarely in astrocytes and never in neurons in iron-loaded slice cultures. Moreover, iron loading led to a 15% loss of olig2-positive cells and a 16% increase in number and greater activation of microglia compared with vehicle. However, there was no appreciable effect of iron loading on astrocytes. In what we believe is a significant advance on traditional mono- or dual-cultures, our novel ex vivo slice-culture model allows characterization of the collective response of brain cells to iron-loading. PMID:27808258

  5. A Parallelized Pumpless Artificial Placenta System Significantly Prolonged Survival Time in a Preterm Lamb Model.

    Science.gov (United States)

    Miura, Yuichiro; Matsuda, Tadashi; Usuda, Haruo; Watanabe, Shimpei; Kitanishi, Ryuta; Saito, Masatoshi; Hanita, Takushi; Kobayashi, Yoshiyasu

    2016-05-01

    An artificial placenta (AP) is an arterio-venous extracorporeal life support system that is connected to the fetal circulation via the umbilical vasculature. Previously, we published an article describing a pumpless AP system with a small priming volume. We subsequently developed a parallelized system, hypothesizing that the reduced circuit resistance conveyed by this modification would enable healthy fetal survival time to be prolonged. We conducted experiments using a premature lamb model to test this hypothesis. As a result, the fetal survival period was significantly prolonged (60.4 ± 3.8 vs. 18.2 ± 3.2 h, P < 0.01), and circuit resistance and minimal blood lactate levels were significantly lower in the parallel circuit group, compared with our previous single circuit group. Fetal physiological parameters remained stable until the conclusion of the experiments. In summary, parallelization of the AP system was associated with reduced circuit resistance and lactate levels and allowed preterm lamb fetuses to survive for a significantly longer period when compared with previous studies.

  6. The performance of FLake in the Met Office Unified Model

    Directory of Open Access Journals (Sweden)

    Gabriel Gerard Rooney

    2013-12-01

    Full Text Available We present results from the coupling of FLake to the Met Office Unified Model (MetUM. The coupling and initialisation are first described, and the results of testing the coupled model in local and global model configurations are presented. These show that FLake has a small statistical impact on screen temperature, but has the potential to modify the weather in the vicinity of areas of significant inland water. Examination of FLake lake ice has revealed that the behaviour of lakes in the coupled model is unrealistic in some areas of significant sub-grid orography. Tests of various modifications to ameliorate this behaviour are presented. The results indicate which of the possible model changes best improve the annual cycle of lake ice. As FLake has been developed and tuned entirely outside the Unified Model system, these results can be interpreted as a useful objective measure of the performance of the Unified Model in terms of its near-surface characteristics.

  7. Phasic firing in vasopressin cells: understanding its functional significance through computational models.

    Directory of Open Access Journals (Sweden)

    Duncan J MacGregor

    Full Text Available Vasopressin neurons, responding to input generated by osmotic pressure, use an intrinsic mechanism to shift from slow irregular firing to a distinct phasic pattern, consisting of long bursts and silences lasting tens of seconds. With increased input, bursts lengthen, eventually shifting to continuous firing. The phasic activity remains asynchronous across the cells and is not reflected in the population output signal. Here we have used a computational vasopressin neuron model to investigate the functional significance of the phasic firing pattern. We generated a concise model of the synaptic input driven spike firing mechanism that gives a close quantitative match to vasopressin neuron spike activity recorded in vivo, tested against endogenous activity and experimental interventions. The integrate-and-fire based model provides a simple physiological explanation of the phasic firing mechanism involving an activity-dependent slow depolarising afterpotential (DAP generated by a calcium-inactivated potassium leak current. This is modulated by the slower, opposing, action of activity-dependent dendritic dynorphin release, which inactivates the DAP, the opposing effects generating successive periods of bursting and silence. Model cells are not spontaneously active, but fire when perturbed by random perturbations mimicking synaptic input. We constructed one population of such phasic neurons, and another population of similar cells but which lacked the ability to fire phasically. We then studied how these two populations differed in the way that they encoded changes in afferent inputs. By comparison with the non-phasic population, the phasic population responds linearly to increases in tonic synaptic input. Non-phasic cells respond to transient elevations in synaptic input in a way that strongly depends on background activity levels, phasic cells in a way that is independent of background levels, and show a similar strong linearization of the response

  8. Significant manipulation of output performance of a bridge-structured spin valve magnetoresistance sensor via an electric field

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yue; Yan, Baiqian; Ou-Yang, Jun; Zhu, Benpeng; Chen, Shi; Yang, Xiaofei, E-mail: hust-yangxiaofei@163.com [School of Optical and Electronic Information, Huazhong University of Science and Technology, Wuhan 430074 (China); Wang, Xianghao [School of Information Engineering, Wuhan University of Technology, Wuhan 430070 (China)

    2016-01-28

    Through principles of spin-valve giant magnetoresistance (SV-GMR) effect and its application in magnetic sensors, we have investigated electric-field control of the output performance of a bridge-structured Co/Cu/NiFe/IrMn SV-GMR sensor on a PZN-PT piezoelectric substrate using the micro-magnetic simulation. We centered on the influence of the variation of uniaxial magnetic anisotropy constant (K) of Co on the output of the bridge, and K was manipulated via the stress of Co, which is generated from the strain of a piezoelectric substrate under an electric field. The results indicate that when K varies between 2 × 10{sup 4 }J/m{sup 3} and 10 × 10{sup 4 }J/m{sup 3}, the output performance can be significantly manipulated: The linear range alters from between −330 Oe and 330 Oe to between −650 Oe and 650 Oe, and the sensitivity is tuned by almost 7 times, making it possible to measure magnetic fields with very different ranges. According to the converse piezoelectric effect, we have found that this variation of K can be realized by applying an electric field with the magnitude of about 2–20 kV/cm on a PZN-PT piezoelectric substrate, which is realistic in application. This result means that electric-control of SV-GMR effect has potential application in developing SV-GMR sensors with improved performance.

  9. Performance of a New Rapid Immunoassay Test Kit for Point-of-Care Diagnosis of Significant Bacteriuria.

    Science.gov (United States)

    Stapleton, Ann E; Cox, Marsha E; DiNello, Robert K; Geisberg, Mark; Abbott, April; Roberts, Pacita L; Hooton, Thomas M

    2015-09-01

    Urinary tract infections (UTIs) are frequently encountered in clinical practice and most commonly caused by Escherichia coli and other Gram-negative uropathogens. We tested RapidBac, a rapid immunoassay for bacteriuria developed by Silver Lake Research Corporation (SLRC), compared with standard bacterial culture using 966 clean-catch urine specimens submitted to a clinical microbiology laboratory in an urban academic medical center. RapidBac was performed in accordance with instructions, providing a positive or negative result in 20 min. RapidBac identified as positive 245/285 (sensitivity 86%) samples with significant bacteriuria, defined as the presence of a Gram-negative uropathogen or Staphylococcus saprophyticus at ≥10(3) CFU/ml. The sensitivities for Gram-negative bacteriuria at ≥10(4) CFU/ml and ≥10(5) CFU/ml were 96% and 99%, respectively. The specificity of the test, detecting the absence of significant bacteriuria, was 94%. The sensitivity and specificity of RapidBac were similar on samples from inpatient and outpatient settings, from male and female patients, and across age groups from 18 to 89 years old, although specificity was higher in men (100%) compared with that in women (92%). The RapidBac test for bacteriuria may be effective as an aid in the point-of-care diagnosis of UTIs especially in emergency and primary care settings. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  10. An integrative modeling approach to elucidate suction-feeding performance.

    Science.gov (United States)

    Holzman, Roi; Collar, David C; Mehta, Rita S; Wainwright, Peter C

    2012-01-01

    Research on suction-feeding performance has mostly focused on measuring individual underlying components such as suction pressure, flow velocity, ram or the effects of suction-induced forces on prey movement during feeding. Although this body of work has advanced our understanding of aquatic feeding, no consensus has yet emerged on how to combine all of these variables to predict prey-capture performance. Here, we treated the aquatic predator-prey encounter as a hydrodynamic interaction between a solid particle (representing the prey) and the unsteady suction flows around it, to integrate the effects of morphology, physiology, skull kinematics, ram and fluid mechanics on suction-feeding performance. We developed the suction-induced force-field (SIFF) model to study suction-feeding performance in 18 species of centrarchid fishes, and asked what morphological and functional traits underlie the evolution of feeding performance on three types of prey. Performance gradients obtained using SIFF revealed that different trait combinations contribute to the ability to feed on attached, evasive and (strain-sensitive) zooplanktonic prey because these prey types impose different challenges on the predator. The low overlap in the importance of different traits in determining performance also indicated that the evolution of suction-feeding ability along different ecological axes is largely unconstrained. SIFF also yielded estimates of feeding ability that performed better than kinematic traits in explaining natural patterns of prey use. When compared with principal components describing variation in the kinematics of suction-feeding events, SIFF output explained significantly more variation in centrarchid diets, suggesting that the inclusion of more mechanistic hydrodynamic models holds promise for gaining insight into the evolution of aquatic feeding performance.

  11. The clinical significance of incidental intra-abdominal findings on positron emission tomography performed to investigate pulmonary nodules

    Directory of Open Access Journals (Sweden)

    Gill Richdeep S

    2012-01-01

    Full Text Available Abstract Background Lung cancer is a common cause of cancer-related death. Staging typically includes positron emission tomography (PET scanning, in which18F-fluoro-2-dexoy-D-glucose (FDG is taken up by cells proportional to metabolic activity, thus aiding in differentiating benign and malignant pulmonary nodules. Uptake of FDG can also occur in the abdomen. The clinical significance of incidental intraabdominal FDG uptake in the setting of pulmonary nodules is not well established. Our objective was to report on the clinical significance of incidental intra-abdominal FDG activity in the setting of lung cancer. Methods Fifteen hundred FDG-PET reports for studies performed for lung cancer were retrospectively reviewed for the presence of incidental FDG-positive intraabdominal findings. Patient charts with positive findings were then reviewed and information extracted. Results Twenty-five patients (25/1500 demonstrated incidental intraabdominal FDG uptake thought to be significant (1.7% with a mean patient age of 71 years. Colonic uptake was most common (n = 17 with 9 (52% being investigated further. Of these 9 cases, a diagnosis of malignancy was made in 3 patients, pre-malignant adenomas in 2 patients, a benign lipoma in 1 patient and no abnormal findings in the remaining patients. 8 patients were not investigated further (3 diagnosed with metastatic lung cancer and 2 were of advanced age secondary to poor prognosis. Conclusion Incidental abdominal findings in the colon on FDG-PET scan for work-up of pulmonary nodules need to be further investigated by colonoscopy.

  12. A systematic experimental investigation of significant parameters affecting model tire hydroplaning

    Science.gov (United States)

    Wray, G. A.; Ehrlich, I. R.

    1973-01-01

    The results of a comprehensive parametric study of model and small pneumatic tires operating on a wet surface are presented. Hydroplaning inception (spin down) and rolling restoration (spin up) are discussed. Conclusions indicate that hydroplaning inception occurs at a speed significantly higher than the rolling restoration speed. Hydroplaning speed increases considerably with tread depth, surface roughness and tire inflation pressure of footprint pressure, and only moderately with increased load. Water film thickness affects spin down speed only slightly. Spin down speed varies inversely as approximately the one-sixth power of film thickness. Empirical equations relating tire inflation pressure, normal load, tire diameter and water film thickness have been generated for various tire tread and surface configurations.

  13. Cognition and procedure representational requirements for predictive human performance models

    Science.gov (United States)

    Corker, K.

    1992-01-01

    Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods

  14. Computer modeling of gastric parietal cell: significance of canalicular space, gland lumen, and variable canalicular [K+].

    Science.gov (United States)

    Crothers, James M; Forte, John G; Machen, Terry E

    2016-05-01

    A computer model, constructed for evaluation of integrated functioning of cellular components involved in acid secretion by the gastric parietal cell, has provided new interpretations of older experimental evidence, showing the functional significance of a canalicular space separated from a mucosal bath by a gland lumen and also shedding light on basolateral Cl(-) transport. The model shows 1) changes in levels of parietal cell secretion (with stimulation or H-K-ATPase inhibitors) result mainly from changes in electrochemical driving forces for apical K(+) and Cl(-) efflux, as canalicular [K(+)] ([K(+)]can) increases or decreases with changes in apical H(+)/K(+) exchange rate; 2) H-K-ATPase inhibition in frog gastric mucosa would increase [K(+)]can similarly with low or high mucosal [K(+)], depolarizing apical membrane voltage similarly, so electrogenic H(+) pumping is not indicated by inhibition causing similar increase in transepithelial potential difference (Vt) with 4 and 80 mM mucosal K(+); 3) decreased H(+) secretion during strongly mucosal-positive voltage clamping is consistent with an electroneutral H-K-ATPase being inhibited by greatly decreased [K(+)]can (Michaelis-Menten mechanism); 4) slow initial change ("long time-constant transient") in current or Vt with clamping of Vt or current involves slow change in [K(+)]can; 5) the Na(+)-K(+)-2Cl(-) symporter (NKCC) is likely to have a significant role in Cl(-) influx, despite evidence that it is not necessary for acid secretion; and 6) relative contributions of Cl(-)/HCO3 (-) exchanger (AE2) and NKCC to Cl(-) influx would differ greatly between resting and stimulated states, possibly explaining reported differences in physiological characteristics of stimulated open-circuit Cl(-) secretion (≈H(+)) and resting short-circuit Cl(-) secretion (>H(+)).

  15. Significant Features Found in Simulated Tropical Climates Using a Cloud Resolving Model

    Science.gov (United States)

    Shie, C.-L.; Tao, W.-K.; Simpson, J.; Sui, C.-H.

    2000-01-01

    Cloud resolving model (CRM) has widely been used in recent years for simulations involving studies of radiative-convective systems and their role in determining the tropical regional climate. The growing popularity of CRMs usage can be credited for their inclusion of crucial and realistic features such like explicit cloud-scale dynamics, sophisticated microphysical processes, and explicit radiative-convective interaction. For example, by using a two-dimensional cloud model with radiative-convective interaction process, found a QBO-like (quasibiennial oscillation) oscillation of mean zonal wind that affected the convective system. Accordingly, the model-generated rain band corresponding to convective activity propagated in the direction of the low-level zonal mean winds; however, the precipitation became "localized" (limited within a small portion of the domain) as zonal mean winds were removed. Two other CRM simulations by S94 and Grabowski et al. (1996, hereafter G96), respectively that produced distinctive quasi-equilibrium ("climate") states on both tropical water and energy, i.e., a cold/dry state in S94 and a warm/wet state in G96, have later been investigated by T99. They found that the pattern of the imposed large-scale horizontal wind and the magnitude of the imposed surface fluxes were the two crucial mechanisms in determining the tropical climate states. The warm/wet climate was found associated with prescribed strong surface winds, or with maintained strong vertical wind shears that well-organized convective systems prevailed. On the other hand, the cold/dry climate was produced due to imposed weak surface winds and weak wind shears throughout a vertically mixing process by convection. In this study, considered as a sequel of T99, the model simulations to be presented are generally similar to those of T99 (where a detailed model setup can be found), except for a more detailed discussion along with few more simulated experiments. There are twelve major

  16. Pomalidomide shows significant therapeutic activity against CNS lymphoma with a major impact on the tumor microenvironment in murine models.

    Science.gov (United States)

    Li, Zhimin; Qiu, Yushi; Personett, David; Huang, Peng; Edenfield, Brandy; Katz, Jason; Babusis, Darius; Tang, Yang; Shirely, Michael A; Moghaddam, Mehran F; Copland, John A; Tun, Han W

    2013-01-01

    Primary CNS lymphoma carries a poor prognosis. Novel therapeutic agents are urgently needed. Pomalidomide (POM) is a novel immunomodulatory drug with anti-lymphoma activity. CNS pharmacokinetic analysis was performed in rats to assess the CNS penetration of POM. Preclinical evaluation of POM was performed in two murine models to assess its therapeutic activity against CNS lymphoma. The impact of POM on the CNS lymphoma immune microenvironment was evaluated by immunohistochemistry and immunofluorescence. In vitro cell culture experiments were carried out to further investigate the impact of POM on the biology of macrophages. POM crosses the blood brain barrier with CNS penetration of ~ 39%. Preclinical evaluations showed that it had significant therapeutic activity against CNS lymphoma with significant reduction in tumor growth rate and prolongation of survival, that it had a major impact on the tumor microenvironment with an increase in macrophages and natural killer cells, and that it decreased M2-polarized tumor-associated macrophages and increased M1-polarized macrophages when macrophages were evaluated based on polarization status. In vitro studies using various macrophage models showed that POM converted the polarization status of IL4-stimulated macrophages from M2 to M1, that M2 to M1 conversion by POM in the polarization status of lymphoma-associated macrophages is dependent on the presence of NK cells, that POM induced M2 to M1 conversion in the polarization of macrophages by inactivating STAT6 signaling and activating STAT1 signaling, and that POM functionally increased the phagocytic activity of macrophages. Based on our findings, POM is a promising therapeutic agent for CNS lymphoma with excellent CNS penetration, significant preclinical therapeutic activity, and a major impact on the tumor microenvironment. It can induce significant biological changes in tumor-associated macrophages, which likely play a major role in its therapeutic activity against CNS

  17. Describing Assay Precision-Reciprocal of Variance Is Correct, Not CV Percent: Its Use Should Significantly Improve Laboratory Performance.

    Science.gov (United States)

    Jelliffe, Roger W; Schumitzky, Alan; Bayard, David; Fu, Xiaowei; Neely, Michael

    2015-06-01

    Describing assay error as percent coefficient of variation (CV%) fails as measurements approach zero. Results are censored if below some arbitrarily chosen lower limit of quantification (LLOQ). CV% gives incorrect weighting to data obtained by therapeutic drug monitoring, with incorrect parameter values in the resulting pharmacokinetic models, and incorrect dosage regimens for patient care. CV% was compared with the reciprocal of the variance (1/var) of each assay measurement. This method has not been considered by the laboratory community. A simple description of assay standard deviation (SD) as a polynomial function of the assay measurement over its working range was developed, the reciprocal of the assay variance determined, and its results compared with CV%. CV% does not provide correct weighting of measured serum concentrations as required for optimal therapeutic drug monitoring. It does not permit optimally individualized models of the behavior of a drug in a patient, resulting in incorrect dosage regimens. The assay error polynomial described here, using 1/var, provides correct weighting of such data, all the way down to and including zero. There is no need to censor low results, and no need to set any arbitrary LLOQ. Reciprocal of variance is the correct measure of assay precision and should replace CV%. The information is easily stored as an assay error polynomial. The laboratory can serve the medical community better. There is no longer any need for LLOQ, a significant improvement. Regulatory agencies should implement this more informed policy.

  18. A New Model to Simulate Energy Performance of VRF Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Tianzhen; Pang, Xiufeng; Schetrit, Oren; Wang, Liping; Kasahara, Shinichi; Yura, Yoshinori; Hinokuma, Ryohei

    2014-03-30

    This paper presents a new model to simulate energy performance of variable refrigerant flow (VRF) systems in heat pump operation mode (either cooling or heating is provided but not simultaneously). The main improvement of the new model is the introduction of the evaporating and condensing temperature in the indoor and outdoor unit capacity modifier functions. The independent variables in the capacity modifier functions of the existing VRF model in EnergyPlus are mainly room wet-bulb temperature and outdoor dry-bulb temperature in cooling mode and room dry-bulb temperature and outdoor wet-bulb temperature in heating mode. The new approach allows compliance with different specifications of each indoor unit so that the modeling accuracy is improved. The new VRF model was implemented in a custom version of EnergyPlus 7.2. This paper first describes the algorithm for the new VRF model, which is then used to simulate the energy performance of a VRF system in a Prototype House in California that complies with the requirements of Title 24 ? the California Building Energy Efficiency Standards. The VRF system performance is then compared with three other types of HVAC systems: the Title 24-2005 Baseline system, the traditional High Efficiency system, and the EnergyStar Heat Pump system in three typical California climates: Sunnyvale, Pasadena and Fresno. Calculated energy savings from the VRF systems are significant. The HVAC site energy savings range from 51 to 85percent, while the TDV (Time Dependent Valuation) energy savings range from 31 to 66percent compared to the Title 24 Baseline Systems across the three climates. The largest energy savings are in Fresno climate followed by Sunnyvale and Pasadena. The paper discusses various characteristics of the VRF systems contributing to the energy savings. It should be noted that these savings are calculated using the Title 24 prototype House D under standard operating conditions. Actual performance of the VRF systems for real

  19. PV Performance Modeling Methods and Practices: Results from the 4th PV Performance Modeling Collaborative Workshop.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    In 2014, the IEA PVPS Task 13 added the PVPMC as a formal activity to its technical work plan for 2014-2017. The goal of this activity is to expand the reach of the PVPMC to a broader international audience and help to reduce PV performance modeling uncertainties worldwide. One of the main deliverables of this activity is to host one or more PVPMC workshops outside the US to foster more international participation within this collaborative group. This report reviews the results of the first in a series of these joint IEA PVPS Task 13/PVPMC workshops. The 4th PV Performance Modeling Collaborative Workshop was held in Cologne, Germany at the headquarters of TÜV Rheinland on October 22-23, 2015.

  20. Performance Improvement/HPT Model: Guiding the Process

    Science.gov (United States)

    Dessinger, Joan Conway; Moseley, James L.; Van Tiem, Darlene M.

    2012-01-01

    This commentary is part of an ongoing dialogue that began in the October 2011 special issue of "Performance Improvement"--Exploring a Universal Performance Model for HPT: Notes From the Field. The performance improvement/HPT (human performance technology) model represents a unifying process that helps accomplish successful change, create…

  1. Synthesised model of market orientation-business performance relationship

    Directory of Open Access Journals (Sweden)

    G. Nwokah

    2006-12-01

    Full Text Available Purpose: The purpose of this paper is to assess the impact of market orientation on the performance of the organisation. While much empirical works have centered on market orientation, the generalisability of its impact on performance of the Food and Beverages organisations in the Nigeria context has been under-researched. Design/Methodology/Approach: The study adopted a triangulation methodology (quantitative and qualitative approach. Data was collected from key informants using a research instrument. Returned instruments were analyzed using nonparametric correlation through the use of the Statistical Package for Social Sciences (SPSS version 10. Findings: The study validated the earlier instruments but did not find any strong association between market orientation and business performance in the Nigerian context using the food and beverages organisations for the study. The reasons underlying the weak relationship between market orientation and business performance of the Food and Beverages organisations is government policies, new product development, diversification, innovation and devaluation of the Nigerian currency. One important finding of this study is that market orientation leads to business performance through some moderating variables. Implications: The study recommends that Nigerian Government should ensure a stable economy and make economic policies that will enhance existing business development in the country. Also, organisations should have performance measurement systems to detect the impact of investment on market orientation with the aim of knowing how the organisation works. Originality/Value: This study significantly refines the body of knowledge concerning the impact of market orientation on the performance of the organisation, and thereby offers a model of market orientation and business performance in the Nigerian context for marketing scholars and practitioners. This model will, no doubt, contribute to the body of

  2. HANDOVER MANAGEABILITY AND PERFORMANCE MODELING IN

    African Journals Online (AJOL)

    SOFTLINKS DIGITAL

    April 2014. 103 engineering and development including ... formal model management team must rely on guess work. ... model provides a systematic method for comparing ...... In 18th Annual Software Engineering and Knowledge. Engineering ...

  3. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels;

    2003-01-01

    A model for a flue gas boiler covering the flue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been defined for the furnace, the convection zone (split in 2......: a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat......Lab/Simulink has been applied for carrying out the simulations. To be able to verify the simulated results experiments has been carried out on a full scale boiler plant....

  4. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, Kim; Karstensen, Claus; Condra, Thomas Joseph;

    2003-01-01

    A model for a ue gas boiler covering the ue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been dened for the furnace, the convection zone (split in 2: a zone...... submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic- Equation system (DAE). Subsequently MatLab/Simulink has...... been applied for carrying out the simulations. To be able to verify the simulated results an experiments has been carried out on a full scale boiler plant....

  5. Probability and Statistics in Sensor Performance Modeling

    Science.gov (United States)

    2010-12-01

    transformed Rice- Nakagami distribution ......................................................................... 49 Report Documentation Page...acoustic or electromagnetic waves are scattered by both objects and turbulent wind. A version of the Rice- Nakagami model (specifically with a...Gaussian, lognormal, exponential, gamma, and the 2XX → transformed Rice- Nakagami —as well as a discrete model. (Other examples of statistical models

  6. The predictive performance and stability of six species distribution models.

    Science.gov (United States)

    Duan, Ren-Yan; Kong, Xiao-Quan; Huang, Min-Yi; Fan, Wei-Yi; Wang, Zhi-Gao

    2014-01-01

    Predicting species' potential geographical range by species distribution models (SDMs) is central to understand their ecological requirements. However, the effects of using different modeling techniques need further investigation. In order to improve the prediction effect, we need to assess the predictive performance and stability of different SDMs. We collected the distribution data of five common tree species (Pinus massoniana, Betula platyphylla, Quercus wutaishanica, Quercus mongolica and Quercus variabilis) and simulated their potential distribution area using 13 environmental variables and six widely used SDMs: BIOCLIM, DOMAIN, MAHAL, RF, MAXENT, and SVM. Each model run was repeated 100 times (trials). We compared the predictive performance by testing the consistency between observations and simulated distributions and assessed the stability by the standard deviation, coefficient of variation, and the 99% confidence interval of Kappa and AUC values. The mean values of AUC and Kappa from MAHAL, RF, MAXENT, and SVM trials were similar and significantly higher than those from BIOCLIM and DOMAIN trials (pMAXENT, and SVM. Compared to BIOCLIM and DOMAIN, other SDMs (MAHAL, RF, MAXENT, and SVM) had higher prediction accuracy, smaller confidence intervals, and were more stable and less affected by the random variable (randomly selected pseudo-absence points). According to the prediction performance and stability of SDMs, we can divide these six SDMs into two categories: a high performance and stability group including MAHAL, RF, MAXENT, and SVM, and a low performance and stability group consisting of BIOCLIM, and DOMAIN. We highlight that choosing appropriate SDMs to address a specific problem is an important part of the modeling process.

  7. The predictive performance and stability of six species distribution models.

    Directory of Open Access Journals (Sweden)

    Ren-Yan Duan

    Full Text Available Predicting species' potential geographical range by species distribution models (SDMs is central to understand their ecological requirements. However, the effects of using different modeling techniques need further investigation. In order to improve the prediction effect, we need to assess the predictive performance and stability of different SDMs.We collected the distribution data of five common tree species (Pinus massoniana, Betula platyphylla, Quercus wutaishanica, Quercus mongolica and Quercus variabilis and simulated their potential distribution area using 13 environmental variables and six widely used SDMs: BIOCLIM, DOMAIN, MAHAL, RF, MAXENT, and SVM. Each model run was repeated 100 times (trials. We compared the predictive performance by testing the consistency between observations and simulated distributions and assessed the stability by the standard deviation, coefficient of variation, and the 99% confidence interval of Kappa and AUC values.The mean values of AUC and Kappa from MAHAL, RF, MAXENT, and SVM trials were similar and significantly higher than those from BIOCLIM and DOMAIN trials (p<0.05, while the associated standard deviations and coefficients of variation were larger for BIOCLIM and DOMAIN trials (p<0.05, and the 99% confidence intervals for AUC and Kappa values were narrower for MAHAL, RF, MAXENT, and SVM. Compared to BIOCLIM and DOMAIN, other SDMs (MAHAL, RF, MAXENT, and SVM had higher prediction accuracy, smaller confidence intervals, and were more stable and less affected by the random variable (randomly selected pseudo-absence points.According to the prediction performance and stability of SDMs, we can divide these six SDMs into two categories: a high performance and stability group including MAHAL, RF, MAXENT, and SVM, and a low performance and stability group consisting of BIOCLIM, and DOMAIN. We highlight that choosing appropriate SDMs to address a specific problem is an important part of the modeling process.

  8. Performance Appraisal: A New Model for Academic Advisement.

    Science.gov (United States)

    Hazleton, Vincent; Tuttle, George E.

    1981-01-01

    Presents the performance appraisal model for student advisement, a centralized developmental model that focuses on the content and process of advisement. The model has three content objectives: job definition, performance assessment, and goal setting. Operation of the model is described. Benefits and potential limitations are identified. (Author)

  9. New Metacognitive Model for Human Performance Technology

    Science.gov (United States)

    Turner, John R.

    2011-01-01

    Addressing metacognitive functions has been shown to improve performance at the individual, team, group, and organizational levels. Metacognition is beginning to surface as an added cognate discipline for the field of human performance technology (HPT). Advances from research in the fields of cognition and metacognition offer a place for HPT to…

  10. New Metacognitive Model for Human Performance Technology

    Science.gov (United States)

    Turner, John R.

    2011-01-01

    Addressing metacognitive functions has been shown to improve performance at the individual, team, group, and organizational levels. Metacognition is beginning to surface as an added cognate discipline for the field of human performance technology (HPT). Advances from research in the fields of cognition and metacognition offer a place for HPT to…

  11. Building performance modelling for sustainable building design

    Directory of Open Access Journals (Sweden)

    Olufolahan Oduyemi

    2016-12-01

    The output revealed that BPM delivers information needed for enhanced design and building performance. Recommendations such as the establishment of proper mechanisms to monitor the performance of BPM related construction are suggested to allow for its continuous implementation. This research consolidates collective movements towards wider implementation of BPM and forms a base for developing a sound BIM strategy and guidance.

  12. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    Science.gov (United States)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  13. Pharmacological kynurenine 3-monooxygenase enzyme inhibition significantly reduces neuropathic pain in a rat model.

    Science.gov (United States)

    Rojewska, Ewelina; Piotrowska, Anna; Makuch, Wioletta; Przewlocka, Barbara; Mika, Joanna

    2016-03-01

    Recent studies have highlighted the involvement of the kynurenine pathway in the pathology of neurodegenerative diseases, but the role of this system in neuropathic pain requires further extensive research. Therefore, the aim of our study was to examine the role of kynurenine 3-monooxygenase (Kmo), an enzyme that is important in this pathway, in a rat model of neuropathy after chronic constriction injury (CCI) to the sciatic nerve. For the first time, we demonstrated that the injury-induced increase in the Kmo mRNA levels in the spinal cord and the dorsal root ganglia (DRG) was reduced by chronic administration of the microglial inhibitor minocycline and that this effect paralleled a decrease in the intensity of neuropathy. Further, minocycline administration alleviated the lipopolysaccharide (LPS)-induced upregulation of Kmo mRNA expression in microglial cell cultures. Moreover, we demonstrated that not only indirect inhibition of Kmo using minocycline but also direct inhibition using Kmo inhibitors (Ro61-6048 and JM6) decreased neuropathic pain intensity on the third and the seventh days after CCI. Chronic Ro61-6048 administration diminished the protein levels of IBA-1, IL-6, IL-1beta and NOS2 in the spinal cord and/or the DRG. Both Kmo inhibitors potentiated the analgesic properties of morphine. In summary, our data suggest that in neuropathic pain model, inhibiting Kmo function significantly reduces pain symptoms and enhances the effectiveness of morphine. The results of our studies show that the kynurenine pathway is an important mediator of neuropathic pain pathology and indicate that Kmo represents a novel pharmacological target for the treatment of neuropathy.

  14. Cyclosporin A significantly improves preeclampsia signs and suppresses inflammation in a rat model.

    Science.gov (United States)

    Hu, Bihui; Yang, Jinying; Huang, Qian; Bao, Junjie; Brennecke, Shaun Patrick; Liu, Huishu

    2016-05-01

    Preeclampsia is associated with an increased inflammatory response. Immune suppression might be an effective treatment. The aim of this study was to examine whether Cyclosporin A (CsA), an immunosuppressant, improves clinical characteristics of preeclampsia and suppresses inflammation in a lipopolysaccharide (LPS) induced preeclampsia rat model. Pregnant rats were randomly divided into 4 groups: group 1 (PE) rats each received LPS via tail vein on gestational day (GD) 14; group 2 (PE+CsA5) rats were pretreated with LPS (1.0 μg/kg) on GD 14 and were then treated with CsA (5mg/kg, ip) on GDs 16, 17 and 18; group 3 (PE+CsA10) rats were pretreated with LPS (1.0 μg/kg) on GD 14 and were then treated with CsA (10mg/kg, ip) on GDs 16, 17 and 18; group 4 (pregnant control, PC) rats were treated with the vehicle (saline) used for groups 1, 2 and 3. Systolic blood pressure, urinary albumin, biometric parameters and the levels of serum cytokines were measured on day 20. CsA treatment significantly reduced LPS-induced systolic blood pressure and the mean 24-h urinary albumin excretion. Pro-inflammatory cytokines IL-6, IL-17, IFN-γ and TNF-α were increased in the LPS treatment group but were reduced in (LPS+CsA) group (PCyclosporine A improved preeclampsia signs and attenuated inflammatory responses in the LPS induced preeclampsia rat model which suggests that immunosuppressant might be an alternative management option for preeclampsia. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Prostate cancer risk and DNA damage: translational significance of selenium supplementation in a canine model.

    Science.gov (United States)

    Waters, David J; Shen, Shuren; Glickman, Lawrence T; Cooley, Dawn M; Bostwick, David G; Qian, Junqi; Combs, Gerald F; Morris, J Steven

    2005-07-01

    Daily supplementation with the essential trace mineral selenium significantly reduced prostate cancer risk in men in the Nutritional Prevention of Cancer Trial. However, the optimal intake of selenium for prostate cancer prevention is unknown. We hypothesized that selenium significantly regulates the extent of genotoxic damage within the aging prostate and that the relationship between dietary selenium intake and DNA damage is non-linear, i.e. more selenium is not necessarily better. To test this hypothesis, we conducted a randomized feeding trial in which 49 elderly beagle dogs (physiologically equivalent to 62-69-year-old men) received nutritionally adequate or supranutritional levels of selenium for 7 months, in order to mimic the range of dietary selenium intake of men in the United States. Our results demonstrate an intriguing U-shaped dose-response relationship between selenium status (toenail selenium concentration) and the extent of DNA damage (alkaline Comet assay) within the prostate. Further, we demonstrate that the concentration of selenium that minimizes DNA damage in the aging dog prostate remarkably parallels the selenium concentration in men that minimizes prostate cancer risk. By studying elderly dogs, the only non-human animal model of spontaneous prostate cancer, we have established a new approach to bridge the gap between laboratory and human studies that can be used to select the appropriate dose of anticancer agents for large-scale human cancer prevention trials. From the U-shaped dose-response, it follows that not all men will necessarily benefit from increasing their selenium intake and that measurement of baseline nutrient status should be required for all individuals in prevention trials to avoid oversupplementation.

  16. Kernel density surface modelling as a means to identify significant concentrations of vulnerable marine ecosystem indicators.

    Directory of Open Access Journals (Sweden)

    Ellen Kenchington

    Full Text Available The United Nations General Assembly Resolution 61/105, concerning sustainable fisheries in the marine ecosystem, calls for the protection of vulnerable marine ecosystems (VME from destructive fishing practices. Subsequently, the Food and Agriculture Organization (FAO produced guidelines for identification of VME indicator species/taxa to assist in the implementation of the resolution, but recommended the development of case-specific operational definitions for their application. We applied kernel density estimation (KDE to research vessel trawl survey data from inside the fishing footprint of the Northwest Atlantic Fisheries Organization (NAFO Regulatory Area in the high seas of the northwest Atlantic to create biomass density surfaces for four VME indicator taxa: large-sized sponges, sea pens, small and large gorgonian corals. These VME indicator taxa were identified previously by NAFO using the fragility, life history characteristics and structural complexity criteria presented by FAO, along with an evaluation of their recovery trajectories. KDE, a non-parametric neighbour-based smoothing function, has been used previously in ecology to identify hotspots, that is, areas of relatively high biomass/abundance. We present a novel approach of examining relative changes in area under polygons created from encircling successive biomass categories on the KDE surface to identify "significant concentrations" of biomass, which we equate to VMEs. This allows identification of the VMEs from the broader distribution of the species in the study area. We provide independent assessments of the VMEs so identified using underwater images, benthic sampling with other gear types (dredges, cores, and/or published species distribution models of probability of occurrence, as available. For each VME indicator taxon we provide a brief review of their ecological function which will be important in future assessments of significant adverse impact on these habitats here

  17. Significance of settling model structures and parameter subsets in modelling WWTPs under wet-weather flow and filamentous bulking conditions

    DEFF Research Database (Denmark)

    Ramin, Elham; Sin, Gürkan; Mikkelsen, Peter Steen;

    2014-01-01

    Current research focuses on predicting and mitigating the impacts of high hydraulic loadings on centralized wastewater treatment plants (WWTPs) under wet-weather conditions. The maximum permissible inflow to WWTPs depends not only on the settleability of activated sludge in secondary settling tanks...... (SSTs) but also on the hydraulic behaviour of SSTs. The present study investigates the impacts of ideal and non-ideal flow (dry and wet weather) and settling (good settling and bulking) boundary conditions on the sensitivity of WWTP model outputs to uncertainties intrinsic to the one-dimensional (1-D...... of settling parameters to the total variance of the key WWTP process outputs significantly depends on the influent flow and settling conditions. The magnitude of the impact is found to vary, depending on which type of 1-D SST model is used. Therefore, we identify and recommend potential parameter subsets...

  18. Methylphenidate significantly improves driving performance of adults with attention-deficit hyperactivity disorder: a randomized crossover trial.

    NARCIS (Netherlands)

    Verster, J.C.; Bekker, E.M.; Roos, M.; Minova, A.; Eijken, E.J.; Kooij, J.J.; Buitelaar, J.K.; Kenemans, J.L.; Verbaten, M.N.; Olivier, B.; Volkerts, E.R.

    2008-01-01

    Although patients with attention-deficit hyperactivity disorder (ADHD) have reported improved driving performance on methylphenidate, limited evidence exists to support an effect of treatment on driving performance and some regions prohibit driving on methylphenidate. A randomized, crossover trial e

  19. High-fat diet induces significant metabolic disorders in a mouse model of polycystic ovary syndrome.

    Science.gov (United States)

    Lai, Hao; Jia, Xiao; Yu, Qiuxiao; Zhang, Chenglu; Qiao, Jie; Guan, Youfei; Kang, Jihong

    2014-11-01

    Polycystic ovary syndrome (PCOS) is the most common female endocrinopathy associated with both reproductive and metabolic disorders. Dehydroepiandrosterone (DHEA) is currently used to induce a PCOS mouse model. High-fat diet (HFD) has been shown to cause obesity and infertility in female mice. The possible effect of an HFD on the phenotype of DHEA-induced PCOS mice is unknown. The aim of the present study was to investigate both reproductive and metabolic features of DHEA-induced PCOS mice fed a normal chow or a 60% HFD. Prepubertal C57BL/6 mice (age 25 days) on the normal chow or an HFD were injected (s.c.) daily with the vehicle sesame oil or DHEA for 20 consecutive days. At the end of the experiment, both reproductive and metabolic characteristics were assessed. Our data show that an HFD did not affect the reproductive phenotype of DHEA-treated mice. The treatment of HFD, however, caused significant metabolic alterations in DHEA-treated mice, including obesity, glucose intolerance, dyslipidemia, and pronounced liver steatosis. These findings suggest that HFD induces distinct metabolic features in DHEA-induced PCOS mice. The combined DHEA and HFD treatment may thus serve as a means of studying the mechanisms involved in metabolic derangements of this syndrome, particularly in the high prevalence of hepatic steatosis in women with PCOS.

  20. Individualized Biomathematical Modeling of Fatigue and Performance

    Science.gov (United States)

    2008-05-29

    prior information about the initial state parameters may be acquired by other means, though. For instance, actigraphy could be used to track sleep ...J., Saper C. B. Neurobiology of the sleep -wake cycle: Sleep architecture , circadian regulation, and regulatory feedback. J. Biol. Rhythms 21, 482... Sleep and Performance Research Center 8. PERFORMING ORGANIZATION REPORT NUMBER Washington State University, Spokane P.O. Box 1495 Spokane, WA

  1. Detailed Performance Model for Photovoltaic Systems: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Tian, H.; Mancilla-David, F.; Ellis, K.; Muljadi, E.; Jenkins, P.

    2012-07-01

    This paper presents a modified current-voltage relationship for the single diode model. The single-diode model has been derived from the well-known equivalent circuit for a single photovoltaic cell. The modification presented in this paper accounts for both parallel and series connections in an array.

  2. Complex Systems and Human Performance Modeling

    Science.gov (United States)

    2013-12-01

    constitute a cognitive architecture or decomposing the work flows and resource constraints that characterize human-system interactions, the modeler...also explored the generation of so-called “ fractal ” series from simple task network models where task times are the calculated by way of a moving

  3. HANDOVER MANAGEABILITY AND PERFORMANCE MODELING IN

    African Journals Online (AJOL)

    SOFTLINKS DIGITAL

    West African Journal of Industrial & Academic Research Vol.10 No.1 April ... sketches out a model of proactive and reactive mitigation response model for individuals ... valuable asset to all firms. ... information is shared only among ... ability to ensure that a party to a contract or ... organizations only react to security threats,.

  4. Comparative performance of high-fidelity training models for flexible ureteroscopy: Are all models effective?

    Directory of Open Access Journals (Sweden)

    Shashikant Mishra

    2011-01-01

    Full Text Available Objective: We performed a comparative study of high-fidelity training models for flexible ureteroscopy (URS. Our objective was to determine whether high-fidelity non-virtual reality (VR models are as effective as the VR model in teaching flexible URS skills. Materials and Methods: Twenty-one trained urologists without clinical experience of flexible URS underwent dry lab simulation practice. After a warm-up period of 2 h, tasks were performed on a high-fidelity non-VR (Uro-scopic Trainer TM ; Endo-Urologie-Modell TM and a high-fidelity VR model (URO Mentor TM . The participants were divided equally into three batches with rotation on each of the three stations for 30 min. Performance of the trainees was evaluated by an expert ureteroscopist using pass rating and global rating score (GRS. The participants rated a face validity questionnaire at the end of each session. Results: The GRS improved statistically at evaluation performed after second rotation (P<0.001 for batches 1, 2 and 3. Pass ratings also improved significantly for all training models when the third and first rotations were compared (P<0.05. The batch that was trained on the VR-based model had more improvement on pass ratings on second rotation but could not achieve statistical significance. Most of the realistic domains were higher for a VR model as compared with the non-VR model, except the realism of the flexible endoscope. Conclusions: All the models used for training flexible URS were effective in increasing the GRS and pass ratings irrespective of the VR status.

  5. An automated nowcasting model of significant instability events in the flight terminal area of Rio de Janeiro, Brazil

    Science.gov (United States)

    Borges França, Gutemberg; Valdonel de Almeida, Manoel; Rosette, Alessana C.

    2016-05-01

    This paper presents a novel model, based on neural network techniques, to produce short-term and local-specific forecasts of significant instability for flights in the terminal area of Galeão Airport, Rio de Janeiro, Brazil. Twelve years of data were used for neural network training/validation and test. Data are originally from four sources: (1) hourly meteorological observations from surface meteorological stations at five airports distributed around the study area; (2) atmospheric profiles collected twice a day at the meteorological station at Galeão Airport; (3) rain rate data collected from a network of 29 rain gauges in the study area; and (4) lightning data regularly collected by national detection networks. An investigation was undertaken regarding the capability of a neural network to produce early warning signs - or as a nowcasting tool - for significant instability events in the study area. The automated nowcasting model was tested using results from five categorical statistics, indicated in parentheses in forecasts of the first, second, and third hours, respectively, namely proportion correct (0.99, 0.97, and 0.94), BIAS (1.10, 1.42, and 2.31), the probability of detection (0.79, 0.78, and 0.67), false-alarm ratio (0.28, 0.45, and 0.73), and threat score (0.61, 0.47, and 0.25). Possible sources of error related to the test procedure are presented and discussed. The test showed that the proposed model (or neural network) can grab the physical content inside the data set, and its performance is quite encouraging for the first and second hours to nowcast significant instability events in the study area.

  6. Performance evaluation of quality monitor models in spot welding

    Institute of Scientific and Technical Information of China (English)

    Zhang Zhongdian; Li Dongqing; Wang Kai

    2005-01-01

    Performance of quality monitor models in spot welding determines the monitor precision directly, so it's crucial to evaluate it. Previously, mean square error ( MSE ) is often used to evaluate performances of models, but it can only show the total errors of finite specimens of models, and cannot show whether the quality information inferred from models are accurate and reliable enough or not. For this reason, by means of measure error theory, a new way to evaluate the performances of models according to the error distributions is developed as follows: Only if correct and precise enough the error distribution of model is, the quality information inferred from model is accurate and reliable.

  7. A unified tool for performance modelling and prediction

    Energy Technology Data Exchange (ETDEWEB)

    Gilmore, Stephen [Laboratory for Foundations of Computer Science, University of Edinburgh, King' s Buildings, Mayfield Road, Edinburgh, Scotland EH9 3JZ (United Kingdom)]. E-mail: stg@inf.ed.ac.uk; Kloul, Leila [Laboratory for Foundations of Computer Science, University of Edinburgh, King' s Buildings, Mayfield Road, Edinburgh, Scotland EH9 3JZ (United Kingdom)

    2005-07-01

    We describe a novel performability modelling approach, which facilitates the efficient solution of performance models extracted from high-level descriptions of systems. The notation which we use for our high-level designs is the Unified Modelling Language (UML) graphical modelling language. The technology which provides the efficient representation capability for the underlying performance model is the multi-terminal binary decision diagram (MTBDD)-based PRISM probabilistic model checker. The UML models are compiled through an intermediate language, the stochastic process algebra PEPA, before translation into MTBDDs for solution. We illustrate our approach on a real-world analysis problem from the domain of mobile telephony.

  8. Modeling and optimization of LCD optical performance

    CERN Document Server

    Yakovlev, Dmitry A; Kwok, Hoi-Sing

    2015-01-01

    The aim of this book is to present the theoretical foundations of modeling the optical characteristics of liquid crystal displays, critically reviewing modern modeling methods and examining areas of applicability. The modern matrix formalisms of optics of anisotropic stratified media, most convenient for solving problems of numerical modeling and optimization of LCD, will be considered in detail. The benefits of combined use of the matrix methods will be shown, which generally provides the best compromise between physical adequacy and accuracy with computational efficiency and optimization fac

  9. Significant impacts of irrigation water sources and methods on modeling irrigation effects in the ACME Land Model

    Energy Technology Data Exchange (ETDEWEB)

    Leng, Guoyong; Leung, Lai-Yung; Huang, Maoyi

    2017-07-01

    An irrigation module that considers both irrigation water sources and irrigation methods has been incorporated into the ACME Land Model (ALM). Global numerical experiments were conducted to evaluate the impacts of irrigation water sources and irrigation methods on the simulated irrigation effects. All simulations shared the same irrigation soil moisture target constrained by a global census dataset of irrigation amounts. Irrigation has large impacts on terrestrial water balances especially in regions with extensive irrigation. Such effects depend on the irrigation water sources: surface-water-fed irrigation leads to decreases in runoff and water table depth, while groundwater-fed irrigation increases water table depth, with positive or negative effects on runoff depending on the pumping intensity. Irrigation effects also depend significantly on the irrigation methods. Flood irrigation applies water in large volumes within short durations, resulting in much larger impacts on runoff and water table depth than drip and sprinkler irrigations. Differentiating the irrigation water sources and methods is important not only for representing the distinct pathways of how irrigation influences the terrestrial water balances, but also for estimating irrigation water use efficiency. Specifically, groundwater pumping has lower irrigation water use efficiency due to enhanced recharge rates. Different irrigation methods also affect water use efficiency, with drip irrigation the most efficient followed by sprinkler and flood irrigation. Our results highlight the importance of explicitly accounting for irrigation sources and irrigation methods, which are the least understood and constrained aspects in modeling irrigation water demand, water scarcity and irrigation effects in Earth System Models.

  10. Ranking streamflow model performance based on Information theory metrics

    Science.gov (United States)

    Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas

    2016-04-01

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.

  11. Hydrologic Evaluation of Landfill Performance (HELP) Model

    Science.gov (United States)

    The program models rainfall, runoff, infiltration, and other water pathways to estimate how much water builds up above each landfill liner. It can incorporate data on vegetation, soil types, geosynthetic materials, initial moisture conditions, slopes, etc.

  12. Integrated thermodynamic model for ignition target performance

    Directory of Open Access Journals (Sweden)

    Springer P.T.

    2013-11-01

    Full Text Available We have derived a 3-dimensional synthetic model for NIF implosion conditions, by predicting and optimizing fits to a broad set of x-ray and nuclear diagnostics obtained on each shot. By matching x-ray images, burn width, neutron time-of-flight ion temperature, yield, and fuel ρr, we obtain nearly unique constraints on conditions in the hotspot and fuel in a model that is entirely consistent with the observables. This model allows us to determine hotspot density, pressure, areal density (ρr, total energy, and other ignition-relevant parameters not available from any single diagnostic. This article describes the model and its application to National Ignition Facility (NIF tritium–hydrogen–deuterium (THD and DT implosion data, and provides an explanation for the large yield and ρr degradation compared to numerical code predictions.

  13. Modeling logistic performance in quantitative microbial risk assessment.

    Science.gov (United States)

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  14. Manufacturing Excellence Approach to Business Performance Model

    Directory of Open Access Journals (Sweden)

    Jesus Cruz Alvarez

    2015-03-01

    Full Text Available Six Sigma, lean manufacturing, total quality management, quality control, and quality function deployment are the fundamental set of tools to enhance productivity in organizations. There is some research that outlines the benefit of each tool into a particular context of firm´s productivity, but not into a broader context of firm´s competitiveness that is achieved thru business performance. The aim of this theoretical research paper is to contribute to this mean and propose a manufacturing excellence approach that links productivity tools into a broader context of business performance.

  15. Safety performance models for urban intersections in Brazil.

    Science.gov (United States)

    Barbosa, Heloisa; Cunto, Flávio; Bezerra, Bárbara; Nodari, Christine; Jacques, Maria Alice

    2014-09-01

    This paper presents a modeling effort for developing safety performance models (SPM) for urban intersections for three major Brazilian cities. The proposed methodology for calibrating SPM has been divided into the following steps: defining the safety study objective, choosing predictive variables and sample size, data acquisition, defining model expression and model parameters and model evaluation. Among the predictive variables explored in the calibration phase were exposure variables (AADT), number of lanes, number of approaches and central median status. SPMs were obtained for three cities: Fortaleza, Belo Horizonte and Brasília. The SPM developed for signalized intersections in Fortaleza and Belo Horizonte had the same structure and the most significant independent variables, which were AADT entering the intersection and number of lanes, and in addition, the coefficient of the best models were in the same range of values. For Brasília, because of the sample size, the signalized and unsignalized intersections were grouped, and the AADT was split in minor and major approaches, which were the most significant variables. This paper also evaluated SPM transferability to other jurisdiction. The SPM for signalized intersections from Fortaleza and Belo Horizonte have been recalibrated (in terms of the Cx) to the city of Porto Alegre. The models were adjusted following the Highway Safety Manual (HSM) calibration procedure and yielded Cx of 0.65 and 2.06 for Fortaleza and Belo Horizonte SPM respectively. This paper showed the experience and future challenges toward the initiatives on development of SPMs in Brazil, that can serve as a guide for other countries that are in the same stage in this subject.

  16. Developing an Energy Performance Modeling Startup Kit

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2012-10-01

    In 2011, the NAHB Research Center began assessing the needs and motivations of residential remodelers regarding energy performance remodeling. This report outlines: the current remodeling industry and the role of energy efficiency; gaps and barriers to adding energy efficiency into remodeling; and support needs of professional remodelers to increase sales and projects involving improving home energy efficiency.

  17. Forecasting Performance of Asymmetric GARCH Stock Market Volatility Models

    Directory of Open Access Journals (Sweden)

    Hojin Lee

    2009-12-01

    Full Text Available We investigate the asymmetry between positive and negative returns in their effect on conditional variance of the stock market index and incorporate the characteristics to form an out-of-sample volatility forecast. Contrary to prior evidence, however, the results in this paper suggest that no asymmetric GARCH model is superior to basic GARCH(1,1 model. It is our prior knowledge that, for equity returns, it is unlikely that positive and negative shocks have the same impact on the volatility. In order to reflect this intuition, we implement three diagnostic tests for volatility models: the Sign Bias Test, the Negative Size Bias Test, and the Positive Size Bias Test and the tests against the alternatives of QGARCH and GJR-GARCH. The asymmetry test results indicate that the sign and the size of the unexpected return shock do not influence current volatility differently which contradicts our presumption that there are asymmetric effects in the stock market volatility. This result is in line with various diagnostic tests which are designed to determine whether the GARCH(1,1 volatility estimates adequately represent the data. The diagnostic tests in section 2 indicate that the GARCH(1,1 model for weekly KOSPI returns is robust to the misspecification test. We also investigate two representative asymmetric GARCH models, QGARCH and GJR-GARCH model, for our out-of-sample forecasting performance. The out-of-sample forecasting ability test reveals that no single model is clearly outperforming. It is seen that the GJR-GARCH and QGARCH model give mixed results in forecasting ability on all four criteria across all forecast horizons considered. Also, the predictive accuracy test of Diebold and Mariano based on both absolute and squared prediction errors suggest that the forecasts from the linear and asymmetric GARCH models need not be significantly different from each other.

  18. Inconsistent strategies to spin up models in CMIP5: implications for ocean biogeochemical model performance assessment

    Science.gov (United States)

    Seferian, R.; Gehlen, M.; Bopp, L.; Resplandy, L.; Orr, J. C.; Marti, O.

    2016-12-01

    During the fifth phase of the Coupled Model Intercomparison Project (CMIP5) substantial efforts were made to systematically assess the skills of Earth system models against available modern observations. However, most of these skill-assessment approaches can be considered as "blind" given that they were applied without considering models' specific characteristics and treat models a priori as independent of observations. Indeed, since these models are typically initialized from observations, the spin-up procedure (e.g. the length of time for which the model has been run since initialization, and therefore the degree to which it has approached it's own equilibrium) has the potential to exert a significant control over the skill-assessment metrics calculated for each model. Here, we explore how the large diversity in spin-up protocols used for marine biogeochemistry in CMIP5 Earth system models (ESM) contributes to model-to-model differences in the simulated fields. We focus on the amplification of biases in selected biogeochemical fields (O2, NO3, Alk-DIC) as a function of spin-up duration in a dedicated 500-year-long spin-up simulation performed with IPSL-CM5A-LR as well as an ensemble of 24 CMIP5 ESMs. We demonstrate that a relationship between spin-up duration and skill-assessment metrics emerges from the results of a single model and holds when confronted with a larger ensemble of CMIP5 models. This shows that drift in biogeochemical fields has implications for performance assessment in addition to possibly influence estimates of climate change impact. Our study suggests that differences in spin-up protocols could explain a substantial part of model disparities, constituting a source of model-to-model uncertainty. This requires more attention in future model intercomparison exercises in order to provide quantitatively more correct ESM results on marine biogeochemistry and carbon cycle feedbacks.

  19. Duct thermal performance models for large commercial buildings

    Energy Technology Data Exchange (ETDEWEB)

    Wray, Craig P.

    2003-10-01

    Despite the potential for significant energy savings by reducing duct leakage or other thermal losses from duct systems in large commercial buildings, California Title 24 has no provisions to credit energy-efficient duct systems in these buildings. A substantial reason is the lack of readily available simulation tools to demonstrate the energy-saving benefits associated with efficient duct systems in large commercial buildings. The overall goal of the Efficient Distribution Systems (EDS) project within the PIER High Performance Commercial Building Systems Program is to bridge the gaps in current duct thermal performance modeling capabilities, and to expand our understanding of duct thermal performance in California large commercial buildings. As steps toward this goal, our strategy in the EDS project involves two parts: (1) developing a whole-building energy simulation approach for analyzing duct thermal performance in large commercial buildings, and (2) using the tool to identify the energy impacts of duct leakage in California large commercial buildings, in support of future recommendations to address duct performance in the Title 24 Energy Efficiency Standards for Nonresidential Buildings. The specific technical objectives for the EDS project were to: (1) Identify a near-term whole-building energy simulation approach that can be used in the impacts analysis task of this project (see Objective 3), with little or no modification. A secondary objective is to recommend how to proceed with long-term development of an improved compliance tool for Title 24 that addresses duct thermal performance. (2) Develop an Alternative Calculation Method (ACM) change proposal to include a new metric for thermal distribution system efficiency in the reporting requirements for the 2005 Title 24 Standards. The metric will facilitate future comparisons of different system types using a common ''yardstick''. (3) Using the selected near-term simulation approach

  20. Space Station Freedom electrical performance model

    Science.gov (United States)

    Hojnicki, Jeffrey S.; Green, Robert D.; Kerslake, Thomas W.; Mckissock, David B.; Trudell, Jeffrey J.

    1993-01-01

    The baseline Space Station Freedom electric power system (EPS) employs photovoltaic (PV) arrays and nickel hydrogen (NiH2) batteries to supply power to housekeeping and user electrical loads via a direct current (dc) distribution system. The EPS was originally designed for an operating life of 30 years through orbital replacement of components. As the design and development of the EPS continues, accurate EPS performance predictions are needed to assess design options, operating scenarios, and resource allocations. To meet these needs, NASA Lewis Research Center (LeRC) has, over a 10 year period, developed SPACE (Station Power Analysis for Capability Evaluation), a computer code designed to predict EPS performance. This paper describes SPACE, its functionality, and its capabilities.

  1. Everyday cognitive functioning in cardiac patients: relationships between self-report, report of a significant other and cognitive test performance.

    Science.gov (United States)

    Elliott, Peter C; Smith, Geoff; Ernest, Christine S; Murphy, Barbara M; Worcester, Marian U C; Higgins, Rosemary O; Le Grande, Michael R; Goble, Alan J; Andrewes, David; Tatoulis, James

    2010-01-01

    Candidates for cardiac bypass surgery often experience cognitive decline. Such decline is likely to affect their everyday cognitive functioning. The aim of the present study was to compare cardiac patients' ratings of their everyday cognitive functioning against significant others' ratings and selected neuropsychological tests. Sixty-nine patients completed a battery of standardised cognitive tests. Patients and significant others also completed the Everyday Function Questionnaire independently of each other. Patient and significant other ratings of patients' everyday cognitive difficulties were found to be similar. Despite the similarities in ratings of difficulties, some everyday cognitive tasks were attributed to different processes. Patients' and significant others' ratings were most closely associated with the neuropsychological test of visual memory. Tests of the patients' verbal memory and fluency were only related to significant others' ratings. Test scores of attention and planning were largely unrelated to ratings by either patients or their significant others.

  2. Manufacturing Excellence Approach to Business Performance Model

    OpenAIRE

    Jesus Cruz Alvarez; Carlos Monge Perry

    2015-01-01

    Six Sigma, lean manufacturing, total quality management, quality control, and quality function deployment are the fundamental set of tools to enhance productivity in organizations. There is some research that outlines the benefit of each tool into a particular context of firm´s productivity, but not into a broader context of firm´s competitiveness that is achieved thru business performance. The aim of this theoretical research paper is to contribute to this mean and propose a manufacturing ex...

  3. An Outline Course on Human Performance Modeling

    Science.gov (United States)

    2006-01-01

    complementary or competing tasks: Dario SaIvucci, ??? 46. Bonnie Johns, David Kieras 47. ecological interface design 48. More into modeling human... alarcon 70. Ben Knott 71. Evelyn Rozanski 7.Pete Khooshabeh Optional: If ou would like to be on a mailin list for further seminars lease enter our email

  4. Persistence Modeling for Assessing Marketing Strategy Performance

    NARCIS (Netherlands)

    M.G. Dekimpe (Marnik); D.M. Hanssens (Dominique)

    2003-01-01

    textabstractThe question of long-run market response lies at the heart of any marketing strategy that tries to create a sustainable competitive advantage for the firm or brand. A key challenge, however, is that only short-run results of marketing actions are readily observable. Persistence modeling

  5. Persistence Modeling for Assessing Marketing Strategy Performance

    NARCIS (Netherlands)

    M.G. Dekimpe (Marnik); D.M. Hanssens (Dominique)

    2003-01-01

    textabstractThe question of long-run market response lies at the heart of any marketing strategy that tries to create a sustainable competitive advantage for the firm or brand. A key challenge, however, is that only short-run results of marketing actions are readily observable. Persistence modeling

  6. An Empirical Study of a Solo Performance Assessment Model

    Science.gov (United States)

    Russell, Brian E.

    2015-01-01

    The purpose of this study was to test a hypothesized model of solo music performance assessment. Specifically, this study investigates the influence of technique and musical expression on perceptions of overall performance quality. The Aural Musical Performance Quality (AMPQ) measure was created to measure overall performance quality, technique,…

  7. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    NARCIS (Netherlands)

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models

  8. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    NARCIS (Netherlands)

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models marketi

  9. Dual learning processes underlying human decision-making in reversal learning tasks: Functional significance and evidence from the model fit to human behavior

    Directory of Open Access Journals (Sweden)

    Yu eBai

    2014-08-01

    Full Text Available Humans are capable of correcting their actions based on actions performed in the past, and this ability enables them to adapt to a changing environment. The computational field of reinforcement learning (RL has provided a powerful explanation for understanding such processes. Recently, the dual learning system, modeled as a hybrid model that incorporates value update based on reward-prediction error and learning rate modulation based on the surprise signal, has gained attention as a model for explaining various neural signals. However, the functional significance of the hybrid model has not been established. In the present study, we used computer simulation in a reversal learning task to address functional significance. The hybrid model was found to perform better than the standard RL model in a large parameter setting. These results suggest that the hybrid model is more robust against mistuning of parameters compared to the standard RL model when decision makers continue to learn stimulus-reward contingencies, which make an abrupt changes. The parameter fitting results also indicated that the hybrid model fit better than the standard RL model for more than 50% of the participants, which suggests that the hybrid model has more explanatory power for the behavioral data than the standard RL model.

  10. Clinical laboratory as an economic model for business performance analysis.

    Science.gov (United States)

    Buljanović, Vikica; Patajac, Hrvoje; Petrovecki, Mladen

    2011-08-15

    To perform SWOT (strengths, weaknesses, opportunities, and threats) analysis of a clinical laboratory as an economic model that may be used to improve business performance of laboratories by removing weaknesses, minimizing threats, and using external opportunities and internal strengths. Impact of possible threats to and weaknesses of the Clinical Laboratory at Našice General County Hospital business performance and use of strengths and opportunities to improve operating profit were simulated using models created on the basis of SWOT analysis results. The operating profit as a measure of profitability of the clinical laboratory was defined as total revenue minus total expenses and presented using a profit and loss account. Changes in the input parameters in the profit and loss account for 2008 were determined using opportunities and potential threats, and economic sensitivity analysis was made by using changes in the key parameters. The profit and loss account and economic sensitivity analysis were tools for quantifying the impact of changes in the revenues and expenses on the business operations of clinical laboratory. Results of simulation models showed that operational profit of €470 723 in 2008 could be reduced to only €21 542 if all possible threats became a reality and current weaknesses remained the same. Also, operational gain could be increased to €535 804 if laboratory strengths and opportunities were utilized. If both the opportunities and threats became a reality, the operational profit would decrease by €384 465. The operational profit of the clinical laboratory could be significantly reduced if all threats became a reality and the current weaknesses remained the same. The operational profit could be increased by utilizing strengths and opportunities as much as possible. This type of modeling may be used to monitor business operations of any clinical laboratory and improve its financial situation by implementing changes in the next fiscal

  11. Clinical laboratory as an economic model for business performance analysis

    Science.gov (United States)

    Buljanović, Vikica; Patajac, Hrvoje; Petrovečki, Mladen

    2011-01-01

    Aim To perform SWOT (strengths, weaknesses, opportunities, and threats) analysis of a clinical laboratory as an economic model that may be used to improve business performance of laboratories by removing weaknesses, minimizing threats, and using external opportunities and internal strengths. Methods Impact of possible threats to and weaknesses of the Clinical Laboratory at Našice General County Hospital business performance and use of strengths and opportunities to improve operating profit were simulated using models created on the basis of SWOT analysis results. The operating profit as a measure of profitability of the clinical laboratory was defined as total revenue minus total expenses and presented using a profit and loss account. Changes in the input parameters in the profit and loss account for 2008 were determined using opportunities and potential threats, and economic sensitivity analysis was made by using changes in the key parameters. The profit and loss account and economic sensitivity analysis were tools for quantifying the impact of changes in the revenues and expenses on the business operations of clinical laboratory. Results Results of simulation models showed that operational profit of €470 723 in 2008 could be reduced to only €21 542 if all possible threats became a reality and current weaknesses remained the same. Also, operational gain could be increased to €535 804 if laboratory strengths and opportunities were utilized. If both the opportunities and threats became a reality, the operational profit would decrease by €384 465. Conclusion The operational profit of the clinical laboratory could be significantly reduced if all threats became a reality and the current weaknesses remained the same. The operational profit could be increased by utilizing strengths and opportunities as much as possible. This type of modeling may be used to monitor business operations of any clinical laboratory and improve its financial situation by

  12. Modeling the Mechanical Performance of Die Casting Dies

    Energy Technology Data Exchange (ETDEWEB)

    R. Allen Miller

    2004-02-27

    The following report covers work performed at Ohio State on modeling the mechanical performance of dies. The focus of the project was development and particularly verification of finite element techniques used to model and predict displacements and stresses in die casting dies. The work entails a major case study performed with and industrial partner on a production die and laboratory experiments performed at Ohio State.

  13. ICT evaluation models and performance of medium and small enterprises

    Directory of Open Access Journals (Sweden)

    Bayaga Anass

    2014-01-01

    Full Text Available Building on prior research related to (1 impact of information communication technology (ICT and (2 operational risk management (ORM in the context of medium and small enterprises (MSEs, the focus of this study was to investigate the relationship between (1 ICT operational risk management (ORM and (2 performances of MSEs. To achieve the focus, the research investigated evaluating models for understanding the value of ICT ORM in MSEs. Multiple regression, Repeated-Measures Analysis of Variance (RM-ANOVA and Repeated-Measures Multivariate Analysis of Variance (RM-MANOVA were performed. The findings of the distribution revealed that only one variable made a significant percentage contribution to the level of ICT operation in MSEs, the Payback method (β = 0.410, p < .000. It may thus be inferred that the Payback method is the prominent variable, explaining the variation in level of evaluation models affecting ICT adoption within MSEs. Conclusively, in answering the two questions (1 degree of variability explained and (2 predictors, the results revealed that the variable contributed approximately 88.4% of the variations in evaluation models affecting ICT adoption within MSEs. The analysis of variance also revealed that the regression coefficients were real and did not occur by chance

  14. Developing Performance Management in State Government: An Exploratory Model for Danish State Institutions

    DEFF Research Database (Denmark)

    Nielsen, Steen; Rikhardsson, Pall M.

    management model. The findings are built on a questionnaire study of 45 high level accounting officers in central governmental institutions. Our statistical model consists of five explored constructs: improvements; initiatives and reforms, incentives and contracts, the use of management accounting practices......, and cost allocations and their relations to performance management. Findings based on structural equation modelling and partial least squares regression (PLS) indicates a positive effect on the latent depending variable, called performance management results. The models/theories explain a significant...

  15. Developing an Energy Performance Modeling Startup Kit

    Energy Technology Data Exchange (ETDEWEB)

    Wood, A.

    2012-10-01

    In 2011, the NAHB Research Center began the first part of the multi-year effort by assessing the needs and motivations of residential remodelers regarding energy performance remodeling. The scope is multifaceted - all perspectives will be sought related to remodeling firms ranging in size from small-scale, sole proprietor to national. This will allow the Research Center to gain a deeper understanding of the remodeling and energy retrofit business and the needs of contractors when offering energy upgrade services. To determine the gaps and the motivation for energy performance remodeling, the NAHB Research Center conducted (1) an initial series of focus groups with remodelers at the 2011 International Builders' Show, (2) a second series of focus groups with remodelers at the NAHB Research Center in conjunction with the NAHB Spring Board meeting in DC, and (3) quantitative market research with remodelers based on the findings from the focus groups. The goal was threefold, to: Understand the current remodeling industry and the role of energy efficiency; Identify the gaps and barriers to adding energy efficiency into remodeling; and Quantify and prioritize the support needs of professional remodelers to increase sales and projects involving improving home energy efficiency. This report outlines all three of these tasks with remodelers.

  16. Significant Performance Advantage of Electroabsorption Modulator Integrated Distributed Feedback Laser (EML) Transmitter in Transporting Multicarrier QAM Signals

    NARCIS (Netherlands)

    Chand, Naresh; Bakker, Laurens; Veen, van Dora; Yadvish, Robert D.

    2001-01-01

    Data are presented that show that, for transporting quadrature amplitude modulated (QAM) radiofrequency (RF) subcarriers in suboctave frequency range, electroabsorption modulator integrated distributed feedback lasers (EMLs) can he modulated with significantly higher (2.5 times) modulation index wit

  17. Performance of turbulence models for transonic flows in a diffuser

    Science.gov (United States)

    Liu, Yangwei; Wu, Jianuo; Lu, Lipeng

    2016-09-01

    Eight turbulence models frequently used in aerodynamics have been employed in the detailed numerical investigations for transonic flows in the Sajben diffuser, to assess the predictive capabilities of the turbulence models for shock wave/turbulent boundary layer interactions (SWTBLI) in internal flows. The eight turbulence models include: the Spalart-Allmaras model, the standard k - 𝜀 model, the RNG k - 𝜀 model, the realizable k - 𝜀 model, the standard k - ω model, the SST k - ω model, the v2¯ - f model and the Reynolds stress model. The performance of the different turbulence models adopted has been systematically assessed by comparing the numerical results with the available experimental data. The comparisons show that the predictive performance becomes worse as the shock wave becomes stronger. The v2¯ - f model and the SST k - ω model perform much better than other models, and the SST k - ω model predicts a little better than the v2¯ - f model for pressure on walls and velocity profile, whereas the v2¯ - f model predicts a little better than the SST k - ω model for separation location, reattachment location and separation length for strong shock case.

  18. Determinants of business model performance in software firms

    OpenAIRE

    Rajala, Risto

    2009-01-01

    The antecedents and consequences of business model design have gained increasing interest among information system (IS) scholars and business practitioners alike. Based on an extensive literature review and empirical research, this study investigates the factors that drive business model design and the performance effects generated by the different kinds of business models in software firms. The main research question is: “What are the determinants of business model performance in the softwar...

  19. High Performance Geostatistical Modeling of Biospheric Resources

    Science.gov (United States)

    Pedelty, J. A.; Morisette, J. T.; Smith, J. A.; Schnase, J. L.; Crosier, C. S.; Stohlgren, T. J.

    2004-12-01

    We are using parallel geostatistical codes to study spatial relationships among biospheric resources in several study areas. For example, spatial statistical models based on large- and small-scale variability have been used to predict species richness of both native and exotic plants (hot spots of diversity) and patterns of exotic plant invasion. However, broader use of geostastics in natural resource modeling, especially at regional and national scales, has been limited due to the large computing requirements of these applications. To address this problem, we implemented parallel versions of the kriging spatial interpolation algorithm. The first uses the Message Passing Interface (MPI) in a master/slave paradigm on an open source Linux Beowulf cluster, while the second is implemented with the new proprietary Xgrid distributed processing system on an Xserve G5 cluster from Apple Computer, Inc. These techniques are proving effective and provide the basis for a national decision support capability for invasive species management that is being jointly developed by NASA and the US Geological Survey.

  20. Significance of hydrological model choice and land use changes when doing climate change impact assessment

    Science.gov (United States)

    Bjørnholt Karlsson, Ida; Obel Sonnenborg, Torben; Refsgaard, Jens Christian; Høgh Jensen, Karsten

    2014-05-01

    Uncertainty in impact studies arises both from Global Climate Models (GCM), emission projections, statistical downscaling, Regional Climate Models (RCM), hydrological models and calibration techniques (Refsgaard et al. 2013). Some of these uncertainties have been evaluated several times in the literature; however few studies have investigated the effect of hydrological model choice on the assessment results (Boorman & Sefton 1997; Jiang et al. 2007; Bastola et al. 2011). These studies have found that model choice results in large differences, up to 70%, in the predicted discharge changes depending on the climate input. The objective of the study is to investigate the impact of climate change on hydrology of the Odense catchment, Denmark both in response to (a) different climate projections (GCM-RCM combinations); (b) different hydrological models and (c) different land use scenarios. This includes: 1. Separation of the climate model signal; the hydrological model signal and the land use signal 2. How do the different hydrological components react under different climate and land use conditions for the different models 3. What land use scenario seems to provide the best adaptation for the challenges of the different future climate change scenarios from a hydrological perspective? Four climate models from the ENSEMBLES project (Hewitt & Griggs 2004): ECHAM5 - HIRHAM5, ECHAM5 - RCA3, ARPEGE - RM5.1 and HadCM3 - HadRM3 are used, assessing the climate change impact in three periods: 1991-2010 (present), 2041-2060 (near future) and 2081-2100 (far future). The four climate models are used in combination with three hydrological models with different conceptual layout: NAM, SWAT and MIKE SHE. Bastola, S., C. Murphy and J. Sweeney (2011). "The role of hydrological modelling uncertainties in climate change impact assessments of Irish river catchments." Advances in Water Resources 34: 562-576. Boorman, D. B. and C. E. M. Sefton (1997). "Recognising the uncertainty in the

  1. Advanced Performance Modeling with Combined Passive and Active Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Dovrolis, Constantine [Georgia Inst. of Technology, Atlanta, GA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-04-15

    To improve the efficiency of resource utilization and scheduling of scientific data transfers on high-speed networks, the "Advanced Performance Modeling with combined passive and active monitoring" (APM) project investigates and models a general-purpose, reusable and expandable network performance estimation framework. The predictive estimation model and the framework will be helpful in optimizing the performance and utilization of networks as well as sharing resources with predictable performance for scientific collaborations, especially in data intensive applications. Our prediction model utilizes historical network performance information from various network activity logs as well as live streaming measurements from network peering devices. Historical network performance information is used without putting extra load on the resources by active measurement collection. Performance measurements collected by active probing is used judiciously for improving the accuracy of predictions.

  2. WISC-R Verbal and Performance IQ Discrepancy in an Unselected Cohort: Clinical Significance and Longitudinal Stability.

    Science.gov (United States)

    Moffitt, Terrie E.; Silva, P. A.

    1987-01-01

    Examined children whose Wechsler Intelligence Scale for Children-Revised (WISC-R) verbal and performance Intelligence Quotient discrepancies placed them beyond the 90th percentile. Longitudinal study showed 23 percent of the discrepant cases to be discrepant at two or more ages. Studied frequency of perinatal difficulties, early childhood…

  3. Comparative performance of Fungichrom I, Candifast and API 20C Aux systems in the identification of clinically significant yeasts.

    Science.gov (United States)

    Gündeş, S G; Gulenc, S; Bingol, R

    2001-12-01

    To compare the performance of current chromogenic yeast identification methods, three commercial systems (API 20C Aux, Fungichrom I and Candifast) were evaluated in parallel, along with conventional tests to identify yeasts commonly isolated in this clinical microbiology laboratory. In all, 116 clinical isolates, (68 Candida albicans, 12 C. parapsilosis, 12 C. glabrata and 24 other yeasts) were tested. Germ-tube production, microscopical morphology and other conventional methods were used as standards to definitively identify yeast isolates. The percentage of isolates identified correctly varied between 82.7% and 95.6%. Overall, the performance obtained with Fungichrom I was highest with 95.6% identification (111 of 116 isolates). The performance of API 20C Aux was higher with 87% (101 of 116 isolates) than that of Candifast with 82.7% (96 of 116). The Fungichrom I method was found to be rapid, as 90% of strains were identified after incubation for 24 h at 30 degrees C. Both of the chromogenic yeast identification systems provided a simple, accurate alternative to API 20C Aux and conventional assimilation methods for the rapid identification of most commonly encountered isolates of Candida spp. Fungichrom seemed to be the most appropriate system for use in a clinical microbiology laboratory, due to its good performance with regard to sensitivity, ease of use and reading, rapidity and the cost per test.

  4. Evaluation of Techniques to Detect Significant Network Performance Problems using End-to-End Active Network Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Cottrell, R.Les; Logg, Connie; Chhaparia, Mahesh; /SLAC; Grigoriev, Maxim; /Fermilab; Haro, Felipe; /Chile U., Catolica; Nazir, Fawad; /NUST, Rawalpindi; Sandford, Mark

    2006-01-25

    End-to-End fault and performance problems detection in wide area production networks is becoming increasingly hard as the complexity of the paths, the diversity of the performance, and dependency on the network increase. Several monitoring infrastructures are built to monitor different network metrics and collect monitoring information from thousands of hosts around the globe. Typically there are hundreds to thousands of time-series plots of network metrics which need to be looked at to identify network performance problems or anomalous variations in the traffic. Furthermore, most commercial products rely on a comparison with user configured static thresholds and often require access to SNMP-MIB information, to which a typical end-user does not usually have access. In our paper we propose new techniques to detect network performance problems proactively in close to realtime and we do not rely on static thresholds and SNMP-MIB information. We describe and compare the use of several different algorithms that we have implemented to detect persistent network problems using anomalous variations analysis in real end-to-end Internet performance measurements. We also provide methods and/or guidance for how to set the user settable parameters. The measurements are based on active probes running on 40 production network paths with bottlenecks varying from 0.5Mbits/s to 1000Mbit/s. For well behaved data (no missed measurements and no very large outliers) with small seasonal changes most algorithms identify similar events. We compare the algorithms' robustness with respect to false positives and missed events especially when there are large seasonal effects in the data. Our proposed techniques cover a wide variety of network paths and traffic patterns. We also discuss the applicability of the algorithms in terms of their intuitiveness, their speed of execution as implemented, and areas of applicability. Our encouraging results compare and evaluate the accuracy of our

  5. 75 FR 29587 - Notice of Availability of Revised Model Proposed No Significant Hazards Consideration...

    Science.gov (United States)

    2010-05-26

    ... of Nuclear Reactor Regulation, U.S. Nuclear Regulatory Commission, Washington, DC, 20555-0001... Processes Branch, Division of Policy and Rulemaking, Office of Nuclear Reactor Regulation. Revised Model... with the confidence in the ability of the fission product barriers (i.e., fuel cladding,...

  6. An Ecological-Transactional Model of Significant Risk Factors for Child Psychopathology in Outer Mongolia

    Science.gov (United States)

    Kohrt, Holbrook E.; Kohrt, Brandon A.; Waldman, Irwin; Saltzman, Kasey; Carrion, Victor G.

    2004-01-01

    The present study examined significant risk factors, including child maltreatment, for child psychopathology in a cross-cultural setting. Ninety-nine Mongolian boys, ages 3-10 years, were assessed. Primary caregivers (PCG) completed structured interviews including the Emory Combined Rating Scale (ECRS) and the Mood and Feelings Questionnaire…

  7. Magnitude, modeling and significance of swelling and shrinkage processes in clay soils.

    NARCIS (Netherlands)

    Bronswijk, J.J.B.

    1991-01-01

    The dynamic process of swelling and shrinkage in clay soils has significant practical consequences, such as the rapid transport of water and solutes via shrinkage cracks to the subsoil, and the destruction of buildings and roads on clay soils. In order to develop measuring methods and computer simul

  8. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...

  9. Significance of different animal species in experimental models for in vivo investigations of hematopoiesis

    Directory of Open Access Journals (Sweden)

    Kovačević-Filipović Milica

    2004-01-01

    Full Text Available Numerous discoveries in medicine are results of experiments on different animal species. The most frequently used animals in hematopoiesis investigations are laboratory mice and rats, but so-called big animals, such as pigs, sheep, cats, dogs, and monkeys, evolution-wise closer to humans have a place in experimental hematology as well. The specific problematics of a certain animal specie can lead to fundamental knowledge on certain aspects of the process of hematopoiesis end the biology of stem cells in hematopoiesis. Furthermore, comparative investigations of certain phenomena in different species help in the recognition of the general rules in the living world. In the area f preclinicalinvesti- gations, animal models are an inevitable step in studies of transplantation biology of stem cells in hematopoiesis, as well as in studies of biologically active molecules which have an effect on the hematopoietic system. Knowledge acquired on animal models is applied in both human and veterinary medicine.

  10. Recent advances in mechanical characterisation of biofilm and their significance for material modelling.

    Science.gov (United States)

    Böl, Markus; Ehret, Alexander E; Bolea Albero, Antonio; Hellriegel, Jan; Krull, Rainer

    2013-06-01

    In recent years, the advances in microbiology show that biofilms are structurally complex, dynamic and adaptable systems including attributes of multicellular organisms and miscellaneous ecosystems. One may distinguish between beneficial and harmful biofilms appearing in daily life as well as various industrial processes. In order to advance the growth of the former or prevent the latter type of biofilm, a detailed understanding of its properties is indispensable. Besides microbiological aspects, this concerns the determination of mechanical characteristics, which provides the basis for material modelling. In the present paper the existing experimental methods that have been proposed since the 1980s are reviewed and critically discussed with respect to their usefulness and applicability to develop numerical modelling approaches.

  11. Modelling of green roofs' hydrologic performance using EPA's SWMM.

    Science.gov (United States)

    Burszta-Adamiak, E; Mrowiec, M

    2013-01-01

    Green roofs significantly affect the increase in water retention and thus the management of rain water in urban areas. In Poland, as in many other European countries, excess rainwater resulting from snowmelt and heavy rainfall contributes to the development of local flooding in urban areas. Opportunities to reduce surface runoff and reduce flood risks are among the reasons why green roofs are more likely to be used also in this country. However, there are relatively few data on their in situ performance. In this study the storm water performance was simulated for the green roofs experimental plots using the Storm Water Management Model (SWMM) with Low Impact Development (LID) Controls module (version 5.0.022). The model consists of many parameters for a particular layer of green roofs but simulation results were unsatisfactory considering the hydrologic response of the green roofs. For the majority of the tested rain events, the Nash coefficient had negative values. It indicates a weak fit between observed and measured flow-rates. Therefore complexity of the LID module does not affect the increase of its accuracy. Further research at a technical scale is needed to determine the role of the green roof slope, vegetation cover and drying process during the inter-event periods.

  12. Development of the PCAD Model to Assess Biological Significance of Acoustic Disturbance

    Science.gov (United States)

    2015-09-30

    mother-calf separation as a function of body mass index ( BMI ) and proportion lipid in blubber. We have also quantified the relationship between those...the approach. This is best accomplished by selecting species that are as similar as possible to target species and are also extremely well-studied...We identified northern elephant seals and Atlantic bottlenose dolphins as the best species to parameterize the PCAD model. These species represent

  13. System Level Modelling and Performance Estimation of Embedded Systems

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer

    is simulation based and allows performance estimation to be carried out throughout all design phases ranging from early functional to cycle accurate and bit true descriptions of the system, modelling both hardware and software components in a unied way. Design space exploration and performance estimation...... an efficient system level design methodology, a modelling framework for performance estimation and design space exploration at the system level is required. This thesis presents a novel component based modelling framework for system level modelling and performance estimation of embedded systems. The framework...... is performed by having the framework produce detailed quantitative information about the system model under investigation. The project is part of the national Danish research project, Danish Network of Embedded Systems (DaNES), which is funded by the Danish National Advanced Technology Foundation. The project...

  14. Performance Predictable ServiceBSP Model for Grid Computing

    Institute of Scientific and Technical Information of China (English)

    TONG Weiqin; MIAO Weikai

    2007-01-01

    This paper proposes a performance prediction model for grid computing model ServiceBSP to support developing high quality applications in grid environment. In ServiceBSP model,the agents carrying computing tasks are dispatched to the local domain of the selected computation services. By using the IP (integer program) approach, the Service Selection Agent selects the computation services with global optimized QoS (quality of service) consideration. The performance of a ServiceBSP application can be predicted according to the performance prediction model based on the QoS of the selected services. The performance prediction model can help users to analyze their applications and improve them by optimized the factors which affects the performance. The experiment shows that the Service Selection Agent can provide ServiceBSP users with satisfied QoS of applications.

  15. Significance of genetic information in risk assessment and individual classification using silicosis as a case model

    Energy Technology Data Exchange (ETDEWEB)

    McCanlies, E.; Landsittel, D.P.; Yucesoy, B.; Vallyathan, V.; Luster, M.L.; Sharp, D.S. [NIOSH, Morgantown, WV (United States)

    2002-06-01

    Over the last decade the role of genetic data in epidemiological research has expanded considerably. The authors recently published a case-control study that evaluated the interaction between silica exposure and minor variants in the genes coding for interleukin-1alpha. (IL-1alpha), interleukin-1 receptor antagonist (IL-1RA) and tumor necrosis factor alpha (TNFalpha) as risk factors associated with silicosis, a fibrotic lung disease. In contrast, this report uses data generated from these studies to illustrate the utility of genetic information for the purposes of risk assessment and clinical prediction. Specifically, this study addresses how, given a known exposure, genetic information affects the characterization of risk groups. Relative operating characteristic (ROC) curves were then used to determine the impact of genetic information on individual classification. Logistic regression modeling procedures were used to estimate the predicted probability of developing silicosis. This probability was then used to construct predicted risk deciles, first for a model with occupational exposure only and then for a model containing occupational exposure and genetic main effects and interactions. The results indicate that genetic information plays a valuable role in effectively characterizing risk groups and mechanisms of disease operating in a substantial proportion of the population. However, in the case of fibrotic lung disease caused by silica exposure, information about the presence or absence of the minor variants of IL-1alpha, IL-1RA and TNFalpha is unlikely to be a useful tool for individual classification.

  16. Crumbled or mashed feed had no significant effect on the performance of lactating sows or their offspring

    OpenAIRE

    Kim, S. C.; H.L. Li; Park, J H; Kim, I. H.

    2015-01-01

    Background Physical and chemical properties of feedstuffs can be changed by feed processing. Moreover, through various mechanisms, feed processing can affect growth performance and feed efficiency of swine, nutrition value of the feed. Weaning-to service-intervals (WSI), subsequent farrowing rates, and total-born litter sizes were determined by feed intake and metabolic state during lactation. Methods A total of 20 sows (Landrace × Yorkshire) with an average body weight (BW) of 266.1 kg 4 d b...

  17. Predicting optimum vortex tube performance using a simplified CFD model

    Energy Technology Data Exchange (ETDEWEB)

    Karimi-Esfahani, M; Fartaj, A.; Rankin, G.W. [Univ. of Windsor, Dept. of Mechanical, Automotive and Materials Engineering, Windsor, Ontario (Canada)]. E-mail: mki_60@hotmail.com

    2004-07-01

    The Ranque-Hilsch tube is a particular type of vortex tube device. The flow enters the device tangentially near one end and exits from the open ends of the tube. The inlet air is of a uniform temperature throughout while the outputs are of different temperatures. One outlet is hotter and the other is colder than the inlet air. This device has no moving parts and does not require any additional power for its operation other than that supplied to the device to compress the inlet air. It has, however, not been widely used, mainly because of its low efficiency. In this paper, a simplified 2-dimensional computational fluid dynamics model for the flow in the vortex tube is developed using FLUENT. This model makes use of the assumption of axial symmetry throughout the entire flow domain. Compared to a three-dimensional computational solution, the simplified model requires significantly less computational time. This is important because the model is to be used for an optimization study. A user-defined function is generated to implement a modified version of the k-epsilon model to account for turbulence. This model is validated by comparing a particular solution with available experimental data. The variation of cold temperature drop and efficiency of the device with orifice diameter, inlet pressure and cold mass flow ratio qualitatively agree with experimental results. Variation of these performance indices with tube length did not agree with the experiments for small values of tube length. However, it did agree qualitatively for large values. (author)

  18. Performance measurement and modeling of component applications in a high performance computing environment : a case study.

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Robert C.; Ray, Jaideep; Malony, A. (University of Oregon, Eugene, OR); Shende, Sameer (University of Oregon, Eugene, OR); Trebon, Nicholas D.

    2003-11-01

    We present a case study of performance measurement and modeling of a CCA (Common Component Architecture) component-based application in a high performance computing environment. We explore issues peculiar to component-based HPC applications and propose a performance measurement infrastructure for HPC based loosely on recent work done for Grid environments. A prototypical implementation of the infrastructure is used to collect data for a three components in a scientific application and construct performance models for two of them. Both computational and message-passing performance are addressed.

  19. Performance Modeling of Communication Networks with Markov Chains

    CERN Document Server

    Mo, Jeonghoon

    2010-01-01

    This book is an introduction to Markov chain modeling with applications to communication networks. It begins with a general introduction to performance modeling in Chapter 1 where we introduce different performance models. We then introduce basic ideas of Markov chain modeling: Markov property, discrete time Markov chain (DTMe and continuous time Markov chain (CTMe. We also discuss how to find the steady state distributions from these Markov chains and how they can be used to compute the system performance metric. The solution methodologies include a balance equation technique, limiting probab

  20. Hierarchical Bulk Synchronous Parallel Model and Performance Optimization

    Institute of Scientific and Technical Information of China (English)

    HUANG Linpeng; SUNYongqiang; YUAN Wei

    1999-01-01

    Based on the framework of BSP, aHierarchical Bulk Synchronous Parallel (HBSP) performance model isintroduced in this paper to capture the performance optimizationproblem for various stages in parallel program development and toaccurately predict the performance of a parallel program byconsidering factors causing variance at local computation and globalcommunication. The related methodology has been applied to several realapplications and the results show that HBSP is a suitable model foroptimizing parallel programs.

  1. Postoperative Ambulatory Performance Status Significantly Affects Implant Failure Rate Among Surgical Treatment Strategies in Patients With Proximal Femur Metastasis.

    Science.gov (United States)

    Tsai, Shang-Wen; Wu, Po-Kuei; Chen, Cheng-Fong; Chang, Ming-Chau; Chen, Wei-Ming

    2016-11-08

    Surgical treatment strategies for proximal femur metastasis have been reported with mixed results. Little is known about risk factor for implant failure except for longer patient survival. Therefore, we determined whether implant survivorship differed among treatment strategies, as well as risk factors for implant failure. We retrospectively reviewed a consecutive 106 patients with proximal femur metastasis treated with prosthesis replacement (n = 38), intramedullary nail (n = 32), and dynamic hip screw (DHS) (n = 36). Eastern Cooperative Oncology Group (ECOG) scale and Karnofsky index were used to evaluate functional outcome. Patient characteristics and postoperative ambulatory performance status were assessed for their values in determining implant failure. Overall implant failure rate was 11.3% (12 of 106). Prosthesis replacement was related to better implant survivorship (P = 0.041), without mechanical failures. On the contrary, 7 of the 10 implant failures in the fixation group were considered mechanical failures. Better postoperative ambulatory status (ECOG ≤ 2) was a risk factor for implant failure (P = 0.03). Notably, for patients with poor ambulatory status (ECOG ≥ 3), implant survivorship was not different among choice of implants. In conclusion, prosthesis replacement would be a more durable option in the treatment of proximal femur metastasis. Postoperative ambulatory status could be an additional consideration. For patients with poor expected ambulatory performance status, fixation with intramedullary nail or DHS might be considered for a less technical demanding procedure.

  2. Cost and Performance Assumptions for Modeling Electricity Generation Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Tidball, Rick [ICF International, Fairfax, VA (United States); Bluestein, Joel [ICF International, Fairfax, VA (United States); Rodriguez, Nick [ICF International, Fairfax, VA (United States); Knoke, Stu [ICF International, Fairfax, VA (United States)

    2010-11-01

    The goal of this project was to compare and contrast utility scale power plant characteristics used in data sets that support energy market models. Characteristics include both technology cost and technology performance projections to the year 2050. Cost parameters include installed capital costs and operation and maintenance (O&M) costs. Performance parameters include plant size, heat rate, capacity factor or availability factor, and plant lifetime. Conventional, renewable, and emerging electricity generating technologies were considered. Six data sets, each associated with a different model, were selected. Two of the data sets represent modeled results, not direct model inputs. These two data sets include cost and performance improvements that result from increased deployment as well as resulting capacity factors estimated from particular model runs; other data sets represent model input data. For the technologies contained in each data set, the levelized cost of energy (LCOE) was also evaluated, according to published cost, performance, and fuel assumptions.

  3. Cost and Performance Assumptions for Modeling Electricity Generation Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Tidball, R.; Bluestein, J.; Rodriguez, N.; Knoke, S.

    2010-11-01

    The goal of this project was to compare and contrast utility scale power plant characteristics used in data sets that support energy market models. Characteristics include both technology cost and technology performance projections to the year 2050. Cost parameters include installed capital costs and operation and maintenance (O&M) costs. Performance parameters include plant size, heat rate, capacity factor or availability factor, and plant lifetime. Conventional, renewable, and emerging electricity generating technologies were considered. Six data sets, each associated with a different model, were selected. Two of the data sets represent modeled results, not direct model inputs. These two data sets include cost and performance improvements that result from increased deployment as well as resulting capacity factors estimated from particular model runs; other data sets represent model input data. For the technologies contained in each data set, the levelized cost of energy (LCOE) was also evaluated, according to published cost, performance, and fuel assumptions.

  4. Compound fuzzy model for thermal performance of refrigeration compressors

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The fuzzy method is introduced to the calculation of thermal performance of refrigeration compressors. A compound model combining classical thermodynamic theory and fuzzy theory is presented and compared with a simple fuzzy model without classical thermodynamic fundamentals. Case study of refrigeration compressors shows that the compound fuzzy model and the simple fuzzy model are both more efficient than the classical thermodynamic method. However, the compound fuzzy model is of better precision and adaptability.

  5. LCP- LIFETIME COST AND PERFORMANCE MODEL FOR DISTRIBUTED PHOTOVOLTAIC SYSTEMS

    Science.gov (United States)

    Borden, C. S.

    1994-01-01

    The Lifetime Cost and Performance (LCP) Model was developed to assist in the assessment of Photovoltaic (PV) system design options. LCP is a simulation of the performance, cost, and revenue streams associated with distributed PV power systems. LCP provides the user with substantial flexibility in specifying the technical and economic environment of the PV application. User-specified input parameters are available to describe PV system characteristics, site climatic conditions, utility purchase and sellback rate structures, discount and escalation rates, construction timing, and lifetime of the system. Such details as PV array orientation and tilt angle, PV module and balance-of-system performance attributes, and the mode of utility interconnection are user-specified. LCP assumes that the distributed PV system is utility grid interactive without dedicated electrical storage. In combination with a suitable economic model, LCP can provide an estimate of the expected net present worth of a PV system to the owner, as compared to electricity purchased from a utility grid. Similarly, LCP might be used to perform sensitivity analyses to identify those PV system parameters having significant impact on net worth. The user describes the PV system configuration to LCP via the basic electrical components. The module is the smallest entity in the PV system which is modeled. A PV module is defined in the simulation by its short circuit current, which varies over the system lifetime due to degradation and failure. Modules are wired in series to form a branch circuit. Bypass diodes are allowed between modules in the branch circuits. Branch circuits are then connected in parallel to form a bus. A collection of buses is connected in parallel to form an increment to capacity of the system. By choosing the appropriate series-parallel wiring design, the user can specify the current, voltage, and reliability characteristics of the system. LCP simulation of system performance is site

  6. A Spectral Evaluation of Models Performances in Mediterranean Oak Woodlands

    Science.gov (United States)

    Vargas, R.; Baldocchi, D. D.; Abramowitz, G.; Carrara, A.; Correia, A.; Kobayashi, H.; Papale, D.; Pearson, D.; Pereira, J.; Piao, S.; Rambal, S.; Sonnentag, O.

    2009-12-01

    Ecosystem processes are influenced by climatic trends at multiple temporal scales including diel patterns and other mid-term climatic modes, such as interannual and seasonal variability. Because interactions between biophysical components of ecosystem processes are complex, it is important to test how models perform in frequency (e.g. hours, days, weeks, months, years) and time (i.e. day of the year) domains in addition to traditional tests of annual or monthly sums. Here we present a spectral evaluation using wavelet time series analysis of model performance in seven Mediterranean Oak Woodlands that encompass three deciduous and four evergreen sites. We tested the performance of five models (CABLE, ORCHIDEE, BEPS, Biome-BGC, and JULES) on measured variables of gross primary production (GPP) and evapotranspiration (ET). In general, model performance fails at intermediate periods (e.g. weeks to months) likely because these models do not represent the water pulse dynamics that influence GPP and ET at these Mediterranean systems. To improve the performance of a model it is critical to identify first where and when the model fails. Only by identifying where a model fails we can improve the model performance and use them as prognostic tools and to generate further hypotheses that can be tested by new experiments and measurements.

  7. Performance of Modeling wireless networks in realistic environment

    CERN Document Server

    Siraj, M

    2012-01-01

    A wireless network is realized by mobile devices which communicate over radio channels. Since, experiments of real life problem with real devices are very difficult, simulation is used very often. Among many other important properties that have to be defined for simulative experiments, the mobility model and the radio propagation model have to be selected carefully. Both have strong impact on the performance of mobile wireless networks, e.g., the performance of routing protocols varies with these models. There are many mobility and radio propagation models proposed in literature. Each of them was developed with different objectives and is not suited for every physical scenario. The radio propagation models used in common wireless network simulators, in general researcher consider simple radio propagation models and neglect obstacles in the propagation environment. In this paper, we study the performance of wireless networks simulation by consider different Radio propagation models with considering obstacles i...

  8. Performance modeling and prediction for linear algebra algorithms

    OpenAIRE

    Iakymchuk, Roman

    2012-01-01

    This dissertation incorporates two research projects: performance modeling and prediction for dense linear algebra algorithms, and high-performance computing on clouds. The first project is focused on dense matrix computations, which are often used as computational kernels for numerous scientific applications. To solve a particular mathematical operation, linear algebra libraries provide a variety of algorithms. The algorithm of choice depends, obviously, on its performance. Performance of su...

  9. Breast cancer-associated metastasis is significantly increased in a model of autoimmune arthritis

    Science.gov (United States)

    Das Roy, Lopamudra; Pathangey, Latha B; Tinder, Teresa L; Schettini, Jorge L; Gruber, Helen E; Mukherjee, Pinku

    2009-01-01

    Introduction Sites of chronic inflammation are often associated with the establishment and growth of various malignancies including breast cancer. A common inflammatory condition in humans is autoimmune arthritis (AA) that causes inflammation and deformity of the joints. Other systemic effects associated with arthritis include increased cellular infiltration and inflammation of the lungs. Several studies have reported statistically significant risk ratios between AA and breast cancer. Despite this knowledge, available for a decade, it has never been questioned if the site of chronic inflammation linked to AA creates a milieu that attracts tumor cells to home and grow in the inflamed bones and lungs which are frequent sites of breast cancer metastasis. Methods To determine if chronic inflammation induced by autoimmune arthritis contributes to increased breast cancer-associated metastasis, we generated mammary gland tumors in SKG mice that were genetically prone to develop AA. Two breast cancer cell lines, one highly metastatic (4T1) and the other non-metastatic (TUBO) were used to generate the tumors in the mammary fat pad. Lung and bone metastasis and the associated inflammatory milieu were evaluated in the arthritic versus the non-arthritic mice. Results We report a three-fold increase in lung metastasis and a significant increase in the incidence of bone metastasis in the pro-arthritic and arthritic mice compared to non-arthritic control mice. We also report that the metastatic breast cancer cells augment the severity of arthritis resulting in a vicious cycle that increases both bone destruction and metastasis. Enhanced neutrophilic and granulocytic infiltration in lungs and bone of the pro-arthritic and arthritic mice and subsequent increase in circulating levels of proinflammatory cytokines, such as macrophage colony stimulating factor (M-CSF), interleukin-17 (IL-17), interleukin-6 (IL-6), vascular endothelial growth factor (VEGF), and tumor necrosis factor

  10. Comparisons of Faulting-Based Pavement Performance Prediction Models

    Directory of Open Access Journals (Sweden)

    Weina Wang

    2017-01-01

    Full Text Available Faulting prediction is the core of concrete pavement maintenance and design. Highway agencies are always faced with the problem of lower accuracy for the prediction which causes costly maintenance. Although many researchers have developed some performance prediction models, the accuracy of prediction has remained a challenge. This paper reviews performance prediction models and JPCP faulting models that have been used in past research. Then three models including multivariate nonlinear regression (MNLR model, artificial neural network (ANN model, and Markov Chain (MC model are tested and compared using a set of actual pavement survey data taken on interstate highway with varying design features, traffic, and climate data. It is found that MNLR model needs further recalibration, while the ANN model needs more data for training the network. MC model seems a good tool for pavement performance prediction when the data is limited, but it is based on visual inspections and not explicitly related to quantitative physical parameters. This paper then suggests that the further direction for developing the performance prediction model is incorporating the advantages and disadvantages of different models to obtain better accuracy.

  11. Atomic scale simulations for improved CRUD and fuel performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Anders David Ragnar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cooper, Michael William Donald [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-06

    A more mechanistic description of fuel performance codes can be achieved by deriving models and parameters from atomistic scale simulations rather than fitting models empirically to experimental data. The same argument applies to modeling deposition of corrosion products on fuel rods (CRUD). Here are some results from publications in 2016 carried out using the CASL allocation at LANL.

  12. Support for significant evolutions of the user data model in ROOT files

    Energy Technology Data Exchange (ETDEWEB)

    Canal, Ph; Russo, P [Fermilab, Batavia, IL (United States); Brun, R; Janyst, L [CERN, Geneva (Switzerland); Fine, V; Lauret, J, E-mail: pcanal@fnal.go [Brookhaven National Laboratory, Upton, NY (United States)

    2010-04-01

    One of the main strengths of ROOT input and output (I/O) is its inherent support for schema evolution. Two distinct modes are supported, one manual via a hand coded streamer function and one fully automatic via the ROOT StreamerInfo. One draw back of the streamer functions is that they are not usable by TTree objects in split mode. Until now, the user could not customize the automatic schema evolution mechanism and the only mechanism to go beyond the default rules was to revert to using the streamer function. In ROOT 5.22/00, we introduced a new mechanism which allows user provided extensions of the automatic schema evolution that can be used in object-wise, member-wise and split modes. This paper will describe the many possibilities ranging from the simple assignment of transient members to the complex reorganization of the user's object model.

  13. Trait impressions as overgeneralized responses to adaptively significant facial qualities: evidence from connectionist modeling.

    Science.gov (United States)

    Zebrowitz, Leslie A; Fellous, Jean-Marc; Mignault, Alain; Andreoletti, Carrie

    2003-01-01

    Connectionist modeling experiments tested anomalous-face and baby-face overgeneralization hypotheses proposed to explain consensual trait impressions of faces. Activation of a neural network unit trained to respond to anomalous faces predicted impressions of normal adult faces varying in attractiveness as well as several elderly stereotypes. Activation of a neural network unit trained to respond to babies' faces predicted impressions of adults varying in babyfaceness as well as 1 elderly stereotype. Thus, similarities of normal adult faces to anomalous faces or babies' faces contribute to impressions of them quite apart from knowledge of overlapping social stereotypes. The evolutionary importance of appropriate responses to unfit individuals or babies is presumed to produce a strong response preparedness that is overgeneralized to faces resembling the unfit or babies.

  14. Mouse models of lipodystrophy and their significance in understanding fat regulation.

    Science.gov (United States)

    Rochford, Justin J

    2014-01-01

    Adipose tissue plays a critical role in human metabolic health. This is most dramatically illustrated by the severe metabolic disease that occurs in syndromes of lipodystrophy where individuals fail to develop or maintain appropriate adipose tissue mass. The most severe form of this disorder is congenital generalized lipodystrophy (CGL). Individuals with CGL have a striking paucity of adipose tissue and typically display severe metabolic disease with insulin resistance and dyslipidemia. Understanding of the metabolic consequences of lipodystrophies and their underlying molecular mechanisms will provide new information regarding the development and function of human adipose tissue. Mouse models of these conditions offer key resources to investigate this in vivo. Adipocyte dysfunction is believed to underlie the development of metabolic disease in obesity. Hence, understanding how one might beneficially manipulate adipose tissue by studying genes whose disruption causes lipodystrophy is likely to suggest novel means to improve metabolic health in common obesity.

  15. Support for significant evolutions of the user data model in ROOT files

    Energy Technology Data Exchange (ETDEWEB)

    Canal, P.; /Fermilab; Brun, R.; /CERN; Fine, V.; /Brookhaven; Janyst, L.; /CERN; Lauret, J.; /Brookhaven; Russo, P.; /Fermilab

    2010-01-01

    One of the main strengths of ROOT input and output (I/O) is its inherent support for schema evolution. Two distinct modes are supported, one manual via a hand coded streamer function and one fully automatic via the ROOT StreamerInfo. One draw back of the streamer functions is that they are not usable by TTree objects in split mode. Until now, the user could not customize the automatic schema evolution mechanism and the only mechanism to go beyond the default rules was to revert to using the streamer function. In ROOT 5.22/00, we introduced a new mechanism which allows user provided extensions of the automatic schema evolution that can be used in object-wise, member-wise and split modes. This paper will describe the many possibilities ranging from the simple assignment of transient members to the complex reorganization of the user's object model.

  16. Preconditioning Provides Neuroprotection in Models of CNS Disease: Paradigms and Clinical Significance

    Science.gov (United States)

    Stetler, R. Anne; Leak, Rehana K.; Gan, Yu; Li, Peiying; Hu, Xiaoming; Jing, Zheng; Chen, Jun; Zigmond, Michael J.; Gao, Yanqin

    2014-01-01

    Preconditioning is a phenomenon in which brief episodes of a sublethal insult induce robust protection against subsequent lethal injuries. Preconditioning has been observed in multiple organisms and can occur in the brain as well as other tissues. Extensive animal studies suggest that the brain can be preconditioned to resist acute injuries, such as ischemic stroke, neonatal hypoxia/ischemia, trauma, and agents that are used in models of neurodegenerative diseases, such as Parkinson’s disease and Alzheimer’s disease. Effective preconditioning stimuli are numerous and diverse, ranging from transient ischemia, hypoxia, hyperbaric oxygen, hypothermia and hyperthermia, to exposure to neurotoxins and pharmacological agents. The phenomenon of “cross-tolerance,” in which a sublethal stress protects against a different type of injury, suggests that different preconditioning stimuli may confer protection against a wide range of injuries. Research conducted over the past few decades indicates that brain preconditioning is complex, involving multiple effectors such as metabolic inhibition, activation of extra- and intracellular defense mechanisms, a shift in the neuronal excitatory/inhibitory balance, and reduction in inflammatory sequelae. An improved understanding of brain preconditioning should help us identify innovative therapeutic strategies that prevent or at least reduce neuronal damage in susceptible patients. In this review, we focus on the experimental evidence of preconditioning in the brain and systematically survey the models used to develop paradigms for neuroprotection, and then discuss the clinical potential of brain preconditioning. In a subsequent components of this two-part series, we will discuss the cellular and molecular events that are likely to underlie these phenomena. PMID:24389580

  17. Thermophysical modeling of asteroids from WISE thermal infrared data - Significance of the shape model and the pole orientation uncertainties

    CERN Document Server

    Hanuš, Josef; Ďurech, Josef; Alí-Lagoa, Victor

    2015-01-01

    In the analysis of thermal infrared data of asteroids by means of thermophysical models (TPMs) it is a common practice to neglect the uncertainty of the shape model and the rotational state, which are taken as an input for the model. Here, we present a novel method of investigating the importance of the shape model and the pole orientation uncertainties in the thermophysical modeling - the varied shape TPM (VS-TPM). Our method uses optical photometric data to generate various shape models that map the uncertainty in the shape and the rotational state. The TPM procedure is then run for all these shape models. We apply the implementation of the classical TPM as well as our VS-TPM to the convex shape models of several asteroids together with their thermal infrared data acquired by the NASA's Wide-field Infrared Survey Explorer (WISE) and compare the results. These show that the uncertainties of the shape model and the pole orientation can be very important (e.g., for the determination of the thermal inertia) and...

  18. A modelling study of long term green roof retention performance.

    Science.gov (United States)

    Stovin, Virginia; Poë, Simon; Berretta, Christian

    2013-12-15

    This paper outlines the development of a conceptual hydrological flux model for the long term continuous simulation of runoff and drought risk for green roof systems. A green roof's retention capacity depends upon its physical configuration, but it is also strongly influenced by local climatic controls, including the rainfall characteristics and the restoration of retention capacity associated with evapotranspiration during dry weather periods. The model includes a function that links evapotranspiration rates to substrate moisture content, and is validated against observed runoff data. The model's application to typical extensive green roof configurations is demonstrated with reference to four UK locations characterised by contrasting climatic regimes, using 30-year rainfall time-series inputs at hourly simulation time steps. It is shown that retention performance is dependent upon local climatic conditions. Volumetric retention ranges from 0.19 (cool, wet climate) to 0.59 (warm, dry climate). Per event retention is also considered, and it is demonstrated that retention performance decreases significantly when high return period events are considered in isolation. For example, in Sheffield the median per-event retention is 1.00 (many small events), but the median retention for events exceeding a 1 in 1 yr return period threshold is only 0.10. The simulation tool also provides useful information about the likelihood of drought periods, for which irrigation may be required. A sensitivity study suggests that green roofs with reduced moisture-holding capacity and/or low evapotranspiration rates will tend to offer reduced levels of retention, whilst high moisture-holding capacity and low evapotranspiration rates offer the strongest drought resistance.

  19. Performance model for grid-connected photovoltaic inverters.

    Energy Technology Data Exchange (ETDEWEB)

    Boyson, William Earl; Galbraith, Gary M.; King, David L.; Gonzalez, Sigifredo

    2007-09-01

    This document provides an empirically based performance model for grid-connected photovoltaic inverters used for system performance (energy) modeling and for continuous monitoring of inverter performance during system operation. The versatility and accuracy of the model were validated for a variety of both residential and commercial size inverters. Default parameters for the model can be obtained from manufacturers specification sheets, and the accuracy of the model can be further refined using measurements from either well-instrumented field measurements in operational systems or using detailed measurements from a recognized testing laboratory. An initial database of inverter performance parameters was developed based on measurements conducted at Sandia National Laboratories and at laboratories supporting the solar programs of the California Energy Commission.

  20. Performance Modeling for Heterogeneous Wireless Networks with Multiservice Overflow Traffic

    DEFF Research Database (Denmark)

    Huang, Qian; Ko, King-Tim; Iversen, Villy Bæk

    2009-01-01

    Performance modeling is important for the purpose of developing efficient dimensioning tools for large complicated networks. But it is difficult to achieve in heterogeneous wireless networks, where different networks have different statistical characteristics in service and traffic models....... Multiservice loss analysis based on multi-dimensional Markov chain becomes intractable in these networks due to intensive computations required. This paper focuses on performance modeling for heterogeneous wireless networks based on a hierarchical overlay infrastructure. A method based on decomposition...... of the correlated traffic is used to achieve an approximate performance modeling for multiservice in hierarchical heterogeneous wireless networks with overflow traffic. The accuracy of the approximate performance obtained by our proposed modeling is verified by simulations....

  1. Concave Pit-Containing Scaffold Surfaces Improve Stem Cell-Derived Osteoblast Performance and Lead to Significant Bone Tissue Formation

    Science.gov (United States)

    Cusella-De Angelis, Maria Gabriella; Laino, Gregorio; Piattelli, Adriano; Pacifici, Maurizio; De Rosa, Alfredo; Papaccio, Gianpaolo

    2007-01-01

    Background Scaffold surface features are thought to be important regulators of stem cell performance and endurance in tissue engineering applications, but details about these fundamental aspects of stem cell biology remain largely unclear. Methodology and Findings In the present study, smooth clinical-grade lactide-coglyolic acid 85:15 (PLGA) scaffolds were carved as membranes and treated with NMP (N-metil-pyrrolidone) to create controlled subtractive pits or microcavities. Scanning electron and confocal microscopy revealed that the NMP-treated membranes contained: (i) large microcavities of 80–120 µm in diameter and 40–100 µm in depth, which we termed primary; and (ii) smaller microcavities of 10–20 µm in diameter and 3–10 µm in depth located within the primary cavities, which we termed secondary. We asked whether a microcavity-rich scaffold had distinct bone-forming capabilities compared to a smooth one. To do so, mesenchymal stem cells derived from human dental pulp were seeded onto the two types of scaffold and monitored over time for cytoarchitectural characteristics, differentiation status and production of important factors, including bone morphogenetic protein-2 (BMP-2) and vascular endothelial growth factor (VEGF). We found that the microcavity-rich scaffold enhanced cell adhesion: the cells created intimate contact with secondary microcavities and were polarized. These cytological responses were not seen with the smooth-surface scaffold. Moreover, cells on the microcavity-rich scaffold released larger amounts of BMP-2 and VEGF into the culture medium and expressed higher alkaline phosphatase activity. When this type of scaffold was transplanted into rats, superior bone formation was elicited compared to cells seeded on the smooth scaffold. Conclusion In conclusion, surface microcavities appear to support a more vigorous osteogenic response of stem cells and should be used in the design of therapeutic substrates to improve bone repair and

  2. Towards the Significance of Decision Aid in Building Information Modeling (BIM Software Selection Process

    Directory of Open Access Journals (Sweden)

    Omar Mohd Faizal

    2014-01-01

    Full Text Available Building Information Modeling (BIM has been considered as a solution in construction industry to numerous problems such as delays, increased lead in times and increased costs. This is due to the concept and characteristic of BIM that will reshaped the way construction project teams work together to increase productivity and improve the final project outcomes (cost, time, quality, safety, functionality, maintainability, etc.. As a result, the construction industry has witnesses numerous of BIM software available in market. Each of this software has offers different function, features. Furthermore, the adoption of BIM required high investment on software, hardware and also training expenses. Thus, there is indentified that there is a need of decision aid for appropriated BIM software selection that fulfill the project needs. However, research indicates that there is limited study attempt to guide decision in BIM software selection problem. Thus, this paper highlight the importance of decision making and support for BIM software selection as it is vital to increase productivity, construction project throughout building lifecycle.

  3. Towards an Accurate Performance Modeling of Parallel SparseFactorization

    Energy Technology Data Exchange (ETDEWEB)

    Grigori, Laura; Li, Xiaoye S.

    2006-05-26

    We present a performance model to analyze a parallel sparseLU factorization algorithm on modern cached-based, high-end parallelarchitectures. Our model characterizes the algorithmic behavior bytakingaccount the underlying processor speed, memory system performance, aswell as the interconnect speed. The model is validated using theSuperLU_DIST linear system solver, the sparse matrices from realapplications, and an IBM POWER3 parallel machine. Our modelingmethodology can be easily adapted to study performance of other types ofsparse factorizations, such as Cholesky or QR.

  4. Myriocin significantly increases the mortality of a non-mammalian model host during Candida pathogenesis.

    Directory of Open Access Journals (Sweden)

    Nadja Rodrigues de Melo

    Full Text Available Candida albicans is a major human pathogen whose treatment is challenging due to antifungal drug toxicity, drug resistance and paucity of antifungal agents available. Myrocin (MYR inhibits sphingosine synthesis, a precursor of sphingolipids, an important cell membrane and signaling molecule component. MYR also has dual immune suppressive and antifungal properties, potentially modulating mammalian immunity and simultaneously reducing fungal infection risk. Wax moth (Galleria mellonella larvae, alternatives to mice, were used to establish if MYR suppressed insect immunity and increased survival of C. albicans-infected insects. MYR effects were studied in vivo and in vitro, and compared alone and combined with those of approved antifungal drugs, fluconazole (FLC and amphotericin B (AMPH. Insect immune defenses failed to inhibit C. albicans with high mortalities. In insects pretreated with the drug followed by C. albicans inoculation, MYR+C. albicans significantly increased mortality to 93% from 67% with C. albicans alone 48 h post-infection whilst AMPH+C. albicans and FLC+C. albicans only showed 26% and 0% mortalities, respectively. MYR combinations with other antifungal drugs in vivo also enhanced larval mortalities, contrasting the synergistic antifungal effect of the MYR+AMPH combination in vitro. MYR treatment influenced immunity and stress management gene expression during C. albicans pathogenesis, modulating transcripts putatively associated with signal transduction/regulation of cytokines, I-kappaB kinase/NF-kappaB cascade, G-protein coupled receptor and inflammation. In contrast, all stress management gene expression was down-regulated in FLC and AMPH pretreated C. albicans-infected insects. Results are discussed with their implications for clinical use of MYR to treat sphingolipid-associated disorders.

  5. Performance Comparison of the European Storm Surge Models and Chaotic Model in Forecasting Extreme Storm Surges

    Science.gov (United States)

    Siek, M. B.; Solomatine, D. P.

    2009-04-01

    Storm surge modeling has rapidly developed considerably over the past 30 years. A number of significant advances on operational storm surge models have been implemented and tested, consisting of: refining computational grids, calibrating the model, using a better numerical scheme (i.e. more realistic model physics for air-sea interaction), implementing data assimilation and ensemble model forecasts. This paper addresses the performance comparison between the existing European storm surge models and the recently developed methods of nonlinear dynamics and chaos theory in forecasting storm surge dynamics. The chaotic model is built using adaptive local models based on the dynamical neighbours in the reconstructed phase space of observed time series data. The comparison focused on the model accuracy in forecasting a recently extreme storm surge in the North Sea on November 9th, 2007 that hit the coastlines of several European countries. The combination of a high tide, north-westerly winds exceeding 50 mph and low pressure produced an exceptional storm tide. The tidal level was exceeded 3 meters above normal sea levels. Flood warnings were issued for the east coast of Britain and the entire Dutch coast. The Maeslant barrier's two arc-shaped steel doors in the Europe's biggest port of Rotterdam was closed for the first time since its construction in 1997 due to this storm surge. In comparison to the chaotic model performance, the forecast data from several European physically-based storm surge models were provided from: BSH Germany, DMI Denmark, DNMI Norway, KNMI Netherlands and MUMM Belgium. The performance comparison was made over testing datasets for two periods/conditions: non-stormy period (1-Sep-2007 till 14-Oct-2007) and stormy period (15-Oct-2007 till 20-Nov-2007). A scalar chaotic model with optimized parameters was developed by utilizing an hourly training dataset of observations (11-Sep-2005 till 31-Aug-2007). The comparison results indicated the chaotic

  6. Emerging Carbon Nanotube Electronic Circuits, Modeling, and Performance

    OpenAIRE

    Yao Xu; Ashok Srivastava; Sharma, Ashwani K.

    2010-01-01

    Current transport and dynamic models of carbon nanotube field-effect transistors are presented. A model of single-walled carbon nanotube as interconnect is also presented and extended in modeling of single-walled carbon nanotube bundles. These models are applied in studying the performances of circuits such as the complementary carbon nanotube inverter pair and carbon nanotube as interconnect. Cadence/Spectre simulations show that carbon nanotube field-effect transistor circuits can operate a...

  7. Planetary Suit Hip Bearing Model for Predicting Design vs. Performance

    Science.gov (United States)

    Cowley, Matthew S.; Margerum, Sarah; Harvil, Lauren; Rajulu, Sudhakar

    2011-01-01

    Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. In order to verifying that new suit designs meet requirements, full prototypes must eventually be built and tested with human subjects. Using computer models early in the design phase of new hardware development can be advantageous, allowing virtual prototyping to take place. Having easily modifiable models of the suit hard sections may reduce the time it takes to make changes to the hardware designs and then to understand their impact on suit and human performance. A virtual design environment gives designers the ability to think outside the box and exhaust design possibilities before building and testing physical prototypes with human subjects. Reductions in prototyping and testing may eventually reduce development costs. This study is an attempt to develop computer models of the hard components of the suit with known physical characteristics, supplemented with human subject performance data. Objectives: The primary objective was to develop an articulating solid model of the Mark III hip bearings to be used for evaluating suit design performance of the hip joint. Methods: Solid models of a planetary prototype (Mark III) suit s hip bearings and brief section were reverse-engineered from the prototype. The performance of the models was then compared by evaluating the mobility performance differences between the nominal hardware configuration and hardware modifications. This was accomplished by gathering data from specific suited tasks. Subjects performed maximum flexion and abduction tasks while in a nominal suit bearing configuration and in three off-nominal configurations. Performance data for the hip were recorded using state-of-the-art motion capture technology. Results: The results demonstrate that solid models of planetary suit hard segments for use as a performance design tool is feasible. From a general trend perspective

  8. Models used to assess the performance of photovoltaic systems.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua S.; Klise, Geoffrey T.

    2009-12-01

    This report documents the various photovoltaic (PV) performance models and software developed and utilized by researchers at Sandia National Laboratories (SNL) in support of the Photovoltaics and Grid Integration Department. In addition to PV performance models, hybrid system and battery storage models are discussed. A hybrid system using other distributed sources and energy storage can help reduce the variability inherent in PV generation, and due to the complexity of combining multiple generation sources and system loads, these models are invaluable for system design and optimization. Energy storage plays an important role in reducing PV intermittency and battery storage models are used to understand the best configurations and technologies to store PV generated electricity. Other researcher's models used by SNL are discussed including some widely known models that incorporate algorithms developed at SNL. There are other models included in the discussion that are not used by or were not adopted from SNL research but may provide some benefit to researchers working on PV array performance, hybrid system models and energy storage. The paper is organized into three sections to describe the different software models as applied to photovoltaic performance, hybrid systems, and battery storage. For each model, there is a description which includes where to find the model, whether it is currently maintained and any references that may be available. Modeling improvements underway at SNL include quantifying the uncertainty of individual system components, the overall uncertainty in modeled vs. measured results and modeling large PV systems. SNL is also conducting research into the overall reliability of PV systems.

  9. Repeat fine-needle aspiration can be performed at 6 months or more after initial atypia of undetermined significance or follicular lesion of undetermined significance results for thyroid nodules 10 mm or larger.

    Science.gov (United States)

    Koh, Jieun; Kim, Eun-Kyung; Kwak, Jin Young; Yoon, Jung Hyun; Moon, Hee Jung

    2016-12-01

    To investigate whether repeat ultrasound-guided fine-needle aspiration (US-FNA) in initial atypia of undetermined significance or follicular lesion of undetermined significance (AUS/FLUS) results could be performed 6 months after or more. A total of 221 AUS/FLUS ≥10 mm with any follow-up were grouped according to the first follow-up interval at less than 6 months (group 1, n = 87) and 6 months or more (group 2, n = 134). Clinical features, final assessment of ultrasound (US) or the Thyroid Imaging Reporting and Data System (TIRADS), tumour size, extrathyroidal extension and lymph node metastasis in malignancies were compared. Thirty-four (15.4 %) were malignant. Age, gender, size, final assessment, TIRADS and malignancy rate were not significantly different between the two groups (p = 0.660, 0.691, 0.502, 0.237, 0.819 and 0.420). Tumour size, extrathyroidal extension and lymph node metastasis were not significantly different between the two malignancy groups (p = 0.770, 0.611 and 0.068). Two of 10 nodules with increased size were malignancies found at 7.1 and 25.0 months. None of 33 nodules (14.9 %) with decreased size at a median 10 months were malignant. Repeat US-FNA performed on nodules ≥10 mm at 6 months or more after initial AUS/FLUS results can reduce unnecessary repeat US-FNAs without progression of malignancy. • Follow-up intervals of AUS/FLUS did not affect the malignancy rate • Tumour stage was not different according to the follow-up intervals • None of the nodules with decreased size were malignant • Repeat US-FNA can be performed at ≥6 months after initial AUS/FLUS.

  10. Comparison of performance of simulation models for floor heating

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Svendsen, Svend

    2005-01-01

    only the floor construction, the differences can be directly compared. In this comparison, a two-dimensional model of a slab-on-grade floor including foundation is used as reference. The other models include a one-dimensional model and a thermal network model including the linear thermal transmittance......This paper describes the comparison of performance of simulation models for floor heating with different level of detail in the modelling process. The models are compared in an otherwise identical simulation model containing room model, walls, windows, ceiling and ventilation system. By exchanging...... of the foundation. The result can be also be found in the energy consumption of the building, since up to half the energy consumption is lost through the ground. Looking at the different implementations it is also found, that including a 1m ground volume below the floor construction under a one-dimensional model...

  11. Selecting Optimal Subset of Features for Student Performance Model

    Directory of Open Access Journals (Sweden)

    Hany M. Harb

    2012-09-01

    Full Text Available Educational data mining (EDM is a new growing research area and the essence of data mining concepts are used in the educational field for the purpose of extracting useful information on the student behavior in the learning process. Classification methods like decision trees, rule mining, and Bayesian network, can be applied on the educational data for predicting the student behavior like performance in an examination. This prediction may help in student evaluation. As the feature selection influences the predictive accuracy of any performance model, it is essential to study elaborately the effectiveness of student performance model in connection with feature selection techniques. The main objective of this work is to achieve high predictive performance by adopting various feature selection techniques to increase the predictive accuracy with least number of features. The outcomes show a reduction in computational time and constructional cost in both training and classification phases of the student performance model.

  12. Integrated Main Propulsion System Performance Reconstruction Process/Models

    Science.gov (United States)

    Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael

    2013-01-01

    The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

  13. Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance

    Science.gov (United States)

    Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.

    2014-01-01

    This presentation describes the capabilities of three-dimensional thermal power model of advanced stirling radioisotope generator (ASRG). The performance of the ASRG is presented for different scenario, such as Venus flyby with or without the auxiliary cooling system.

  14. Performance modeling of data dissemination in vehicular ad hoc networks

    DEFF Research Database (Denmark)

    Chaqfeh, Moumena; Lakas, Abderrahmane; Lazarova-Molnar, Sanja

    2013-01-01

    ad hoc nature which does not require fixed infrastructure or centralized administration. However, designing scalable information dissemination techniques for VANET applications remains a challenging task due to the inherent nature of such highly dynamic environments. Existing dissemination techniques...... often resort to simulation for performance evaluation and there are only few studies that offer mathematical modeling. In this paper we provide a comparative study of existing performance modeling approaches for data dissemination techniques designed for different VANET applications....

  15. Modeling radial flow ion exchange performance for condensate polisher conditions

    Energy Technology Data Exchange (ETDEWEB)

    Shallcross, D. [University of Melbourne, Melbourne, VIC (Australia). Department of Chemical Engineering; Renouf, P.

    2001-11-01

    A theoretical model is developed which simulates ion exchange performance within an annular resin bed. Flow within the mixed ion exchange bed is diverging, with the solution flowing outwards away from the bed's axis. The model is used to simulate performance of a mixed annular bed operating under condensate polisher conditions. The simulation predictions are used to develop design envelope curves for practical radial flow beds and to estimate potential cost savings flowing from less expensive polisher vessels. (orig.)

  16. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...... that involve several types of numerical computations. The computers considered in our study are the Connection Machines CM-200 and CM-5, and the MasPar MP-2216...

  17. A Formal Comparison of Model Variants for Performance Prediction

    Science.gov (United States)

    2009-12-01

    400 450 500 1 2 3 4 5 6 7 8 P e rf o rm a n c e S c o re s Mission Team Performance in UAS Predator Simulation CERI , 2005 Humans Model...Simulation CERI , 2005 Humans Model Team Performance in F-16 Simulator Missions DMO Testbd, Mesa Table 2. Cross-validation RMSD...Warfighter Readiness Research Division. The authors would like to thank the Cognitive Engineering Research Institute ( CERI ) and researchers from Mesa’s

  18. Human Performance Modeling and Simulation for Launch Team Applications

    Science.gov (United States)

    Peaden, Cary J.; Payne, Stephen J.; Hoblitzell, Richard M., Jr.; Chandler, Faith T.; LaVine, Nils D.; Bagnall, Timothy M.

    2006-01-01

    This paper describes ongoing research into modeling and simulation of humans for launch team analysis, training, and evaluation. The initial research is sponsored by the National Aeronautics and Space Administration's (NASA)'s Office of Safety and Mission Assurance (OSMA) and NASA's Exploration Program and is focused on current and future launch team operations at Kennedy Space Center (KSC). The paper begins with a description of existing KSC launch team environments and procedures. It then describes the goals of new Simulation and Analysis of Launch Teams (SALT) research. The majority of this paper describes products from the SALT team's initial proof-of-concept effort. These products include a nominal case task analysis and a discrete event model and simulation of launch team performance during the final phase of a shuttle countdown; and a first proof-of-concept training demonstration of launch team communications in which the computer plays most roles, and the trainee plays a role of the trainee's choice. This paper then describes possible next steps for the research team and provides conclusions. This research is expected to have significant value to NASA's Exploration Program.

  19. Using Machine Learning to Create Turbine Performance Models (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Clifton, A.

    2013-04-01

    Wind turbine power output is known to be a strong function of wind speed, but is also affected by turbulence and shear. In this work, new aerostructural simulations of a generic 1.5 MW turbine are used to explore atmospheric influences on power output. Most significant is the hub height wind speed, followed by hub height turbulence intensity and then wind speed shear across the rotor disk. These simulation data are used to train regression trees that predict the turbine response for any combination of wind speed, turbulence intensity, and wind shear that might be expected at a turbine site. For a randomly selected atmospheric condition, the accuracy of the regression tree power predictions is three times higher than that of the traditional power curve methodology. The regression tree method can also be applied to turbine test data and used to predict turbine performance at a new site. No new data is required in comparison to the data that are usually collected for a wind resource assessment. Implementing the method requires turbine manufacturers to create a turbine regression tree model from test site data. Such an approach could significantly reduce bias in power predictions that arise because of different turbulence and shear at the new site, compared to the test site.

  20. A Mathematical Model to Improve the Performance of Logistics Network

    Directory of Open Access Journals (Sweden)

    Muhammad Izman Herdiansyah

    2012-01-01

    Full Text Available The role of logistics nowadays is expanding from just providing transportation and warehousing to offering total integrated logistics. To remain competitive in the global market environment, business enterprises need to improve their logistics operations performance. The improvement will be achieved when we can provide a comprehensive analysis and optimize its network performances. In this paper, a mixed integer linier model for optimizing logistics network performance is developed. It provides a single-product multi-period multi-facilities model, as well as the multi-product concept. The problem is modeled in form of a network flow problem with the main objective to minimize total logistics cost. The problem can be solved using commercial linear programming package like CPLEX or LINDO. Even in small case, the solver in Excel may also be used to solve such model.Keywords: logistics network, integrated model, mathematical programming, network optimization

  1. A Mathematical Model to Improve the Performance of Logistics Network

    Directory of Open Access Journals (Sweden)

    Muhammad Izman Herdiansyah

    2012-01-01

    Full Text Available The role of logistics nowadays is expanding from just providing transportation and warehousing to offering total integrated logistics. To remain competitive in the global market environment, business enterprises need to improve their logistics operations performance. The improvement will be achieved when we can provide a comprehensive analysis and optimize its network performances. In this paper, a mixed integer linier model for optimizing logistics network performance is developed. It provides a single-product multi-period multi-facilities model, as well as the multi-product concept. The problem is modeled in form of a network flow problem with the main objective to minimize total logistics cost. The problem can be solved using commercial linear programming package like CPLEX or LINDO. Even in small case, the solver in Excel may also be used to solve such model.Keywords: logistics network, integrated model, mathematical programming, network optimization

  2. Performance analysis of FXLMS algorithm with secondary path modeling error

    Institute of Scientific and Technical Information of China (English)

    SUN Xu; CHEN Duanshi

    2003-01-01

    Performance analysis of filtered-X LMS (FXLMS) algorithm with secondary path modeling error is carried out in both time and frequency domain. It is shown firstly that the effects of secondary path modeling error on the performance of FXLMS algorithm are determined by the distribution of the relative error of secondary path model along with frequency.In case of that the distribution of relative error is uniform the modeling error of secondary path will have no effects on the performance of the algorithm. In addition, a limitation property of FXLMS algorithm is proved, which implies that the negative effects of secondary path modeling error can be compensated by increasing the adaptive filter length. At last, some insights into the "spillover" phenomenon of FXLMS algorithm are given.

  3. Performance evaluation of quantum well infrared phototransistor instrumentation through modeling

    Science.gov (United States)

    El-Tokhy, Mohamed S.; Mahmoud, Imbaby I.

    2014-05-01

    This paper presents a theoretical analysis for the characteristics of quantum well infrared phototransistors (QWIPTs). A mathematical model describing this device is introduced under nonuniformity distribution of quantum wells (QWs). MATLAB environment is used to devise this model. Furthermore, block diagram models through the VisSim environment were used to describe the device characteristics. The developed models are used to investigate the behavior of the device with different values of performance parameters such as bias voltage, spacing between QWs, and temperature. These parameters are tuned to enhance the performance of these quantum phototransistors through the presented modeling. Moreover, the resultant performance characteristics and comparison between both QWIPTs and quantum wire infrared phototransistors are investigated. Also, the obtained results are validated against experimental published work and full agreements are obtained.

  4. Configuration of Distributed Message Converter Systems using Performance Modeling

    NARCIS (Netherlands)

    Aberer, Karl; Risse, Thomas; Wombacher, Andreas

    2001-01-01

    To find a configuration of a distributed system satisfying performance goals is a complex search problem that involves many design parameters, like hardware selection, job distribution and process configuration. Performance models are a powerful tools to analyse potential system configurations, howe

  5. A Composite Model for Employees' Performance Appraisal and Improvement

    Science.gov (United States)

    Manoharan, T. R.; Muralidharan, C.; Deshmukh, S. G.

    2012-01-01

    Purpose: The purpose of this paper is to develop an innovative method of performance appraisal that will be useful for designing a structured training programme. Design/methodology/approach: Employees' performance appraisals are conducted using new approaches, namely data envelopment analysis and an integrated fuzzy model. Interpretive structural…

  6. Performance Implications of Business Model Change: A Case Study

    Directory of Open Access Journals (Sweden)

    Jana Poláková

    2015-01-01

    Full Text Available The paper deals with changes in performance level introduced by the change of business model. The selected case is a small family business undergoing through substantial changes in reflection of structural changes of its markets. The authors used the concept of business model to describe value creation processes within the selected family business and by contrasting the differences between value creation processes before and after the change introduced they prove the role of business model as the performance differentiator. This is illustrated with the use of business model canvas constructed on the basis interviews, observations and document analysis. The two business model canvases allow for explanation of cause-and-effect relationships within the business leading to change in performance. The change in the performance is assessed by financial analysis of the business conducted over the period of 2006–2012 demonstrates changes in performance (comparing development of ROA, ROE and ROS having their lowest levels before the change of business model was introduced, growing after the introduction of the change, as well as the activity indicators with similar developments of the family business. The described case study contributes to the concept of business modeling with the arguments supporting its value as strategic tool facilitating decisions related to value creation within the business.

  7. High-Performance Work Systems: American Models of Workplace Transformation.

    Science.gov (United States)

    Appelbaum, Eileen; Batt, Rosemary

    Rising competition in world and domestic markets for the past 2 decades has necessitated that U.S. companies undergo significant transformations to improve their performance with respect to a wide array of efficiency and quality indicators. Research on the transformations recently undertaken by some U.S. companies to boost performance revealed two…

  8. Weighted Feature Significance: A Simple, Interpretable Model of Compound Toxicity Based on the Statistical Enrichment of Structural Features

    OpenAIRE

    Huang, Ruili; Southall, Noel; Xia, Menghang; Cho, Ming-Hsuang; Jadhav, Ajit; Nguyen, Dac-Trung; Inglese, James; Tice, Raymond R.; Austin, Christopher P.

    2009-01-01

    In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high–throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) dat...

  9. Activity-Based Costing Model for Assessing Economic Performance.

    Science.gov (United States)

    DeHayes, Daniel W.; Lovrinic, Joseph G.

    1994-01-01

    An economic model for evaluating the cost performance of academic and administrative programs in higher education is described. Examples from its application at Indiana University-Purdue University Indianapolis are used to illustrate how the model has been used to control costs and reengineer processes. (Author/MSE)

  10. Null Objects in Second Language Acquisition: Grammatical vs. Performance Models

    Science.gov (United States)

    Zyzik, Eve C.

    2008-01-01

    Null direct objects provide a favourable testing ground for grammatical and performance models of argument omission. This article examines both types of models in order to determine which gives a more plausible account of the second language data. The data were collected from second language (L2) learners of Spanish by means of four oral…

  11. Modelling the Performance of Product Integrated Photovoltaic (PIPV) Cells Indoors

    NARCIS (Netherlands)

    Apostolou, G.; Verwaal, M.; Reinders, Angelina H.M.E.

    2014-01-01

    In this paper we present a model, which have been developed for the estimation of the PV products’ cells’ performance in an indoor environment. The model computes the efficiency and power production of PV technologies, as a function of distance from natural and artificial light sources. It intents

  12. The Use of Neural Network Technology to Model Swimming Performance

    Science.gov (United States)

    Silva, António José; Costa, Aldo Manuel; Oliveira, Paulo Moura; Reis, Victor Machado; Saavedra, José; Perl, Jurgen; Rouboa, Abel; Marinho, Daniel Almeida

    2007-01-01

    The aims of the present study were: to identify the factors which are able to explain the performance in the 200 meters individual medley and 400 meters front crawl events in young swimmers, to model the performance in those events using non-linear mathematic methods through artificial neural networks (multi-layer perceptrons) and to assess the neural network models precision to predict the performance. A sample of 138 young swimmers (65 males and 73 females) of national level was submitted to a test battery comprising four different domains: kinanthropometric evaluation, dry land functional evaluation (strength and flexibility), swimming functional evaluation (hydrodynamics, hydrostatic and bioenergetics characteristics) and swimming technique evaluation. To establish a profile of the young swimmer non-linear combinations between preponderant variables for each gender and swim performance in the 200 meters medley and 400 meters font crawl events were developed. For this purpose a feed forward neural network was used (Multilayer Perceptron) with three neurons in a single hidden layer. The prognosis precision of the model (error lower than 0.8% between true and estimated performances) is supported by recent evidence. Therefore, we consider that the neural network tool can be a good approach in the resolution of complex problems such as performance modeling and the talent identification in swimming and, possibly, in a wide variety of sports. Key pointsThe non-linear analysis resulting from the use of feed forward neural network allowed us the development of four performance models.The mean difference between the true and estimated results performed by each one of the four neural network models constructed was low.The neural network tool can be a good approach in the resolution of the performance modeling as an alternative to the standard statistical models that presume well-defined distributions and independence among all inputs.The use of neural networks for sports

  13. Hybrid Corporate Performance Prediction Model Considering Technical Capability

    Directory of Open Access Journals (Sweden)

    Joonhyuck Lee

    2016-07-01

    Full Text Available Many studies have tried to predict corporate performance and stock prices to enhance investment profitability using qualitative approaches such as the Delphi method. However, developments in data processing technology and machine-learning algorithms have resulted in efforts to develop quantitative prediction models in various managerial subject areas. We propose a quantitative corporate performance prediction model that applies the support vector regression (SVR algorithm to solve the problem of the overfitting of training data and can be applied to regression problems. The proposed model optimizes the SVR training parameters based on the training data, using the genetic algorithm to achieve sustainable predictability in changeable markets and managerial environments. Technology-intensive companies represent an increasing share of the total economy. The performance and stock prices of these companies are affected by their financial standing and their technological capabilities. Therefore, we apply both financial indicators and technical indicators to establish the proposed prediction model. Here, we use time series data, including financial, patent, and corporate performance information of 44 electronic and IT companies. Then, we predict the performance of these companies as an empirical verification of the prediction performance of the proposed model.

  14. Facial Performance Transfer via Deformable Models and Parametric Correspondence.

    Science.gov (United States)

    Asthana, Akshay; de la Hunty, Miles; Dhall, Abhinav; Goecke, Roland

    2012-09-01

    The issue of transferring facial performance from one person's face to another's has been an area of interest for the movie industry and the computer graphics community for quite some time. In recent years, deformable face models, such as the Active Appearance Model (AAM), have made it possible to track and synthesize faces in real time. Not surprisingly, deformable face model-based approaches for facial performance transfer have gained tremendous interest in the computer vision and graphics community. In this paper, we focus on the problem of real-time facial performance transfer using the AAM framework. We propose a novel approach of learning the mapping between the parameters of two completely independent AAMs, using them to facilitate the facial performance transfer in a more realistic manner than previous approaches. The main advantage of modeling this parametric correspondence is that it allows a "meaningful" transfer of both the nonrigid shape and texture across faces irrespective of the speakers' gender, shape, and size of the faces, and illumination conditions. We explore linear and nonlinear methods for modeling the parametric correspondence between the AAMs and show that the sparse linear regression method performs the best. Moreover, we show the utility of the proposed framework for a cross-language facial performance transfer that is an area of interest for the movie dubbing industry.

  15. MODEL-BASED PERFORMANCE EVALUATION APPROACH FOR MOBILE AGENT SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    Li Xin; Mi Zhengkun; Meng Xudong

    2004-01-01

    Claimed as the next generation programming paradigm, mobile agent technology has attracted extensive interests in recent years. However, up to now, limited research efforts have been devoted to the performance study of mobile agent system and most of these researches focus on agent behavior analysis resulting in that models are hard to apply to mobile agent systems. To bridge the gap, a new performance evaluation model derived from operation mechanisms of mobile agent platforms is proposed. Details are discussed for the design of companion simulation software, which can provide the system performance such as response time of platform to mobile agent. Further investigation is followed on the determination of model parameters. Finally comparison is made between the model-based simulation results and measurement-based real performance of mobile agent systems. The results show that the proposed model and designed software are effective in evaluating performance characteristics of mobile agent systems. The proposed approach can also be considered as the basis of performance analysis for large systems composed of multiple mobile agent platforms.

  16. Observer analysis and its impact on task performance modeling

    Science.gov (United States)

    Jacobs, Eddie L.; Brown, Jeremy B.

    2014-05-01

    Fire fighters use relatively low cost thermal imaging cameras to locate hot spots and fire hazards in buildings. This research describes the analyses performed to study the impact of thermal image quality on fire fighter fire hazard detection task performance. Using human perception data collected by the National Institute of Standards and Technology (NIST) for fire fighters detecting hazards in a thermal image, an observer analysis was performed to quantify the sensitivity and bias of each observer. Using this analysis, the subjects were divided into three groups representing three different levels of performance. The top-performing group was used for the remainder of the modeling. Models were developed which related image quality factors such as contrast, brightness, spatial resolution, and noise to task performance probabilities. The models were fitted to the human perception data using logistic regression, as well as probit regression. Probit regression was found to yield superior fits and showed that models with not only 2nd order parameter interactions, but also 3rd order parameter interactions performed the best.

  17. Disaggregation of Rainy Hours: Compared Performance of Various Models.

    Science.gov (United States)

    Ben Haha, M.; Hingray, B.; Musy, A.

    In the urban environment, the response times of catchments are usually short. To de- sign or to diagnose waterworks in that context, it is necessary to describe rainfall events with a good time resolution: a 10mn time step is often necessary. Such in- formation is not always available. Rainfall disaggregation models have thus to be applied to produce from rough rainfall data that short time resolution information. The communication will present the performance obtained with several rainfall dis- aggregation models that allow for the disaggregation of rainy hours into six 10mn rainfall amounts. The ability of the models to reproduce some statistical character- istics of rainfall (mean, variance, overall distribution of 10mn-rainfall amounts; ex- treme values of maximal rainfall amounts over different durations) is evaluated thanks to different graphical and numerical criteria. The performance of simple models pre- sented in some scientific papers or developed in the Hydram laboratory as well as the performance of more sophisticated ones is compared with the performance of the basic constant disaggregation model. The compared models are either deterministic or stochastic; for some of them the disaggregation is based on scaling properties of rainfall. The compared models are in increasing complexity order: constant model, linear model (Ben Haha, 2001), Ormsbee Deterministic model (Ormsbee, 1989), Ar- tificial Neuronal Network based model (Burian et al. 2000), Hydram Stochastic 1 and Hydram Stochastic 2 (Ben Haha, 2001), Multiplicative Cascade based model (Olsson and Berndtsson, 1998), Ormsbee Stochastic model (Ormsbee, 1989). The 625 rainy hours used for that evaluation (with a hourly rainfall amount greater than 5mm) were extracted from the 21 years chronological rainfall series (10mn time step) observed at the Pully meteorological station, Switzerland. The models were also evaluated when applied to different rainfall classes depending on the season first and on the

  18. The Relationship between Shared Mental Models and Task Performance in an Online Team- Based Learning Environment

    Science.gov (United States)

    Johnson, Tristan E.; Lee, Youngmin

    2008-01-01

    In an effort to better understand learning teams, this study examines the effects of shared mental models on team and individual performance. The results indicate that each team's shared mental model changed significantly over the time that subjects participated in team-based learning activities. The results also showed that the shared mental…

  19. Causal Analysis for Performance Modeling of Computer Programs

    Directory of Open Access Journals (Sweden)

    Jan Lemeire

    2007-01-01

    Full Text Available Causal modeling and the accompanying learning algorithms provide useful extensions for in-depth statistical investigation and automation of performance modeling. We enlarged the scope of existing causal structure learning algorithms by using the form-free information-theoretic concept of mutual information and by introducing the complexity criterion for selecting direct relations among equivalent relations. The underlying probability distribution of experimental data is estimated by kernel density estimation. We then reported on the benefits of a dependency analysis and the decompositional capacities of causal models. Useful qualitative models, providing insight into the role of every performance factor, were inferred from experimental data. This paper reports on the results for a LU decomposition algorithm and on the study of the parameter sensitivity of the Kakadu implementation of the JPEG-2000 standard. Next, the analysis was used to search for generic performance characteristics of the applications.

  20. Impact of reactive settler models on simulated WWTP performance.

    Science.gov (United States)

    Gernaey, K V; Jeppsson, U; Batstone, D J; Ingildsen, P

    2006-01-01

    Including a reactive settler model in a wastewater treatment plant model allows representation of the biological reactions taking place in the sludge blanket in the settler, something that is neglected in many simulation studies. The idea of including a reactive settler model is investigated for an ASM1 case study. Simulations with a whole plant model including the non-reactive Takács settler model are used as a reference, and are compared to simulation results considering two reactive settler models. The first is a return sludge model block removing oxygen and a user-defined fraction of nitrate, combined with a non-reactive Takács settler. The second is a fully reactive ASM1 Takács settler model. Simulations with the ASM1 reactive settler model predicted a 15.3% and 7.4% improvement of the simulated N removal performance, for constant (steady-state) and dynamic influent conditions respectively. The oxygen/nitrate return sludge model block predicts a 10% improvement of N removal performance under dynamic conditions, and might be the better modelling option for ASM1 plants: it is computationally more efficient and it will not overrate the importance of decay processes in the settler.

  1. Ecological niche modeling in Maxent: the importance of model complexity and the performance of model selection criteria.

    Science.gov (United States)

    Warren, Dan L; Seifert, Stephanie N

    2011-03-01

    Maxent, one of the most commonly used methods for inferring species distributions and environmental tolerances from occurrence data, allows users to fit models of arbitrary complexity. Model complexity is typically constrained via a process known as L1 regularization, but at present little guidance is available for setting the appropriate level of regularization, and the effects of inappropriately complex or simple models are largely unknown. In this study, we demonstrate the use of information criterion approaches to setting regularization in Maxent, and we compare models selected using information criteria to models selected using other criteria that are common in the literature. We evaluate model performance using occurrence data generated from a known "true" initial Maxent model, using several different metrics for model quality and transferability. We demonstrate that models that are inappropriately complex or inappropriately simple show reduced ability to infer habitat quality, reduced ability to infer the relative importance of variables in constraining species' distributions, and reduced transferability to other time periods. We also demonstrate that information criteria may offer significant advantages over the methods commonly used in the literature.

  2. Comparative Performance of Volatility Models for Oil Price

    Directory of Open Access Journals (Sweden)

    Afees A. Salisu

    2012-07-01

    Full Text Available In this paper, we compare the performance of volatility models for oil price using daily returns of WTI. The innovations of this paper are in two folds: (i we analyse the oil price across three sub samples namely period before, during and after the global financial crisis, (ii we also analyse the comparative performance of both symmetric and asymmetric volatility models for the oil price. We find that oil price was most volatile during the global financial crises compared to other sub samples. Based on the appropriate model selection criteria, the asymmetric GARCH models appear superior to the symmetric ones in dealing with oil price volatility. This finding indicates evidence of leverage effects in the oil market and ignoring these effects in oil price modelling will lead to serious biases and misleading results.

  3. Modeling and performance analysis of QoS data

    Science.gov (United States)

    Strzeciwilk, Dariusz; Zuberek, Włodzimierz M.

    2016-09-01

    The article presents the results of modeling and analysis of data transmission performance on systems that support quality of service. Models are designed and tested, taking into account multiservice network architecture, i.e. supporting the transmission of data related to different classes of traffic. Studied were mechanisms of traffic shaping systems, which are based on the Priority Queuing with an integrated source of data and the various sources of data that is generated. Discussed were the basic problems of the architecture supporting QoS and queuing systems. Designed and built were models based on Petri nets, supported by temporal logics. The use of simulation tools was to verify the mechanisms of shaping traffic with the applied queuing algorithms. It is shown that temporal models of Petri nets can be effectively used in the modeling and analysis of the performance of computer networks.

  4. Performance Models for Split-execution Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; McCaskey, Alex [ORNL; Schrock, Jonathan [ORNL; Seddiqi, Hadayat [ORNL; Britt, Keith A [ORNL; Imam, Neena [ORNL

    2016-01-01

    Split-execution computing leverages the capabilities of multiple computational models to solve problems, but splitting program execution across different computational models incurs costs associated with the translation between domains. We analyze the performance of a split-execution computing system developed from conventional and quantum processing units (QPUs) by using behavioral models that track resource usage. We focus on asymmetric processing models built using conventional CPUs and a family of special-purpose QPUs that employ quantum computing principles. Our performance models account for the translation of a classical optimization problem into the physical representation required by the quantum processor while also accounting for hardware limitations and conventional processor speed and memory. We conclude that the bottleneck in this split-execution computing system lies at the quantum-classical interface and that the primary time cost is independent of quantum processor behavior.

  5. Performance Assessment of Hydrological Models Considering Acceptable Forecast Error Threshold

    Directory of Open Access Journals (Sweden)

    Qianjin Dong

    2015-11-01

    Full Text Available It is essential to consider the acceptable threshold in the assessment of a hydrological model because of the scarcity of research in the hydrology community and errors do not necessarily cause risk. Two forecast errors, including rainfall forecast error and peak flood forecast error, have been studied based on the reliability theory. The first order second moment (FOSM and bound methods are used to identify the reliability. Through the case study of the Dahuofang (DHF Reservoir, it is shown that the correlation between these two errors has great influence on the reliability index of hydrological model. In particular, the reliability index of the DHF hydrological model decreases with the increasing correlation. Based on the reliability theory, the proposed performance evaluation framework incorporating the acceptable forecast error threshold and correlation among the multiple errors can be used to evaluate the performance of a hydrological model and to quantify the uncertainties of a hydrological model output.

  6. Comparison of Simple Versus Performance-Based Fall Prediction Models

    Directory of Open Access Journals (Sweden)

    Shekhar K. Gadkaree BS

    2015-05-01

    Full Text Available Objective: To compare the predictive ability of standard falls prediction models based on physical performance assessments with more parsimonious prediction models based on self-reported data. Design: We developed a series of fall prediction models progressing in complexity and compared area under the receiver operating characteristic curve (AUC across models. Setting: National Health and Aging Trends Study (NHATS, which surveyed a nationally representative sample of Medicare enrollees (age ≥65 at baseline (Round 1: 2011-2012 and 1-year follow-up (Round 2: 2012-2013. Participants: In all, 6,056 community-dwelling individuals participated in Rounds 1 and 2 of NHATS. Measurements: Primary outcomes were 1-year incidence of “any fall” and “recurrent falls.” Prediction models were compared and validated in development and validation sets, respectively. Results: A prediction model that included demographic information, self-reported problems with balance and coordination, and previous fall history was the most parsimonious model that optimized AUC for both any fall (AUC = 0.69, 95% confidence interval [CI] = [0.67, 0.71] and recurrent falls (AUC = 0.77, 95% CI = [0.74, 0.79] in the development set. Physical performance testing provided a marginal additional predictive value. Conclusion: A simple clinical prediction model that does not include physical performance testing could facilitate routine, widespread falls risk screening in the ambulatory care setting.

  7. Human performance modeling for system of systems analytics.

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, Kevin R.; Lawton, Craig R.; Basilico, Justin Derrick; Longsine, Dennis E. (INTERA, Inc., Austin, TX); Forsythe, James Chris; Gauthier, John Henry; Le, Hai D.

    2008-10-01

    A Laboratory-Directed Research and Development project was initiated in 2005 to investigate Human Performance Modeling in a System of Systems analytic environment. SAND2006-6569 and SAND2006-7911 document interim results from this effort; this report documents the final results. The problem is difficult because of the number of humans involved in a System of Systems environment and the generally poorly defined nature of the tasks that each human must perform. A two-pronged strategy was followed: one prong was to develop human models using a probability-based method similar to that first developed for relatively well-understood probability based performance modeling; another prong was to investigate more state-of-art human cognition models. The probability-based modeling resulted in a comprehensive addition of human-modeling capability to the existing SoSAT computer program. The cognitive modeling resulted in an increased understanding of what is necessary to incorporate cognition-based models to a System of Systems analytic environment.

  8. Comparison of Simple Versus Performance-Based Fall Prediction Models

    Directory of Open Access Journals (Sweden)

    Shekhar K. Gadkaree BS

    2015-05-01

    Full Text Available Objective: To compare the predictive ability of standard falls prediction models based on physical performance assessments with more parsimonious prediction models based on self-reported data. Design: We developed a series of fall prediction models progressing in complexity and compared area under the receiver operating characteristic curve (AUC across models. Setting: National Health and Aging Trends Study (NHATS, which surveyed a nationally representative sample of Medicare enrollees (age ≥65 at baseline (Round 1: 2011-2012 and 1-year follow-up (Round 2: 2012-2013. Participants: In all, 6,056 community-dwelling individuals participated in Rounds 1 and 2 of NHATS. Measurements: Primary outcomes were 1-year incidence of “ any fall ” and “ recurrent falls .” Prediction models were compared and validated in development and validation sets, respectively. Results: A prediction model that included demographic information, self-reported problems with balance and coordination, and previous fall history was the most parsimonious model that optimized AUC for both any fall (AUC = 0.69, 95% confidence interval [CI] = [0.67, 0.71] and recurrent falls (AUC = 0.77, 95% CI = [0.74, 0.79] in the development set. Physical performance testing provided a marginal additional predictive value. Conclusion: A simple clinical prediction model that does not include physical performance testing could facilitate routine, widespread falls risk screening in the ambulatory care setting.

  9. THE USE OF NEURAL NETWORK TECHNOLOGY TO MODEL SWIMMING PERFORMANCE

    Directory of Open Access Journals (Sweden)

    António José Silva

    2007-03-01

    Full Text Available The aims of the present study were: to identify the factors which are able to explain the performance in the 200 meters individual medley and 400 meters front crawl events in young swimmers, to model the performance in those events using non-linear mathematic methods through artificial neural networks (multi-layer perceptrons and to assess the neural network models precision to predict the performance. A sample of 138 young swimmers (65 males and 73 females of national level was submitted to a test battery comprising four different domains: kinanthropometric evaluation, dry land functional evaluation (strength and flexibility, swimming functional evaluation (hydrodynamics, hydrostatic and bioenergetics characteristics and swimming technique evaluation. To establish a profile of the young swimmer non-linear combinations between preponderant variables for each gender and swim performance in the 200 meters medley and 400 meters font crawl events were developed. For this purpose a feed forward neural network was used (Multilayer Perceptron with three neurons in a single hidden layer. The prognosis precision of the model (error lower than 0.8% between true and estimated performances is supported by recent evidence. Therefore, we consider that the neural network tool can be a good approach in the resolution of complex problems such as performance modeling and the talent identification in swimming and, possibly, in a wide variety of sports

  10. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  11. Product Data Model for Performance-driven Design

    Science.gov (United States)

    Hu, Guang-Zhong; Xu, Xin-Jian; Xiao, Shou-Ne; Yang, Guang-Wu; Pu, Fan

    2017-09-01

    When designing large-sized complex machinery products, the design focus is always on the overall performance; however, there exist no design theory and method based on performance driven. In view of the deficiency of the existing design theory, according to the performance features of complex mechanical products, the performance indices are introduced into the traditional design theory of "Requirement-Function-Structure" to construct a new five-domain design theory of "Client Requirement-Function-Performance-Structure-Design Parameter". To support design practice based on this new theory, a product data model is established by using performance indices and the mapping relationship between them and the other four domains. When the product data model is applied to high-speed train design and combining the existing research result and relevant standards, the corresponding data model and its structure involving five domains of high-speed trains are established, which can provide technical support for studying the relationships between typical performance indices and design parameters and the fast achievement of a high-speed train scheme design. The five domains provide a reference for the design specification and evaluation criteria of high speed train and a new idea for the train's parameter design.

  12. End-to-end models for marine ecosystems: Are we on the precipice of a significant advance or just putting lipstick on a pig?

    Directory of Open Access Journals (Sweden)

    Kenneth A. Rose

    2012-02-01

    Full Text Available There has been a rapid rise in the development of end-to-end models for marine ecosystems over the past decade. Some reasons for this rise include need for predicting effects of climate change on biota and dissatisfaction with existing models. While the benefits of a well-implemented end-to-end model are straightforward, there are many challenges. In the short term, my view is that the major role of end-to-end models is to push the modelling community forward, and to identify critical data so that these data can be collected now and thus be available for the next generation of end-to-end models. I think we should emulate physicists and build theoretically-oriented models first, and then collect the data. In the long-term, end-to-end models will increase their skill, data collection will catch up, and end-to-end models will move towards site-specific applications with forecasting and management capabilities. One pathway into the future is individual efforts, over-promise, and repackaging of poorly performing component submodels (“lipstick on a pig”. The other pathway is a community-based collaborative effort, with appropriate caution and thoughtfulness, so that the needed improvements are achieved (“significant advance”. The promise of end-to-end modelling is great. We should act now to avoid missing a great opportunity.

  13. A CHAID Based Performance Prediction Model in Educational Data Mining

    Directory of Open Access Journals (Sweden)

    R. Bhaskaran

    2010-01-01

    Full Text Available The performance in higher secondary school education in India is a turning point in the academic lives of all students. As this academic performance is influenced by many factors, it is essential to develop predictive data mining model for students' performance so as to identify the slow learners and study the influence of the dominant factors on their academic performance. In the present investigation, a survey cum experimental methodology was adopted to generate a database and it was constructed from a primary and a secondary source. While the primary data was collected from the regular students, the secondary data was gathered from the school and office of the Chief Educational Officer (CEO. A total of 1000 datasets of the year 2006 from five different schools in three different districts of Tamilnadu were collected. The raw data was preprocessed in terms of filling up missing values, transforming values in one form into another and relevant attribute/ variable selection. As a result, we had 772 student records, which were used for CHAID prediction model construction. A set of prediction rules were extracted from CHIAD prediction model and the efficiency of the generated CHIAD prediction model was found. The accuracy of the present model was compared with other model and it has been found to be satisfactory.

  14. An ambient agent model for analyzing managers' performance during stress

    Science.gov (United States)

    ChePa, Noraziah; Aziz, Azizi Ab; Gratim, Haned

    2016-08-01

    Stress at work have been reported everywhere. Work related performance during stress is a pattern of reactions that occurs when managers are presented with work demands that are not matched with their knowledge, skills, or abilities, and which challenge their ability to cope. Although there are many prior findings pertaining to explain the development of manager performance during stress, less attention has been given to explain the same concept through computational models. In such, a descriptive nature in psychological theories about managers' performance during stress can be transformed into a causal-mechanistic stage that explains the relationship between a series of observed phenomena. This paper proposed an ambient agent model for analyzing managers' performance during stress. Set of properties and variables are identified through past literatures to construct the model. Differential equations have been used in formalizing the model. Set of equations reflecting relations involved in the proposed model are presented. The proposed model is essential and can be encapsulated within an intelligent agent or robots that can be used to support managers during stress.

  15. CORPORATE FORESIGHT AND PERFORMANCE: A CHAIN-OF-EFFECTS MODEL

    DEFF Research Database (Denmark)

    Jissink, Tymen; Huizingh, Eelko K.R.E.; Rohrbeck, René

    2015-01-01

    , formal organization, and culture. We investigate the relation of corporate foresight with three innovation performance dimensions – new product success, new product innovativeness, and financial performance. We use partial-least-squares structural equations modelling to assess our measurement mode ls......In this paper we develop and validate a measurement scale for corporate foresight and examine its impact on performance in a chain-of-effects model. We conceptualize corporate foresight as an organizational ability consisting of five distinct dimensions: information scope, method usage, people...... and test our research hypotheses. Using a cross-industry sample of 153 innovative firms, we find that corporate foresight can be validly and reliably measured by our measurement instrument. The results of the structural model support the hypothesized positive effects of corporate foresight on all...

  16. A multiserver multiqueue network:modeling and performance analysis

    Institute of Scientific and Technical Information of China (English)

    ZhiguangShan; YangYang; 等

    2002-01-01

    A new categroy of system model,multiserver multiqueue network(MSMQN),is proposed for distributed systems such as the geopgraphically distributed web-server clusters.A MSMQN comprises multiple multiserver multiqueue(MSMQ) nodes distributed over the network.and every node consists of a number of servers that each contains multiple priority queues for waiting customers.An incoming request can be distributed to a waiting queue of any server in any node,according to the routing policy integrated by the nodeselection policy at network-level,request-dispatching policy at node-level,and request-scheduling policy at server-level.The model is investigated using stochastic high-level Petrinet(SHLPN) modeling and performance analysis techniques.The performance metrics concerned includes the delay time of requests in the MSMQ node and the response time perceived by the users.The numerical example shows the feeiciency of the performance analysis technique.

  17. Aircraft Anomaly Detection Using Performance Models Trained on Fleet Data

    Science.gov (United States)

    Gorinevsky, Dimitry; Matthews, Bryan L.; Martin, Rodney

    2012-01-01

    This paper describes an application of data mining technology called Distributed Fleet Monitoring (DFM) to Flight Operational Quality Assurance (FOQA) data collected from a fleet of commercial aircraft. DFM transforms the data into aircraft performance models, flight-to-flight trends, and individual flight anomalies by fitting a multi-level regression model to the data. The model represents aircraft flight performance and takes into account fixed effects: flight-to-flight and vehicle-to-vehicle variability. The regression parameters include aerodynamic coefficients and other aircraft performance parameters that are usually identified by aircraft manufacturers in flight tests. Using DFM, the multi-terabyte FOQA data set with half-million flights was processed in a few hours. The anomalies found include wrong values of competed variables, (e.g., aircraft weight), sensor failures and baises, failures, biases, and trends in flight actuators. These anomalies were missed by the existing airline monitoring of FOQA data exceedances.

  18. A Bibliometric Analysis and Review on Performance Modeling Literature

    Directory of Open Access Journals (Sweden)

    Barbara Livieri

    2015-04-01

    Full Text Available In management practice, performance indicators are considered as a prerequisite to make informed decisions in line with the organization’s goals. On the other hand, indicators summarizes compound phenomena in a few digits, which can induce to inadequate decisions, biased by information loss and conflicting values. Model driven approaches in enterprise engineering can be very effective to avoid these pitfalls, or to take it under control. For that reason, “performance modeling” has the numbers to play a primary role in the “model driven enterprise” scenario, together with process, information and other enterprise-related aspects. In this perspective, we propose a systematic review of the literature on performance modeling in order to retrieve, classify, and summarize existing research, identify the core authors and define areas and opportunities for future research.

  19. Testing a Model of Work Performance in an Academic Environment

    Directory of Open Access Journals (Sweden)

    B. Charles Tatum

    2012-04-01

    Full Text Available In modern society, people both work and study. The intersection between organizational and educational research suggests that a common model should apply to both academic and job performance. The purpose of this study was to apply a model of work and job performance (based on general expectancy theory to a classroom setting, and test the predicted relationships using a causal/path model methodology. The findings revealed that motivation and ability predicted student expectations and self-efficacy, and that expectations and efficacy predicted class performance. Limitations, implications, and future research directions are discussed. This study showed how the research in industrial and organizational psychology is relevant to education. It was concluded that greater effort should be made to integrate knowledge across a wider set of domains.

  20. Performance modeling of a feature-aided tracker

    Science.gov (United States)

    Goley, G. Steven; Nolan, Adam R.

    2012-06-01

    In order to provide actionable intelligence in a layered sensing paradigm, exploitation algorithms should produce a confidence estimate in addition to the inference variable. This article presents a methodology and results of one such algorithm for feature-aided tracking of vehicles in wide area motion imagery. To perform experiments a synthetic environment was developed, which provided explicit knowledge of ground truth, tracker prediction accuracy, and control of operating conditions. This synthetic environment leveraged physics-based modeling simulations to re-create both traffic flow, reflectance of vehicles, obscuration and shadowing. With the ability to control operating conditions as well as the availability of ground truth, several experiments were conducted to test both the tracker and expected performance. The results show that the performance model produces a meaningful estimate of the tracker performance over the subset of operating conditions.

  1. On Performance Modeling of Ad Hoc Routing Protocols

    Directory of Open Access Journals (Sweden)

    Khayam SyedAli

    2010-01-01

    Full Text Available Simulation studies have been the predominant method of evaluating ad hoc routing algorithms. Despite their wide use and merits, simulations are generally time consuming. Furthermore, several prominent ad hoc simulations report inconsistent and unrepeatable results. We, therefore, argue that simulation-based evaluation of ad hoc routing protocols should be complemented with mathematical verification and comparison. In this paper, we propose a performance evaluation framework that can be used to model two key performance metrics of an ad hoc routing algorithm, namely, routing overhead and route optimality. We also evaluate derivatives of the two metrics, namely, total energy consumption and route discovery latency. Using the proposed framework, we evaluate the performance of four prominent ad hoc routing algorithms: DSDV, DSR, AODV-LL, and Gossiping. We show that the modeled metrics not only allow unbiased performance comparison but also provide interesting insight about the impact of different parameters on the behavior of these protocols.

  2. Performance on the Luria-Nebraska Neuropsychological Test Battery-Children's Revision: A Comparison of Children with and without Significant WISC-R VIQ-PIQ Discrepancies.

    Science.gov (United States)

    Gilger, J. W.; Geary, D. C.

    1985-01-01

    Compared the performance of 56 children on the 11 subscales of the Luria-Nebraska Neuropsychological Battery-Children's Revision. Results revealed significant differences on Receptive Speech and Expressive Language subscales, suggesting a possible differential sensitivity of the children's Luria-Nebraska to verbal and nonverbal cognitive deficits.…

  3. Performance on the Luria-Nebraska Neuropsychological Test Battery-Children's Revision: A Comparison of Children with and without Significant WISC-R VIQ-PIQ Discrepancies.

    Science.gov (United States)

    Gilger, J. W.; Geary, D. C.

    1985-01-01

    Compared the performance of 56 children on the 11 subscales of the Luria-Nebraska Neuropsychological Battery-Children's Revision. Results revealed significant differences on Receptive Speech and Expressive Language subscales, suggesting a possible differential sensitivity of the children's Luria-Nebraska to verbal and nonverbal cognitive deficits.…

  4. Bounding SAR ATR performance based on model similarity

    Science.gov (United States)

    Boshra, Michael; Bhanu, Bir

    1999-08-01

    Similarity between model targets plays a fundamental role in determining the performance of target recognition. We analyze the effect of model similarity on the performance of a vote- based approach for target recognition from SAR images. In such an approach, each model target is represented by a set of SAR views sampled at a variety of azimuth angles and a specific depression angle. Both model and data views are represented by locations of scattering centers, which are peak features. The model hypothesis (view of a specific target and associated location) corresponding to a given data view is chosen to be the one with the highest number of data-supported model features (votes). We address three issues in this paper. Firstly, we present a quantitative measure of the similarity between a pair of model views. Such a measure depends on the degree of structural overlap between the two views, and the amount of uncertainty. Secondly, we describe a similarity- based framework for predicting an upper bound on recognition performance in the presence of uncertainty, occlusion and clutter. Thirdly, we validate the proposed framework using MSTAR public data, which are obtained under different depression angles, configurations and articulations.

  5. Performance verification tests of JT-60SA CS model coil

    Energy Technology Data Exchange (ETDEWEB)

    Obana, Tetsuhiro, E-mail: obana.tetsuhiro@LHD.nifs.ac.jp [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Murakami, Haruyuki [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan); Takahata, Kazuya; Hamaguchi, Shinji; Chikaraishi, Hirotaka; Mito, Toshiyuki; Imagawa, Shinsaku [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Kizu, Kaname; Natsume, Kyohei; Yoshida, Kiyoshi [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan)

    2015-11-15

    Highlights: • The performance of the JT-60SA CS model coil was verified. • The CS model coil comprised a quad-pancake wound with a Nb{sub 3}Sn CIC conductor. • The CS model coil met the design requirements. - Abstract: As a final check of the coil manufacturing method of the JT-60 Super Advanced (JT-60SA) central solenoid (CS), we verified the performance of a CS model coil. The model coil comprised a quad-pancake wound with a Nb{sub 3}Sn cable-in-conduit conductor. Measurements of the critical current, joint resistance, pressure drop, and magnetic field were conducted in the verification tests. In the critical-current measurement, the critical current of the model coil coincided with the estimation derived from a strain of −0.62% for the Nb{sub 3}Sn strands. As a result, critical-current degradation caused by the coil manufacturing process was not observed. The results of the performance verification tests indicate that the model coil met the design requirements. Consequently, the manufacturing process of the JT-60SA CS was established.

  6. Thermal performance modeling of NASA s scientific balloons

    Science.gov (United States)

    Franco, H.; Cathey, H.

    The flight performance of a scientific balloon is highly dependant on the interaction between the balloon and its environment. The balloon is a thermal vehicle. Modeling a scientific balloon's thermal performance has proven to be a difficult analytical task. Most previous thermal models have attempted these analyses by using either a bulk thermal model approach, or by simplified representations of the balloon. These approaches to date have provided reasonable, but not very accurate results. Improvements have been made in recent years using thermal analysis tools developed for the thermal modeling of spacecraft and other sophisticated heat transfer problems. These tools, which now allow for accurate modeling of highly transmissive materials, have been applied to the thermal analysis of NASA's scientific balloons. A research effort has been started that utilizes the "Thermal Desktop" addition to AUTO CAD. This paper will discuss the development of thermal models for both conventional and Ultra Long Duration super-pressure balloons. This research effort has focused on incremental analysis stages of development to assess the accuracy of the tool and the required model resolution to produce usable data. The first stage balloon thermal analyses started with simple spherical balloon models with a limited number of nodes, and expanded the number of nodes to determine required model resolution. These models were then modified to include additional details such as load tapes. The second stage analyses looked at natural shaped Zero Pressure balloons. Load tapes were then added to these shapes, again with the goal of determining the required modeling accuracy by varying the number of gores. The third stage, following the same steps as the Zero Pressure balloon efforts, was directed at modeling super-pressure pumpkin shaped balloons. The results were then used to develop analysis guidelines and an approach for modeling balloons for both simple first order estimates and detailed

  7. Comparison of Predictive Models for PV Module Performance (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Marion, B.

    2008-05-01

    This paper examines three models used to estimate the maximum power (P{sub m}) of PV modules when the irradiance and PV cell temperature are known: (1) the power temperature coefficient model, (2) the PVFORM model, and (3) the bilinear interpolation model. A variation of the power temperature coefficient model is also presented that improved model accuracy. For modeling values of P{sub m}, an 'effective' plane-of-array (POA) irradiance (E{sub e}) and the PV cell temperature (T) are used as model inputs. Using E{sub e} essentially removes the effects of variations in solar spectrum and reflectance losses, and permits the influence of irradiance and temperature on model performance for P{sub m} to be more easily studied. Eq. 1 is used to determine E{sub e} from T and the PV module's measured short-circuit current (I{sub sc}). Zero subscripts denote performance at Standard Reporting Conditions (SRC).

  8. Generalized Models: An Application to Identify Environmental Variables That Significantly Affect the Abundance of Three Tree Species

    Directory of Open Access Journals (Sweden)

    Pablo Antúnez

    2017-02-01

    Full Text Available In defining the environmental preferences of plant species, statistical models are part of the essential tools in the field of modern ecology. However, conventional linear models require compliance with some parametric assumptions and if these requirements are not met, imply a serious limitation of the applied model. In this study, the effectiveness of linear and nonlinear generalized models was examined to identify the unitary effect of the principal environmental variables on the abundance of three tree species growing in the natural temperate forests of Oaxaca, Mexico. The covariates that showed a significant effect on the distribution of tree species were the maximum and minimum temperatures and the precipitation during specific periods. Results suggest that the generalized models, particularly smoothed models, were able to detect the increase or decrease of the abundance against changes in an environmental variable; they also revealed the inflection of the regression. In addition, these models allow partial characterization of the realized niche of a given species according to some specific variables, regardless of the type of relationship.

  9. Developing a model of forecasting information systems performance

    Directory of Open Access Journals (Sweden)

    G. N. Isaev

    2017-01-01

    Full Text Available Research aim: to develop a model to forecast the performance ofinformation systems as a mechanism for preliminary assessment of the information system effectiveness before the beginning of financing the information system project.Materials and methods: the starting material used the results of studying the parameters of the statistical structure of information system data processing defects. Methods of cluster analysis and regression analysis were applied.Results: in order to reduce financial risks, information systems customers try to make decisions on the basis of preliminary calculations on the effectiveness of future information systems. However, the assumptions on techno-economic justification of the project can only be obtained when the funding for design work is already open. Its evaluation can be done before starting the project development using a model of forecasting information system performance. The model is developed using regression analysis in the form of a multiple linear regression. The value of information system performance is the predicted variable in the regression equation. The values of data processing defects in the classes of accuracy, completeness and timeliness are the forecast variables. Measurement and evaluation of parameters of the statistical structure of defects were done through programmes of cluster analysis and regression analysis. The calculations for determining the actual and forecast values of the information system performance were conducted.Conclusion: in terms of implementing the model, a research of information systems was carried out, as well as the development of forecasting model of information system performance. The conducted experimental work showed the adequacy of the model. The model is implemented in the complex task of designing information systems in education and industry.

  10. A Real-Time Performance Analysis Model for Cryptographic Protocols

    Directory of Open Access Journals (Sweden)

    Amos Olagunju

    2012-12-01

    Full Text Available Several encryption algorithms exist today for securing data in storage and transmission over network systems. The choice of encryption algorithms must weigh performance requirements against the call for protection of sensitive data. This research investigated the processing times of alternative encryption algorithms under specific conditions. The paper presents the architecture of a model multiplatform tool for the evaluation of candidate encryption algorithms based on different data and key sizes. The model software was used to appraise the real-time performance of DES, AES, 3DES, MD5, SHA1, and SHA2 encryption algorithms.

  11. CFD modelling of hydrogen stratification in enclosures: Model validation and application to PAR performance

    Energy Technology Data Exchange (ETDEWEB)

    Hoyes, J.R., E-mail: james.hoyes@hsl.gsi.gov.uk; Ivings, M.J.

    2016-12-15

    Highlights: • The ability of CFD to predict hydrogen stratification phenomena is investigated. • Contrary to expectation, simulations on tetrahedral meshes under-predict mixing. • Simulations on structured meshes give good agreement with experimental data. • CFD model used to investigate the effects of stratification on PAR performance. • Results show stratification can have a significant effect on PAR performance. - Abstract: Computational Fluid Dynamics (CFD) models are maturing into useful tools for supporting safety analyses. This paper investigates the capabilities of CFD models for predicting hydrogen stratification in a containment vessel using data from the NEA/OECD SETH2 MISTRA experiments. Further simulations are then carried out to illustrate the qualitative effects of hydrogen stratification on the performance of Passive Autocatalytic Recombiner (PAR) units. The MISTRA experiments have well-defined initial and boundary conditions which makes them well suited for use in a validation study. Results are presented for the sensitivity to mesh resolution and mesh type. Whilst the predictions are shown to be largely insensitive to the mesh resolution they are surprisingly sensitive to the mesh type. In particular, tetrahedral meshes are found to induce small unphysical convection currents that result in molecular diffusion and turbulent mixing being under-predicted. This behaviour is not unique to the CFD model used here (ANSYS CFX) and furthermore, it may affect simulations run on other non-aligned meshes (meshes that are not aligned perpendicular to gravity), including non-aligned structured meshes. Following existing best practice guidelines can help to identify potential unphysical predictions, but as an additional precaution consideration should be given to using gravity-aligned meshes for modelling stratified flows. CFD simulations of hydrogen recombination in the Becker Technologies THAI facility are presented with high and low PAR positions

  12. Model of service-oriented catering supply chain performance evaluation

    Directory of Open Access Journals (Sweden)

    Juanqiong Gou

    2013-03-01

    Full Text Available Purpose: The aim of this paper is constructing a performance evaluation model for service-oriented catering supply chain. Design/methodology/approach: With the research on the current situation of catering industry, this paper summarized the characters of the catering supply chain, and then presents the service-oriented catering supply chain model based on the platform of logistics and information. At last, the fuzzy AHP method is used to evaluate the performance of service-oriented catering supply chain. Findings: With the analysis of the characteristics of catering supply chain, we construct the performance evaluation model in order to guarantee the food safety, logistics efficiency, price stability and so on. Practical implications: In order to evolve an efficient and effective service supply chain, it can not only used to own enterprise improvement, but also can be used for selecting different customers, to choose a different model of development. Originality/value: This paper has a new definition of service-oriented catering supply chain. And it offers a model to evaluate the performance of this catering supply chain.

  13. Performance based Ranking Model for Cloud SaaS Services

    Directory of Open Access Journals (Sweden)

    Sahar Abdalla Elmubarak

    2017-01-01

    Full Text Available Cloud computing systems provide virtualized resources that can be provisioned on demand basis. Enormous number of cloud providers are offering diverse number of services. The performance of these services is a critical factor for clients to determine the cloud provider that they will choose. However, determining a provider with efficient and effective services is a challenging task. There is a need for an efficient model that help clients to select the best provider based on the performance attributes and measurements. Cloud service ranking is a standard method used to perform this task. It is the process of arranging and classifying several cloud services within the cloud, then compute the relative ranking values of them based on the quality of service required by clients and the features of the cloud services. The objective of this study is to propose an enhanced performance based ranking model to help users choose the best service they need. The proposed model combines the attributes and measurements from cloud computing field and the welldefined and established software engineering field. SMICloud Toolkit has been used to test the applicability of the proposed model. The experimentation results of the proposed model were promising.

  14. Neural Network Based Model for Predicting Housing Market Performance

    Institute of Scientific and Technical Information of China (English)

    Ahmed Khalafallah

    2008-01-01

    The United States real estate market is currently facing its worst hit in two decades due to the slowdown of housing sales. The most affected by this decline are real estate investors and home develop-ers who are currently struggling to break-even financially on their investments. For these investors, it is of utmost importance to evaluate the current status of the market and predict its performance over the short-term in order to make appropriate financial decisions. This paper presents the development of artificial neu-ral network based models to support real estate investors and home developers in this critical task. The pa-per describes the decision variables, design methodology, and the implementation of these models. The models utilize historical market performance data sets to train the artificial neural networks in order to pre-dict unforeseen future performances. An application example is analyzed to demonstrate the model capabili-ties in analyzing and predicting the market performance. The model testing and validation showed that the error in prediction is in the range between -2% and +2%.

  15. New performance evaluation models for character detection in images

    Science.gov (United States)

    Wang, YanWei; Ding, XiaoQing; Liu, ChangSong; Wang, Kongqiao

    2010-02-01

    Detection of characters regions is a meaningful research work for both highlighting region of interest and recognition for further information processing. A lot of researches have been performed on character localization and extraction and this leads to the great needs of performance evaluation scheme to inspect detection algorithms. In this paper, two probability models are established to accomplish evaluation tasks for different applications respectively. For highlighting region of interest, a Gaussian probability model, which simulates the property of a low-pass Gaussian filter of human vision system (HVS), was constructed to allocate different weights to different character parts. It reveals the greatest potential to describe the performance of detectors, especially, when the result detected is an incomplete character, where other methods cannot effectively work. For the recognition destination, we also introduced a weighted probability model to give an appropriate description for the contribution of detection results to final recognition results. The validity of performance evaluation models proposed in this paper are proved by experiments on web images and natural scene images. These models proposed in this paper may also be able to be applied in evaluating algorithms of locating other objects, like face detection and more wide experiments need to be done to examine the assumption.

  16. Model-based approach for elevator performance estimation

    Science.gov (United States)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  17. Model for performance prediction in multi-axis machining

    CERN Document Server

    Lavernhe, Sylvain; Lartigue, Claire; 10.1007/s00170-007-1001-4

    2009-01-01

    This paper deals with a predictive model of kinematical performance in 5-axis milling within the context of High Speed Machining. Indeed, 5-axis high speed milling makes it possible to improve quality and productivity thanks to the degrees of freedom brought by the tool axis orientation. The tool axis orientation can be set efficiently in terms of productivity by considering kinematical constraints resulting from the set machine-tool/NC unit. Capacities of each axis as well as some NC unit functions can be expressed as limiting constraints. The proposed model relies on each axis displacement in the joint space of the machine-tool and predicts the most limiting axis for each trajectory segment. Thus, the calculation of the tool feedrate can be performed highlighting zones for which the programmed feedrate is not reached. This constitutes an indicator for trajectory optimization. The efficiency of the model is illustrated through examples. Finally, the model could be used for optimizing process planning.

  18. PORFLOW Modeling Supporting The H-Tank Farm Performance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, J. M.; Flach, G. P.; Westbrook, M. L.

    2012-08-31

    Numerical simulations of groundwater flow and contaminant transport in the vadose and saturated zones have been conducted using the PORFLOW code in support of an overall Performance Assessment (PA) of the H-Tank Farm. This report provides technical detail on selected aspects of PORFLOW model development and describes the structure of the associated electronic files. The PORFLOW models for the H-Tank Farm PA, Rev. 1 were updated with grout, solubility, and inventory changes. The aquifer model was refined. In addition, a set of flow sensitivity runs were performed to allow flow to be varied in the related probabilistic GoldSim models. The final PORFLOW concentration values are used as input into a GoldSim dose calculator.

  19. Procedure for assessing the performance of a rockfall fragmentation model

    Science.gov (United States)

    Matas, Gerard; Lantada, Nieves; Corominas, Jordi; Gili, Josep Antoni; Ruiz-Carulla, Roger; Prades, Albert

    2017-04-01

    A Rockfall is a mass instability process frequently observed in road cuts, open pit mines and quarries, steep slopes and cliffs. It is frequently observed that the detached rock mass becomes fragmented when it impacts with the slope surface. The consideration of the fragmentation of the rockfall mass is critical for the calculation of block's trajectories and their impact energies, to further assess their potential to cause damage and design adequate preventive structures. We present here the performance of the RockGIS model. It is a GIS-Based tool that simulates stochastically the fragmentation of the rockfalls, based on a lumped mass approach. In RockGIS, the fragmentation initiates by the disaggregation of the detached rock mass through the pre-existing discontinuities just before the impact with the ground. An energy threshold is defined in order to determine whether the impacting blocks break or not. The distribution of the initial mass between a set of newly generated rock fragments is carried out stochastically following a power law. The trajectories of the new rock fragments are distributed within a cone. The model requires the calibration of both the runout of the resultant blocks and the spatial distribution of the volumes of fragments generated by breakage during their propagation. As this is a coupled process which is controlled by several parameters, a set of performance criteria to be met by the simulation have been defined. The criteria includes: position of the centre of gravity of the whole block distribution, histogram of the runout of the blocks, extent and boundaries of the young debris cover over the slope surface, lateral dispersion of trajectories, total number of blocks generated after fragmentation, volume distribution of the generated fragments, the number of blocks and volume passages past a reference line and the maximum runout distance Since the number of parameters to fit increases significantly when considering fragmentation, the

  20. Software life cycle dynamic simulation model: The organizational performance submodel

    Science.gov (United States)

    Tausworthe, Robert C.

    1985-01-01

    The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.

  1. Rethinking board role performance: Towards an integrative model

    Directory of Open Access Journals (Sweden)

    Babić Verica M.

    2011-01-01

    Full Text Available This research focuses on the board role evolution analysis which took place simultaneously with the development of different corporate governance theories and perspectives. The purpose of this paper is to provide understanding of key factors that make a board effective in the performance of its role. We argue that analysis of board role performance should incorporate both structural and process variables. This paper’s contribution is the development of an integrative model that aims to establish the relationship between the board structure and processes on the one hand, and board role performance on the other.

  2. Performance analysis of IP QoS provision model

    Institute of Scientific and Technical Information of China (English)

    SUN Danning; Moonsik Kang

    2006-01-01

    The Performance of a heterogeneous IP QoS provision service model was analyzed. This model utilized RSVP technique to set up dynamic resource reservation interface between the user and the network, meanwhile, DiffServ technique was utilized to transmit class-based packets in different per hop behaviors. Furthermore, accordingly queue management and packets scheduling mechanisms were presented for end-to-end QoS guarantees and appropriate cooperation of network elements.

  3. Does better rainfall interpolation improve hydrological model performance?

    Science.gov (United States)

    Bàrdossy, Andràs; Kilsby, Chris; Lewis, Elisabeth

    2017-04-01

    High spatial variability of precipitation is one of the main sources of uncertainty in rainfall/runoff modelling. Spatially distributed models require detailed space time information on precipitation as input. In the past decades a lot of effort was spent on improving precipitation interpolation using point observations. Different geostatistical methods like Ordinary Kriging, External Drift Kriging or Copula based interpolation can be used to find the best estimators for unsampled locations. The purpose of this work is to investigate to what extents more sophisticated precipitation estimation methods can improve model performance. For this purpose the Wye catchment in Wales was selected. The physically-based spatially-distributed hydrological model SHETRAN is used to describe the hydrological processes in the catchment. 31 raingauges with 1 hourly temporal resolution are available for a time period of 6 years. In order to avoid the effect of model uncertainty model parameters were not altered in this study. Instead 100 random subsets consisting of 14 stations each were selected. For each of the configurations precipitation was interpolated for each time step using nearest neighbor (NN), inverse distance (ID) and Ordinary Kriging (OK). The variogram was obtained using the temporal correlation of the time series measured at different locations. The interpolated data were used as input for the spatially distributed model. Performance was evaluated for daily mean discharges using the Nash-Sutcliffe coefficient, temporal correlations, flow volumes and flow duration curves. The results show that the simplest NN and the sophisticated OK performances are practically equally good, while ID performed worse. NN was often better for high flows. The reason for this is that NN does not reduce the variance, while OK and ID yield smooth precipitation fields. The study points out the importance of precipitation variability and suggests the use of conditional spatial simulation as

  4. Kinetic models in industrial biotechnology - Improving cell factory performance.

    Science.gov (United States)

    Almquist, Joachim; Cvijovic, Marija; Hatzimanikatis, Vassily; Nielsen, Jens; Jirstrand, Mats

    2014-07-01

    An increasing number of industrial bioprocesses capitalize on living cells by using them as cell factories that convert sugars into chemicals. These processes range from the production of bulk chemicals in yeasts and bacteria to the synthesis of therapeutic proteins in mammalian cell lines. One of the tools in the continuous search for improved performance of such production systems is the development and application of mathematical models. To be of value for industrial biotechnology, mathematical models should be able to assist in the rational design of cell factory properties or in the production processes in which they are utilized. Kinetic models are particularly suitable towards this end because they are capable of representing the complex biochemistry of cells in a more complete way compared to most other types of models. They can, at least in principle, be used to in detail understand, predict, and evaluate the effects of adding, removing, or modifying molecular components of a cell factory and for supporting the design of the bioreactor or fermentation process. However, several challenges still remain before kinetic modeling will reach the degree of maturity required for routine application in industry. Here we review the current status of kinetic cell factory modeling. Emphasis is on modeling methodology concepts, including model network structure, kinetic rate expressions, parameter estimation, optimization methods, identifiability analysis, model reduction, and model validation, but several applications of kinetic models for the improvement of cell factories are also discussed.

  5. Model for determining and optimizing delivery performance in industrial systems

    Directory of Open Access Journals (Sweden)

    Fechete Flavia

    2017-01-01

    Full Text Available Performance means achieving organizational objectives regardless of their nature and variety, and even overcoming them. Improving performance is one of the major goals of any company. Achieving the global performance means not only obtaining the economic performance, it is a must to take into account other functions like: function of quality, delivery, costs and even the employees satisfaction. This paper aims to improve the delivery performance of an industrial system due to their very low results. The delivery performance took into account all categories of performance indicators, such as on time delivery, backlog efficiency or transport efficiency. The research was focused on optimizing the delivery performance of the industrial system, using linear programming. Modeling the delivery function using linear programming led to obtaining precise quantities to be produced and delivered each month by the industrial system in order to minimize their transport cost, satisfying their customers orders and to control their stock. The optimization led to a substantial improvement in all four performance indicators that concern deliveries.

  6. The photon dose calculation algorithm used in breast radiotherapy has significant impact on the parameters of radiobiological models.

    Science.gov (United States)

    Petillion, Saskia; Swinnen, Ans; Defraene, Gilles; Verhoeven, Karolien; Weltens, Caroline; Van den Heuvel, Frank

    2014-07-08

    The comparison of the pencil beam dose calculation algorithm with modified Batho heterogeneity correction (PBC-MB) and the analytical anisotropic algorithm (AAA) and the mutual comparison of advanced dose calculation algorithms used in breast radiotherapy have focused on the differences between the physical dose distributions. Studies on the radiobiological impact of the algorithm (both on the tumor control and the moderate breast fibrosis prediction) are lacking. We, therefore, investigated the radiobiological impact of the dose calculation algorithm in whole breast radiotherapy. The clinical dose distributions of 30 breast cancer patients, calculated with PBC-MB, were recalculated with fixed monitor units using more advanced algorithms: AAA and Acuros XB. For the latter, both dose reporting modes were used (i.e., dose-to-medium and dose-to-water). Next, the tumor control probability (TCP) and the normal tissue complication probability (NTCP) of each dose distribution were calculated with the Poisson model and with the relative seriality model, respectively. The endpoint for the NTCP calculation was moderate breast fibrosis five years post treatment. The differences were checked for significance with the paired t-test. The more advanced algorithms predicted a significantly lower TCP and NTCP of moderate breast fibrosis then found during the corresponding clinical follow-up study based on PBC calculations. The differences varied between 1% and 2.1% for the TCP and between 2.9% and 5.5% for the NTCP of moderate breast fibrosis. The significant differences were eliminated by determination of algorithm-specific model parameters using least square fitting. Application of the new parameters on a second group of 30 breast cancer patients proved their appropriateness. In this study, we assessed the impact of the dose calculation algorithms used in whole breast radiotherapy on the parameters of the radiobiological models. The radiobiological impact was eliminated by

  7. New Mechanical Model for the Transmutation Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller

    2008-04-01

    A new mechanical model has been developed for implementation into the TRU fuel performance code. The new model differs from the existing FRAPCON 3 model, which it is intended to replace, in that it will include structural deformations (elasticity, plasticity, and creep) of the fuel. Also, the plasticity algorithm is based on the “plastic strain–total strain” approach, which should allow for more rapid and assured convergence. The model treats three situations relative to interaction between the fuel and cladding: (1) an open gap between the fuel and cladding, such that there is no contact, (2) contact between the fuel and cladding where the contact pressure is below a threshold value, such that axial slippage occurs at the interface, and (3) contact between the fuel and cladding where the contact pressure is above a threshold value, such that axial slippage is prevented at the interface. The first stage of development of the model included only the fuel. In this stage, results obtained from the model were compared with those obtained from finite element analysis using ABAQUS on a problem involving elastic, plastic, and thermal strains. Results from the two analyses showed essentially exact agreement through both loading and unloading of the fuel. After the cladding and fuel/clad contact were added, the model demonstrated expected behavior through all potential phases of fuel/clad interaction, and convergence was achieved without difficulty in all plastic analysis performed. The code is currently in stand alone form. Prior to implementation into the TRU fuel performance code, creep strains will have to be added to the model. The model will also have to be verified against an ABAQUS analysis that involves contact between the fuel and cladding.

  8. Waterflooding performance using Dykstra-Parsons as compared with numerical model performance

    Energy Technology Data Exchange (ETDEWEB)

    Mobarak, S.

    1975-01-01

    Multilayered models have been used by a number of investigators to represent heterogeneous reservoirs. The purpose of this note is to present waterflood performance for multilayered systems using the standard Dykstra-Parsons method as method as compared with that predicted by the modified form using equations given and those obtained by using a numerical model. The predicted oil recovery, using Johnson charts or the standard Dykstra-Parsons recovery modulus chart is always conservative, if not overly pessimistic. The modified Dykstra-Parsons method, as explained in the text, shows good agreement with the numerical model.

  9. Modeling performance measurement applications and implementation issues in DEA

    CERN Document Server

    Cook, Wade D

    2005-01-01

    Addresses advanced/new DEA methodology and techniques that are developed for modeling unique and new performance evaluation issuesPesents new DEA methodology and techniques via discussions on how to solve managerial problemsProvides an easy-to-use DEA software - DEAFrontier (www.deafrontier.com) which is an excellent tool for both DEA researchers and practitioners.

  10. High Performance Computing tools for the Integrated Tokamak Modelling project

    Energy Technology Data Exchange (ETDEWEB)

    Guillerminet, B., E-mail: bernard.guillerminet@cea.f [Association Euratom-CEA sur la Fusion, IRFM, DSM, CEA Cadarache (France); Plasencia, I. Campos [Instituto de Fisica de Cantabria (IFCA), CSIC, Santander (Spain); Haefele, M. [Universite Louis Pasteur, Strasbourg (France); Iannone, F. [EURATOM/ENEA Fusion Association, Frascati (Italy); Jackson, A. [University of Edinburgh (EPCC) (United Kingdom); Manduchi, G. [EURATOM/ENEA Fusion Association, Padova (Italy); Plociennik, M. [Poznan Supercomputing and Networking Center (PSNC) (Poland); Sonnendrucker, E. [Universite Louis Pasteur, Strasbourg (France); Strand, P. [Chalmers University of Technology (Sweden); Owsiak, M. [Poznan Supercomputing and Networking Center (PSNC) (Poland)

    2010-07-15

    Fusion Modelling and Simulation are very challenging and the High Performance Computing issues are addressed here. Toolset for jobs launching and scheduling, data communication and visualization have been developed by the EUFORIA project and used with a plasma edge simulation code.

  11. Range-dependent sonar performance modelling during Battlespace Preparation 2007

    NARCIS (Netherlands)

    Raa, L.A. te; Lam, F.P.A.; Schouten M.W.; Janmaat, J.

    2009-01-01

    Spatial and temporal variations in sound speed can have substantial effects on sound propagation and hence sonar performance. Operational oceanographic models can provide forecasts of oceanographic variables as temperature, salinity and sound speed up to several days ahead. These four-dimensional fo

  12. Towards a Social Networks Model for Online Learning & Performance

    Science.gov (United States)

    Chung, Kon Shing Kenneth; Paredes, Walter Christian

    2015-01-01

    In this study, we develop a theoretical model to investigate the association between social network properties, "content richness" (CR) in academic learning discourse, and performance. CR is the extent to which one contributes content that is meaningful, insightful and constructive to aid learning and by social network properties we…

  13. Introducing Model Predictive Control for Improving Power Plant Portfolio Performance

    DEFF Research Database (Denmark)

    Edlund, Kristian Skjoldborg; Bendtsen, Jan Dimon; Børresen, Simon

    2008-01-01

    This paper introduces a model predictive control (MPC) approach for construction of a controller for balancing the power generation against consumption in a power system. The objective of the controller is to coordinate a portfolio consisting of multiple power plant units in the effort to perform...

  14. Evaluating hydrological model performance using information theory-based metrics

    Science.gov (United States)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  15. Performance evaluation:= (process algebra + model checking) x Markov chains

    NARCIS (Netherlands)

    Hermanns, H.; Katoen, J.P.; Larsen, Kim G.; Nielsen, Mogens

    2001-01-01

    Markov chains are widely used in practice to determine system performance and reliability characteristics. The vast majority of applications considers continuous-time Markov chains (CTMCs). This tutorial paper shows how successful model specification and analysis techniques from concurrency theory c

  16. Performance in model transformations: experiments with ATL and QVT

    NARCIS (Netherlands)

    van Amstel, Marcel; Bosems, S.; Ivanov, Ivan; Ferreira Pires, Luis; Cabot, Jordi; Visser, Eelco

    Model transformations are increasingly being incorporated in software development processes. However, as systems being developed with transformations grow in size and complexity, the performance of the transformations tends to degrade. In this paper we investigate the factors that have an impact on

  17. An e-Procurement Model for Logistic Performance Increase

    NARCIS (Netherlands)

    Toma, Cristina; Vasilescu, Bogdan; Popescu, Catalin; Soliman, KS

    2009-01-01

    This paper discusses the suitability of an e-procurement system in increasing logistic performance, given the growth in fast Internet availability,. In consequence, a model is derived and submitted for analysis. The scope of the research is limited at the intermediary goods importing sector for a be

  18. Performance Analysis of OFDM with Frequency Offset and Correction Model

    Institute of Scientific and Technical Information of China (English)

    QIN Sheng-ping; YIN Chang-chuan; LUO Tao; YUE Guang-xin

    2003-01-01

    The performance of OFDM with frequency offset is analyzed and simulated in this paper. It is concluded that the SIR is very large and the BER of OFDM system with frequency offset is strongly affected. A BER calculating method is introduced and simulated. Assumed that the frequency offset is known, frequency offset correction model is discussed.

  19. Towards a Social Networks Model for Online Learning & Performance

    Science.gov (United States)

    Chung, Kon Shing Kenneth; Paredes, Walter Christian

    2015-01-01

    In this study, we develop a theoretical model to investigate the association between social network properties, "content richness" (CR) in academic learning discourse, and performance. CR is the extent to which one contributes content that is meaningful, insightful and constructive to aid learning and by social network properties we…

  20. Stutter-Step Models of Performance in School

    Science.gov (United States)

    Morgan, Stephen L.; Leenman, Theodore S.; Todd, Jennifer J.; Kentucky; Weeden, Kim A.

    2013-01-01

    To evaluate a stutter-step model of academic performance in high school, this article adopts a unique measure of the beliefs of 12,591 high school sophomores from the Education Longitudinal Study, 2002-2006. Verbatim responses to questions on occupational plans are coded to capture specific job titles, the listing of multiple jobs, and the listing…

  1. Performance Analysis of MANET Routing Protocols in Different Mobility Models

    Directory of Open Access Journals (Sweden)

    Anuj K. Gupta

    2013-05-01

    Full Text Available A mobile ad-hoc network (MANET is basically called as a network without any central administration or fixed infrastructure. It consists of a number of mobile nodes that use to send data packets through a wireless medium. There is always a need of a good routing protocol in order to establish the connection between mobile nodes since they possess the property of dynamic changing topology. Further, in all the existing routing protocols, mobility of a node has always been one of the important characteristics in determining the overall performance of the ad hoc network. Thus, it is essential to know about various mobility models and their effect on the routing protocols. In this paper, we have made an attempt to compare different mobility models and provide an overview of their current research status. The main focus is on Random Mobility Models and Group Mobility Models. Firstly, we present a survey of the characteristics, drawbacks and research challenges of mobility modeling. At the last we present simulation results that illustrate the importance of choosing a mobility model in the simulation of an ad hoc network protocol. Also, we illustrate how the performance results of an ad hoc network protocol drastically change as a result of changing the mobility model simulated.

  2. PERFORMANCE EVALUATION METHOD FOR BUSINESS PROCESS OF MACHINERY MANUFACTURER BASED ON DEA/AHP HYBRID MODEL

    Institute of Scientific and Technical Information of China (English)

    WANG Ting; YI Shuping; YANG Yuanzhao

    2007-01-01

    A set of indices for performance evaluation for business processes with multiple inputs and multiple outputs is proposed, which are found in machinery manufacturers. Based on the traditional methods of data envelopment analysis (DEA) and analytical hierarchical process (AHP), a hybrid model called DEA/AHP model is proposed to deal with the evaluation of business process performance. With the proposed method, the DEA is firstly used to develop a pairwise comparison matrix, and then the AHP is applied to evaluate the performance of business process using the pairwise comparison matrix. The significant advantage of this hybrid model is the use of objective data instead of subjective human judgment for performance evaluation. In the case study, a project of business process reengineering (BPR) with a hydraulic machinery manufacturer is used to demonstrate the effectiveness of the DEA/AHP model.

  3. Temporal diagnostic analysis of the SWAT model to detect dominant periods of poor model performance

    Science.gov (United States)

    Guse, Björn; Reusser, Dominik E.; Fohrer, Nicola

    2013-04-01

    Hydrological models generally include thresholds and non-linearities, such as snow-rain-temperature thresholds, non-linear reservoirs, infiltration thresholds and the like. When relating observed variables to modelling results, formal methods often calculate performance metrics over long periods, reporting model performance with only few numbers. Such approaches are not well suited to compare dominating processes between reality and model and to better understand when thresholds and non-linearities are driving model results. We present a combination of two temporally resolved model diagnostic tools to answer when a model is performing (not so) well and what the dominant processes are during these periods. We look at the temporal dynamics of parameter sensitivities and model performance to answer this question. For this, the eco-hydrological SWAT model is applied in the Treene lowland catchment in Northern Germany. As a first step, temporal dynamics of parameter sensitivities are analyzed using the Fourier Amplitude Sensitivity test (FAST). The sensitivities of the eight model parameters investigated show strong temporal variations. High sensitivities were detected for two groundwater (GW_DELAY, ALPHA_BF) and one evaporation parameters (ESCO) most of the time. The periods of high parameter sensitivity can be related to different phases of the hydrograph with dominances of the groundwater parameters in the recession phases and of ESCO in baseflow and resaturation periods. Surface runoff parameters show high parameter sensitivities in phases of a precipitation event in combination with high soil water contents. The dominant parameters give indication for the controlling processes during a given period for the hydrological catchment. The second step included the temporal analysis of model performance. For each time step, model performance was characterized with a "finger print" consisting of a large set of performance measures. These finger prints were clustered into

  4. Gender consequences of a national performance-based funding model

    DEFF Research Database (Denmark)

    Nielsen, Mathias Wullum

    2015-01-01

    This article investigates the extent to which the Danish Bibliometric Research Indicator (BRI) reflects the performance of men and women differently. The model is based on a differentiated counting of peer-reviewed publications, awarding three and eight points for contributions to ‘well-regarded’...... privileges collaborative research, which disadvantages women due to gender differences in collaborative network relations.......This article investigates the extent to which the Danish Bibliometric Research Indicator (BRI) reflects the performance of men and women differently. The model is based on a differentiated counting of peer-reviewed publications, awarding three and eight points for contributions to ‘well......-regarded’ and highly selective journals and book publishers, and 1 and 5 points for equivalent scientific contributions via ‘normal level’ channels. On the basis of bibliometric data, the study shows that the BRI considerably widens the existing gender gap in researcher performance, since men on average receive more...

  5. Performance optimization of Jatropha biodiesel engine model using Taguchi approach

    Energy Technology Data Exchange (ETDEWEB)

    Ganapathy, T.; Murugesan, K.; Gakkhar, R.P. [Mechanical and Industrial Engineering Department, Indian Institute of Technology Roorkee, Roorkee 247 667 (India)

    2009-11-15

    This paper proposes a methodology for thermodynamic model analysis of Jatropha biodiesel engine in combination with Taguchi's optimization approach to determine the optimum engine design and operating parameters. A thermodynamic model based on two-zone Weibe's heat release function has been employed to simulate the Jatropha biodiesel engine performance. Among the important engine design and operating parameters 10 critical parameters were selected assuming interactions between the pair of parameters. Using linear graph theory and Taguchi method an L{sub 16} orthogonal array has been utilized to determine the engine test trials layout. In order to maximize the performance of Jatropha biodiesel engine the signal to noise ratio (SNR) related to higher-the-better (HTB) quality characteristics has been used. The present methodology correctly predicted the compression ratio, Weibe's heat release constants and combustion zone duration as the critical parameters that affect the performance of the engine compared to other parameters. (author)

  6. Performance Models and Risk Management in Communications Systems

    CERN Document Server

    Harrison, Peter; Rüstem, Berç

    2011-01-01

    This volume covers recent developments in the design, operation, and management of telecommunication and computer network systems in performance engineering and addresses issues of uncertainty, robustness, and risk. Uncertainty regarding loading and system parameters leads to challenging optimization and robustness issues. Stochastic modeling combined with optimization theory ensures the optimum end-to-end performance of telecommunication or computer network systems. In view of the diverse design options possible, supporting models have many adjustable parameters and choosing the best set for a particular performance objective is delicate and time-consuming. An optimization based approach determines the optimal possible allocation for these parameters. Researchers and graduate students working at the interface of telecommunications and operations research will benefit from this book. Due to the practical approach, this book will also serve as a reference tool for scientists and engineers in telecommunication ...

  7. Human task animation from performance models and natural language input

    Science.gov (United States)

    Esakov, Jeffrey; Badler, Norman I.; Jung, Moon

    1989-01-01

    Graphical manipulation of human figures is essential for certain types of human factors analyses such as reach, clearance, fit, and view. In many situations, however, the animation of simulated people performing various tasks may be based on more complicated functions involving multiple simultaneous reaches, critical timing, resource availability, and human performance capabilities. One rather effective means for creating such a simulation is through a natural language description of the tasks to be carried out. Given an anthropometrically-sized figure and a geometric workplace environment, various simple actions such as reach, turn, and view can be effectively controlled from language commands or standard NASA checklist procedures. The commands may also be generated by external simulation tools. Task timing is determined from actual performance models, if available, such as strength models or Fitts' Law. The resulting action specification are animated on a Silicon Graphics Iris workstation in real-time.

  8. Modeling the seakeeping performance of luxury cruise ships

    Science.gov (United States)

    Cao, Yu; Yu, Bao-Jun; Wang, Jian-Fang

    2010-09-01

    The seakeeping performance of a luxury cruise ship was evaluated during the concept design phase. By comparing numerical predictions based on 3-D linear potential flow theory in the frequency domain with the results of model tests, it was shown that the 3-D method predicted the seakeeping performance of the luxury cruise ship well. Based on the model, the seakeeping features of the luxury cruise ship were analyzed, and then the influence was seen of changes to the primary design parameters (center of gravity, inertial radius, etc.). Based on the results, suggestions were proposed to improve the choice of parameters for luxury cruise ships during the concept design phase. They should improve seakeeping performance.

  9. Performance of GeantV EM Physics Models

    Energy Technology Data Exchange (ETDEWEB)

    Amadio, G.; et al.

    2016-10-14

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  10. A performance model for the communication in fast multipole methods on high-performance computing platforms

    KAUST Repository

    Ibeid, Huda

    2016-03-04

    Exascale systems are predicted to have approximately 1 billion cores, assuming gigahertz cores. Limitations on affordable network topologies for distributed memory systems of such massive scale bring new challenges to the currently dominant parallel programing model. Currently, there are many efforts to evaluate the hardware and software bottlenecks of exascale designs. It is therefore of interest to model application performance and to understand what changes need to be made to ensure extrapolated scalability. The fast multipole method (FMM) was originally developed for accelerating N-body problems in astrophysics and molecular dynamics but has recently been extended to a wider range of problems. Its high arithmetic intensity combined with its linear complexity and asynchronous communication patterns make it a promising algorithm for exascale systems. In this paper, we discuss the challenges for FMM on current parallel computers and future exascale architectures, with a focus on internode communication. We focus on the communication part only; the efficiency of the computational kernels are beyond the scope of the present study. We develop a performance model that considers the communication patterns of the FMM and observe a good match between our model and the actual communication time on four high-performance computing (HPC) systems, when latency, bandwidth, network topology, and multicore penalties are all taken into account. To our knowledge, this is the first formal characterization of internode communication in FMM that validates the model against actual measurements of communication time. The ultimate communication model is predictive in an absolute sense; however, on complex systems, this objective is often out of reach or of a difficulty out of proportion to its benefit when there exists a simpler model that is inexpensive and sufficient to guide coding decisions leading to improved scaling. The current model provides such guidance.

  11. Reference Manual for the System Advisor Model's Wind Power Performance Model

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, J.; Jorgenson, J.; Gilman, P.; Ferguson, T.

    2014-08-01

    This manual describes the National Renewable Energy Laboratory's System Advisor Model (SAM) wind power performance model. The model calculates the hourly electrical output of a single wind turbine or of a wind farm. The wind power performance model requires information about the wind resource, wind turbine specifications, wind farm layout (if applicable), and costs. In SAM, the performance model can be coupled to one of the financial models to calculate economic metrics for residential, commercial, or utility-scale wind projects. This manual describes the algorithms used by the wind power performance model, which is available in the SAM user interface and as part of the SAM Simulation Core (SSC) library, and is intended to supplement the user documentation that comes with the software.

  12. Models for the energy performance of low-energy houses

    DEFF Research Database (Denmark)

    Andersen, Philip Hvidthøft Delff

    such as mechanical ventilation, floor heating, and control of the lighting effect, the heat dynamics must be taken into account. Hence, this thesis provides methods for data-driven modeling of heat dynamics of modern buildings. While most of the work in this thesis is related to characterization of heat dynamics...... - referred to as "grey-box” modeling - one-step predictions can be generated and used for model validation by testing statistically whether the model describes all variation and dynamics observed in the data. The possibility of validating the model dynamics is a great advantage from the use of stochastic......-building. The building is well-insulated and features large modern energy-effcient windows and oor heating. These features lead to increased non-linear responses to solar radiation and longer time constants. The building is equipped with advanced control and measuring equipment. Experiments are designed and performed...

  13. Acoustic performance of industrial mufflers with CAE modeling and simulation

    Science.gov (United States)

    Jeon, Soohong; Kim, Daehwan; Hong, Chinsuk; Jeong, Weuibong

    2014-12-01

    This paper investigates the noise transmission performance of industrial mufflers widely used in ships based on the CAE modeling and simulation. Since the industrial mufflers have very complicated internal structures, the conventional Transfer Matrix Method (TMM) is of limited use. The CAE modeling and simulation is therefore required to incorporate commercial softwares: CATIA for geometry modeling, MSC/PATRAN for FE meshing and LMS/ SYSNOISE for analysis. Main sources of difficulties in this study are led by complicated arrangement of reactive elements, perforated walls and absorption materials. The reactive elements and absorbent materials are modeled by applying boundary conditions given by impedance. The perforated walls are modeled by applying the transfer impedance on the duplicated node mesh. The CAE approach presented in this paper is verified by comparing with the theoretical solution of a concentric-tube resonator and is applied for industrial mufflers.

  14. Acoustic performance of industrial mufflers with CAE modeling and simulation

    Directory of Open Access Journals (Sweden)

    Jeon Soohong

    2014-12-01

    Full Text Available This paper investigates the noise transmission performance of industrial mufflers widely used in ships based on the CAE modeling and simulation. Since the industrial mufflers have very complicated internal structures, the conventional Transfer Matrix Method (TMM is of limited use. The CAE modeling and simulation is therefore required to incorporate commercial softwares: CATIA for geometry modeling, MSC/PATRAN for FE meshing and LMS/ SYSNOISE for analysis. Main sources of difficulties in this study are led by complicated arrangement of reactive elements, perforated walls and absorption materials. The reactive elements and absorbent materials are modeled by applying boundary conditions given by impedance. The perforated walls are modeled by applying the transfer impedance on the duplicated node mesh. The CAE approach presented in this paper is verified by comparing with the theoretical solution of a concentric-tube resonator and is applied for industrial mufflers.

  15. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State Univ., Fort Collins, CO (United States); El-Azab, Anter [Florida State Univ., Tallahassee, FL (United States); Pernice, Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States); Peterson, John W. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Polyakov, Peter [Univ. of Wyoming, Laramie, WY (United States); Tavener, Simon [Colorado State Univ., Fort Collins, CO (United States); Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States); Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis for computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.

  16. Modeling time-lagged reciprocal psychological empowerment-performance relationships.

    Science.gov (United States)

    Maynard, M Travis; Luciano, Margaret M; D'Innocenzo, Lauren; Mathieu, John E; Dean, Matthew D

    2014-11-01

    Employee psychological empowerment is widely accepted as a means for organizations to compete in increasingly dynamic environments. Previous empirical research and meta-analyses have demonstrated that employee psychological empowerment is positively related to several attitudinal and behavioral outcomes including job performance. While this research positions psychological empowerment as an antecedent influencing such outcomes, a close examination of the literature reveals that this relationship is primarily based on cross-sectional research. Notably, evidence supporting the presumed benefits of empowerment has failed to account for potential reciprocal relationships and endogeneity effects. Accordingly, using a multiwave, time-lagged design, we model reciprocal relationships between psychological empowerment and job performance using a sample of 441 nurses from 5 hospitals. Incorporating temporal effects in a staggered research design and using structural equation modeling techniques, our findings provide support for the conventional positive correlation between empowerment and subsequent performance. Moreover, accounting for the temporal stability of variables over time, we found support for empowerment levels as positive influences on subsequent changes in performance. Finally, we also found support for the reciprocal relationship, as performance levels were shown to relate positively to changes in empowerment over time. Theoretical and practical implications of the reciprocal psychological empowerment-performance relationships are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  17. A PERFORMANCE MANAGEMENT MODEL FOR PHYSICAL ASSET MANAGEMENT

    Directory of Open Access Journals (Sweden)

    J.L. Jooste

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: There has been an emphasis shift from maintenance management towards asset management, where the focus is on reliable and operational equipment and on effective assets at optimum life-cycle costs. A challenge in the manufacturing industry is to develop an asset performance management model that is integrated with business processes and strategies. The authors developed the APM2 model to satisfy that requirement. The model has a generic reference structure and is supported by operational protocols to assist in operations management. It facilitates performance measurement, business integration and continuous improvement, whilst exposing industry to the latest developments in asset performance management.

    AFRIKAANSE OPSOMMING: Daar is ‘n klemverskuiwing vanaf onderhoudsbestuur na batebestuur, waar daar gefokus word op betroubare en operasionele toerusting, asook effektiewe bates teen optimum lewensikluskoste. ‘n Uitdaging in die vervaardigingsindustrie is die ontwikkeling van ‘n prestasiemodel vir bates, wat geïntegreer is met besigheidsprosesse en –strategieë. Die outeurs het die APM2 model ontwikkel om in hierdie behoefte te voorsien. Die model het ‘n generiese verwysingsstruktuur, wat ondersteun word deur operasionele instruksies wat operasionele bestuur bevorder. Dit fasiliteer prestasiebestuur, besigheidsintegrasie en voortdurende verbetering, terwyl dit die industrie ook blootstel aan die nuutste ontwikkelinge in prestasiebestuur van bates.

  18. A personality trait-based interactionist model of job performance.

    Science.gov (United States)

    Tett, Robert P; Burnett, Dawn D

    2003-06-01

    Evidence for situational specificity of personality-job performance relations calls for better understanding of how personality is expressed as valued work behavior. On the basis of an interactionist principle of trait activation (R. P. Tett & H. A. Guterman, 2000), a model is proposed that distinguishes among 5 situational features relevant to trait expression (job demands, distracters, constraints, releasers, and facilitators), operating at task, social, and organizational levels. Trait-expressive work behavior is distinguished from (valued) job performance in clarifying the conditions favoring personality use in selection efforts. The model frames linkages between situational taxonomies (e.g., J. L. Holland's [1985] RIASEC model) and the Big Five and promotes useful discussion of critical issues, including situational specificity, personality-oriented job analysis, team building, and work motivation.

  19. PHARAO Laser Source Flight Model: Design and Performances

    CERN Document Server

    Lévèque, Thomas; Esnault, François-Xavier; Delaroche, Christophe; Massonnet, Didier; Grosjean, Olivier; Buffe, Fabrice; Torresi, Patrizia; Bomer, Thierry; Pichon, Alexandre; Béraud, Pascal; Lelay, Jean-Pierre; Thomin, Stéphane; Laurent, Philippe

    2015-01-01

    In this paper, we describe the design and the main performances of the PHARAO laser source flight model. PHARAO is a laser cooled cesium clock specially designed for operation in space and the laser source is one of the main sub-systems. The flight model presented in this work is the first remote-controlled laser system designed for spaceborne cold atom manipulation. The main challenges arise from mechanical compatibility with space constraints, which impose a high level of compactness, a low electric power consumption, a wide range of operating temperature and a vacuum environment. We describe the main functions of the laser source and give an overview of the main technologies developed for this instrument. We present some results of the qualification process. The characteristics of the laser source flight model, and their impact on the clock performances, have been verified in operational conditions.

  20. Performance Comparison of Sub Phonetic Model with Input Signal Processing

    Directory of Open Access Journals (Sweden)

    Dr E. Ramaraj

    2006-01-01

    Full Text Available The quest to arrive at a better model for signal transformation for speech has resulted in striving to develop better signal representations and algorithm. The article explores the word model which is a concatenation of state dependent senones as an alternate for phoneme. The Research Work has an objective of involving the senone with the Input signal processing an algorithm which has been tried with phoneme and has been quite successful and try to compare the performance of senone with ISP and Phoneme with ISP and supply the result analysis. The research model has taken the SPHINX IV[4] speech engine for its implementation owing to its flexibility to the new algorithm, robustness and performance consideration.

  1. A multiserver multiqueue network: modeling and performance analysis

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A new category of system model, multiserver multiqueue network (MSMQN), is proposed for distributed systems such as the geographically distributed Web-server clusters. A MSMQN comprises multiple multiserver multiqueue (MSMQ) nodes distributed over the network, and everynode consists of a number of servers that each contains multiple priority queues for waiting customers. An incoming request can be distributed to a waiting queue of any server in any node, according to the routing policy integrated by the node-selection policy at network-level, request-dispatching policy at node-level, and request-scheduling policy at server-level. The model is investigated using stochastic high-level Petri net (SHLPN) modeling and performance analysis techniques. Theperformance metrics concerned includes the delay time of requests in the MSMQ node and the response time perceived by the users. The numerical example shows the efficiency of the performance analysis technique.

  2. Frequency modulated continuous wave lidar performance model for target detection

    Science.gov (United States)

    Du Bosq, Todd W.; Preece, Bradley L.

    2017-05-01

    The desire to provide the warfighter both ranging and reflected intensity information is increasing to meet expanding operational needs. LIDAR imaging systems can provide the user with intensity, range, and even velocity information of a scene. The ability to predict the performance of LIDAR systems is critical for the development of future designs without the need to conduct time consuming and costly field studies. Performance modeling of a frequency modulated continuous wave (FMCW) LIDAR system is challenging due to the addition of the chirped laser source and waveform mixing. The FMCW LIDAR model is implemented in the NV-IPM framework using the custom component generation tool. This paper presents an overview of the FMCW Lidar, the customized LIDAR components, and a series of trade studies using the LIDAR model.

  3. Tiling for Performance Tuning on Different Models of GPUs

    CERN Document Server

    Xu, Chang; Jenkins, Samantha

    2010-01-01

    The strategy of using CUDA-compatible GPUs as a parallel computation solution to improve the performance of programs has been more and more widely approved during the last two years since the CUDA platform was released. Its benefit extends from the graphic domain to many other computationally intensive domains. Tiling, as the most general and important technique, is widely used for optimization in CUDA programs. New models of GPUs with better compute capabilities have, however, been released, new versions of CUDA SDKs were also released. These updated compute capabilities must to be considered when optimizing using the tiling technique. In this paper, we implement image interpolation algorithms as a test case to discuss how different tiling strategies affect the program's performance. We especially focus on how the different models of GPUs affect the tiling's effectiveness by executing the same program on two different models of GPUs equipped testing platforms. The results demonstrate that an optimized tiling...

  4. An improved model for TPV performance predictions and optimization

    Science.gov (United States)

    Schroeder, K. L.; Rose, M. F.; Burkhalter, J. E.

    1997-03-01

    Previously a model has been presented for calculating the performance of a TPV system. This model has been revised into a general purpose algorithm, improved in fidelity, and is presented here. The basic model is an energy based formulation and evaluates both the radiant and heat source elements of a combustion based system. Improvements in the radiant calculations include the use of ray tracking formulations and view factors for evaluating various flat plate and cylindrical configurations. Calculation of photocell temperature and performance parameters as a function of position and incident power have also been incorporated. Heat source calculations have been fully integrated into the code by the incorporation of a modified version of the NASA Complex Chemical Equilibrium Compositions and Applications (CEA) code. Additionally, coding has been incorporated to allow optimization of various system parameters and configurations. Several examples cases are presented and compared, and an optimum flat plate emitter/filter/photovoltaic configuration is also described.

  5. PHARAO laser source flight model: Design and performances

    Energy Technology Data Exchange (ETDEWEB)

    Lévèque, T., E-mail: thomas.leveque@cnes.fr; Faure, B.; Esnault, F. X.; Delaroche, C.; Massonnet, D.; Grosjean, O.; Buffe, F.; Torresi, P. [Centre National d’Etudes Spatiales, 18 avenue Edouard Belin, 31400 Toulouse (France); Bomer, T.; Pichon, A.; Béraud, P.; Lelay, J. P.; Thomin, S. [Sodern, 20 Avenue Descartes, 94451 Limeil-Brévannes (France); Laurent, Ph. [LNE-SYRTE, CNRS, UPMC, Observatoire de Paris, 61 avenue de l’Observatoire, 75014 Paris (France)

    2015-03-15

    In this paper, we describe the design and the main performances of the PHARAO laser source flight model. PHARAO is a laser cooled cesium clock specially designed for operation in space and the laser source is one of the main sub-systems. The flight model presented in this work is the first remote-controlled laser system designed for spaceborne cold atom manipulation. The main challenges arise from mechanical compatibility with space constraints, which impose a high level of compactness, a low electric power consumption, a wide range of operating temperature, and a vacuum environment. We describe the main functions of the laser source and give an overview of the main technologies developed for this instrument. We present some results of the qualification process. The characteristics of the laser source flight model, and their impact on the clock performances, have been verified in operational conditions.

  6. Performance and Prediction: Bayesian Modelling of Fallible Choice in Chess

    Science.gov (United States)

    Haworth, Guy; Regan, Ken; di Fatta, Giuseppe

    Evaluating agents in decision-making applications requires assessing their skill and predicting their behaviour. Both are well developed in Poker-like situations, but less so in more complex game and model domains. This paper addresses both tasks by using Bayesian inference in a benchmark space of reference agents. The concepts are explained and demonstrated using the game of chess but the model applies generically to any domain with quantifiable options and fallible choice. Demonstration applications address questions frequently asked by the chess community regarding the stability of the rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The last include alleged under-performance, fabrication of tournament results, and clandestine use of computer advice during competition. Beyond the model world of games, the aim is to improve fallible human performance in complex, high-value tasks.

  7. A conceptual model to improve performance in virtual teams

    Directory of Open Access Journals (Sweden)

    Shopee Dube

    2016-04-01

    Full Text Available Background: The vast improvement in communication technologies and sophisticated project management tools, methods and techniques has allowed geographically and culturally diverse groups to operate and function in a virtual environment. To succeed in this virtual environment where time and space are becoming increasingly irrelevant, organisations must define new ways of implementing initiatives. This virtual environment phenomenon has brought about the formation of virtual project teams that allow organisations to harness the skills and knowhow of the best resources, irrespective of their location.Objectives: The aim of this article was to investigate performance criteria and develop a conceptual model which can be applied to enhance the success of virtual project teams. There are no clear guidelines of the performance criteria in managing virtual project teams.Method: A qualitative research methodology was used in this article. The purpose of content analysis was to explore the literature to understand the concept of performance in virtual project teams and to summarise the findings of the literature reviewed.Results: The research identified a set of performance criteria for the virtual project teams as follows: leadership, trust, communication, team cooperation, reliability, motivation, comfort and social interaction. These were used to conceptualise the model.Conclusion: The conceptual model can be used in a holistic way to determine the overall performance of the virtual project team, but each factor can be analysed individually to determine the impact on the overall performance. The knowledge of performance criteria for virtual project teams could aid project managers in enhancing the success of these teams and taking a different approach to better manage and coordinate them.

  8. A Neural Network Model for the Correlation between Sprinters’ Pre-competition Anxiety and Competition Performance

    Directory of Open Access Journals (Sweden)

    Jiwei Yao

    2013-06-01

    Full Text Available Sprint is an important sporting event in track and field competition, in which, athletes’ pre-competition anxiety will greatly affect them in bringing into play their competence, which will then influence their final performance in the competition. For this reason, to study the correlation between sprinters’ pre-competition anxiety and their competition performance is of great significance in predicting athletes’ performance under difference anxiety state. After having analyzed domestic and foreign research achievements related with sport anxiety and sport performance, the study further applied CSAI-2 (1994 questionnaire to investigate athletes’ anxiety in sprint competition of a university sports meeting in Changsha. Moreover, based on neural network model, the study also constructed related models concerning athletes’ pre-competition anxiety and their competition performance. In addition, related curves concerning athletes’ pre-competition anxiety and specific performance are also formulated.

  9. Improving winter leaf area index estimation in evergreen coniferous forests and its significance in carbon and water fluxes modeling

    Science.gov (United States)

    Wang, R.; Chen, J. M.; Luo, X.

    2016-12-01

    Modeling of carbon and water fluxes at the continental and global scales requires remotely sensed LAI as inputs. For evergreen coniferous forests (ENF), severely underestimated winter LAI has been one of the issues for mostly available remote sensing products, which could cause negative bias in the modeling of Gross Primary Productivity (GPP) and evapotranspiration (ET). Unlike deciduous trees which shed all the leaves in winter, conifers retains part of their needles and the proportion of the retained needles depends on the needle longevity. In this work, the Boreal Ecosystem Productivity Simulator (BEPS) was used to model GPP and ET at eight FLUXNET Canada ENF sites. Two sets of LAI were used as the model inputs: the 250m 10-day University of Toronto (U of T) LAI product Version 2 and the corrected LAI based on the U of T LAI product and the needle longevity of the corresponding tree species at individual sites. Validating model daily GPP (gC/m2) against site measurements, the mean RMSE over eight sites decreases from 1.85 to 1.15, and the bias changes from -0.99 to -0.19. For daily ET (mm), mean RMSE decreases from 0.63 to 0.33, and the bias changes from -0.31 to -0.16. Most of the improvements occur in the beginning and at the end of the growing season when there is large correction of LAI and meanwhile temperature is still suitable for photosynthesis and transpiration. For the dormant season, the improvement in ET simulation mostly comes from the increased interception of precipitation brought by the elevated LAI during that time. The results indicate that model performance can be improved by the application the corrected LAI. Improving the winter RS LAI can make a large impact on land surface carbon and energy budget.

  10. Outdoor FSO Communications Under Fog: Attenuation Modeling and Performance Evaluation

    KAUST Repository

    Esmail, Maged Abdullah

    2016-07-18

    Fog is considered to be a primary challenge for free space optics (FSO) systems. It may cause attenuation that is up to hundreds of decibels per kilometer. Hence, accurate modeling of fog attenuation will help telecommunication operators to engineer and appropriately manage their networks. In this paper, we examine fog measurement data coming from several locations in Europe and the United States and derive a unified channel attenuation model. Compared with existing attenuation models, our proposed model achieves a minimum of 9 dB, which is lower than the average root-mean-square error (RMSE). Moreover, we have investigated the statistical behavior of the channel and developed a probabilistic model under stochastic fog conditions. Furthermore, we studied the performance of the FSO system addressing various performance metrics, including signal-to-noise ratio (SNR), bit-error rate (BER), and channel capacity. Our results show that in communication environments with frequent fog, FSO is typically a short-range data transmission technology. Therefore, FSO will have its preferred market segment in future wireless fifth-generation/sixth-generation (5G/6G) networks having cell sizes that are lower than a 1-km diameter. Moreover, the results of our modeling and analysis can be applied in determining the switching/thresholding conditions in highly reliable hybrid FSO/radio-frequency (RF) networks.

  11. Significant Effect of a Pre-Exercise High-Fat Meal after a 3-Day High-Carbohydrate Diet on Endurance Performance

    Directory of Open Access Journals (Sweden)

    Ikuma Murakami

    2012-06-01

    Full Text Available We investigated the effect of macronutrient composition of pre-exercise meals on endurance performance. Subjects consumed a high-carbohydrate diet at each meal for 3 days, followed by a high-fat meal (HFM; 1007 ± 21 kcal, 30% CHO, 55% F and 15% P or high-carbohydrate meal (HCM; 1007 ± 21 kcal, 71% CHO, 20% F and 9% P 4 h before exercise. Furthermore, just prior to the test, subjects in the HFM group ingested either maltodextrin jelly (M or a placebo jelly (P, while subjects in the HCM ingested a placebo jelly. Endurance performance was measured as running time until exhaustion at a speed between lactate threshold and the onset of blood lactate accumulation. All subjects participated in each trial, randomly assigned at weekly intervals. We observed that the time until exhaustion was significantly longer in the HFM + M (p < 0.05 than in HFM + P and HCM + P conditions. Furthermore, the total amount of fat oxidation during exercise was significantly higher in HFM + M and HFM + P than in HCM + P (p < 0.05. These results suggest that ingestion of a HFM prior to exercise is more favorable for endurance performance than HCM. In addition, HFM and maltodextrin ingestion following 3 days of carbohydrate loading enhances endurance running performance.

  12. A puzzle form of a non-verbal intelligence test gives significantly higher performance measures in children with severe intellectual disability

    Directory of Open Access Journals (Sweden)

    Crewther Sheila G

    2008-08-01

    Full Text Available Abstract Background Assessment of 'potential intellectual ability' of children with severe intellectual disability (ID is limited, as current tests designed for normal children do not maintain their interest. Thus a manual puzzle version of the Raven's Coloured Progressive Matrices (RCPM was devised to appeal to the attentional and sensory preferences and language limitations of children with ID. It was hypothesized that performance on the book and manual puzzle forms would not differ for typically developing children but that children with ID would perform better on the puzzle form. Methods The first study assessed the validity of this puzzle form of the RCPM for 76 typically developing children in a test-retest crossover design, with a 3 week interval between tests. A second study tested performance and completion rate for the puzzle form compared to the book form in a sample of 164 children with ID. Results In the first study, no significant difference was found between performance on the puzzle and book forms in typically developing children, irrespective of the order of completion. The second study demonstrated a significantly higher performance and completion rate for the puzzle form compared to the book form in the ID population. Conclusion Similar performance on book and puzzle forms of the RCPM by typically developing children suggests that both forms measure the same construct. These findings suggest that the puzzle form does not require greater cognitive ability but demands sensory-motor attention and limits distraction in children with severe ID. Thus, we suggest the puzzle form of the RCPM is a more reliable measure of the non-verbal mentation of children with severe ID than the book form.

  13. Integrated healthcare networks' performance: a growth curve modeling approach.

    Science.gov (United States)

    Wan, Thomas T H; Wang, Bill B L

    2003-05-01

    This study examines the effects of integration on the performance ratings of the top 100 integrated healthcare networks (IHNs) in the United States. A strategic-contingency theory is used to identify the relationship of IHNs' performance to their structural and operational characteristics and integration strategies. To create a database for the panel study, the top 100 IHNs selected by the SMG Marketing Group in 1998 were followed up in 1999 and 2000. The data were merged with the Dorenfest data on information system integration. A growth curve model was developed and validated by the Mplus statistical program. Factors influencing the top 100 IHNs' performance in 1998 and their subsequent rankings in the consecutive years were analyzed. IHNs' initial performance scores were positively influenced by network size, number of affiliated physicians and profit margin, and were negatively associated with average length of stay and technical efficiency. The continuing high performance, judged by maintaining higher performance scores, tended to be enhanced by the use of more managerial or executive decision-support systems. Future studies should include time-varying operational indicators to serve as predictors of network performance.

  14. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  15. An Integrated Model to Explain How Corporate Social Responsibility Affects Corporate Financial Performance

    Directory of Open Access Journals (Sweden)

    Chin-Shien Lin

    2015-06-01

    Full Text Available The effect of corporate social responsibility (CSR on financial performance has important implications for enterprises, communities, and countries, and the significance of this issue cannot be ignored. Therefore, this paper proposes an integrated model to explain the influence of CSR on financial performance with intellectual capital as a mediator and industry type as a moderator. Empirical results indicate that intellectual capital mediates the relationship between CSR and financial performance, and industry type moderates the direct influence of CSR on financial performance. Such results have critical implications for both academia and practice.

  16. Implicit Value Updating Explains Transitive Inference Performance: The Betasort Model.

    Directory of Open Access Journals (Sweden)

    Greg Jensen

    Full Text Available Transitive inference (the ability to infer that B > D given that B > C and C > D is a widespread characteristic of serial learning, observed in dozens of species. Despite these robust behavioral effects, reinforcement learning models reliant on reward prediction error or associative strength routinely fail to perform these inferences. We propose an algorithm called betasort, inspired by cognitive processes, which performs transitive inference at low computational cost. This is accomplished by (1 representing stimulus positions along a unit span using beta distributions, (2 treating positive and negative feedback asymmetrically, and (3 updating the position of every stimulus during every trial, whether that stimulus was visible or not. Performance was compared for rhesus macaques, humans, and the betasort algorithm, as well as Q-learning, an established reward-prediction error (RPE model. Of these, only Q-learning failed to respond above chance during critical test trials. Betasort's success (when compared to RPE models and its computational efficiency (when compared to full Markov decision process implementations suggests that the study of reinforcement learning in organisms will be best served by a feature-driven approach to comparing formal models.

  17. Towards Modeling Realistic Mobility for Performance Evaluations in MANET

    Science.gov (United States)

    Aravind, Alex; Tahir, Hassan

    Simulation modeling plays crucial role in conducting research on complex dynamic systems like mobile ad hoc networks and often the only way. Simulation has been successfully applied in MANET for more than two decades. In several recent studies, it is observed that the credibility of the simulation results in the field has decreased while the use of simulation has steadily increased. Part of this credibility crisis has been attributed to the simulation of mobility of the nodes in the system. Mobility has such a fundamental influence on the behavior and performance of mobile ad hoc networks. Accurate modeling and knowledge of mobility of the nodes in the system is not only helpful but also essential for the understanding and interpretation of the performance of the system under study. Several ideas, mostly in isolation, have been proposed in the literature to infuse realism in the mobility of nodes. In this paper, we attempt a holistic analysis of creating realistic mobility models and then demonstrate creation and analysis of realistic mobility models using a software tool we have developed. Using our software tool, desired mobility of the nodes in the system can be specified, generated, analyzed, and then the trace can be exported to be used in the performance studies of proposed algorithms or systems.

  18. Performance benchmarks for a next generation numerical dynamo model

    Science.gov (United States)

    Matsui, Hiroaki; Heien, Eric; Aubert, Julien; Aurnou, Jonathan M.; Avery, Margaret; Brown, Ben; Buffett, Bruce A.; Busse, Friedrich; Christensen, Ulrich R.; Davies, Christopher J.; Featherstone, Nicholas; Gastine, Thomas; Glatzmaier, Gary A.; Gubbins, David; Guermond, Jean-Luc; Hayashi, Yoshi-Yuki; Hollerbach, Rainer; Hwang, Lorraine J.; Jackson, Andrew; Jones, Chris A.; Jiang, Weiyuan; Kellogg, Louise H.; Kuang, Weijia; Landeau, Maylis; Marti, Philippe; Olson, Peter; Ribeiro, Adolfo; Sasaki, Youhei; Schaeffer, Nathanaël.; Simitev, Radostin D.; Sheyko, Andrey; Silva, Luis; Stanley, Sabine; Takahashi, Futoshi; Takehiro, Shin-ichi; Wicht, Johannes; Willis, Ashley P.

    2016-05-01

    Numerical simulations of the geodynamo have successfully represented many observable characteristics of the geomagnetic field, yielding insight into the fundamental processes that generate magnetic fields in the Earth's core. Because of limited spatial resolution, however, the diffusivities in numerical dynamo models are much larger than those in the Earth's core, and consequently, questions remain about how realistic these models are. The typical strategy used to address this issue has been to continue to increase the resolution of these quasi-laminar models with increasing computational resources, thus pushing them toward more realistic parameter regimes. We assess which methods are most promising for the next generation of supercomputers, which will offer access to O(106) processor cores for large problems. Here we report performance and accuracy benchmarks from 15 dynamo codes that employ a range of numerical and parallelization methods. Computational performance is assessed on the basis of weak and strong scaling behavior up to 16,384 processor cores. Extrapolations of our weak-scaling results indicate that dynamo codes that employ two-dimensional or three-dimensional domain decompositions can perform efficiently on up to ˜106 processor cores, paving the way for more realistic simulations in the next model generation.

  19. Instruction-level performance modeling and characterization of multimedia applications

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Y. [Los Alamos National Lab., NM (United States). Scientific Computing Group; Cameron, K.W. [Louisiana State Univ., Baton Rouge, LA (United States). Dept. of Computer Science

    1999-06-01

    One of the challenges for characterizing and modeling realistic multimedia applications is the lack of access to source codes. On-chip performance counters effectively resolve this problem by monitoring run-time behaviors at the instruction-level. This paper presents a novel technique of characterizing and modeling workloads at the instruction level for realistic multimedia applications using hardware performance counters. A variety of instruction counts are collected from some multimedia applications, such as RealPlayer, GSM Vocoder, MPEG encoder/decoder, and speech synthesizer. These instruction counts can be used to form a set of abstract characteristic parameters directly related to a processor`s architectural features. Based on microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. The biggest advantage of this new characterization technique is a better understanding of processor utilization efficiency and architectural bottleneck for each application. This technique also provides predictive insight of future architectural enhancements and their affect on current codes. In this paper the authors also attempt to model architectural effect on processor utilization without memory influence. They derive formulas for calculating CPI{sub 0}, CPI without memory effect, and they quantify utilization of architectural parameters. These equations are architecturally diagnostic and predictive in nature. Results provide promise in code characterization, and empirical/analytical modeling.

  20. Modeling the Performance of Fast Mulipole Method on HPC platforms

    KAUST Repository

    Ibeid, Huda

    2012-04-06

    The current trend in high performance computing is pushing towards exascale computing. To achieve this exascale performance, future systems will have between 100 million and 1 billion cores assuming gigahertz cores. Currently, there are many efforts studying the hardware and software bottlenecks for building an exascale system. It is important to understand and meet these bottlenecks in order to attain 10 PFLOPS performance. On applications side, there is an urgent need to model application performance and to understand what changes need to be made to ensure continued scalability at this scale. Fast multipole methods (FMM) were originally developed for accelerating N-body problems for particle based methods. Nowadays, FMM is more than an N-body solver, recent trends in HPC have been to use FMMs in unconventional application areas. FMM is likely to be a main player in exascale due to its hierarchical nature and the techniques used to access the data via a tree structure which allow many operations to happen simultaneously at each level of the hierarchy. In this thesis , we discuss the challenges for FMM on current parallel computers and future exasclae architecture. Furthermore, we develop a novel performance model for FMM. Our ultimate aim of this thesis is to ensure the scalability of FMM on the future exascale machines.

  1. A model for microbial phosphorus cycling in bioturbated marine sediments: Significance for phosphorus burial in the early Paleozoic

    Science.gov (United States)

    Dale, Andrew W.; Boyle, Richard A.; Lenton, Timothy M.; Ingall, Ellery D.; Wallmann, Klaus

    2016-09-01

    A diagenetic model is used to simulate the diagenesis and burial of particulate organic carbon (Corg) and phosphorus (P) in marine sediments underlying anoxic versus oxic bottom waters. The latter are physically mixed by animals moving through the surface sediment (bioturbation) and ventilated by burrowing, tube-dwelling organisms (bioirrigation). The model is constrained using an empirical database including burial ratios of Corg with respect to organic P (Corg:Porg) and total reactive P (Corg:Preac), burial efficiencies of Corg and Porg, and inorganic carbon-to-phosphorus regeneration ratios. If Porg is preferentially mineralized relative to Corg during aerobic respiration, as many previous studies suggest, then the simulated Porg pool is found to be completely depleted. A modified model that incorporates the redox-dependent microbial synthesis of polyphosphates and Porg (termed the microbial P pump) allows preferential mineralization of the bulk Porg pool relative to Corg during both aerobic and anaerobic respiration and is consistent with the database. Results with this model show that P burial is strongly enhanced in sediments hosting fauna. Animals mix highly labile Porg away from the aerobic sediment layers where mineralization rates are highest, thereby mitigating diffusive PO43- fluxes to the bottom water. They also expand the redox niche where microbial P uptake occurs. The model was applied to a hypothetical shelf setting in the early Paleozoic; a time of the first radiation of benthic fauna. Results show that even shallow bioturbation at that time may have had a significant impact on P burial. Our model provides support for a recent study that proposed that faunal radiation in ocean sediments led to enhanced P burial and, possibly, a stabilization of atmospheric O2 levels. The results also help to explain Corg:Porg ratios in the geological record and the persistence of Porg in ancient marine sediments.

  2. Circuit modeling and performance analysis of photoconductive antenna

    Science.gov (United States)

    Prajapati, Jitendra; Bharadwaj, Mrinmoy; Chatterjee, Amitabh; Bhattacharjee, Ratnajit

    2017-07-01

    In recent years, several experimental and simulation studies have been reported on the terahertz (THz) generation using a photoconductive antenna (PCA). The major problem with PCA is its low overall efficiency, which depends on several parameters related to a semiconductor material, an antenna geometry, and characteristics of the laser beam. To analyze the effect of different parameters on PCA efficiency, accurate circuit modeling, using physics undergoing in the device, is necessary. Although a few equivalent circuit models have been proposed in the literature, these models do not adequately capture the semiconductor physics in PCA. This paper presents an equivalent electrical circuit model of PCA incorporating basic semiconductor device physics. The proposed equivalent circuit model is validated using Sentaurus TCAD device level modeling tool as well as with the experimental results available in the literature. The results obtained from the proposed circuit model are in close agreement with the TCAD results as well as available experimental results. The proposed circuit model is expected to contribute towards future research efforts aimed at optimization of the performance of the PCA system.

  3. Correlation between human observer performance and model observer performance in differential phase contrast CT

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ke; Garrett, John [Department of Medical Physics, University of Wisconsin-Madison, 1111 Highland Avenue, Madison, Wisconsin 53705 (United States); Chen, Guang-Hong [Department of Medical Physics, University of Wisconsin-Madison, 1111 Highland Avenue, Madison, Wisconsin 53705 and Department of Radiology, University of Wisconsin-Madison, 600 Highland Avenue, Madison, Wisconsin 53792 (United States)

    2013-11-15

    Purpose: With the recently expanding interest and developments in x-ray differential phase contrast CT (DPC-CT), the evaluation of its task-specific detection performance and comparison with the corresponding absorption CT under a given radiation dose constraint become increasingly important. Mathematical model observers are often used to quantify the performance of imaging systems, but their correlations with actual human observers need to be confirmed for each new imaging method. This work is an investigation of the effects of stochastic DPC-CT noise on the correlation of detection performance between model and human observers with signal-known-exactly (SKE) detection tasks.Methods: The detectabilities of different objects (five disks with different diameters and two breast lesion masses) embedded in an experimental DPC-CT noise background were assessed using both model and human observers. The detectability of the disk and lesion signals was then measured using five types of model observers including the prewhitening ideal observer, the nonprewhitening (NPW) observer, the nonprewhitening observer with eye filter and internal noise (NPWEi), the prewhitening observer with eye filter and internal noise (PWEi), and the channelized Hotelling observer (CHO). The same objects were also evaluated by four human observers using the two-alternative forced choice method. The results from the model observer experiment were quantitatively compared to the human observer results to assess the correlation between the two techniques.Results: The contrast-to-detail (CD) curve generated by the human observers for the disk-detection experiments shows that the required contrast to detect a disk is inversely proportional to the square root of the disk size. Based on the CD curves, the ideal and NPW observers tend to systematically overestimate the performance of the human observers. The NPWEi and PWEi observers did not predict human performance well either, as the slopes of their CD

  4. Using everyday technology to compensate for difficulties in task performance in daily life: experiences in persons with acquired brain injury and their significant others.

    Science.gov (United States)

    Larsson Lund, Maria; Lövgren-Engström, Ann-Louice; Lexell, Jan

    2011-01-01

    PURPOSE. The purpose of this study is to illuminate how persons with acquired brain injury (ABI) and their significant others experienced individualised occupation-based interventions using commonly available everyday technology (ET) to compensate for perceived difficulties with performance of tasks in daily life. METHOD. Qualitative research interviews were conducted with 10 persons with ABI and with one of their significant others. The data were analysed according to qualitative content analysis. RESULTS. The persons with ABI experienced that they mastered their lives in a better way by the compensatory use of ET. They became capable of doing tasks independently and experienced themselves as being a new person. During the intervention process, persons with ABI became aware of the compensatory potential of familiar ET, and they were supported to use effective compensatory strategies and incorporate them into their habits. Their significant others felt a relief in daily life, and their mood was positively affected as they experienced reduced responsibility and need of control. CONCLUSIONS. This qualitative study has shown that persons with ABI, as well as their significant others, experienced a multitude of benefits from occupation-based interventions using commonly available ET to compensate for their difficulties in the performance of tasks in daily life and that the goals achieved affected their overall contentment with life.

  5. Stochastic Modeling and Performance Analysis of Multimedia SoCs

    DEFF Research Database (Denmark)

    Raman, Balaji; Nouri, Ayoub; Gangadharan, Deepak

    2013-01-01

    decoder. The results shows that, for our stochastic design metric, the analytical framework upper bounds (and relatively accurate) compare to the statistical model checking technique. Also, we observed significant reduction in resource usage (such as output buffer size) with tolerable loss in output...

  6. Compact models and performance investigations for subthreshold interconnects

    CERN Document Server

    Dhiman, Rohit

    2014-01-01

    The book provides a detailed analysis of issues related to sub-threshold interconnect performance from the perspective of analytical approach and design techniques. Particular emphasis is laid on the performance analysis of coupling noise and variability issues in sub-threshold domain to develop efficient compact models. The proposed analytical approach gives physical insight of the parameters affecting the transient behavior of coupled interconnects. Remedial design techniques are also suggested to mitigate the effect of coupling noise. The effects of wire width, spacing between the wires, wi

  7. 3D Massive MIMO Systems: Channel Modeling and Performance Analysis

    KAUST Repository

    Nadeem, Qurrat-Ul-Ain

    2015-03-01

    Multiple-input-multiple-output (MIMO) systems of current LTE releases are capable of adaptation in the azimuth only. More recently, the trend is to enhance the system performance by exploiting the channel\\'s degrees of freedom in the elevation through the dynamic adaptation of the vertical antenna beam pattern. This necessitates the derivation and characterization of three-dimensional (3D) channels. Over the years, channel models have evolved to address the challenges of wireless communication technologies. In parallel to theoretical studies on channel modeling, many standardized channel models like COST-based models, 3GPP SCM, WINNER, ITU have emerged that act as references for industries and telecommunication companies to assess system-level and link-level performances of advanced signal processing techniques over real-like channels. Given the existing channels are only two dimensional (2D) in nature; a large effort in channel modeling is needed to study the impact of the channel component in the elevation direction. The first part of this work sheds light on the current 3GPP activity around 3D channel modeling and beamforming, an aspect that to our knowledge has not been extensively covered by a research publication. The standardized MIMO channel model is presented, that incorporates both the propagation effects of the environment and the radio effects of the antennas. In order to facilitate future studies on the use of 3D beamforming, the main features of the proposed 3D channel model are discussed. A brief overview of the future 3GPP 3D channel model being outlined for the next generation of wireless networks is also provided. In the subsequent part of this work, we present an information-theoretic channel model for MIMO systems that supports the elevation dimension. The model is based on the principle of maximum entropy, which enables us to determine the distribution of the channel matrix consistent with the prior information on the angles of departure and

  8. Ecosystem performance monitoring of rangelands by integrating modeling and remote sensing

    Science.gov (United States)

    Wylie, Bruce K.; Boyte, Stephen P.; Major, Donald J.

    2012-01-01

    Monitoring rangeland ecosystem dynamics, production, and performance is valuable for researchers and land managers. However, ecosystem monitoring studies can be difficult to interpret and apply appropriately if management decisions and disturbances are inseparable from the ecosystem's climate signal. This study separates seasonal weather influences from influences caused by disturbances and management decisions, making interannual time-series analysis more consistent and interpretable. We compared the actual ecosystem performance (AEP) of five rangeland vegetation types in the Owyhee Uplands for 9 yr to their expected ecosystem performance (EEP). Integrated growing season Normalized Difference Vegetation Index data for each of the nine growing seasons served as a proxy for annual AEP. Regression-tree models used long-term site potential, seasonal weather, and land cover data sets to generate annual EEP, an estimate of ecosystem performance incorporating annual weather variations. The difference between AEP and EEP provided a performance measure for each pixel in the study area. Ecosystem performance anomalies occurred when the ecosystem performed significantly better or worse than the model predicted. About 14% of the Owyhee Uplands showed a trend of significant underperformance or overperformance (P<0.10). Land managers can use results from weather-based rangeland ecosystem performance models to help support adaptive management strategies.

  9. Employing Second-Order Circular Suprasegmental Hidden Markov Models to Enhance Speaker Identification Performance in Shouted Talking Environments

    OpenAIRE

    Ismail Shahin

    2010-01-01

    Speaker identification performance is almost perfect in neutral talking environments. However, the performance is deteriorated significantly in shouted talking environments. This work is devoted to proposing, implementing, and evaluating new models called Second-Order Circular Suprasegmental Hidden Markov Models (CSPHMM2s) to alleviate the deteriorated performance in the shouted talking environments. These proposed models possess the characteristics of both Circular Suprasegmental Hidden Mark...

  10. Key performance indicators in hospital based on balanced scorecard model

    Directory of Open Access Journals (Sweden)

    Hamed Rahimi

    2017-01-01

    Full Text Available Introduction: Performance measurement is receiving increasing verification all over the world. Nowadays in a lot of organizations, irrespective of their type or size, performance evaluation is the main concern and a key issue for top administrators. The purpose of this study is to organize suitable key performance indicators (KPIs for hospitals’ performance evaluation based on the balanced scorecard (BSC. Method: This is a mixed method study. In order to identify the hospital’s performance indicators (HPI, first related literature was reviewed and then the experts’ panel and Delphi method were used. In this study, two rounds were needed for the desired level of consensus. The experts rated the importance of the indicators, on a five-point Likert scale. In the consensus calculation, the consensus percentage was calculated by classifying the values 1-3 as not important (0 and 4-5 to (1 as important. Simple additive weighting technique was used to rank the indicators and select hospital’s KPIs. The data were analyzed by Excel 2010 software. Results: About 218 indicators were obtained from a review of selected literature. Through internal expert panel, 77 indicators were selected. Finally, 22 were selected for KPIs of hospitals. Ten indicators were selected in internal process perspective and 5, 4, and 3 indicators in finance, learning and growth, and customer, respectively. Conclusion: This model can be a useful tool for evaluating and comparing the performance of hospitals. However, this model is flexible and can be adjusted according to differences in the target hospitals. This study can be beneficial for hospital administrators and it can help them to change their perspective about performance evaluation.

  11. Evaluation of the performance of DIAS ionospheric forecasting models

    Directory of Open Access Journals (Sweden)

    Tsagouri Ioanna

    2011-08-01

    Full Text Available Nowcasting and forecasting ionospheric products and services for the European region are regularly provided since August 2006 through the European Digital upper Atmosphere Server (DIAS, http://dias.space.noa.gr. Currently, DIAS ionospheric forecasts are based on the online implementation of two models: (i the solar wind driven autoregression model for ionospheric short-term forecast (SWIF, which combines historical and real-time ionospheric observations with solar-wind parameters obtained in real time at the L1 point from NASA ACE spacecraft, and (ii the geomagnetically correlated autoregression model (GCAM, which is a time series forecasting method driven by a synthetic geomagnetic index. In this paper we investigate the operational ability and the accuracy of both DIAS models carrying out a metrics-based evaluation of their performance under all possible conditions. The analysis was established on the systematic comparison between models’ predictions with actual observations obtained over almost one solar cycle (1998–2007 at four European ionospheric locations (Athens, Chilton, Juliusruh and Rome and on the comparison of the models’ performance against two simple prediction strategies, the median- and the persistence-based predictions during storm conditions. The results verify operational validity for both models and quantify their prediction accuracy under all possible conditions in support of operational applications but also of comparative studies in assessing or expanding the current ionospheric forecasting capabilities.

  12. Lightweight ZERODUR: Validation of Mirror Performance and Mirror Modeling Predictions

    Science.gov (United States)

    Hull, Tony; Stahl, H. Philip; Westerhoff, Thomas; Valente, Martin; Brooks, Thomas; Eng, Ron

    2017-01-01

    Upcoming spaceborne missions, both moderate and large in scale, require extreme dimensional stability while relying both upon established lightweight mirror materials, and also upon accurate modeling methods to predict performance under varying boundary conditions. We describe tests, recently performed at NASA's XRCF chambers and laboratories in Huntsville Alabama, during which a 1.2 m diameter, f/1.2988% lightweighted SCHOTT lightweighted ZERODUR(TradeMark) mirror was tested for thermal stability under static loads in steps down to 230K. Test results are compared to model predictions, based upon recently published data on ZERODUR(TradeMark). In addition to monitoring the mirror surface for thermal perturbations in XRCF Thermal Vacuum tests, static load gravity deformations have been measured and compared to model predictions. Also the Modal Response(dynamic disturbance) was measured and compared to model. We will discuss the fabrication approach and optomechanical design of the ZERODUR(TradeMark) mirror substrate by SCHOTT, its optical preparation for test by Arizona Optical Systems (AOS). Summarize the outcome of NASA's XRCF tests and model validations

  13. Performance Analysis of Transposition Models Simulating Solar Radiation on Inclined Surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Yu; Sengupta, Manajit

    2016-06-02

    Transposition models have been widely used in the solar energy industry to simulate solar radiation on inclined photovoltaic panels. Following numerous studies comparing the performance of transposition models, this work aims to understand the quantitative uncertainty in state-of-the-art transposition models and the sources leading to the uncertainty. Our results show significant differences between two highly used isotropic transposition models, with one substantially underestimating the diffuse plane-of-array irradiances when diffuse radiation is perfectly isotropic. In the empirical transposition models, the selection of the empirical coefficients and land surface albedo can both result in uncertainty in the output. This study can be used as a guide for the future development of physics-based transposition models and evaluations of system performance.

  14. Significant improvement of olfactory performance in sleep apnea patients after three months of nasal CPAP therapy – Observational study and randomized trial

    Science.gov (United States)

    Boerner, Bettina; Tini, Gabrielo M.; Fachinger, Patrick; Graber, Sereina M.; Irani, Sarosh

    2017-01-01

    Objectives The olfactory function highly impacts quality of life (QoL). Continuous positive airway pressure is an effective treatment for obstructive sleep apnea (OSA) and is often applied by nasal masks (nCPAP). The influence of nCPAP on the olfactory performance of OSA patients is unknown. The aim of this study was to assess the sense of smell before initiation of nCPAP and after three months treatment, in moderate and severe OSA patients. Methods The sense of smell was assessed in 35 patients suffering from daytime sleepiness and moderate to severe OSA (apnea/hypopnea index ≥ 15/h), with the aid of a validated test battery (Sniffin’ Sticks) before initiation of nCPAP therapy and after three months of treatment. Additionally, adherent subjects were included in a double-blind randomized three weeks CPAP-withdrawal trial (sub-therapeutic CPAP pressure). Results Twenty five of the 35 patients used the nCPAP therapy for more than four hours per night, and for more than 70% of nights (adherent group). The olfactory performance of these patients improved significantly (p = 0.007) after three months of nCPAP therapy. When considering the entire group of patients, olfaction also improved significantly (p = 0.001). In the randomized phase the sense of smell of six patients deteriorated under sub-therapeutic CPAP pressure (p = 0.046) whereas five patients in the maintenance CPAP group showed no significant difference (p = 0.501). Conclusions Olfactory performance improved significantly after three months of nCPAP therapy in patients suffering from moderate and severe OSA. It seems that this effect of nCPAP is reversible under sub-therapeutic CPAP pressure. Trial registration ISRCTN11128866 PMID:28158212

  15. From Performance Measurement to Strategic Management Model: Balanced Scorecard

    Directory of Open Access Journals (Sweden)

    Cihat Savsar

    2015-03-01

    Full Text Available Abstract: In Today’s competitive markets, one of the main conditions of the surviving of enterprises is the necessity to have effective performance management systems. Decisions must be taken by the management according to the performance of assets. In the transition from industrial society to information society, the presence of business structures have changed and the values of non-financial assets have increased in this period. So some systems have emerged based on intangible assets and to measure them instead of tangible assets and their measurements. With economic and technological development multi-dimensional evaluation in the business couldn’t be sufficient.  Performance evaluation methods can be applied in business with an integrated approach by its accordance with business strategy, linking to reward system and cause effects link established between performance measures. Balanced scorecard is one of the commonly used in measurement methods. While it was used for the first time in 1992 as a performance measurement tool today it has been used as a strategic management model besides its conventional uses. BSC contains customer perspective, internal perspective and learning and growth perspective besides financial perspective. Learning and growth perspective is determinant of other perspectives. In order to achieve the objectives set out in the financial perspective in other dimensions that need to be accomplished, is emphasized. Establishing a causal link between performance measures and targets how to achieve specified goals with strategy maps are described.

  16. Performance model for Micro Tunnelling Boring Machines (MTBM

    Directory of Open Access Journals (Sweden)

    J. Gallo

    2017-06-01

    Full Text Available From the last decades of the XX century, various formulae have been proposed to estimate the performance in tunnelling of disc cutters, mainly employed in Tunnelling Boring Machines (TBM. Nevertheless, their suitability has not been verified in Micro Tunnelling Boring Machines (MTBM, with smaller diameter of excavation, between 1,000 and 2,500 mm and smaller cutter tools, where parameters like joint spacing may have a different influence. This paper analyzes those models proposed for TBM. After having observed very low correlation with data obtained in 15 microtunnels, a new performance model is developed, adapted to the geomechanical data available in this type of works. Moreover, a method to calculate the total amount of hours that are necessary to carry out microtunnels, including all the tasks of the excavation cycle and installation and uninstallation.

  17. Model for magnetostrictive performance in soft/hard coupled bilayers

    Energy Technology Data Exchange (ETDEWEB)

    Jianjun, Li, E-mail: ljj8081@gmail.com [National Key Laboratory of Science and Technology on Advanced Composites in Special Environments, Harbin Institute of Technology, Harbin 150080 (China); Laboratoire de Magnétisme de Bretagne, Université de Bretagne Occidentale, 29238 Brest Cedex 3 (France); Beibei, Duan; Minglun, Li [National Key Laboratory of Science and Technology on Advanced Composites in Special Environments, Harbin Institute of Technology, Harbin 150080 (China)

    2015-11-01

    A model is set up to investigate the magnetostrictive performance and spin response in soft/hard magnetostrictive coupled bilayers. Direct coupling between soft ferromagnet and hard TbFe{sub 2} at the interface is assumed. The magnetostriction results from the rotation of ferromagnetic vector and TbFe{sub 2} vectors from the easy axis driven by applied magnetic field. Dependence of magnetostriction on TbFe{sub 2} layer thickness and interfacial exchange interaction is studied. The simulated results reveal the compromise between interfacial exchange interaction and anisotropy of TbFe{sub 2} hard layer. - Highlights: • A model for magnetostrictive performance in soft/hard coupled bilayers. • Simulated magnetostriction loop and corresponding spin response. • Competition and compromise between interfacial interaction and TbFe{sub 2} anisotropy. • Dependence of saturated magnetostriction on different parameters.

  18. Does model performance improve with complexity? A case study with three hydrological models

    Science.gov (United States)

    Orth, Rene; Staudinger, Maria; Seneviratne, Sonia I.; Seibert, Jan; Zappa, Massimiliano

    2015-04-01

    In recent decades considerable progress has been made in climate model development. Following the massive increase in computational power, models became more sophisticated. At the same time also simple conceptual models have advanced. In this study we validate and compare three hydrological models of different complexity to investigate whether their performance varies accordingly. For this purpose we use runoff and also soil moisture measurements, which allow a truly independent validation, from several sites across Switzerland. The models are calibrated in similar ways with the same runoff data. Our results show that the more complex models HBV and PREVAH outperform the simple water balance model (SWBM) in case of runoff but not for soil moisture. Furthermore the most sophisticated PREVAH model shows an added value compared to the HBV model only in case of soil moisture. Focusing on extreme events we find generally improved performance of the SWBM during drought conditions and degraded agreement with observations during wet extremes. For the more complex models we find the opposite behavior, probably because they were primarily developed for prediction of runoff extremes. As expected given their complexity, HBV and PREVAH have more problems with over-fitting. All models show a tendency towards better performance in lower altitudes as opposed to (pre-) alpine sites. The results vary considerably across the investigated sites. In contrast, the different metrics we consider to estimate the agreement between models and observations lead to similar conclusions, indicating that the performance of the considered models is similar at different time scales as well as for anomalies and long-term means. We conclude that added complexity does not necessarily lead to improved performance of hydrological models, and that performance can vary greatly depending on the considered hydrological variable (e.g. runoff vs. soil moisture) or hydrological conditions (floods vs. droughts).

  19. Thermal performance modeling of cross-flow heat exchangers

    CERN Document Server

    Cabezas-Gómez, Luben; Saíz-Jabardo, José Maria

    2014-01-01

    This monograph introduces a numerical computational methodology for thermal performance modeling of cross-flow heat exchangers, with applications in chemical, refrigeration and automobile industries. This methodology allows obtaining effectiveness-number of transfer units (e-NTU) data and has been used for simulating several standard and complex flow arrangements configurations of cross-flow heat exchangers. Simulated results have been validated through comparisons with results from available exact and approximate analytical solutions. Very accurate results have been obtained over wide ranges

  20. Towards an Improved Performance Measure for Language Models

    CERN Document Server

    Ueberla, J P

    1997-01-01

    In this paper a first attempt at deriving an improved performance measure for language models, the probability ratio measure (PRM) is described. In a proof of concept experiment, it is shown that PRM correlates better with recognition accuracy and can lead to better recognition results when used as the optimisation criterion of a clustering algorithm. Inspite of the approximations and limitations of this preliminary work, the results are very encouraging and should justify more work along the same lines.

  1. A Fuzzy Knowledge Representation Model for Student Performance Assessment

    DEFF Research Database (Denmark)

    Badie, Farshad

    Knowledge representation models based on Fuzzy Description Logics (DLs) can provide a foundation for reasoning in intelligent learning environments. While basic DLs are suitable for expressing crisp concepts and binary relationships, Fuzzy DLs are capable of processing degrees of truth/completene....../completeness about vague or imprecise information. This paper tackles the issue of representing fuzzy classes using OWL2 in a dataset describing Performance Assessment Results of Students (PARS)....

  2. Human Engineering Modeling and Performance Lab Study Project

    Science.gov (United States)

    Oliva-Buisson, Yvette J.

    2014-01-01

    The HEMAP (Human Engineering Modeling and Performance) Lab is a joint effort between the Industrial and Human Engineering group and the KAVE (Kennedy Advanced Visualiations Environment) group. The lab consists of sixteen camera system that is used to capture human motions and operational tasks, through te use of a Velcro suit equipped with sensors, and then simulate these tasks in an ergonomic software package know as Jac, The Jack software is able to identify the potential risk hazards.

  3. Modeling and Simulation of Ceramic Arrays to Improve Ballistic Performance

    Science.gov (United States)

    2014-04-30

    distribution is Unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT -Develop Modeling and Simulation tools, use Depth of Penetration ( DOP ) as metric...7.62 APM2 -Evaluate SiC tile on Aluminum with material properties from literature -Develop seam designs to improve performance, demonstrate with DOP ...5083, SiC, DoP Expeminets, AutoDyn Sin 16. SECURITY CLASSIFICATION OF: UU a. REPORT b. ABSTRACT c. THIS PAGE 17. LIMITATION OF ABSTRACT UU 18

  4. Towards Accreditation of Diagnostic Models for Improved Performance

    Science.gov (United States)

    2004-10-02

    analysis. Secondly, while performing testability, the diagnostic algorithm is not included to assess Anuradha Kodali et al. This is an open-access...to assess the diagnosis (Sheppard, & Simpson, 1998). Considering these factors, Interactive Diagnostic Modeling Evaluator (i-DME) ( Kodali , Robinson...requirements set before to suit practical compulsions. This may lead to changing the basic principles and to refine the existing methods continuously

  5. Model for the Analysis of the Company Performance

    Directory of Open Access Journals (Sweden)

    Mădălina DUMBRAVĂ

    2010-08-01

    Full Text Available The analysis of the performance of a firm (company has a determinant role in setting the strategy to follow and this is more necessary during the period of economic-financial crisis. In the following items, I have performed the analysis, based on the balance sheet data, for SC DELTA SRL, using a system of indicators that have relevance, and the interpretation of which allows to draw certain conclusions, depending on which the future development can be forecasted. I have tried to use a number of indicators, viewed as a system, which would define, in the end, a model for company performance analysis. The research focused on using the system of indicators on the data from the balance sheet of SC DELTA SRL.

  6. Model helicopter performance degradation with simulated ice shapes

    Science.gov (United States)

    Tinetti, Ana F.; Korkan, Kenneth D.

    1987-01-01

    An experimental program using a commercially available model helicopter has been conducted in the Texas A&M University Subsonic Wind Tunnel to investigate main rotor performance degradation due to generic ice. The simulated ice, including both primary and secondary formations, was scaled by chord from previously documented artificial ice accretions. Base and iced performance data were gathered as functions of fuselage incidence, blade collective pitch, main rotor rotational velocity, and freestream velocity. It was observed that the presence of simulated ice tends to decrease the lift to equivalent drag ratio, as well as thrust coefficient for the range of velocity ratios tested. Also, increases in torque coefficient due to the generic ice formations were observed. Evaluation of the data has indicated that the addition of roughness due to secondary ice formations is crucial for proper evaluation of the degradation in main rotor performance.

  7. A performance measurement using balanced scorecard and structural equation modeling

    Directory of Open Access Journals (Sweden)

    Rosha Makvandi

    2014-02-01

    Full Text Available During the past few years, balanced scorecard (BSC has been widely used as a promising method for performance measurement. BSC studies organizations in terms of four perspectives including customer, internal processes, learning and growth and financial figures. This paper presents a hybrid of BSC and structural equation modeling (SEM to measure the performance of an Iranian university in province of Alborz, Iran. The proposed study of this paper uses this conceptual method, designs a questionnaire and distributes it among some university students and professors. Using SEM technique, the survey analyzes the data and the results indicate that the university did poorly in terms of all four perspectives. The survey extracts necessary target improvement by presenting necessary attributes for performance improvement.

  8. The application of DEA model in enterprise environmental performance auditing

    Science.gov (United States)

    Li, F.; Zhu, L. Y.; Zhang, J. D.; Liu, C. Y.; Qu, Z. G.; Xiao, M. S.

    2017-01-01

    As a part of society, enterprises have an inescapable responsibility for environmental protection and governance. This article discusses the feasibility and necessity of enterprises environmental performance auditing and uses DEA model calculate the environmental performance of Haier for example. The most of reference data are selected and sorted from Haier’s environmental reportspublished in 2008, 2009, 2011 and 2015, and some of the data from some published articles and fieldwork. All the calculation results are calculated by DEAP software andhave a high credibility. The analysis results of this article can give corporate managements an idea about using environmental performance auditing to adjust their corporate environmental investments capital quota and change their company’s environmental strategies.

  9. Advanced transport systems analysis, modeling, and evaluation of performances

    CERN Document Server

    Janić, Milan

    2014-01-01

    This book provides a systematic analysis, modeling and evaluation of the performance of advanced transport systems. It offers an innovative approach by presenting a multidimensional examination of the performance of advanced transport systems and transport modes, useful for both theoretical and practical purposes. Advanced transport systems for the twenty-first century are characterized by the superiority of one or several of their infrastructural, technical/technological, operational, economic, environmental, social, and policy performances as compared to their conventional counterparts. The advanced transport systems considered include: Bus Rapid Transit (BRT) and Personal Rapid Transit (PRT) systems in urban area(s), electric and fuel cell passenger cars, high speed tilting trains, High Speed Rail (HSR), Trans Rapid Maglev (TRM), Evacuated Tube Transport system (ETT), advanced commercial subsonic and Supersonic Transport Aircraft (STA), conventionally- and Liquid Hydrogen (LH2)-fuelled commercial air trans...

  10. Model helicopter performance degradation with simulated ice shapes

    Science.gov (United States)

    Tinetti, Ana F.; Korkan, Kenneth D.

    1987-01-01

    An experimental program using a commercially available model helicopter has been conducted in the Texas A&M University Subsonic Wind Tunnel to investigate main rotor performance degradation due to generic ice. The simulated ice, including both primary and secondary formations, was scaled by chord from previously documented artificial ice accretions. Base and iced performance data were gathered as functions of fuselage incidence, blade collective pitch, main rotor rotational velocity, and freestream velocity. It was observed that the presence of simulated ice tends to decrease the lift to equivalent drag ratio, as well as thrust coefficient for the range of velocity ratios tested. Also, increases in torque coefficient due to the generic ice formations were observed. Evaluation of the data has indicated that the addition of roughness due to secondary ice formations is crucial for proper evaluation of the degradation in main rotor performance.

  11. 3D Massive MIMO Systems: Modeling and Performance Analysis

    KAUST Repository

    Nadeem, Qurrat-Ul-Ain

    2015-07-30

    Multiple-input-multiple-output (MIMO) systems of current LTE releases are capable of adaptation in the azimuth only. Recently, the trend is to enhance system performance by exploiting the channel’s degrees of freedom in the elevation, which necessitates the characterization of 3D channels. We present an information-theoretic channel model for MIMO systems that supports the elevation dimension. The model is based on the principle of maximum entropy, which enables us to determine the distribution of the channel matrix consistent with the prior information on the angles. Based on this model, we provide analytical expression for the cumulative density function (CDF) of the mutual information (MI) for systems with a single receive and finite number of transmit antennas in the general signalto- interference-plus-noise-ratio (SINR) regime. The result is extended to systems with finite receive antennas in the low SINR regime. A Gaussian approximation to the asymptotic behavior of MI distribution is derived for the large number of transmit antennas and paths regime. We corroborate our analysis with simulations that study the performance gains realizable through meticulous selection of the transmit antenna downtilt angles, confirming the potential of elevation beamforming to enhance system performance. The results are directly applicable to the analysis of 5G 3D-Massive MIMO-systems.

  12. Modelling the Progression of Male Swimmers’ Performances through Adolescence

    Directory of Open Access Journals (Sweden)

    Shilo J. Dormehl

    2016-01-01

    Full Text Available Insufficient data on adolescent athletes is contributing to the challenges facing youth athletic development and accurate talent identification. The purpose of this study was to model the progression of male sub-elite swimmers’ performances during adolescence. The performances of 446 males (12–19 year olds competing in seven individual events (50, 100, 200 m freestyle, 100 m backstroke, breaststroke, butterfly, 200 m individual medley over an eight-year period at an annual international schools swimming championship, run under FINA regulations were collected. Quadratic functions for each event were determined using mixed linear models. Thresholds of peak performance were achieved between the ages of 18.5 ± 0.1 (50 m freestyle and 200 m individual medley and 19.8 ± 0.1 (100 m butterfly years. The slowest rate of improvement was observed in the 200 m individual medley (20.7% and the highest in the 100 m butterfly (26.2%. Butterfly does however appear to be one of the last strokes in which males specialise. The models may be useful as talent identification tools, as they predict the age at which an average sub-elite swimmer could potentially peak. The expected rate of improvement could serve as a tool in which to monitor and evaluate benchmarks.

  13. Performance Evaluation Based on EFQM Excellence Model in Sport Organizations

    Directory of Open Access Journals (Sweden)

    Rasoul Faraji

    2012-06-01

    Full Text Available The present study aims to evaluate the performance of physical education (P.E. general office of Tehran province through model of European Foundation for Quality Management (EFQM. Questionnaire approach was used in this study. Therefore validity of the 50-item EFQM questionnaire verified by the experts and the reliability also calculated in a pilot study (α=0.928. 95 questionnaires distributed between subjects (N=n and 80 questionnaires returned and concluded in the statistical analysis. From nine EFQM criteria, the highest scores were gained in key performance results (37.62% and the lowest gained in people results (27.94%. Totally, this organization achieved 337.11 pointes out of a total of 1000. Additionally, there was a strong relationship (r=0.827, p=0.001 between enablers and results (P<0.05. Based on scores gained in the criteria, improving measures in all criteria is essential for this organization, especially in the people criterion from enablers and people results criterion from results domain. Furthermore, it is believed that the physical education area is one of the best fields for application of the excellence model towards the performance excellence and gaining better results and hence, it seems that the model has a high potential in responding to problems commonly seen in sport sector.

  14. Performance of chromatographic systems to model soil-water sorption.

    Science.gov (United States)

    Hidalgo-Rodríguez, Marta; Fuguet, Elisabet; Ràfols, Clara; Rosés, Martí

    2012-08-24

    A systematic approach for evaluating the goodness of chromatographic systems to model the sorption of neutral organic compounds by soil from water is presented in this work. It is based on the examination of the three sources of error that determine the overall variance obtained when soil-water partition coefficients are correlated against chromatographic retention factors: the variance of the soil-water sorption data, the variance of the chromatographic data, and the variance attributed to the dissimilarity between the two systems. These contributions of variance are easily predicted through the characterization of the systems by the solvation parameter model. According to this method, several chromatographic systems besides the reference octanol-water partition system have been selected to test their performance in the emulation of soil-water sorption. The results from the experimental correlations agree with the predicted variances. The high-performance liquid chromatography system based on an immobilized artificial membrane and the micellar electrokinetic chromatography systems of sodium dodecylsulfate and sodium taurocholate provide the most precise correlation models. They have shown to predict well soil-water sorption coefficients of several tested herbicides. Octanol-water partitions and high-performance liquid chromatography measurements using C18 columns are less suited for the estimation of soil-water partition coefficients.

  15. Beamforming in Ad Hoc Networks: MAC Design and Performance Modeling

    Directory of Open Access Journals (Sweden)

    Khalil Fakih

    2009-01-01

    Full Text Available We examine in this paper the benefits of beamforming techniques in ad hoc networks. We first devise a novel MAC paradigm for ad hoc networks when using these techniques in multipath fading environment. In such networks, the use of conventional directional antennas does not necessarily improve the system performance. On the other hand, the exploitation of the potential benefits of smart antenna systems and especially beamforming techniques needs a prior knowledge of the physical channel. Our proposition performs jointly channel estimation and radio resource sharing. We validate the fruitfulness of the proposed MAC and we evaluate the effects of the channel estimation on the network performance. We then present an accurate analytical model for the performance of IEEE 802.11 MAC protocol. We extend the latter model, by introducing the fading probability, to derive the saturation throughput for our proposed MAC when the simplest beamforming strategy is used in real multipath fading ad hoc networks. Finally, numerical results validate our proposition.

  16. Hydrothermal Fe cycling and deep ocean organic carbon scavenging: Model-based evidence for significant POC supply to seafloor sediments

    Science.gov (United States)

    German, C. R.; Legendre, L. L.; Sander, S. G.; Niquil, N.; Luther, G. W.; Bharati, L.; Han, X.; Le Bris, N.

    2015-06-01

    Submarine hydrothermal venting has recently been identified to have the potential to impact ocean biogeochemistry at the global scale. This is the case because processes active in hydrothermal plumes are so vigorous that the residence time of the ocean, with respect to cycling through hydrothermal plumes, is comparable to that of deep ocean mixing caused by thermohaline circulation. Recently, it has been argued that seafloor venting may provide a significant source of bio-essential Fe to the oceans as the result of a close coupling between Fe and organic carbon in hydrothermal plumes. But a complementary question remains to be addressed: does this same intimate Fe-Corg association in hydrothermal plumes cause any related impact to the global C cycle? To address this, SCOR-InterRidge Working Group 135 developed a modeling approach to synthesize site-specific field data from the East Pacific Rise 9°50‧ N hydrothermal field, where the range of requisite data sets is most complete, and combine those inputs with global estimates for dissolved Fe inputs from venting to the oceans to establish a coherent model with which to investigate hydrothermal Corg cycling. The results place new constraints on submarine Fe vent fluxes worldwide, including an indication that the majority of Fe supplied to hydrothermal plumes should come from entrainment of diffuse flow. While this same entrainment is not predicted to enhance the supply of dissolved organic carbon to hydrothermal plumes by more than ∼10% over background values, what the model does indicate is that scavenging of carbon in association with Fe-rich hydrothermal plume particles should play a significant role in the delivery of particulate organic carbon to deep ocean sediments, worldwide.

  17. Changes of High Mobility Group box 1 in Serum of Pig Acute Hepatic Failure Model and Significance

    Institute of Scientific and Technical Information of China (English)

    Fan ZHANG; Yongwen HE; Zhongping DUAN

    2008-01-01

    The role of the high mobility group box 1 (HMGB-1) in acute hepatic failure and the ef- fect of artificial liver support system treatment on HMGB-1 level were investigated. Pig models of acute hepatic failure were induced by D-galactosamine and randomly divided into two groups with or without artificial liver support system treatment. Tumor necrosis factor-α (TNF-α) and interleukin-1β (IL-1β) levels were detected by the enzyme linked immunosorbent assay (ELISA), the expression of HMGB-1 by Western blot, and serum levels of HMGB-1, liver function and hepatic pathology were observed after artificial liver support system treatment. The levels of TNF-α and IL-1β were increased and reached the peak at 24th h in the acute hepatic failure group, then quickly decreased. The serum level of HMGB-1 was increased at 24th h in the acute hepatic failure group and reached the peak at 48th h, then kept a stable high level. Significant liver injury appeared at 24th h and was continuously getting worse in the pig models of acute hepatic failure. In contrast, the liver injury was significantly alleviated and serum level of HMGB-1 was significantly decreased in the group treated with artificial liver support system (P<0.05). It was suggested that HMGB-1 may participate in the inflammatory response and liver injury in the late stage of the acute liver failure. Artificial liver support system treatment can reduce serum HMGB-1 level and relieve liver pathological damage.

  18. Modeling, Analysis, and Control of a Hypersonic Vehicle with Significant Aero-Thermo-Elastic-Propulsion Interactions: Elastic, Thermal and Mass Uncertainty

    Science.gov (United States)

    Khatri, Jaidev

    This thesis examines themodeling, analysis, and control system design issues for scramjet powered hypersonic vehicles. A nonlinear three degrees of freedom longitudinal model which includes aero-propulsion-elasticity effects was used for all analyses. This model is based upon classical compressible flow and Euler-Bernouli structural concepts. Higher fidelity computational fluid dynamics and finite element methods are needed for more precise intermediate and final evaluations. The methods presented within this thesis were shown to be useful for guiding initial control relevant design. The model was used to examine the vehicle's static and dynamic characteristics over the vehicle's trimmable region. The vehicle has significant longitudinal coupling between the fuel equivalency ratio (FER) and the flight path angle (FPA). For control system design, a two-input two-output plant (FER - elevator to speed-FPA) with 11 states (including 3 flexible modes) was used. Velocity, FPA, and pitch were assumed to be available for feedback. Aerodynamic heat modeling and design for the assumed TPS was incorporated to original Bolender's model to study the change in static and dynamic properties. De-centralized control stability, feasibility and limitations issues were dealt with the change in TPS elasticity, mass and physical dimension. The impact of elasticity due to TPS mass, TPS physical dimension as well as prolonged heating was also analyzed to understand performance limitations of de-centralized control designed for nominal model.

  19. Part A: Assessing the performance of the COMFA outdoor thermal comfort model on subjects performing physical activity

    Science.gov (United States)

    Kenny, Natasha A.; Warland, Jon S.; Brown, Robert D.; Gillespie, Terry G.

    2009-09-01

    This study assessed the performance of the COMFA outdoor thermal comfort model on subjects performing moderate to vigorous physical activity. Field tests were conducted on 27 subjects performing 30 min of steady-state activity (walking, running, and cycling) in an outdoor environment. The predicted COMFA budgets were compared to the actual thermal sensation (ATS) votes provided by participants during each 5-min interval. The results revealed a normal distribution in the subjects’ ATS votes, with 82% of votes received in categories 0 (neutral) to +2 (warm). The ATS votes were significantly dependent upon sex, air temperature, short and long-wave radiation, wind speed, and metabolic activity rate. There was a significant positive correlation between the ATS and predicted budgets (Spearman’s rho = 0.574, P < 0.01). However, the predicted budgets did not display a normal distribution, and the model produced erroneous estimates of the heat and moisture exchange between the human body and the ambient environment in 6% of the cases.

  20. A Wake Model for the Prediction of Propeller Performance at Low Advance Ratios

    Directory of Open Access Journals (Sweden)

    Ye Tian

    2012-01-01

    Full Text Available A low order panel method is used to predict the performance of propellers. A wake alignment model based on a pseudounsteady scheme is proposed and implemented. The results from this full wake alignment (FWA model are correlated with available experimental data, and results from RANS for some propellers at design and low advance ratios. Significant improvements have been found in the predicted integrated forces and pressure distributions.

  1. The 4th Release of GOCE Gravity Field Models - Overview and Performance Analysis

    Science.gov (United States)

    Gruber, Thomas; Rummel, Reiner

    2013-04-01

    New GOCE gravity field models based on about 2 years of completely reprocessed gradiometer data have been recently released to the user community. They were obtained based on different processing strategies and reflect the state-of-the-art of GOCE gravity field models. With the improved gravity gradients resulting from a number of updates implemented in the level 1B processor and with the additional data set the performance of the resulting GOCE based models could be significantly improved as compared to the previous solutions. The paper provides an overview of the available GOCE models and presents the results of their validation by different means.

  2. Effect of patient location on the performance of clinical models to predict pulmonary embolism.

    Science.gov (United States)

    Ollenberger, Glenn P; Worsley, Daniel F

    2006-01-01

    Current clinical likelihood models for predicting pulmonary embolism (PE) are used to categorize outpatients into low, intermediate and high clinical pre-test likelihood of PE. Since these clinical prediction rules were developed using outpatients it is not known if they can be applied universally to both inpatients and outpatients with suspected PE. Thus, the purpose of this study was to determine the effect of patient location on the performance of clinical models to predict PE. Two clinical models (Wells and Wicki) were applied to data from the multi-centered PIOPED study. The Wells score was applied to 1359 patients and the Wicki score was applied to 998 patients. 361 patients (27%) from the PIOPED study did not have arterial gas measurement and were excluded from the Wicki score patient group. Patients were stratified by their location at the time of entry into the PIOPED study as follows: outpatient/emergency, surgical ward, medicine/coronary care unit or intensive care unit. The diagnostic performance of the two clinical models was applied to the various patient locations and the performance was evaluated using the area under a fitted receiver operating characteristic curve (AUC). The prevalence of PE in the three clinical probability categories were similar for the two scoring methods. Both clinical models yielded the lowest diagnostic performance in patients referred from surgical wards. The AUC for both clinical prediction rules decreased significantly when applied to inpatients in comparison to outpatients. Current clinical prediction rules for determining the pre-test likelihood of PE yielded different diagnostic performances depending upon patient location. The performance of the clinical prediction rules decreased significantly when applied to inpatients. In particular, the rules performed least well when applied to patients referred from surgical wards suggesting these rules should not be used in this patient group. As expected the clinical

  3. Green roof hydrologic performance and modeling: a review.

    Science.gov (United States)

    Li, Yanling; Babcock, Roger W

    2014-01-01

    Green roofs reduce runoff from impervious surfaces in urban development. This paper reviews the technical literature on green roof hydrology. Laboratory experiments and field measurements have shown that green roofs can reduce stormwater runoff volume by 30 to 86%, reduce peak flow rate by 22 to 93% and delay the peak flow by 0 to 30 min and thereby decrease pollution, flooding and erosion during precipitation events. However, the effectiveness can vary substantially due to design characteristics making performance predictions difficult. Evaluation of the most recently published study findings indicates that the major factors affecting green roof hydrology are precipitation volume, precipitation dynamics, antecedent conditions, growth medium, plant species, and roof slope. This paper also evaluates the computer models commonly used to simulate hydrologic processes for green roofs, including stormwater management model, soil water atmosphere and plant, SWMS-2D, HYDRUS, and other models that are shown to be effective for predicting precipitation response and economic benefits. The review findings indicate that green roofs are effective for reduction of runoff volume and peak flow, and delay of peak flow, however, no tool or model is available to predict expected performance for any given anticipated system based on design parameters that directly affect green roof hydrology.

  4. A Fluid Model for Performance Analysis in Cellular Networks

    Directory of Open Access Journals (Sweden)

    Coupechoux Marceau

    2010-01-01

    Full Text Available We propose a new framework to study the performance of cellular networks using a fluid model and we derive from this model analytical formulas for interference, outage probability, and spatial outage probability. The key idea of the fluid model is to consider the discrete base station (BS entities as a continuum of transmitters that are spatially distributed in the network. This model allows us to obtain simple analytical expressions to reveal main characteristics of the network. In this paper, we focus on the downlink other-cell interference factor (OCIF, which is defined for a given user as the ratio of its outer cell received power to its inner cell received power. A closed-form formula of the OCIF is provided in this paper. From this formula, we are able to obtain the global outage probability as well as the spatial outage probability, which depends on the location of a mobile station (MS initiating a new call. Our analytical results are compared to Monte Carlo simulations performed in a traditional hexagonal network. Furthermore, we demonstrate an application of the outage probability related to cell breathing and densification of cellular networks.

  5. Incorporating biologic measurements (SF(2), CFE) into a tumor control probability model increases their prognostic significance: a study in cervical carcinoma treated with radiation therapy.

    Science.gov (United States)

    Buffa, F M; Davidson, S E; Hunter, R D; Nahum, A E; West, C M

    2001-08-01

    To assess whether incorporation of measurements of surviving fraction at 2 Gy (SF(2)) and colony-forming efficiency (CFE) into a tumor control probability (tcp) model increases their prognostic significance. Measurements of SF(2) and CFE were available from a study on carcinoma of the cervix treated with radiation alone. These measurements, as well as tumor volume, dose, and treatment time, were incorporated into a Poisson tcp model (tcp(alpha,rho)). Regression analysis was performed to assess the prognostic power of tcp(alpha,rho) vs. the use of either tcp models with biologic parameters fixed to best-fit estimates (but incorporating individual dose, volume, and treatment time) or the use of SF(2) and CFE measurements alone. In a univariate regression analysis of 44 patients, tcp(alpha,rho) was a better prognostic factor for both local control and survival (p CFE alone (p = 0.015 for local control, p = 0.38 for survival). In multivariate analysis, tcp(alpha,rho) emerged as the most important prognostic factor for local control (p CFE was still a significant independent prognostic factor for local control, whereas SF(2) was not. The sensitivities of tcp(alpha,rho) and SF(2) as predictive tests for local control were 87% and 65%, respectively. Specificities were 70% and 77%, respectively. A Poisson tcp model incorporating individual SF(2), CFE, dose, tumor volume, and treatment time was found to be the best independent prognostic factor for local control and survival in cervical carcinoma patients.

  6. Predictive models for population performance on real biological fitness landscapes.

    Science.gov (United States)

    Rowe, William; Wedge, David C; Platt, Mark; Kell, Douglas B; Knowles, Joshua

    2010-09-01

    Directed evolution, in addition to its principal application of obtaining novel biomolecules, offers significant potential as a vehicle for obtaining useful information about the topologies of biomolecular fitness landscapes. In this article, we make use of a special type of model of fitness landscapes-based on finite state machines-which can be inferred from directed evolution experiments. Importantly, the model is constructed only from the fitness data and phylogeny, not sequence or structural information, which is often absent. The model, called a landscape state machine (LSM), has already been used successfully in the evolutionary computation literature to model the landscapes of artificial optimization problems. Here, we use the method for the first time to simulate a biological fitness landscape based on experimental evaluation. We demonstrate in this study that LSMs are capable not only of representing the structure of model fitness landscapes such as NK-landscapes, but also the fitness landscape of real DNA oligomers binding to a protein (allophycocyanin), data we derived from experimental evaluations on microarrays. The LSMs prove adept at modelling the progress of evolution as a function of various controlling parameters, as validated by evaluations on the real landscapes. Specifically, the ability of the model to 'predict' optimal mutation rates and other parameters of the evolution is demonstrated. A modification to the standard LSM also proves accurate at predicting the effects of recombination on the evolution.

  7. Modeling and design of a high-performance hybrid actuator

    Science.gov (United States)

    Aloufi, Badr; Behdinan, Kamran; Zu, Jean

    2016-12-01

    This paper presents the model and design of a novel hybrid piezoelectric actuator which provides high active and passive performances for smart structural systems. The actuator is composed of a pair of curved pre-stressed piezoelectric actuators, so-called commercially THUNDER actuators, installed opposite each other using two clamping mechanisms constructed of in-plane fixable hinges, grippers and solid links. A fully mathematical model is developed to describe the active and passive dynamics of the actuator and investigate the effects of its geometrical parameters on the dynamic stiffness, free displacement and blocked force properties. Among the literature that deals with piezoelectric actuators in which THUNDER elements are used as a source of electromechanical power, the proposed study is unique in that it presents a mathematical model that has the ability to predict the actuator characteristics and achieve other phenomena, such as resonances, mode shapes, phase shifts, dips, etc. For model validation, the measurements of the free dynamic response per unit voltage and passive acceleration transmissibility of a particular actuator design are used to check the accuracy of the results predicted by the model. The results reveal that there is a good agreement between the model and experiment. Another experiment is performed to teste the linearity of the actuator system by examining the variation of the output dynamic responses with varying forces and voltages at different frequencies. From the results, it can be concluded that the actuator acts approximately as a linear system at frequencies up to 1000 Hz. A parametric study is achieved here by applying the developed model to analyze the influence of the geometrical parameters of the fixable hinges on the active and passive actuator properties. The model predictions in the frequency range of 0-1000 Hz show that the hinge thickness, radius, and opening angle parameters have great effects on the frequency dynamic

  8. Tank System Integrated Model: A Cryogenic Tank Performance Prediction Program

    Science.gov (United States)

    Bolshinskiy, L. G.; Hedayat, A.; Hastings, L. J.; Sutherlin, S. G.; Schnell, A. R.; Moder, J. P.

    2017-01-01

    Accurate predictions of the thermodynamic state of the cryogenic propellants, pressurization rate, and performance of pressure control techniques in cryogenic tanks are required for development of cryogenic fluid long-duration storage technology and planning for future space exploration missions. This Technical Memorandum (TM) presents the analytical tool, Tank System Integrated Model (TankSIM), which can be used for modeling pressure control and predicting the behavior of cryogenic propellant for long-term storage for future space missions. Utilizing TankSIM, the following processes can be modeled: tank self-pressurization, boiloff, ullage venting, mixing, and condensation on the tank wall. This TM also includes comparisons of TankSIM program predictions with the test data andexamples of multiphase mission calculations.

  9. Toward a high performance distributed memory climate model

    Energy Technology Data Exchange (ETDEWEB)

    Wehner, M.F.; Ambrosiano, J.J.; Brown, J.C.; Dannevik, W.P.; Eltgroth, P.G.; Mirin, A.A. [Lawrence Livermore National Lab., CA (United States); Farrara, J.D.; Ma, C.C.; Mechoso, C.R.; Spahr, J.A. [Univ. of California, Los Angeles, CA (US). Dept. of Atmospheric Sciences

    1993-02-15

    As part of a long range plan to develop a comprehensive climate systems modeling capability, the authors have taken the Atmospheric General Circulation Model originally developed by Arakawa and collaborators at UCLA and have recast it in a portable, parallel form. The code uses an explicit time-advance procedure on a staggered three-dimensional Eulerian mesh. The authors have implemented a two-dimensional latitude/longitude domain decomposition message passing strategy. Both dynamic memory management and interprocessor communication are handled with macro constructs that are preprocessed prior to compilation. The code can be moved about a variety of platforms, including massively parallel processors, workstation clusters, and vector processors, with a mere change of three parameters. Performance on the various platforms as well as issues associated with coupling different models for major components of the climate system are discussed.

  10. Cooperative cognitive radio networking system model, enabling techniques, and performance

    CERN Document Server

    Cao, Bin; Mark, Jon W

    2016-01-01

    This SpringerBrief examines the active cooperation between users of Cooperative Cognitive Radio Networking (CCRN), exploring the system model, enabling techniques, and performance. The brief provides a systematic study on active cooperation between primary users and secondary users, i.e., (CCRN), followed by the discussions on research issues and challenges in designing spectrum-energy efficient CCRN. As an effort to shed light on the design of spectrum-energy efficient CCRN, they model the CCRN based on orthogonal modulation and orthogonally dual-polarized antenna (ODPA). The resource allocation issues are detailed with respect to both models, in terms of problem formulation, solution approach, and numerical results. Finally, the optimal communication strategies for both primary and secondary users to achieve spectrum-energy efficient CCRN are analyzed.

  11. Modelling of green roof hydrological performance for urban drainage applications

    DEFF Research Database (Denmark)

    Locatelli, Luca; Mark, Ole; Mikkelsen, Peter Steen

    2014-01-01

    Green roofs are being widely implemented for stormwater management and their impact on the urban hydrological cycle can be evaluated by incorporating them into urban drainage models. This paper presents a model of green roof long term and single event hydrological performance. The model includes...... from 3 different extensive sedum roofs in Denmark. These data consist of high-resolution measurements of runoff, precipitation and atmospheric variables in the period 2010–2012. The hydrological response of green roofs was quantified based on statistical analysis of the results of a 22-year (1989...... and that the mean annual runoff is not linearly related to the storage. Green roofs have therefore the potential to be important parts of future urban stormwater management plans....

  12. Modeling plant, microorganisms, and mineral surface competition for soil nitrogen and phosphorus: Competition representations and ecological significance

    Science.gov (United States)

    Zhu, Q.; Riley, W. J.; Chambers, J. Q.; Tang, J.

    2014-12-01

    It is widely accepted that terrestrial ecosystem carbon dynamics are strongly coupled and controlled by soil nutrients status. Nutrient availability serves as an indicator of aboveground carbon productivity and ecosystem stability, especially when soils are infertile. In these conditions, plants have to outcompete microorganism and mineral surfaces to acquire nutrients required for photosynthesis, respiration, seed production, defense, etc. It is usually hypothesized that microbes are short-term winners but long-term losers in nutrient competition. Microbes quickly trap available soil nitrogen and phosphorous, thereby preventing nutrient inaccessibility through hydrological leaching and mineral surface adsorption. Over longer temporal scales, nutrients are released into the soil and become available for plant uptake. Despite its ecological significance, nutrient competition is either absent or over-simplified (e.g., assuming all consumers are equally competitive) in terrestrial biogeochemistry models. Here, we aim to test the representation of different competitive strategies and to investigate their ecological consequences with a newly developed biogeochemical model structure. The new model includes three major soil nutrients (ammonia, nitrate, and phosphate) and multiple consumers (plants, microbes, mineral surfaces, nitrifiers, and denitrifiers). We analyze predicted soil carbon, nitrogen, and phosphorus dynamics with three different competitive strategies: (1) plants compete poorly against microorganisms; (2) all consumers are equally competitive; and (3) an explicit Equilibrium Chemical Approximation (ECA; Tang and Riley (2013)) treatment. We find that very different ecosystem states are predicted when assuming different competitive structures, and that the ECA approach provides the best match with a large suite of observational constraints from tropical experimental and transect studies. We conclude that terrestrial biogeochemical models should represent a

  13. Modeling the significance of including C redistribution when determining changes in net carbon storage along a cultivated toposequence

    Science.gov (United States)

    Chirinda, Ngonidzashe; Olesen, Jørgen E.; Heckrath, Goswin; Paradelo Pérez, Marcos; Taghizadeh-Toosi, Arezoo

    2016-04-01

    Globally, soil carbon (C) reserves are second only to those in the ocean, and accounts for a significant C reservoir. In the case of arable soils, the quantity of stored C is influenced by various factors (e.g. management practices). Currently, the topography related influences on in-field soil C dynamics remain largely unknown. However, topography is known to influence a multiplicity of factors that regulate C input, storage and redistribution. To understand the patterns and untangle the complexity of soil C dynamics in arable landscapes, our study was conducted with soils from shoulderslope and footslope positions on a 7.1 ha winter wheat field in western Denmark. We first collected soil samples from shoulderslope and footslope positions with various depth intervals down to 100 cm and analyzed them for physical and chemical properties including texture and soil organic C contents. In-situ carbon dioxide (CO2) concentrations were measured at different soil profile depths at both positions for a year. Soil moisture content and temperature at 5 and 40 cm depth was measured continuously. Additionally, surface soil CO2 fluxes at shoulderslope and footslope positions were measured. We then used measurement data collected from the two landscape positions to calibrate the one-dimensional mechanistic model SOILCO2 module of the HYDRUS-1D software package and obtained soil CO2 fluxes from soil profile at two landscape positions. Furthermore, we tested whether the inclusion of vertical and lateral soil C movement improved the modeling of C dynamics in cultivated landscapes. For that, soil profile CO2 fluxes were compared with those obtained using a simple process-based soil whole profile C model, C-TOOL, which was modified to include vertical and lateral movement of C on landscape. Our results highlight the need to consider vertical and lateral soil C movement in the modeling of C dynamics in cultivated landscapes, for better qualification of net carbon storage.

  14. Surface tensions of multi-component mixed inorganic/organic aqueous systems of atmospheric significance: measurements, model predictions and importance for cloud activation predictions

    Directory of Open Access Journals (Sweden)

    D. O. Topping

    2006-11-01

    Full Text Available In order to predict the physical properties of aerosol particles, it is necessary to adequately capture the behaviour of the ubiquitous complex organic components. One of the key properties which may affect this behaviour is the contribution of the organic components to the surface tension of aqueous particles in the moist atmosphere. Whilst the qualitative effect of organic compounds on solution surface tensions has been widely reported, our quantitative understanding on mixed organic and mixed inorganic/organic systems is limited.  Furthermore, it is unclear whether models that exist in the literature can reproduce the surface tension variability for binary and higher order multi-component organic and mixed inorganic/organic systems of atmospheric significance. The current study aims to resolve both issues to some extent. Surface tensions of single and multiple solute aqueous solutions were measured and compared with predictions from a number of model treatments. On comparison with binary organic systems, two predictive models found in the literature provided a range of values resulting from sensitivity to calculations of pure component surface tensions.  Results indicate that a fitted model can capture the variability of the measured data very well, producing the lowest average percentage deviation for all compounds studied.  The performance of the other models varies with compound and choice of model parameters. The behaviour of ternary mixed inorganic/organic systems was unreliably captured by using a predictive scheme and this was composition dependent. For more "realistic" higher order systems, entirely predictive schemes performed poorly. It was found that use of the binary data in a relatively simple mixing rule, or modification of an existing thermodynamic model with parameters derived from binary data, was able to accurately capture the surface tension variation with concentration. Thus, it would appear that in order to model

  15. Performance of several models for predicting budburst date of grapevine (Vitis vinifera L.).

    Science.gov (United States)

    García de Cortázar-Atauri, Iñaki; Brisson, Nadine; Gaudillere, Jean Pierre

    2009-07-01

    The budburst stage is a key phenological stage for grapevine (Vitis vinifera L.), with large site and cultivar variability. The objective of the present work was to provide a reliable agro-meteorological model for simulating grapevine budburst occurrence all over France. The study was conducted using data from ten cultivars of grapevine (Cabernet Sauvignon, Chasselas, Chardonnay, Grenache, Merlot, Pinot Noir, Riesling, Sauvignon, Syrah, Ugni Blanc) and five locations (Bordeaux, Colmar, Angers, Montpellier, Epernay). First, we tested two commonly used models that do not take into account dormancy: growing degree days with a base temperature of 10 degrees C (GDD(10)), and Riou's model (RIOU). The errors of predictions of these models ranged between 9 and 21 days. Second, a new model (BRIN) was studied relying on well-known formalisms for orchard trees and taking into account the dormancy period. The BRIN model showed better performance in predicting budburst date than previous grapevine models. Analysis of the components of BRIN formalisms (calculation of dormancy, use of hourly temperatures, base temperature) explained the better performances obtained with the BRIN model. Base temperature was the main driver, while dormancy period was not significant in simulating budburst date. For each cultivar, we provide the parameter estimates that showed the best performance for both the BRIN model and the GDD model with a base temperature of 5 degrees C.

  16. Performance of several models for predicting budburst date of grapevine ( Vitis vinifera L.)

    Science.gov (United States)

    García de Cortázar-Atauri, Iñaki; Brisson, Nadine; Gaudillere, Jean Pierre

    2009-07-01

    The budburst stage is a key phenological stage for grapevine ( Vitis vinifera L.), with large site and cultivar variability. The objective of the present work was to provide a reliable agro-meteorological model for simulating grapevine budburst occurrence all over France. The study was conducted using data from ten cultivars of grapevine (Cabernet Sauvignon, Chasselas, Chardonnay, Grenache, Merlot, Pinot Noir, Riesling, Sauvignon, Syrah, Ugni Blanc) and five locations (Bordeaux, Colmar, Angers, Montpellier, Epernay). First, we tested two commonly used models that do not take into account dormancy: growing degree days with a base temperature of 10°C (GDD10), and Riou’s model (RIOU). The errors of predictions of these models ranged between 9 and 21 days. Second, a new model (BRIN) was studied relying on well-known formalisms for orchard trees and taking into account the dormancy period. The BRIN model showed better performance in predicting budburst date than previous grapevine models. Analysis of the components of BRIN formalisms (calculation of dormancy, use of hourly temperatures, base temperature) explained the better performances obtained with the BRIN model. Base temperature was the main driver, while dormancy period was not significant in simulating budburst date. For each cultivar, we provide the parameter estimates that showed the best performance for both the BRIN model and the GDD model with a base temperature of 5°C.

  17. Cross-Industry Performance Modeling: Toward Cooperative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    H. S. Blackman; W. J. Reece

    1998-10-01

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several road blocks to widespread sharing of data and lessons learned from operating experience and simulation, including the fact that very few publicly accessible data bases exist(Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.

  18. Cross-industry Performance Modeling: Toward Cooperative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Reece, Wendy Jane; Blackman, Harold Stabler

    1998-10-01

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several roadblocks to widespread sharing of data and lessons learned from operating experience and simulation, including the fact that very few publicly accessible data bases exist (Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.

  19. Significance of the spatial reconstruction based on mathematical modeling in the surgical treatment of giant intracranial aneurysms

    Directory of Open Access Journals (Sweden)

    Nikolić Igor M.

    2006-01-01

    Full Text Available Background. The use of computer models for the 3- dimensional reconstruction could be a reliable method to overcome technical imperfections of diagnostic procedures for the microsurgical operation of giant intracranial aneurysms. Case report. We presented a case of successfully operated 52-year-old woman with giant intracranial aneurysm, in which the computer 3-dimensional reconstruction of blood vessels and the aneurysmal neck had been decisive for making the diagnosis. The model for 3- dimensional reconstruction of blood vessels was based on the two 2-dimensional projections of the conventional angiography. Standard neuroradiologic diagnostic procedures showed a giant aneurysm on the left middle cerebral artery, but the conventional subtraction and CT angiography did not reveal enough information. By the use of a personal computer, we performed a 3-dimensional spatial reconstruction of the left carotid artery to visualize the neck of aneurysm and its supplying blood vessels. Conclusion. The 3-dimensional spatial reconstruction of the cerebral vessels of a giant aneurysm based on the conventional angiography could be useful for planning the surgical procedure.

  20. A Statistical Approach to Modeling Indian Classical Music Performance

    CERN Document Server

    Chakraborty, Soubhik; Roy, Sayan; Chauhan, Shivee; Tripathy, Sanjaya Shankar; Mahto, Kartik

    2008-01-01

    A raga is a melodic structure with fixed notes and a set of rules characterizing a certain mood endorsed through performance. By a vadi swar is meant that note which plays the most significant role in expressing the raga. A samvadi swar similarly is the second most significant note. However, the determination of their significance has an element of subjectivity and hence we are motivated to find some truths through an objective analysis. The paper proposes a probabilistic method of note detection and demonstrates how the relative frequency (relative number of occurrences of the pitch) of the more important notes stabilize far more quickly than that of others. In addition, a count for distinct transitory and similar looking non-transitory (fundamental) frequency movements (but possibly embedding distinct emotions!) between the notes is also taken depicting the varnalankars or musical ornaments decorating the notes and note sequences as rendered by the artist. They reflect certain structural properties of the r...

  1. Test of the classic model for predicting endurance running performance.

    Science.gov (United States)

    McLaughlin, James E; Howley, Edward T; Bassett, David R; Thompson, Dixie L; Fitzhugh, Eugene C

    2010-05-01

    To compare the classic physiological variables linked to endurance performance (VO2max, %VO2max at lactate threshold (LT), and running economy (RE)) with peak treadmill velocity (PTV) as predictors of performance in a 16-km time trial. Seventeen healthy, well-trained distance runners (10 males and 7 females) underwent laboratory testing to determine maximal oxygen uptake (VO2max), RE, percentage of maximal oxygen uptake at the LT (%VO2max at LT), running velocity at LT, and PTV. Velocity at VO2max (vVO2max) was calculated from RE and VO2max. Three stepwise regression models were used to determine the best predictors (classic vs treadmill performance protocols) for the 16-km running time trial. Simple Pearson correlations of the variables with 16-km performance showed vVO2max to have the highest correlation (r = -0.972) and %VO2max at the LT the lowest (r = 0.136). The correlation coefficients for LT, VO2max, and PTV were very similar in magnitude (r = -0.903 to r = -0.892). When VO2max, %VO2max at LT, RE, and PTV were entered into SPSS stepwise analysis, VO2max explained 81.3% of the total variance, and RE accounted for an additional 10.7%. vVO2max was shown to be the best predictor of the 16-km performance, accounting for 94.4% of the total variance. The measured velocity at VO2max (PTV) was highly correlated with the estimated velocity at vVO2max (r = 0.8867). Among well-trained subjects heterogeneous in VO2max and running performance, vVO2max is the best predictor of running performance because it integrates both maximal aerobic power and the economy of running. The PTV is linked to the same physiological variables that determine vVO2max.

  2. Evaluating performance of simplified physically based models for shallow landslide susceptibility

    Science.gov (United States)

    Formetta, Giuseppe; Capparelli, Giovanna; Versace, Pasquale

    2016-11-01

    Rainfall-induced shallow landslides can lead to loss of life and significant damage to private and public properties, transportation systems, etc. Predicting locations that might be susceptible to shallow landslides is a complex task and involves many disciplines: hydrology, geotechnical science, geology, hydrogeology, geomorphology, and statistics. Two main approaches are commonly used: statistical or physically based models. Reliable model applications involve automatic parameter calibration, objective quantification of the quality of susceptibility maps, and model sensitivity analyses. This paper presents a methodology to systemically and objectively calibrate, verify, and compare different models and model performance indicators in order to identify and select the models whose behavior is the most reliable for particular case studies.The procedure was implemented in a package of models for landslide susceptibility analysis and integrated in the NewAge-JGrass hydrological model. The package includes three simplified physically based models for landslide susceptibility analysis (M1, M2, and M3) and a component for model verification. It computes eight goodness-of-fit indices by comparing pixel-by-pixel model results and measurement data. The integration of the package in NewAge-JGrass uses other components, such as geographic information system tools, to manage input-output processes, and automatic calibration algorithms to estimate model parameters. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia. The area is extensively subject to rainfall-induced shallow landslides mainly because of its complex geology and climatology. The analysis was carried out considering all the combinations of the eight optimized indices and the three models. Parameter calibration, verification, and model performance assessment were performed by a comparison with a detailed landslide inventory map for the

  3. Optical polarization tractography revealed significant fiber disarray in skeletal muscles of a mouse model for Duchenne muscular dystrophy.

    Science.gov (United States)

    Wang, Y; Zhang, K; Wasala, N B; Duan, D; Yao, G

    2015-02-01

    Optical polarization tractography (OPT) was recently developed to visualize tissue fiber architecture with cellular-level resolution and accuracy. In this study, we explored the feasibility of using OPT to study muscle disease in the mdx4cv mouse model of Duchenne muscular dystrophy. The freshly dissected tibialis anterior muscles of mdx4cv and normal mice were imaged. A "fiber disarray index" (FDI) was developed to quantify the myofiber disorganization. In necrotic muscle regions of the mdx4cv mice, the FDI was significantly elevated and can be used to segment the 3D necrotic regions for assessing the overall muscle damage. These results demonstrated the OPT's capability for imaging microscopic fiber alternations in muscle research.

  4. ASYMMETRIC PRICE TRANSMISSION MODELING: THE IMPORTANCE OF MODEL COMPLEXITY AND THE PERFORMANCE OF THE SELECTION CRITERIA

    Directory of Open Access Journals (Sweden)

    Henry de-Graft Acquah

    2013-01-01

    Full Text Available Information Criteria provides an attractive basis for selecting the best model from a set of competing asymmetric price transmission models or theories. However, little is understood about the sensitivity of the model selection methods to model complexity. This study therefore fits competing asymmetric price transmission models that differ in complexity to simulated data and evaluates the ability of the model selection methods to recover the true model. The results of Monte Carlo experimentation suggest that in general BIC, CAIC and DIC were superior to AIC when the true data generating process was the standard error correction model, whereas AIC was more successful when the true model was the complex error correction model. It is also shown that the model selection methods performed better in large samples for a complex asymmetric data generating process than with a standard asymmetric data generating process. Except for complex models, AIC's performance did not make substantial gains in recovery rates as sample size increased. The research findings demonstrate the influence of model complexity in asymmetric price transmission model comparison and selection.

  5. FRAMEWORK AND APPLICATION FOR MODELING CONTROL ROOM CREW PERFORMANCE AT NUCLEAR POWER PLANTS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L Boring; David I Gertman; Tuan Q Tran; Brian F Gore

    2008-09-01

    This paper summarizes an emerging project regarding the utilization of high-fidelity MIDAS simulations for visualizing and modeling control room crew performance at nuclear power plants. The key envisioned uses for MIDAS-based control room simulations are: (i) the estimation of human error associated with advanced control room equipment and configurations, (ii) the investigative determination of contributory cognitive factors for risk significant scenarios involving control room operating crews, and (iii) the certification of reduced staffing levels in advanced control rooms. It is proposed that MIDAS serves as a key component for the effective modeling of cognition, elements of situation awareness, and risk associated with human performance in next generation control rooms.

  6. Voxel model in BNCT treatment planning: performance analysis and improvements

    Science.gov (United States)

    González, Sara J.; Carando, Daniel G.; Santa Cruz, Gustavo A.; Zamenhof, Robert G.

    2005-02-01

    In recent years, many efforts have been made to study the performance of treatment planning systems in deriving an accurate dosimetry of the complex radiation fields involved in boron neutron capture therapy (BNCT). The computational model of the patient's anatomy is one of the main factors involved in this subject. This work presents a detailed analysis of the performance of the 1 cm based voxel reconstruction approach. First, a new and improved material assignment algorithm implemented in NCTPlan treatment planning system for BNCT is described. Based on previous works, the performances of the 1 cm based voxel methods used in the MacNCTPlan and NCTPlan treatment planning systems are compared by standard simulation tests. In addition, the NCTPlan voxel model is benchmarked against in-phantom physical dosimetry of the RA-6 reactor of Argentina. This investigation shows the 1 cm resolution to be accurate enough for all reported tests, even in the extreme cases such as a parallelepiped phantom irradiated through one of its sharp edges. This accuracy can be degraded at very shallow depths in which, to improve the estimates, the anatomy images need to be positioned in a suitable way. Rules for this positioning are presented. The skin is considered one of the organs at risk in all BNCT treatments and, in the particular case of cutaneous melanoma of extremities, limits the delivered dose to the patient. Therefore, the performance of the voxel technique is deeply analysed in these shallow regions. A theoretical analysis is carried out to assess the distortion caused by homogenization and material percentage rounding processes. Then, a new strategy for the treatment of surface voxels is proposed and tested using two different irradiation problems. For a parallelepiped phantom perpendicularly irradiated with a 5 keV neutron source, the large thermal neutron fluence deviation present at shallow depths (from 54% at 0 mm depth to 5% at 4 mm depth) is reduced to 2% on average

  7. A Hybrid Fuzzy Model for Lean Product Development Performance Measurement

    Science.gov (United States)

    Osezua Aikhuele, Daniel; Mohd Turan, Faiz

    2016-02-01

    In the effort for manufacturing companies to meet up with the emerging consumer demands for mass customized products, many are turning to the application of lean in their product development process, and this is gradually moving from being a competitive advantage to a necessity. However, due to lack of clear understanding of the lean performance measurements, many of these companies are unable to implement and fully integrated the lean principle into their product development process. Extensive literature shows that only few studies have focus systematically on the lean product development performance (LPDP) evaluation. In order to fill this gap, the study therefore proposed a novel hybrid model based on Fuzzy Reasoning Approach (FRA), and the extension of Fuzzy-AHP and Fuzzy-TOPSIS methods for the assessment of the LPDP. Unlike the existing methods, the model considers the importance weight of each of the decision makers (Experts) since the performance criteria/attributes are required to be rated, and these experts have different level of expertise. The rating is done using a new fuzzy Likert rating scale (membership-scale) which is designed such that it can address problems resulting from information lost/distortion due to closed-form scaling and the ordinal nature of the existing Likert scale.

  8. Modelling of LOCA Tests with the BISON Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, Richard L [Idaho National Laboratory; Pastore, Giovanni [Idaho National Laboratory; Novascone, Stephen Rhead [Idaho National Laboratory; Spencer, Benjamin Whiting [Idaho National Laboratory; Hales, Jason Dean [Idaho National Laboratory

    2016-05-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that is under development at Idaho National Laboratory (USA). Recent advances of BISON include the extension of the code to the analysis of LWR fuel rod behaviour during loss-of-coolant accidents (LOCAs). In this work, BISON models for the phenomena relevant to LWR cladding behaviour during LOCAs are described, followed by presentation of code results for the simulation of LOCA tests. Analysed experiments include separate effects tests of cladding ballooning and burst, as well as the Halden IFA-650.2 fuel rod test. Two-dimensional modelling of the experiments is performed, and calculations are compared to available experimental data. Comparisons include cladding burst pressure and temperature in separate effects tests, as well as the evolution of fuel rod inner pressure during ballooning and time to cladding burst. Furthermore, BISON three-dimensional simulations of separate effects tests are performed, which demonstrate the capability to reproduce the effect of azimuthal temperature variations in the cladding. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project, and the IAEA Coordinated Research Project FUMAC.

  9. Performance Evaluation of the Prototype Model NEXT Ion Thruster

    Science.gov (United States)

    Herman, Daniel A.; Soulas, George C.; Patterson, Michael J.

    2008-01-01

    The performance testing results of the first prototype model NEXT ion engine, PM1, are presented. The NEXT program has developed the next generation ion propulsion system to enhance and enable Discovery, New Frontiers, and Flagship-type NASA missions. The PM1 thruster exhibits operational behavior consistent with its predecessors, the engineering model thrusters, with substantial mass savings, enhanced thermal margins, and design improvements for environmental testing compliance. The dry mass of PM1 is 12.7 kg. Modifications made in the thruster design have resulted in improved performance and operating margins, as anticipated. PM1 beginning-of-life performance satisfies all of the electric propulsion thruster mission-derived technical requirements. It demonstrates a wide range of throttleability by processing input power levels from 0.5 to 6.9 kW. At 6.9 kW, the PM1 thruster demonstrates specific impulse of 4190 s, 237 mN of thrust, and a thrust efficiency of 0.71. The flat beam profile, flatness parameters vary from 0.66 at low-power to 0.88 at full-power, and advanced ion optics reduce localized accelerator grid erosion and increases margins for electron backstreaming, impingement-limited voltage, and screen grid ion transparency. The thruster throughput capability is predicted to exceed 750 kg of xenon, an equivalent of 36,500 hr of continuous operation at the full-power operating condition.

  10. Modeling the performance of coated LPG tanks engulfed in fires

    Energy Technology Data Exchange (ETDEWEB)

    Landucci, Gabriele [CONPRICI - Dipartimento di Ingegneria Chimica, Chimica Industriale e Scienza dei Materiali, Universita di Pisa, via Diotisalvi n.2, 56126 Pisa (Italy); Molag, Menso [Nederlandse Organisatie voor toegepast-natuurwetenschappelijk onderzoek TNO, Princetonlaan 6, 3584 CB Utrecht (Netherlands); Cozzani, Valerio, E-mail: valerio.cozzani@unibo.it [CONPRICI - Dipartimento di Ingegneria Chimica, Mineraria e delle Tecnologie Ambientali, Alma Mater Studiorum - Universita di Bologna, Via Terracini 28 - 40131 Bologna (Italy)

    2009-12-15

    The improvement of passive fire protection of storage vessels is a key factor to enhance safety among the LPG distribution chain. A thermal and mechanical model based on finite elements simulations was developed to assess the behaviour of full size tanks used for LPG storage and transportation in fire engulfment scenarios. The model was validated by experimental results. A specific analysis of the performance of four different reference coating materials was then carried out, also defining specific key performance indicators (KPIs) to assess design safety margins in near-miss simulations. The results confirmed the wide influence of coating application on the expected vessel time to failure due to fire engulfment. A quite different performance of the alternative coating materials was evidenced. General correlations were developed among the vessel time to failure and the effective coating thickness in full engulfment scenarios, providing a preliminary assessment of the coating thickness required to prevent tank rupture for a given time lapse. The KPIs defined allowed the assessment of the available safety margins in the reference scenarios analyzed and of the robustness of thermal protection design.

  11. Modeling the performance of coated LPG tanks engulfed in fires.

    Science.gov (United States)

    Landucci, Gabriele; Molag, Menso; Cozzani, Valerio

    2009-12-15

    The improvement of passive fire protection of storage vessels is a key factor to enhance safety among the LPG distribution chain. A thermal and mechanical model based on finite elements simulations was developed to assess the behaviour of full size tanks used for LPG storage and transportation in fire engulfment scenarios. The model was validated by experimental results. A specific analysis of the performance of four different reference coating materials was then carried out, also defining specific key performance indicators (KPIs) to assess design safety margins in near-miss simulations. The results confirmed the wide influence of coating application on the expected vessel time to failure due to fire engulfment. A quite different performance of the alternative coating materials was evidenced. General correlations were developed among the vessel time to failure and the effective coating thickness in full engulfment scenarios, providing a preliminary assessment of the coating thickness required to prevent tank rupture for a given time lapse. The KPIs defined allowed the assessment of the available safety margins in the reference scenarios analyzed and of the robustness of thermal protection design.

  12. Storage Capacity Modeling of Reservoir Systems Employing Performance Measures

    Directory of Open Access Journals (Sweden)

    Issa Saket Oskoui

    2014-12-01

    Full Text Available Developing a prediction relationship for total (i.e. within-year plus over-year storage capacity of reservoir systems is beneficial because it can be used as an alternative to the analysis of reservoirs during designing stage and gives an opportunity to planner to examine and compare different cases in a fraction of time required for complete analysis where detailed analysis is not necessary. Existing relationships for storage capacity are mostly capable of estimating over-year storage capacity and total storage capacity can be obtained through relationships for adjusting over-year capacity and there is no independent relationship to estimate total storage capacity. Moreover these relationships do not involve vulnerability performance criterion and are not verified for Malaysia Rivers. In this study two different reservoirs in Southern part of Peninsular Malaysia, Melaka and Muar, are analyzed through a Monte Carlo simulation approach involving performance metrics. Subsequently the storage capacity results of the simulation are compared with those of the well-known existing equations. It is observed that existing models may not predict total capacity appropriately for Malaysian reservoirs. Consequently, applying the simulation results, two separate regression equations are developed to model total storage capacity of study reservoirs employing time based reliability and vulnerability performance measures.

  13. Mixing Model Performance in Non-Premixed Turbulent Combustion

    Science.gov (United States)

    Pope, Stephen B.; Ren, Zhuyin

    2002-11-01

    In order to shed light on their qualitative and quantitative performance, three different turbulent mixing models are studied in application to non-premixed turbulent combustion. In previous works, PDF model calculations with detailed kinetics have been shown to agree well with experimental data for non-premixed piloted jet flames. The calculations from two different groups using different descriptions of the chemistry and turbulent mixing are capable of producing the correct levels of local extinction and reignition. The success of these calculations raises several questions, since it is not clear that the mixing models used contain an adequate description of the processes involved. To address these questions, three mixing models (IEM, modified Curl and EMST) are applied to a partially-stirred reactor burning hydrogen in air. The parameters varied are the residence time and the mixing time scale. For small relative values of the mixing time scale (approaching the perfectly-stirred limit) the models yield the same extinction behavior. But for larger values, the behavior is distictly different, with EMST being must resistant to extinction.

  14. Modeling impact of environmental factors on photovoltaic array performance

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jie; Sun, Yize; Xu, Yang [College of Mechanical Engineering, Donghua University NO.2999, North Renmin Road, Shanghai (China)

    2013-07-01

    It is represented in this paper that a methodology to model and quantify the impact of the three environmental factors, the ambient temperature, the incident irradiance and the wind speed, upon the performance of photovoltaic array operating under outdoor conditions. First, A simple correlation correlating operating temperature with the three environmental variables is validated for a range of wind speed studied, 2-8, and for irradiance values between 200 and 1000. Root mean square error (RMSE) between modeled operating temperature and measured values is 1.19% and the mean bias error (MBE) is -0.09%. The environmental factors studied influence I-V curves, P-V curves, and maximum-power outputs of photovoltaic array. The cell-to-module-to-array mathematical model for photovoltaic panels is established in this paper and the method defined as segmented iteration is adopted to solve the I-V curve expression to relate model I-V curves. The model I-V curves and P-V curves are concluded to coincide well with measured data points. The RMSE between numerically calculated maximum-power outputs and experimentally measured ones is 0.2307%, while the MBE is 0.0183%. In addition, a multivariable non-linear regression equation is proposed to eliminate the difference between numerically calculated values and measured ones of maximum power outputs over the range of high ambient temperature and irradiance at noon and in the early afternoon. In conclusion, the proposed method is reasonably simple and accurate.

  15. Evaluation of multidimensional models of WAIS-IV subtest performance.

    Science.gov (United States)

    McFarland, Dennis J

    2017-04-21

    The present study examined the extent to which the covariance structure of the WAIS-IV is best accounted for by models that assume that test performance is the result of group-level factors and multiple independent general factors. Structural models with one to four general factors were evaluated with either four or five group-level factors. Simulations based on four general factors were run to clarify the adequacy of the estimates of the allocation of covariance by the models. Four independent general factors provided better fit than a single general factor for either model with four or five group-level factors. While one of the general factors had much larger loadings than all other factors, simulation results suggested that this might be an artifact of the statistical procedure rather than a reflection of the nature of individual differences in cognitive abilities. These results argue against the contention that clinical interpretation of cognitive test batteries should primarily be at the level of general intelligence. It is a fallacy to assume that factor analysis can reveal the structure of human abilities. Test validity should not be based solely on the results of modeling the covariance of test batteries.

  16. Urban Modelling Performance of Next Generation SAR Missions

    Science.gov (United States)

    Sefercik, U. G.; Yastikli, N.; Atalay, C.

    2017-09-01

    In synthetic aperture radar (SAR) technology, urban mapping and modelling have become possible with revolutionary missions TerraSAR-X (TSX) and Cosmo-SkyMed (CSK) since 2007. These satellites offer 1m spatial resolution in high-resolution spotlight imaging mode and capable for high quality digital surface model (DSM) acquisition for urban areas utilizing interferometric SAR (InSAR) technology. With the advantage of independent generation from seasonal weather conditions, TSX and CSK DSMs are much in demand by scientific users. The performance of SAR DSMs is influenced by the distortions such as layover, foreshortening, shadow and double-bounce depend up on imaging geometry. In this study, the potential of DSMs derived from convenient 1m high-resolution spotlight (HS) InSAR pairs of CSK and TSX is validated by model-to-model absolute and relative accuracy estimations in an urban area. For the verification, an airborne laser scanning (ALS) DSM of the study area was used as the reference model. Results demonstrated that TSX and CSK urban DSMs are compatible in open, built-up and forest land forms with the absolute accuracy of 8-10 m. The relative accuracies based on the coherence of neighbouring pixels are superior to absolute accuracies both for CSK and TSX.

  17. Photovoltaic Pixels for Neural Stimulation: Circuit Models and Performance.

    Science.gov (United States)

    Boinagrov, David; Lei, Xin; Goetz, Georges; Kamins, Theodore I; Mathieson, Keith; Galambos, Ludwig; Harris, James S; Palanker, Daniel

    2016-02-01

    Photovoltaic conversion of pulsed light into pulsed electric current enables optically-activated neural stimulation with miniature wireless implants. In photovoltaic retinal prostheses, patterns of near-infrared light projected from video goggles onto subretinal arrays of photovoltaic pixels are converted into patterns of current to stimulate the inner retinal neurons. We describe a model of these devices and evaluate the performance of photovoltaic circuits, including the electrode-electrolyte interface. Characteristics of the electrodes measured in saline with various voltages, pulse durations, and polarities were modeled as voltage-dependent capacitances and Faradaic resistances. The resulting mathematical model of the circuit yielded dynamics of the electric current generated by the photovoltaic pixels illuminated by pulsed light. Voltages measured in saline with a pipette electrode above the pixel closely matched results of the model. Using the circuit model, our pixel design was optimized for maximum charge injection under various lighting conditions and for different stimulation thresholds. To speed discharge of the electrodes between the pulses of light, a shunt resistor was introduced and optimized for high frequency stimulation.

  18. Significance of stromal-1 and stromal-2 signatures and biologic prognostic model in diffuse large B-cell lymphoma

    Science.gov (United States)

    Abdou, Asmaa Gaber; Asaad, Nancy; Kandil, Mona; Shabaan, Mohammed; Shams, Asmaa

    2017-01-01

    Objective : Diffuse Large B Cell Lymphoma (DLBCL) is a heterogeneous group of tumors with different biological and clinical characteristics that have diverse clinical outcomes and response to therapy. Stromal-1 signature of tumor microenvironment of DLBCL represents extracellular matrix deposition and histiocytic infiltrate, whereas stromal-2 represents angiogenesis that could affect tumor progression. Methods : The aim of the present study is to assess the significance of stromal-1 signature using SPARC-1 and stromal-2 signature using CD31 expression and then finally to construct biologic prognostic model (BPM) in 60 cases of DLBCL via immunohistochemistry. Results : Microvessel density (PBPM showed that 42 cases (70%) were of low biologic score (0–1) and 18 cases (30%) were of high biologic score (2–3). Low BPM cases showed less probability for splenic involvement (P=0.04) and a higher rate of complete response to therapy compared with high score cases (P=0.08). Conclusions : The DLBCL microenvironment could modulate tumor progression behavior since angiogenesis and SPARC positive stromal cells promote dissemination by association with spleen involvement and capsular invasion. Biologic prognostic models, including modified BPM, which considered cell origin of DLBCL and stromal signature pathways, could determine DLBCL progression and response to therapy. PMID:28607806

  19. Attenuation of Rhes activity significantly delays the appearance of behavioral symptoms in a mouse model of Huntington's disease.

    Directory of Open Access Journals (Sweden)

    Brandon A Baiamonte

    Full Text Available Huntington's disease (HD is a neuropsychiatric disorder characterized by choreiform movement of the limbs, cognitive disability, psychosis and dementia. It is invariably associated with an abnormally long CAG expansion within the IT15 gene on human chromosome 4. Although the mutant huntingtin protein is ubiquitously expressed in HD patients, cellular degeneration occurs predominantly in neurons within the corpus striatum and cerebral cortex. The Ras homolog Rhes is expressed very selectively in the precise brain areas affected by HD. Recent in vitro work suggests that Rhes may be a co-factor with mutant huntingtin in cell death. The objective of the present study was to examine whether the inhibition of Rhes would attenuate or delay the symptoms of HD in vivo. We used a transgenic mouse model of HD crossed with Rhes knockout mice to show that the behavioral symptoms of HD are regulated by Rhes. HD(+/Rhes(-/- mice showed significantly delayed expression of HD-like symptoms in this in vivo model. Drugs that block or inhibit the actions of Rhes may be useful as the first treatments for HD.

  20. A logistical model for performance evaluations of hybrid generation systems

    Energy Technology Data Exchange (ETDEWEB)

    Bonanno, F.; Consoli, A.; Raciti, A. [Univ. of Catania (Italy). Dept. of Electrical, Electronic, and Systems Engineering; Lombardo, S. [Schneider Electric SpA, Torino (Italy)

    1998-11-01

    In order to evaluate the fuel and energy savings, and to focus on the problems related to the exploitation of combined renewable and conventional energies, a logistical model for hybrid generation systems (HGS`s) has been prepared. A software package written in ACSL, allowing easy handling of the models and data of the HGS components, is presented. A special feature of the proposed model is that an auxiliary fictitious source is introduced in order to obtain the power electric balance at the busbars during the simulation state and, also, in the case of ill-sized components. The observed imbalance powers are then used to update the system design. As a case study, the simulation program is applied to evaluate the energetic performance of a power plant relative to a small isolated community, and island in the Mediterranean Sea, in order to establish the potential improvement achievable via an optimal integration of renewable energy sources in conventional plants. Evaluations and comparisons among different-sized wind, photovoltaic, and diesel groups, as well as of different management strategies have been performed using the simulation package and are reported and discussed in order to present the track followed to select the final design.