WorldWideScience

Sample records for high performance measures

  1. Building and measuring a high performance network architecture

    Kramer, William T.C.; Toole, Timothy; Fisher, Chuck; Dugan, Jon; Wheeler, David; Wing, William R; Nickless, William; Goddard, Gregory; Corbato, Steven; Love, E. Paul; Daspit, Paul; Edwards, Hal; Mercer, Linden; Koester, David; Decina, Basil; Dart, Eli; Paul Reisinger, Paul; Kurihara, Riki; Zekauskas, Matthew J; Plesset, Eric; Wulf, Julie; Luce, Douglas; Rogers, James; Duncan, Rex; Mauth, Jeffery

    2001-04-20

    Once a year, the SC conferences present a unique opportunity to create and build one of the most complex and highest performance networks in the world. At SC2000, large-scale and complex local and wide area networking connections were demonstrated, including large-scale distributed applications running on different architectures. This project was designed to use the unique opportunity presented at SC2000 to create a testbed network environment and then use that network to demonstrate and evaluate high performance computational and communication applications. This testbed was designed to incorporate many interoperable systems and services and was designed for measurement from the very beginning. The end results were key insights into how to use novel, high performance networking technologies and to accumulate measurements that will give insights into the networks of the future.

  2. Oxide thickness measurement for monitoring fuel performance at high burnup

    Jaeger, M.A.; Van Swam, L.F.P.; Brueck-Neufeld, K.

    1991-01-01

    For on-site monitoring of the fuel performance at high burnup, Advanced Nuclear Fuels uses the linear scan eddy current method to determine the oxide thickness of irradiated Zircaloy fuel cans. Direct digital data acquisition methods are employed to collect the data on magnetic storage media. This field-proven methodology allows oxide thickness measurements and rapid interpretation of the data during the reactor outages and makes it possible to immediately reinsert the assemblies for the next operating cycle. The accuracy of the poolside measurements and data acquisition/interpretation techniques have been verified through hot cell metallographic measurements of rods previously measured in the fuel pool. The accumulated data provide a valuable database against which oxide growth models have been benchmarked and allow for effective monitoring of fuel performance. (orig.) [de

  3. Design and experimentally measure a high performance metamaterial filter

    Xu, Ya-wen; Xu, Jing-cheng

    2018-03-01

    Metamaterial filter is a kind of expecting optoelectronic device. In this paper, a metal/dielectric/metal (M/D/M) structure metamaterial filter is simulated and measured. Simulated results indicate that the perfect impedance matching condition between the metamaterial filter and the free space leads to the transmission band. Measured results show that the proposed metamaterial filter achieves high performance transmission on TM and TE polarization directions. Moreover, the high transmission rate is also can be obtained when the incident angle reaches to 45°. Further measured results show that the transmission band can be expanded through optimizing structural parameters. The central frequency of the transmission band is also can be adjusted through optimizing structural parameters. The physical mechanism behind the central frequency shifted is solved through establishing an equivalent resonant circuit model.

  4. High Job Performance Through Co-Developing Performance Measures With Employees

    Groen, Bianca A.C.; Wilderom, Celeste P.M.; Wouters, Marc

    2017-01-01

    According to various studies, employee participation in the development of performance measures can increase job performance. This study focuses on how this job performance elevation occurs. We hypothesize that when employees have participated in the development of performance measures, they

  5. Performance Measurements in a High Throughput Computing Environment

    AUTHOR|(CDS)2145966; Gribaudo, Marco

    The IT infrastructures of companies and research centres are implementing new technologies to satisfy the increasing need of computing resources for big data analysis. In this context, resource profiling plays a crucial role in identifying areas where the improvement of the utilisation efficiency is needed. In order to deal with the profiling and optimisation of computing resources, two complementary approaches can be adopted: the measurement-based approach and the model-based approach. The measurement-based approach gathers and analyses performance metrics executing benchmark applications on computing resources. Instead, the model-based approach implies the design and implementation of a model as an abstraction of the real system, selecting only those aspects relevant to the study. This Thesis originates from a project carried out by the author within the CERN IT department. CERN is an international scientific laboratory that conducts fundamental researches in the domain of elementary particle physics. The p...

  6. High performance gamma measurements of equipment retrieved from Hanford high-level nuclear waste tanks

    Troyer, G.L.

    1997-03-17

    The cleanup of high level defense nuclear waste at the Hanford site presents several progressive challenges. Among these is the removal and disposal of various components from buried active waste tanks to allow new equipment insertion or hazards mitigation. A unique automated retrieval system at the tank provides for retrieval, high pressure washing, inventory measurement, and containment for disposal. Key to the inventory measurement is a three detector HPGe high performance gamma spectroscopy system capable of recovering data at up to 90% saturation (200,000 counts per second). Data recovery is based on a unique embedded electronic pulser and specialized software to report the inventory. Each of the detectors have different shielding specified through Monte Carlo simulation with the MCNP program. This shielding provides performance over a dynamic range of eight orders of magnitude. System description, calibration issues and operational experiences are discussed.

  7. High performance gamma measurements of equipment retrieved from Hanford high-level nuclear waste tanks

    Troyer, G.L.

    1997-01-01

    The cleanup of high level defense nuclear waste at the Hanford site presents several progressive challenges. Among these is the removal and disposal of various components from buried active waste tanks to allow new equipment insertion or hazards mitigation. A unique automated retrieval system at the tank provides for retrieval, high pressure washing, inventory measurement, and containment for disposal. Key to the inventory measurement is a three detector HPGe high performance gamma spectroscopy system capable of recovering data at up to 90% saturation (200,000 counts per second). Data recovery is based on a unique embedded electronic pulser and specialized software to report the inventory. Each of the detectors have different shielding specified through Monte Carlo simulation with the MCNP program. This shielding provides performance over a dynamic range of eight orders of magnitude. System description, calibration issues and operational experiences are discussed

  8. Measure Guideline: Three High Performance Mineral Fiber Insulation Board Retrofit Solutions

    Neuhauser, Ken [Building Science Corporation, Westford, MA (United States)

    2015-01-01

    This Measure Guideline describes a high performance enclosure retrofit package that uses mineral fiber insulation board. The Measure Guideline describes retrofit assembly and details for wood frame roof and walls and for cast concrete foundations. This Measure Guideline is intended to serve contractors and designers seeking guidance for non-foam exterior insulation retrofit.

  9. Measure Guideline: Ventilation Guidance for Residential High-Performance New Construction - Multifamily

    Lstiburek, Joseph [Building Science Corporation, Westford, MA (United States)

    2017-01-01

    The measure guideline provides ventilation guidance for residential high performance multifamily construction that incorporates the requirements of the ASHRAE 62.2 ventilation and indoor air quality standard. The measure guideline focus is on the decision criteria for weighing cost and performance of various ventilation systems. The measure guideline is intended for contractors, builders, developers, designers and building code officials. The guide may also be helpful to building owners wishing to learn more about ventilation strategies available for their buildings. The measure guideline includes specific design and installation instructions for the most cost effective and performance effective solutions for ventilation in multifamily units that satisfies the requirements of ASHRAE 62.2-2016.

  10. Measure Guideline: Three High Performance Mineral Fiber Insulation Board Retrofit Solutions

    Neuhauser, K. [Building Science Corporation, Westford, MA (United States)

    2015-01-01

    This Measure Guideline describes a high performance enclosure retrofit package that uses mineral fiber insulation board, and is intended to serve contractors and designers seeking guidance for non-foam exterior insulation retrofit processes. The guideline describes retrofit assembly and details for wood frame roof and walls and for cast concrete foundations.

  11. Optical performance evaluation of a solar furnace by measuring the highly concentrated solar flux

    Lee, Hyunjin; Chai, Kwankyo; Kim, Jongkyu; Lee, Sangnam; Yoon, Hwanki; Yu, Changkyun; Kang, Yongheack

    2014-01-01

    We evaluated optical performance of a solar furnace in the KIER (Korea Institute of Energy Research) by measuring the highly concentrated solar flux with the flux mapping method. We presented and analyzed optical performance in terms of concentrated solar flux distribution and power distribution. We investigated concentration ratio, stagnation temperature, total power, and concentration accuracy with help of a modeling code based on the ray tracing method and thereby compared with other solar furnaces. We also discussed flux changes by shutter opening angles and by position adjustment of reflector facets. In the course of flux analysis, we provided a better understanding of reference flux measurement for calibration, reflectivity measurement with a portable reflectometer, shadowing area consideration for effective irradiation, as well as accuracy and repeatability of flux measurements. The results in the present study will help proper utilization of a solar furnace by facilitating comparison between flux measurements at different conditions and flux estimation during operation

  12. Between-game variation of physical soccer performance measures in highly trained youth soccer players.

    Doncaster, Greg; Unnithan, Viswanath

    2017-07-12

    To assess the between-game variation in measures of physical performance during 11 v 11 soccer match-play, over a short period of time, in highly trained youth soccer players. A single cohort observational study design was employed. Physical match performance data were collected from 17 male, highly trained youth soccer players (age: 13.3 ± 0.4 y) over three, 2 x 20min, 11 v 11 matches. Using 10 Hz GPS, the variables selected for analyses were total distance (TD), high-speed running (HSR), very high-speed running (VHSR), number of high-speed running efforts (HSReff) and number of very high-speed running efforts (VHSReff). Match data was also separated into cumulative 5 min epochs, to identify the peak 5 min epoch and the mean of the cumulative 5 min epochs for each match. Variability was quantified using the coefficient of variation (CV), Standard error of measurement (SEM) and intra-class correlation coefficient (ICC). Between- and within-player smallest worthwhile changes (SWC) were also calculated for each variable to aid in the interpretation of the data. Analysis of the variance between games reported a low CV for TD (3.8%) but larger CVs for HSR (33.3%), HSReff (35.4%) and VHSR and VHSReff (59.6 and 57.4 %, respectively). Analysis of 5 min epochs (peak and average) found an increase in the CVs beyond that of the values reported for the whole match. Between-player SWC in high intensity physical performance data ranged from 24.7 - 42.4 %, whereas within-player SWC ranged from 1.2 - 79.9%. The between-game variability of high and very high intensity activities in youth soccer players, across three soccer matches over a short period of time (2 weeks), is relatively 'large' and specific to the individual, thus highlighting the need for caution when interpreting physical performance data between games and players.

  13. A High Performance Sensor for Triaxial Cutting Force Measurement in Turning

    You Zhao

    2015-04-01

    Full Text Available This paper presents a high performance triaxial cutting force sensor with excellent accuracy, favorable natural frequency and acceptable cross-interference for high speed turning process. Octagonal ring is selected as sensitive element of the designed sensor, which is drawn inspiration from ring theory. A novel structure of two mutual-perpendicular octagonal rings is proposed and three Wheatstone full bridge circuits are specially organized in order to obtain triaxial cutting force components and restrain cross-interference. Firstly, the newly developed sensor is tested in static calibration; test results indicate that the sensor possesses outstanding accuracy in the range of 0.38%–0.83%. Secondly, impacting modal tests are conducted to identify the natural frequencies of the sensor in triaxial directions (i.e., 1147 Hz, 1122 Hz and 2035 Hz, which implies that the devised sensor can be used for cutting force measurement in a high speed lathe when the spindle speed does not exceed 17,205 rev/min in continuous cutting condition. Finally, an application of the sensor in turning process is operated to show its performance for real-time cutting force measurement; the measured cutting forces demonstrate a good accordance with the variation of cutting parameters. Thus, the developed sensor possesses perfect properties and it gains great potential for real-time cutting force measurement in turning.

  14. Acceleration performance of individual European sea bass Dicentrarchus labrax measured with a sprint performance chamber: comparison with high-speed cinematography and correlates with ecological performance.

    Vandamm, Joshua P; Marras, Stefano; Claireaux, Guy; Handelsman, Corey A; Nelson, Jay A

    2012-01-01

    Locomotor performance can influence the ecological and evolutionary success of a species. For fish, favorable outcomes of predator-prey encounters are often presumably due to robust acceleration ability. Although escape-response or "fast-start" studies utilizing high-speed cinematography are prevalent, little is known about the contribution of relative acceleration performance to ecological or evolutionary success in a species. This dearth of knowledge may be due to the time-consuming nature of analyzing film, which imposes a practical limit on sample sizes. Herein, we present a high-throughput potential alternative for measuring fish acceleration performance using a sprint performance chamber (SPC). The acceleration performance of a large number of juvenile European sea bass (Dicentrarchus labrax) from two populations was analyzed. Animals from both hatchery and natural ontogenies were assessed, and animals of known acceleration ability had their ecological performance measured in a mesocosm environment. Individuals from one population also had their acceleration performance assessed by both high-speed cinematography and an SPC. Acceleration performance measured in an SPC was lower than that measured by classical high-speed video techniques. However, short-term repeatability and interindividual variation of acceleration performance were similar between the two techniques, and the SPC recorded higher sprint swimming velocities. Wild fish were quicker to accelerate in an SPC and had significantly greater accelerations than all groups of hatchery-raised fish. Acceleration performance had no significant effect on ecological performance (as assessed through animal growth and survival in the mesocosms). However, it is worth noting that wild animals did survive predation in the mesocosm better than farmed ones. Moreover, the hatchery-originated fish that survived the mesocosm experiment, when no predators were present, displayed significantly increased acceleration

  15. Validation of the high performance leadership competencies as measured by an assessment centre in-basket

    H. H. Spangenberg

    2003-10-01

    Full Text Available The purpose of this study was to validate Schroder’s High Performance Leadership Competencies (HPLCs, measured by a specially designed In-basket, against multiple criteria. These consisted of six measures of managerial success, representing managerial advancement and salary progress criteria, and a newly developed comprehensive measure of work unit performance, the Performance Index. An environmental dynamism and complexity questionnaire served as moderator variable. Results indicated disappointing predictive validity quotients for the HPLCs as measured by an In-basket, in contrast to satisfactory predictive and construct validity obtained in previous studies by means of a full assessment centre. The implications of the findings are discussed and suggestions are made for improving the validity of the In-basket. Opsomming Die doel van hierdie studie was die validering van Schroder se Hoëvlak Leierskapsbevoegdhede, gemeet deur ‘n spesiaal ontwerpte Posmandjie, teen veelvoudige kriteria. Dit behels ses metings van bestuursukses wat bestuursbevorderings- en salarisvorderingskriteria insluit, sowel as ‘n nuutontwikkelde, omvattende meting van werkeenheidsprestasie, die Prestasie indeks. ‘n Vraelys wat die dinamika en kompleksiteit van die omgewing meet, het as moderator veranderlike gedien. Resultate dui op teleurstellende geldigheidskwosiënte vir die Hoëvlak Leierskapsbevoegdhede soos gemeet deur ‘n posmandjie, in teenstelling met bevredigende voorspellings- en konstrukgeldigheid wat in vorige studies deur middel van ‘n volle takseersentrum verkry is. Die bevindinge word bespreek en voorstelle word gemaak om die geldigheidskwosiënte te verbeter.

  16. A high performance Time-of-Flight detector applied to isochronous mass measurement at CSRe

    Mei Bo; Tu Xiaolin; Wang Meng; Xu Hushan; Mao Ruishi; Hu Zhengguo; Ma Xinwen; Yuan Youjin; Zhang Xueying; Geng Peng; Shuai Peng; Zang Yongdong; Tang Shuwen; Ma Peng; Lu Wan; Yan Xinshuai; Xia Jiawen; Xiao Guoqing; Guo Zhongyan; Zhang Hongbin

    2010-01-01

    A high performance Time-of-Flight detector has been designed and constructed for isochronous mass spectrometry at the experimental Cooler Storage Ring (CSRe). The detector has been successfully used in an experiment to measure the masses of the N∼Z∼33 nuclides near the proton drip-line. Of particular interest is the mass of 65 As. A maximum detection efficiency of 70% and a time resolution of 118±8 ps (FWHM) have been achieved in the experiment. The dependence of detection efficiency and signal average pulse height (APH) on atomic number Z has been studied. The potential of APH for Z identification has been discussed.

  17. Airborne and Ground-Based Measurements Using a High-Performance Raman Lidar

    Whiteman, David N.; Rush, Kurt; Rabenhorst, Scott; Welch, Wayne; Cadirola, Martin; McIntire, Gerry; Russo, Felicita; Adam, Mariana; Venable, Demetrius; Connell, Rasheen; hide

    2010-01-01

    A high-performance Raman lidar operating in the UV portion of the spectrum has been used to acquire, for the first time using a single lidar, simultaneous airborne profiles of the water vapor mixing ratio, aerosol backscatter, aerosol extinction, aerosol depolarization and research mode measurements of cloud liquid water, cloud droplet radius, and number density. The Raman Airborne Spectroscopic Lidar (RASL) system was installed in a Beechcraft King Air B200 aircraft and was flown over the mid-Atlantic United States during July August 2007 at altitudes ranging between 5 and 8 km. During these flights, despite suboptimal laser performance and subaperture use of the telescope, all RASL measurement expectations were met, except that of aerosol extinction. Following the Water Vapor Validation Experiment Satellite/Sondes (WAVES_2007) field campaign in the summer of 2007, RASL was installed in a mobile trailer for groundbased use during the Measurements of Humidity and Validation Experiment (MOHAVE-II) field campaign held during October 2007 at the Jet Propulsion Laboratory s Table Mountain Facility in southern California. This ground-based configuration of the lidar hardware is called Atmospheric Lidar for Validation, Interagency Collaboration and Education (ALVICE). During theMOHAVE-II field campaign, during which only nighttime measurements were made, ALVICE demonstrated significant sensitivity to lower-stratospheric water vapor. Numerical simulation and comparisons with a cryogenic frost-point hygrometer are used to demonstrate that a system with the performance characteristics of RASL ALVICE should indeed be able to quantify water vapor well into the lower stratosphere with extended averaging from an elevated location like Table Mountain. The same design considerations that optimize Raman lidar for airborne use on a small research aircraft are, therefore, shown to yield significant dividends in the quantification of lower-stratospheric water vapor. The MOHAVE

  18. Measurement of Fast Voltage Transients in High-Performance Nb3Sn Magnets

    Lietzke, A. F.; Sabbi., G. L.; Ferracin, P.; Caspi, S.; Zimmerman, S.; Joseph, J.; Doering, D.; Lizarazo, J.

    2008-06-01

    The Superconducting Magnet group at Lawrence Berkeley National Laboratory has been developing Nb{sub 3}Sn high-field accelerator magnet technology for the last fifteen years. In order to support the magnet R&D effort, we are developing a diagnostic system that can help identify the causes of performance limiting quenches by recording small flux-changes within the magnet prior to quench-onset. These analysis techniques were applied to the test results from recent Nb{sub 3}Sn magnets. This paper will examine various types of events and their distinguishing characteristics. The present measurement techniques are discussed along with the design of a new data acquisition system that will substantially improve the quality of the recorded signals.

  19. Burn-up measurements on nuclear reactor fuels using high performance liquid chromatography

    Sivaraman, N.; Subramaniam, S.; Srinivasan, T.G.; Vasudeva Rao, P.R.

    2002-01-01

    Burn-up measurements on thermal as well as fast reactor fuels were carried out using high performance liquid chromatography (HPLC). A column chromatographic technique using di-(2-ethylhexyl) phosphoric acid (HDEHP) coated column was employed for the isolation of lanthanides from uranium, plutonium and other fission products. Ion-pair HPLC was used for the separation of individual lanthanides. The atom percent fissions were calculated from the concentrations of the lanthanide (neodymium in the case of thermal reactor and lanthanum for the fast reactor fuels) and from uranium and plutonium contents of the dissolver solutions. The HPLC method was also used for determining the fractional fissions from uranium and plutonium for the thermal reactor fuel. (author)

  20. Ultrasonic attenuation measurements at very high SNR: Correlation, information theory and performance

    Challis, Richard; Ivchenko, Vladimir; Al-Lashi, Raied

    2013-01-01

    This paper describes a system for ultrasonic wave attenuation measurements which is based on pseudo-random binary codes as transmission signals combined with on-the-fly correlation for received signal detection. The apparatus can receive signals in the nanovolt range against a noise background in the order of hundreds of microvolts and an analogue to digital convertor (ADC) bit-step also in the order of hundreds of microvolts. Very high signal to noise ratios (SNRs) are achieved without recourse to coherent averaging with its associated requirement for high sampling times. The system works by a process of dithering – in which very low amplitude received signals enter the dynamic range of the ADC by 'riding' on electronic noise at the system input. The amplitude of this 'useful noise' has to be chosen with care for an optimised design. The process of optimisation is explained on the basis of classical information theory and is achieved through a simple noise model. The performance of the system is examined for different transmitted code lengths and gain settings in the receiver chain. Experimental results are shown to verify the expected operation when the system is applied to a very highly attenuating material – an aerated slurry

  1. On the Performance of the Measure for Diagnosing Multiple High Leverage Collinearity-Reducing Observations

    Arezoo Bagheri

    2012-01-01

    Full Text Available There is strong evidence indicating that the existing measures which are designed to detect a single high leverage collinearity-reducing observation are not effective in the presence of multiple high leverage collinearity-reducing observations. In this paper, we propose a cutoff point for a newly developed high leverage collinearity-influential measure and two existing measures ( and to identify high leverage collinearity-reducing observations, the high leverage points which hide multicollinearity in a data set. It is important to detect these observations as they are responsible for the misleading inferences about the fitting of the regression model. The merit of our proposed measure and cutoff point in detecting high leverage collinearity-reducing observations is investigated by using engineering data and Monte Carlo simulations.

  2. Measurement of the Rheological Properties of High Performance Concrete: State of the Art Report

    Ferraris, Chiara F.

    1999-01-01

    The rheological or flow properties of concrete in general and of high performance concrete (HPC) in particular, are important because many factors such as ease of placement, consolidation, durability, and strength depend on the flow properties. Concrete that is not properly consolidated may have defects, such as honeycombs, air voids, and aggregate segregation. Such an important performance attribute has triggered the design of numerous test methods. Generally, the flow behavior of concrete approximates that of a Bingham fluid. Therefore, at least two parameters, yield stress and viscosity, are necessary to characterize the flow. Nevertheless, most methods measure only one parameter. Predictions of the flow properties of concrete from its composition or from the properties of its components are not easy. No general model exists, although some attempts have been made. This paper gives an overview of the flow properties of a fluid or a suspension, followed by a critical review of the most commonly used concrete rheology tests. Particular attention is given to tests that could be used for HPC. Tentative definitions of terms such as workability, consistency, and rheological parameters are provided. An overview of the most promising tests and models for cement paste is given.

  3. Current measurement in high-performance frequency converters; Strommessung in Hochleistungsumrichtern

    Marien, Jan; Hetzler, Ullrich [Isabellenhuette Heusler GmbH und Co. KG, Dillenburg (Germany); Hornung, Hans-Georg; Zwinger, Stefan [Sensor-Technik Wiedemann GmbH, Kaufbeuren (Germany)

    2011-04-15

    The load cycles (raising, lowering, accelerating, braking) of cranes, lift trucks and other off-road vehicles are ideally suited for the efficient deployment of hybrid or full electrical drive technology. Current measurement is a key technology for advancing electrification. Sensor Technik Wiedemann places by her frequency converters on a shunt-based current measurement module from Isabellenhuette Heusler which permits highly accurate measurements. (orig.)

  4. Measurement and simulation of the performance of high energy physics data grids

    Crosby, Paul Andrew

    This thesis describes a study of resource brokering in a computational Grid for high energy physics. Such systems are being devised in order to manage the unprecedented workload of the next generation particle physics experiments such as those at the Large Hadron Collider. A simulation of the European Data Grid has been constructed, and calibrated using logging data from a real Grid testbed. This model is then used to explore the Grid's middleware configuration, and suggest improvements to its scheduling policy. The expansion of the simulation to include data analysis of the type conducted by particle physicists is then described. A variety of job and data management policies are explored, in order to determine how well they meet the needs of physicists, as well as how efficiently they make use of CPU and network resources. Appropriate performance indicators are introduced in order to measure how well jobs and resources are managed from different perspectives. The effects of inefficiencies in Grid middleware are explored, as are methods of compensating for them. It is demonstrated that a scheduling algorithm should alter its weighting on load balancing and data distribution, depending on whether data transfer or CPU requirements dominate, and also on the level of job loading. It is also shown that an economic model for data management and replication can improve the efficiency of network use and job processing.

  5. AGE DIFFERENCES IN MEASURES OF FUNCTIONAL MOVEMENT AND PERFORMANCE IN HIGHLY YOUTH BASKETBALL PLAYERS.

    Gonzalo-Skok, Oliver; Serna, Jorge; Rhea, Matthew R; Marín, Pedro J

    2017-10-01

    There is a lack of information about the influence of age on functional movement tests (FMT) and performance tests as well as in their relationships in young basketball players. The purpose of the present study was to determine the variations in FMT and jump and/or sprint performance scores between age groups (U-14 vs. U-16) in Highly-trained young basketball players. The second purpose was to investigate the relationship between FMT for lower body and jump and/or sprint performance in highly-trained young (U-14 and U-16) male basketball players. Descriptive study. Thirty elite young (U-14 to U-16) male basketball players performed several FMT (weight-bearing dorsiflexion test [WB-DF] and a modified Star Excursion Balance test [SEBT]) and performance including unilateral and bilateral countermovement jumps, unilateral horizontal jumping, linear sprinting and performance tests. All anthropometric and performance tests showed a statistically significant advantage (pjump with left leg (p=0.127). Five out of the eight FMT performed showed a statistically significant advantage (pjump and/or sprint performance test between age groups (U-16 vs U-14). The findings of this study support the idea that the age of the player should be considered when interpreting FMT scores, which could have implications when implementing the FMT for injury risk prediction. 2b.

  6. Developing Effective Performance Measures

    2014-10-14

    University When Performance Measurement Goes Bad Laziness Vanity Narcissism Too Many Pettiness Inanity 52 Developing Effective...Kasunic, October 14, 2014 © 2014 Carnegie Mellon University Narcissism Measuring performance from the organization’s point of view, rather than from

  7. Performance Effects of Measurement and Analysis: Perspectives from CMMI High Maturity Organizations and Appraisers

    2010-06-01

    process performance. Prediction intervals. The unchecked techniques may be used, but I don’t have personal knowledge of it. SWOT SMART VI-1...not the result of the work of one or two bad apples . In nine of the 15 cases we had to completely rework their measurement 118 | CMU/SEI-2010-TR

  8. An optimized method for the measurement of acetaldehyde by high-performance liquid chromatography.

    Guan, Xiangying; Rubin, Emanuel; Anni, Helen

    2012-03-01

    Acetaldehyde is produced during ethanol metabolism predominantly in the liver by alcohol dehydrogenase and rapidly eliminated by oxidation to acetate via aldehyde dehydrogenase. Assessment of circulating acetaldehyde levels in biological matrices is performed by headspace gas chromatography and reverse phase high-performance liquid chromatography (RP-HPLC). We have developed an optimized method for the measurement of acetaldehyde by RP-HPLC in hepatoma cell culture medium, blood, and plasma. After sample deproteinization, acetaldehyde was derivatized with 2,4-dinitrophenylhydrazine (DNPH). The reaction was optimized for pH, amount of derivatization reagent, time, and temperature. Extraction methods of the acetaldehyde-hydrazone (AcH-DNP) stable derivative and product stability studies were carried out. Acetaldehyde was identified by its retention time in comparison with AcH-DNP standard, using a new chromatography gradient program, and quantitated based on external reference standards and standard addition calibration curves in the presence and absence of ethanol. Derivatization of acetaldehyde was performed at pH 4.0 with an 80-fold molar excess of DNPH. The reaction was completed in 40 minutes at ambient temperature, and the product was stable for 2 days. A clear separation of AcH-DNP from DNPH was obtained with a new 11-minute chromatography program. Acetaldehyde detection was linear up to 80 μM. The recovery of acetaldehyde was >88% in culture media and >78% in plasma. We quantitatively determined the ethanol-derived acetaldehyde in hepatoma cells, rat blood and plasma with a detection limit around 3 μM. The accuracy of the method was volume (70 μl) plasma sampling. An optimized method for the quantitative determination of acetaldehyde in biological systems was developed using derivatization with DNPH, followed by a short RP-HPLC separation of AcH-DNP. The method has an extended linear range, is reproducible and applicable to small-volume sampling of culture

  9. Development of twin Ge detector for high energy photon measurement and its performance

    Shigetome, Yoshiaki; Harada, Hideo [Power Reactor and Nuclear Fuel Development Corp., Tokai, Ibaraki (Japan). Tokai Works

    1998-03-01

    Prototype twin HPGe detector composed of two large HPGe crystals was developed to obtain better detection efficiency ({epsilon}) and P/T ratio, which was required for high energy photon spectroscopy. In this work, the performances of the twin HPGe detector were evaluated by computer simulation employing EGS4 code. (author)

  10. Performance Measurement und Environmental Performance Measurement

    Sturm, Anke

    2000-01-01

    Die Zielsetzung der vorliegenden Dissertationsschrift besteht in der Entwicklung einer systematisierten Vorgehensweise, eines Controllingmodells, zur unternehmensinternen Umweltleistungsmessung. Das entwickelte Environmental Performance Measurement (EPM)-Modell umfaßt die fünf Stufen Festlegung der Ziele der Umweltleistungsmessung (1. Stufe), Erfassung der Umwelteinflüsse nach der ökologischen Erfolgsspaltung (2. Stufe), Bewertung der Umwelteinflüsse auf der Grundlage des qualitätszielbezogen...

  11. Performance of high-resolution X-band radar for rainfall measurement in The Netherlands

    C. Z. van de Beek

    2010-02-01

    Full Text Available This study presents an analysis of 195 rainfall events gathered with the X-band weather radar SOLIDAR and a tipping bucket rain gauge network near Delft, The Netherlands, between May 1993 and April 1994. The aim of this paper is to present a thorough analysis of a climatological dataset using a high spatial (120 m and temporal (16 s resolution X-band radar. This makes it a study of the potential for high-resolution rainfall measurements with non-polarimetric X-band radar over flat terrain. An appropriate radar reflectivity – rain rate relation is derived from measurements of raindrop size distributions and compared with radar – rain gauge data. The radar calibration is assessed using a long-term comparison of rain gauge measurements with corresponding radar reflectivities as well as by analyzing the evolution of the stability of ground clutter areas over time. Three different methods for ground clutter correction as well as the effectiveness of forward and backward attenuation correction algorithms have been studied. Five individual rainfall events are discussed in detail to illustrate the strengths and weaknesses of high-resolution X-band radar and the effectiveness of the presented correction methods. X-band radar is found to be able to measure the space-time variation of rainfall at high resolution, far greater than what can be achieved by rain gauge networks or a typical operational C-band weather radar. On the other hand, SOLIDAR can suffer from receiver saturation, wet radome attenuation as well as signal loss along the path. During very strong convective situations the signal can even be lost completely. In combination with several rain gauges for quality control, high resolution X-band radar is considered to be suitable for rainfall monitoring over relatively small (urban catchments. These results offer great prospects for the new high resolution polarimetric doppler X-band radar IDRA.

  12. Measurement of natural radioactivity: Calibration and performance of a high-resolution gamma spectrometry facility

    Murray, A. S.; Helsted, L. M.; Autzen, M.

    2018-01-01

    Murray et al. (2015) described an international inter-comparison of dose rate measurements undertaken using a homogenised beach ridge sand from Jutland, Denmark. The measured concentrations for 226Ra, 232Th and 40K from different laboratories varied considerably, with relative standard deviations...... of 26% (n=8), 59% (n=23) and 15% (n=23), respectively. In contrast, the relative standard deviations observed internally within our laboratory were 9%, 11% and 7%, respectively (n=20), and in addition our mean values were consistent with the global 40K mean, but significantly different from the 232Th...... mean. These problems in both accuracy and precision have led us to examine both the long term performance of our analytical facility, and its calibration. Our approach to the preparation of new absolute 238U, 232Th and 40K standards is outlined and tested against international standards. We also report...

  13. Theoretical repeatability assessment without repetitive measurements in gradient high-performance liquid chromatography.

    Kotani, Akira; Tsutsumi, Risa; Shoji, Asaki; Hayashi, Yuzuru; Kusu, Fumiyo; Yamamoto, Kazuhiro; Hakamata, Hideki

    2016-07-08

    This paper puts forward a time and material-saving method for evaluating the repeatability of area measurements in gradient HPLC with UV detection (HPLC-UV), based on the function of mutual information (FUMI) theory which can theoretically provide the measurement standard deviation (SD) and detection limits through the stochastic properties of baseline noise with no recourse to repetitive measurements of real samples. The chromatographic determination of terbinafine hydrochloride and enalapril maleate is taken as an example. The best choice of the number of noise data points, inevitable for the theoretical evaluation, is shown to be 512 data points (10.24s at 50 point/s sampling rate of an A/D converter). Coupled with the relative SD (RSD) of sample injection variability in the instrument used, the theoretical evaluation is proved to give identical values of area measurement RSDs to those estimated by the usual repetitive method (n=6) over a wide concentration range of the analytes within the 95% confidence intervals of the latter RSD. The FUMI theory is not a statistical one, but the "statistical" reliability of its SD estimates (n=1) is observed to be as high as that attained by thirty-one measurements of the same samples (n=31). Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Effects of combined high intensity arm and leg training on performance and cardio-respiratory measures.

    Zinner, Christoph; Sperlich, Billy; Born, Dennis-Peter; Michels, Guido

    2017-01-01

    The purpose of this study was to investigate the effects of combined arm and leg high-intensity low-volume interval training (HIITarm+leg) on maximal oxygen uptake, myocardial measures (i.e. stroke volume, cardiac output, ejection fraction), Tissue Oxygenation Index (TOI) of the vastus lateralis and triceps brachii, as well as power output in comparison to leg HIIT (HIITleg) only. The 20 healthy, male and female volunteers completed six sessions of either HIITleg on a cycle ergometer or HIITarm+leg on an arm and leg cycle ergometer. During pre- and post-testing, the volunteers completed a submaximal and incremental test to exhaustion on a cycle ergometer. Magnitude based interference revealed likely to very likely beneficial effects for HIITarm+leg compared to HIITleg in maximal oxygen uptake, cardiac measures as well peak power output. The TOI following HIITarm+leg demonstrated likely to very likely increased oxygenation in the triceps brachii or the vastus lateralis when compared to HIITleg. The results suggest that six sessions of HIITarm+leg may likely to very likely improve maximal oxygen uptake, some inotropy-related cardiac measures with improved tissue oxygenation of the triceps brachii and vastus lateralis muscles resulting in greater leg peak power output.

  15. Measurement of food flavonoids by high-performance liquid chromatography: A review.

    Merken, H M; Beecher, G R

    2000-03-01

    The flavonoids are plant polyphenols found frequently in fruits, vegetables, and grains. Divided into several subclasses, they include the anthocyanidins, pigments chiefly responsible for the red and blue colors in fruits, fruit juices, wines, and flowers; the catechins, concentrated in tea; the flavanones and flavanone glycosides, found in citrus and honey; and the flavones, flavonols, and flavonol glycosides, found in tea, fruits, vegetables, and honey. Known for their hydrogen-donating antioxidant activity as well as their ability to complex divalent transition metal cations, flavonoids are propitious to human health. Computer-controlled high-performance liquid chromatography (HPLC) has become the analytical method of choice. Many systems have been developed for the detection and quantification of flavonoids across one, two, or three subclasses. A summary of the various HPLC and sample preparation methods that have been employed to quantify individual flavonoids within a subclass or across several subclasses are tabulated in this review.

  16. Performance Measurement at Universities

    Lueg, Klarissa

    2014-01-01

    This paper proposes empirical approaches to testing the reliability, validity, and organizational effectiveness of student evaluations of teaching (SET) as a performance measurement instrument in knowledge management at the institutional level of universities. Departing from Weber’s concept...

  17. Sensitivity of monthly heart rate and psychometric measures for monitoring physical performance in highly trained young handball players.

    Buchheit, M

    2015-05-01

    The aim of the present study was to examine whether monthly resting heart rate (HR), HR variability (HRV) and psychometric measures can be used to monitor changes in physical performance in highly-trained adolescent handball players. Data were collected in 37 adolescent players (training 10±2.1 h.wk(-1)) on 11 occasions from September to May during the in-season period, and included an estimation of training status (resting HR and HRV, the profile of mood state (POMS) questionnaire), and 3 physical performance tests (a 10-m sprint, a counter movement jump and a graded aerobic intermittent test, 30-15 Intermittent Fitness Test). The sensitivity of HR and psychometric measures to changes in physical performance was poor ( 75%), irrespective of the markers and the performance measures. Finally, the difference in physical performance between players with better vs. worse estimated training status were all almost certainly trivial. The present results highlight the limitation of monthly measures of resting HR, HRV and perceived mood and fatigue for predicting in-season changes in physical performance in highly-trained adolescent handball players. This suggests that more frequent monitoring might be required, and/or that other markers might need to be considered. © Georg Thieme Verlag KG Stuttgart · New York.

  18. High performance conductometry

    Saha, B.

    2000-01-01

    Inexpensive but high performance systems have emerged progressively for basic and applied measurements in physical and analytical chemistry on one hand, and for on-line monitoring and leak detection in plants and facilities on the other. Salient features of the developments will be presented with specific examples

  19. The performance measurement manifesto.

    Eccles, R G

    1991-01-01

    The leading indicators of business performance cannot be found in financial data alone. Quality, customer satisfaction, innovation, market share--metrics like these often reflect a company's economic condition and growth prospects better than its reported earnings do. Depending on an accounting department to reveal a company's future will leave it hopelessly mired in the past. More and more managers are changing their company's performance measurement systems to track nonfinancial measures and reinforce new competitive strategies. Five activities are essential: developing an information architecture; putting the technology in place to support this architecture; aligning bonuses and other incentives with the new system; drawing on outside resources; and designing an internal process to ensure the other four activities occur. New technologies and more sophisticated databases have made the change to nonfinancial performance measurement systems possible and economically feasible. Industry and trade associations, consulting firms, and public accounting firms that already have well-developed methods for assessing market share and other performance metrics can add to the revolution's momentum--as well as profit from the business opportunities it presents. Every company will have its own key measures and distinctive process for implementing the change. But making it happen will always require careful preparation, perseverance, and the conviction of the CEO that it must be carried through. When one leading company can demonstrate the long-term advantage of its superior performance on quality or innovation or any other nonfinancial measure, it will change the rules for all its rivals forever.

  20. Enterprise performance measurement systems

    Milija Bogavac

    2014-10-01

    Full Text Available Performance measurement systems are an extremely important part of the control and management actions, because in this way a company can determine its business potential, its market power, potential and current level of business efficiency. The significance of measurement consists in influencing the relationship between the results of reproduction (total volume of production, value of production, total revenue and profit and investments to achieve these results (factors of production spending and hiring capital in order to achieve the highest possible quality of the economy. (The relationship between the results of reproduction and investment to achieve them quantitatively determines economic success as the quality of the economy. Measuring performance allows the identification of the economic resources the company has, so looking at the key factors that affect its performance can help to determine the appropriate course of action.

  1. Productivity and Performance Measurement

    Hald, Kim Sundtoft; Spring, Martin

    This study explores conceptually how performance measurement as discussed in the literature, enables or constrains the ability to manage and improve productivity. It uses an inter-disciplinary literature review to identify five areas of concern relating productivity accounting to the ability...... to improve productivity: “Productivity representation”; “productivity incentives”, “productivity intervention”; “productivity trade-off or synergy” and “productivity strategy and context”. The paper discusses these areas of concern and expands our knowledge of how productivity and performance measurement...

  2. High performance management: An illustrative example of sales departments’ productivity measurement

    Halkos, George E.; Tzeremes, Nickolaos G

    2009-01-01

    This paper describes a conceptual approach to measure and compare productivity of resource utilization at the firm level, adapting a set of techniques known as Data Envelopment Analysis (DEA). Within this approach, the paper addresses the issues of multiple inputs and multiple outputs of the sales departments of a firm. In particular, we focus on the resource management of sales departments. The proposed measurement methodology will allow assessment of the impact of different management polic...

  3. Measuring Firm Performance

    Assaf, A. George; Josiassen, Alexander; Gillen, David

    2014-01-01

    Set in the airport industry, this paper measures firm performance using both desirable and bad outputs (i.e. airport delays). We first estimate a model that does not include the bad outputs and then a model that includes bad outputs. The results show important differences in the efficiency...

  4. Benchmarking and Performance Measurement.

    Town, J. Stephen

    This paper defines benchmarking and its relationship to quality management, describes a project which applied the technique in a library context, and explores the relationship between performance measurement and benchmarking. Numerous benchmarking methods contain similar elements: deciding what to benchmark; identifying partners; gathering…

  5. Quantitative measurement of vocal fold vibration in male radio performers and healthy controls using high-speed videoendoscopy.

    Samantha Warhurst

    Full Text Available Acoustic and perceptual studies show a number of differences between the voices of radio performers and controls. Despite this, the vocal fold kinematics underlying these differences are largely unknown. Using high-speed videoendoscopy, this study sought to determine whether the vocal vibration features of radio performers differed from those of non-performing controls.Using high-speed videoendoscopy, recordings of a mid-phonatory/i/ in 16 male radio performers (aged 25-52 years and 16 age-matched controls (aged 25-52 years were collected. Videos were extracted and analysed semi-automatically using High-Speed Video Program, obtaining measures of fundamental frequency (f0, open quotient and speed quotient. Post-hoc analyses of sound pressure level (SPL were also performed (n = 19. Pearson's correlations were calculated between SPL and both speed and open quotients.Male radio performers had a significantly higher speed quotient than their matched controls (t = 3.308, p = 0.005. No significant differences were found for f0 or open quotient. No significant correlation was found between either open or speed quotient with SPL.A higher speed quotient in male radio performers suggests that their vocal fold vibration was characterised by a higher ratio of glottal opening to closing times than controls. This result may explain findings of better voice quality, higher equivalent sound level and greater spectral tilt seen in previous research. Open quotient was not significantly different between groups, indicating that the durations of complete vocal fold closure were not different between the radio performers and controls. Further validation of these results is required to determine the aetiology of the higher speed quotient result and its implications for voice training and clinical management in performers.

  6. Comparative performance measures of relational and object-oriented databases using High Energy Physics data

    Marstaller, J.

    1993-12-01

    The major experiments at the SSC are expected to produce up to 1 Petabyte of data per year. The use of database techniques can significantly reduce the time it takes to access data. The goal of this project was to test which underlying data model, the relational or the object-oriented, would be better suited for archival and accessing high energy data. We describe the relational and the object-oriented data model and their implementation in commercial database management systems. To determine scalability we tested both implementations for 10-MB and 100-MB databases using storage and timing criteria

  7. Real-time measurements of temperature, pressure and moisture profiles in High-Performance Concrete exposed to high temperatures during neutron radiography imaging

    Toropovs, N., E-mail: nikolajs.toropovs@rtu.lv [Empa, Swiss Federal Laboratories for Materials Science and Technology, Dübendorf (Switzerland); Riga Technical University, Institute of Materials and Structures, Riga (Latvia); Lo Monte, F. [Politecnico di Milano, Department of Civil and Environmental Engineering, Milan (Italy); Wyrzykowski, M. [Empa, Swiss Federal Laboratories for Materials Science and Technology, Dübendorf (Switzerland); Lodz University of Technology, Department of Building Physics and Building Materials, Lodz (Poland); Weber, B. [Empa, Swiss Federal Laboratories for Materials Science and Technology, Dübendorf (Switzerland); Sahmenko, G. [Riga Technical University, Institute of Materials and Structures, Riga (Latvia); Vontobel, P. [Paul Scherrer Institute, Laboratory for Neutron Scattering and Imaging, Villigen (Switzerland); Felicetti, R. [Politecnico di Milano, Department of Civil and Environmental Engineering, Milan (Italy); Lura, P. [Empa, Swiss Federal Laboratories for Materials Science and Technology, Dübendorf (Switzerland); ETH Zürich, Institute for Building Materials (IfB), Zürich (Switzerland)

    2015-02-15

    High-Performance Concrete (HPC) is particularly prone to explosive spalling when exposed to high temperature. Although the exact causes that lead to spalling are still being debated, moisture transport during heating plays an important role in all proposed mechanisms. In this study, slabs made of high-performance, low water-to-binder ratio mortars with addition of superabsorbent polymers (SAP) and polypropylene fibers (PP) were heated from one side on a temperature-controlled plate up to 550 °C. A combination of measurements was performed simultaneously on the same sample: moisture profiles via neutron radiography, temperature profiles with embedded thermocouples and pore pressure evolution with embedded pressure sensors. Spalling occurred in the sample with SAP, where sharp profiles of moisture and temperature were observed. No spalling occurred when PP-fibers were introduced in addition to SAP. The experimental procedure described here is essential for developing and verifying numerical models and studying measures against fire spalling risk in HPC.

  8. SU-E-I-40: New Method for Measurement of Task-Specific, High-Resolution Detector System Performance

    Loughran, B; Singh, V; Jain, A; Bednarek, D; Rudin, S [University at Buffalo, Buffalo, NY (United States)

    2014-06-01

    Purpose: Although generalized linear system analytic metrics such as GMTF and GDQE can evaluate performance of the whole imaging system including detector, scatter and focal-spot, a simplified task-specific measured metric may help to better compare detector systems. Methods: Low quantum-noise images of a neuro-vascular stent with a modified ANSI head phantom were obtained from the average of many exposures taken with the high-resolution Micro-Angiographic Fluoroscope (MAF) and with a Flat Panel Detector (FPD). The square of the Fourier Transform of each averaged image, equivalent to the measured product of the system GMTF and the object function in spatial-frequency space, was then divided by the normalized noise power spectra (NNPS) for each respective system to obtain a task-specific generalized signal-to-noise ratio. A generalized measured relative object detectability (GM-ROD) was obtained by taking the ratio of the integral of the resulting expressions for each detector system to give an overall metric that enables a realistic systems comparison for the given detection task. Results: The GM-ROD provides comparison of relative performance of detector systems from actual measurements of the object function as imaged by those detector systems. This metric includes noise correlations and spatial frequencies relevant to the specific object. Additionally, the integration bounds for the GM-ROD can be selected to emphasis the higher frequency band of each detector if high-resolution image details are to be evaluated. Examples of this new metric are discussed with a comparison of the MAF to the FPD for neuro-vascular interventional imaging. Conclusion: The GM-ROD is a new direct-measured task-specific metric that can provide clinically relevant comparison of the relative performance of imaging systems. Supported by NIH Grant: 2R01EB002873 and an equipment grant from Toshiba Medical Systems Corporation.

  9. High performance work practices, innovation and performance

    Jørgensen, Frances; Newton, Cameron; Johnston, Kim

    2013-01-01

    Research spanning nearly 20 years has provided considerable empirical evidence for relationships between High Performance Work Practices (HPWPs) and various measures of performance including increased productivity, improved customer service, and reduced turnover. What stands out from......, and Africa to examine these various questions relating to the HPWP-innovation-performance relationship. Each paper discusses a practice that has been identified in HPWP literature and potential variables that can facilitate or hinder the effects of these practices of innovation- and performance...

  10. Measurement of the Patulin toxicant using high performance liquid chromatography (HPLC in apple juices supplied in Khorramabad City, Iran

    Elham Esmaeili Lashkarian

    2016-11-01

    Full Text Available Making use of low quality moldy and worm-eaten fruits for juice production causes various irritations in human body due to its hazardous compounds. Today, Patulin toxicant is one of the most important compounds to be investigated in juices, particularly in apple juices. This research aims to measure the amount of Patulin toxicant and identify the molding factors in apple juices supplied in Khorramabad shops. After preparing a list of shops supplying and selling h\\juices in Khorramabad, 64 apple juices packs were collected at random. The Patulin measurement was accomplished using high performance liquid chromatography (HPLC and the molding factors identification also was performed using macroscopic, microscopic and other necessary tests after the sample were cultured in standard method. Out of 64 sample investigated from presence of lack of mold perspective, 61 (95.3% lacked mold and 1 (1.6% had Aspergillus terreus mold and 2 (3.1% had Penicillium mold. The Patulin level measured in 31 samples (48% was negative and in 33 ones (52% was positive in range 5.102-26.484 μg.l-1. The data obtained from samples was evaluated well in comparison to external standards and the correlation coefficient of 0.99 was indicated. The results obtained from this research indicated that the mean Patulin measured in apple juices studied was less than the EU and Iranian standards.

  11. A new measurement tool for characterization of superconducting rf accelerator cavities using high-performance LTS SQUIDs

    Vodel, W [Friedrich-Schiller-University Jena, Helmholtzweg 5, 07743 Jena (Germany); Neubert, R [Friedrich-Schiller-University Jena, Helmholtzweg 5, 07743 Jena (Germany); Nietzsche, S [Friedrich-Schiller-University Jena, Helmholtzweg 5, 07743 Jena (Germany); Seidel, P [Friedrich-Schiller-University Jena, Helmholtzweg 5, 07743 Jena (Germany); Knaack, K [DESY Hamburg (Germany); Wittenburg, K [DESY Hamburg (Germany); Peters, A [Heidelberger Ionenstrahl-Therapiezentrum, Heidelberg (Germany)

    2007-11-15

    This paper presents a new system to measure very low currents in an accelerator environment, using a cryogenic current comparator (CCC). In principle a CCC is a conventional current transformer using the high-performance SQUID technology to sense the magnetic fields caused by the beam current. Since the system is sensitive on a pA level, it is an optimum device to detect dark currents of superconducting cavities. The system presented here is designed for the test facilities of the superconducting accelerator modules for the European XFEL at the Deutsches Elektronen-Synchrotron (DESY) in Hamburg. Measurements in a quiet environment showed that an intrinsic noise level of the CCC of 40 pA Hz{sup -1/2} could be achieved.

  12. Performance of A Compact Multi-crystal High-purity Germanium Detector Array for Measuring Coincident Gamma-ray Emissions

    Howard, Chris [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Daigle, Stephen [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Buckner, Matt [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Erikson, Luke E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Runkle, Robert C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stave, Sean C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Champagne, Art [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Cooper, Andrew [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Downen, Lori [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Glasgow, Brian D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kelly, Keegan [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Sallaska, Anne [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States)

    2015-02-18

    The Multi-sensor Airborne Radiation Survey (MARS) detector is a 14-crystal array of high-purity germanium (HPGe) detectors housed in a single cryostat. The array was used to measure the astrophysical S-factor for the 14N(p,γ)15O* reaction for several transition energies at an effective center of mass energy of 163 keV. Owing to the segmented nature of the MARS detector, the effect of gamma-ray summing was greatly reduced in comparison to past experiments which utilized large, single-crystal detectors. The new S-factor values agree within the uncertainties with the past measurements. Details of the analysis and detector performance will be presented.

  13. Performance of a compact multi-crystal high-purity germanium detector array for measuring coincident gamma-ray emissions

    Howard, Chris; Daigle, Stephen; Buckner, Matt [University of North Carolina at Chapel Hill, Chapel Hill, NC 27599 (United States); Triangle Universities Nuclear Laboratory, Durham, NC 27708 (United States); Erikson, Luke E.; Runkle, Robert C. [Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Stave, Sean C., E-mail: Sean.Stave@pnnl.gov [Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Champagne, Arthur E.; Cooper, Andrew; Downen, Lori [University of North Carolina at Chapel Hill, Chapel Hill, NC 27599 (United States); Triangle Universities Nuclear Laboratory, Durham, NC 27708 (United States); Glasgow, Brian D. [Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Kelly, Keegan; Sallaska, Anne [University of North Carolina at Chapel Hill, Chapel Hill, NC 27599 (United States); Triangle Universities Nuclear Laboratory, Durham, NC 27708 (United States)

    2015-05-21

    The Multi-sensor Airborne Radiation Survey (MARS) detector is a 14-crystal array of high-purity germanium (HPGe) detectors housed in a single cryostat. The array was used to measure the astrophysical S-factor for the {sup 14}N(p,γ){sup 15}O{sup ⁎} reaction for several transition energies at an effective center-of-mass energy of 163 keV. Owing to the granular nature of the MARS detector, the effect of gamma-ray summing was greatly reduced in comparison to past experiments which utilized large, single-crystal detectors. The new S-factor values agree within their uncertainties with the past measurements. Details of the analysis and detector performance are presented.

  14. Measuring the Company Performance

    Ion Stancu

    2006-03-01

    Full Text Available According to the logics of the efficient capital investment, the management of the investment of the saving capital in the company’s assets must conclude, on the end of the financial year, with a plus ofreal value (NPV > 0. From this point of view, in this paper we suggest the usage of an investment valuationmodel for the assessment of the company managerial and technological performance. Supposing the book value is a proxy of the just value (of assets and operational results and supposing the capital cost iscorrectly estimated, we evaluate the company’s performance both by the net present value model, and also by the company’s ability to create a surplus of the invested capital (NPV >0.Our paper also aims to identify the performance of the financial breakeven point (for which NPV is at least equal to zero as the minimum acceptable level for the company’s activity. Under this critical sales point, the company goes through the undervaluation of shareholders fortune even if the company’s sales are greater than accounting breakeven point. The performance’s activity level is one which the managers recover and surpass the cost of capital, cost which stand for the normal activity benchmark.The risks of applying of our suggested model we support go down to the confidence of accounting data and of the cost of capital estimating. In spite all of this, the usage of a sensitivity analysis to search anaverage NPV would leads to the company’s performance valuation within investment logic with a high information power.

  15. Measuring the Company Performance

    Ion Stancu

    2006-01-01

    Full Text Available According to the logics of the efficient capital investment, the management of the investment of the saving capital in the company’s assets must conclude, on the end of the financial year, with a plus of real value (NPV > 0. From this point of view, in this paper we suggest the usage of an investment valuation model for the assessment of the company managerial and technological performance. Supposing the book value is a proxy of the just value (of assets and operational results and supposing the capital cost is correctly estimated, we evaluate the company’s performance both by the net present value model, and also by the company’s ability to create a surplus of the invested capital (NPV >0. Our paper also aims to identify the performance of the financial breakeven point (for which NPV is at least equal to zero as the minimum acceptable level for the company’s activity. Under this critical sales point, the company goes through the undervaluation of shareholders fortune even if the company’s sales are greater than accounting breakeven point. The performance’s activity level is one which the managers recover and surpass the cost of capital, cost which stand for the normal activity benchmark. The risks of applying of our suggested model we support go down to the confidence of accounting data and of the cost of capital estimating. In spite all of this, the usage of a sensitivity analysis to search an average NPV would leads to the company’s performance valuation within investment logic with a high information power.

  16. Using Technical Performance Measures

    Garrett, Christopher J.; Levack, Daniel J. H.; Rhodes, Russel E.

    2011-01-01

    All programs have requirements. For these requirements to be met, there must be a means of measurement. A Technical Performance Measure (TPM) is defined to produce a measured quantity that can be compared to the requirement. In practice, the TPM is often expressed as a maximum or minimum and a goal. Example TPMs for a rocket program are: vacuum or sea level specific impulse (lsp), weight, reliability (often expressed as a failure rate), schedule, operability (turn-around time), design and development cost, production cost, and operating cost. Program status is evaluated by comparing the TPMs against specified values of the requirements. During the program many design decisions are made and most of them affect some or all of the TPMs. Often, the same design decision changes some TPMs favorably while affecting other TPMs unfavorably. The problem then becomes how to compare the effects of a design decision on different TPMs. How much failure rate is one second of specific impulse worth? How many days of schedule is one pound of weight worth? In other words, how to compare dissimilar quantities in order to trade and manage the TPMs to meet all requirements. One method that has been used successfully and has a mathematical basis is Utility Analysis. Utility Analysis enables quantitative comparison among dissimilar attributes. It uses a mathematical model that maps decision maker preferences over the tradeable range of each attribute. It is capable of modeling both independent and dependent attributes. Utility Analysis is well supported in the literature on Decision Theory. It has been used at Pratt & Whitney Rocketdyne for internal programs and for contracted work such as the J-2X rocket engine program. This paper describes the construction of TPMs and describes Utility Analysis. It then discusses the use of TPMs in design trades and to manage margin during a program using Utility Analysis.

  17. Strategic Measures of Teacher Performance

    Milanowski, Anthony

    2011-01-01

    Managing the human capital in education requires measuring teacher performance. To measure performance, administrators need to combine measures of practice with measures of outcomes, such as value-added measures, and three measurement systems are needed: classroom observations, performance assessments or work samples, and classroom walkthroughs.…

  18. High performance homes

    Beim, Anne; Vibæk, Kasper Sánchez

    2014-01-01

    Can prefabrication contribute to the development of high performance homes? To answer this question, this chapter defines high performance in more broadly inclusive terms, acknowledging the technical, architectural, social and economic conditions under which energy consumption and production occur....... Consideration of all these factors is a precondition for a truly integrated practice and as this chapter demonstrates, innovative project delivery methods founded on the manufacturing of prefabricated buildings contribute to the production of high performance homes that are cost effective to construct, energy...

  19. Freight performance measures : approach analysis.

    2010-05-01

    This report reviews the existing state of the art and also the state of the practice of freight performance measurement. Most performance measures at the state level have aimed at evaluating highway or transit infrastructure performance with an empha...

  20. Winter maintenance performance measure.

    2016-01-01

    The Winter Performance Index is a method of quantifying winter storm events and the DOTs response to them. : It is a valuable tool for evaluating the States maintenance practices, performing post-storm analysis, training : maintenance personnel...

  1. Evaluation of hemoglobin A1c measurement from filter paper using high-performance liquid chromatography and immunoturbidimetric assay.

    Wu, Yonghua; Yang, Xu; Wang, Haining; Li, Zhenrong; Wang, Tiancheng

    2017-04-01

    Glycated hemoglobin (HbA 1c ) measurement from whole blood (WB) samples is inconvenient for epidemic surveillance and self-monitoring of glycemic level. We evaluated HbA 1c measurement from WB blotted on filter paper (FP), which can be easily transported to central laboratories, with high-performance liquid chromatography (HPLC) and immunoturbidimetric assay (ITA). WB was applied to Whatman filter paper. By using HPLC and WB samples as reference methods, these FP samples were evaluated on HPLC and ITA. Inter- and intra-assay variation, WB vs. FP agreement and sample stability at 20-25 °C and -70 °C were assessed by statistical analysis. Results showed that the coefficient of variation (CV, %) of FP samples for HPLC and ITA were 0.44-1.02% and 1.47-2.72%, respectively (intra-assay); 2.13-3.56% and 3.21-4.82%, respectively (inter-assay). The correlation of WB HPLC with FP analyzed using HPLC and ITA are both significant (p < 0.001). Sample stability showed that FP method up to 5 days at 20-25 °C and 5 weeks at -70 °C is accurate and reproducible. In conclusion, FP samples analyzed by HPLC and ITA can both provide an alternative to WB for HbA 1c measurement, supporting the use of FP method in epidemic surveillance and healthcare units.

  2. Automated high-performance cIMT measurement techniques using patented AtheroEdge™: a screening and home monitoring system.

    Molinari, Filippo; Meiburger, Kristen M; Suri, Jasjit

    2011-01-01

    The evaluation of the carotid artery wall is fundamental for the assessment of cardiovascular risk. This paper presents the general architecture of an automatic strategy, which segments the lumen-intima and media-adventitia borders, classified under a class of Patented AtheroEdge™ systems (Global Biomedical Technologies, Inc, CA, USA). Guidelines to produce accurate and repeatable measurements of the intima-media thickness are provided and the problem of the different distance metrics one can adopt is confronted. We compared the results of a completely automatic algorithm that we developed with those of a semi-automatic algorithm, and showed final segmentation results for both techniques. The overall rationale is to provide user-independent high-performance techniques suitable for screening and remote monitoring.

  3. High performance sapphire windows

    Bates, Stephen C.; Liou, Larry

    1993-02-01

    High-quality, wide-aperture optical access is usually required for the advanced laser diagnostics that can now make a wide variety of non-intrusive measurements of combustion processes. Specially processed and mounted sapphire windows are proposed to provide this optical access to extreme environment. Through surface treatments and proper thermal stress design, single crystal sapphire can be a mechanically equivalent replacement for high strength steel. A prototype sapphire window and mounting system have been developed in a successful NASA SBIR Phase 1 project. A large and reliable increase in sapphire design strength (as much as 10x) has been achieved, and the initial specifications necessary for these gains have been defined. Failure testing of small windows has conclusively demonstrated the increased sapphire strength, indicating that a nearly flawless surface polish is the primary cause of strengthening, while an unusual mounting arrangement also significantly contributes to a larger effective strength. Phase 2 work will complete specification and demonstration of these windows, and will fabricate a set for use at NASA. The enhanced capabilities of these high performance sapphire windows will lead to many diagnostic capabilities not previously possible, as well as new applications for sapphire.

  4. High Performance Marine Vessels

    Yun, Liang

    2012-01-01

    High Performance Marine Vessels (HPMVs) range from the Fast Ferries to the latest high speed Navy Craft, including competition power boats and hydroplanes, hydrofoils, hovercraft, catamarans and other multi-hull craft. High Performance Marine Vessels covers the main concepts of HPMVs and discusses historical background, design features, services that have been successful and not so successful, and some sample data of the range of HPMVs to date. Included is a comparison of all HPMVs craft and the differences between them and descriptions of performance (hydrodynamics and aerodynamics). Readers will find a comprehensive overview of the design, development and building of HPMVs. In summary, this book: Focuses on technology at the aero-marine interface Covers the full range of high performance marine vessel concepts Explains the historical development of various HPMVs Discusses ferries, racing and pleasure craft, as well as utility and military missions High Performance Marine Vessels is an ideal book for student...

  5. Performance Measurement Baseline Change Request

    Social Security Administration — The Performance Measurement Baseline Change Request template is used to document changes to scope, cost, schedule, or operational performance metrics for SSA's Major...

  6. High performance systems

    Vigil, M.B. [comp.

    1995-03-01

    This document provides a written compilation of the presentations and viewgraphs from the 1994 Conference on High Speed Computing given at the High Speed Computing Conference, {open_quotes}High Performance Systems,{close_quotes} held at Gleneden Beach, Oregon, on April 18 through 21, 1994.

  7. Transit performance measures in California.

    2016-04-01

    This research is the result of a California Department of Transportation (Caltrans) request to assess the most commonly : available transit performance measures in California. Caltrans wanted to understand performance measures and data used by : Metr...

  8. Responsive design high performance

    Els, Dewald

    2015-01-01

    This book is ideal for developers who have experience in developing websites or possess minor knowledge of how responsive websites work. No experience of high-level website development or performance tweaking is required.

  9. High Performance Macromolecular Material

    Forest, M

    2002-01-01

    .... In essence, most commercial high-performance polymers are processed through fiber spinning, following Nature and spider silk, which is still pound-for-pound the toughest liquid crystalline polymer...

  10. Are NCLB's Measures, Incentives, and Improvement Strategies the Right Ones for the Nation's Low-Performing High Schools?

    Balfanz, Robert; Legters, Nettie; West, Thomas C.; Weber, Lisa M.

    2007-01-01

    This article examines the extent to which adequate yearly progress (AYP) is a valid and reliable indicator of improvement in low-performing high schools. For a random subsample of 202 high schools, the authors investigate the school characteristics and the federal and state policy contexts that influence their AYP status. Logistic regression…

  11. TheClinical Research Tool: a high-performance microdialysis-based system for reliably measuring interstitial fluid glucose concentration.

    Ocvirk, Gregor; Hajnsek, Martin; Gillen, Ralph; Guenther, Arnfried; Hochmuth, Gernot; Kamecke, Ulrike; Koelker, Karl-Heinz; Kraemer, Peter; Obermaier, Karin; Reinheimer, Cornelia; Jendrike, Nina; Freckmann, Guido

    2009-05-01

    A novel microdialysis-based continuous glucose monitoring system, the so-called Clinical Research Tool (CRT), is presented. The CRT was designed exclusively for investigational use to offer high analytical accuracy and reliability. The CRT was built to avoid signal artifacts due to catheter clogging, flow obstruction by air bubbles, and flow variation caused by inconstant pumping. For differentiation between physiological events and system artifacts, the sensor current, counter electrode and polarization voltage, battery voltage, sensor temperature, and flow rate are recorded at a rate of 1 Hz. In vitro characterization with buffered glucose solutions (c(glucose) = 0 - 26 x 10(-3) mol liter(-1)) over 120 h yielded a mean absolute relative error (MARE) of 2.9 +/- 0.9% and a recorded mean flow rate of 330 +/- 48 nl/min with periodic flow rate variation amounting to 24 +/- 7%. The first 120 h in vivo testing was conducted with five type 1 diabetes subjects wearing two systems each. A mean flow rate of 350 +/- 59 nl/min and a periodic variation of 22 +/- 6% were recorded. Utilizing 3 blood glucose measurements per day and a physical lag time of 1980 s, retrospective calibration of the 10 in vivo experiments yielded a MARE value of 12.4 +/- 5.7. Clarke error grid analysis resulted in 81.0%, 16.6%, 0.8%, 1.6%, and 0% in regions A, B, C, D, and E, respectively. The CRT demonstrates exceptional reliability of system operation and very good measurement performance. The ability to differentiate between artifacts and physiological effects suggests the use of the CRT as a reference tool in clinical investigations. 2009 Diabetes Technology Society.

  12. Economic measures of performance

    Anon.

    1992-01-01

    Cogeneration systems can reduce the total cost of utility service, and, in some instances where power is sold to an electric utility, can even produce a positive net revenue stream. This is, the total cogeneration revenue is greater than the cogeneration system's operating cost plus the cost of supplemental fuel and power. Whether it is sited at an existing facility or new construction, cogeneration systems do require an incremental investment over and above that which would be required if the end user were to utilize more conventional utility services. While the decision as to whether or not one should invest in cogeneration may consider such intangibles as predictability of future utility costs, reliability of electrical supply and the quality of that supply, the decision ultimately becomes one of basic economics. This chapter briefly reviews several economic measures with regard to ease of use, accuracy and financial objective

  13. Identifying High Performance ERP Projects

    Stensrud, Erik; Myrtveit, Ingunn

    2002-01-01

    Learning from high performance projects is crucial for software process improvement. Therefore, we need to identify outstanding projects that may serve as role models. It is common to measure productivity as an indicator of performance. It is vital that productivity measurements deal correctly with variable returns to scale and multivariate data. Software projects generally exhibit variable returns to scale, and the output from ERP projects is multivariate. We propose to use Data Envelopment ...

  14. THE MEASURABILITY OF CONTROLLING PERFORMANCE

    V. Laval

    2017-04-01

    Full Text Available The urge to increase the performance of company processes is ongoing. Surveys indicate however, that many companies do not measure the controlling performance with a defined set of key performance indicators. This paper will analyze three categories of controlling key performance indicators based on their degree of measurability and their impact on the financial performance of a company. Potential measures to optimize the performance of the controlling department will be outlined and put in a logical order. The aligning of the controlling activity with the respective management expectation will be discussed as a key success factor of this improvement project.

  15. Measurement properties of performance-based outcome measures to assess physical function in young and middle-aged people known to be at high risk of hip and/or knee osteoarthritis

    Kroman, S L; Roos, Ewa M.; Bennell, K L

    2014-01-01

    To systematically appraise the evidence on measurement properties of performance-based outcome measures to assess physical function in young and middle-aged people known to be at high risk of hip and/or knee osteoarthritis (OA).......To systematically appraise the evidence on measurement properties of performance-based outcome measures to assess physical function in young and middle-aged people known to be at high risk of hip and/or knee osteoarthritis (OA)....

  16. Clojure high performance programming

    Kumar, Shantanu

    2013-01-01

    This is a short, practical guide that will teach you everything you need to know to start writing high performance Clojure code.This book is ideal for intermediate Clojure developers who are looking to get a good grip on how to achieve optimum performance. You should already have some experience with Clojure and it would help if you already know a little bit of Java. Knowledge of performance analysis and engineering is not required. For hands-on practice, you should have access to Clojure REPL with Leiningen.

  17. High Performance Concrete

    Traian Oneţ

    2009-01-01

    Full Text Available The paper presents the last studies and researches accomplished in Cluj-Napoca related to high performance concrete, high strength concrete and self compacting concrete. The purpose of this paper is to raid upon the advantages and inconveniences when a particular concrete type is used. Two concrete recipes are presented, namely for the concrete used in rigid pavement for roads and another one for self-compacting concrete.

  18. High performance polymeric foams

    Gargiulo, M.; Sorrentino, L.; Iannace, S.

    2008-01-01

    The aim of this work was to investigate the foamability of high-performance polymers (polyethersulfone, polyphenylsulfone, polyetherimide and polyethylenenaphtalate). Two different methods have been used to prepare the foam samples: high temperature expansion and two-stage batch process. The effects of processing parameters (saturation time and pressure, foaming temperature) on the densities and microcellular structures of these foams were analyzed by using scanning electron microscopy

  19. Measuring and improving infrastructure performance

    Committee on Measuring and Improving Infrastructure Performance, National Research Council

    .... Developing a framework for guiding attempts at measuring the performance of infrastructure systems and grappling with the concept of defining good performance are the major themes of this book...

  20. Performance of a high-precision calorimeter for the measurement of the antineutrino-source strength in the SOX experiment

    Altenmueller, Konrad [Technische Universitaet Muenchen (Germany); Collaboration: BOREXINO-Collaboration

    2016-07-01

    A calorimeter was developed to measure the thermal power and thus the antineutrino-generation rate of a {sup 144}Ce - {sup 144}Pr antineutrino-source with < 1% overall accuracy for the SOX experiment. SOX is searching for neutrino oscillations at short baselines with the Borexino detector to investigate the existence of eV-scale sterile neutrinos. The calorimeter design is based on a copper heat exchanger with integrated water lines for the heat extraction, mounted around the source. A high precision measurement is possible thanks to an elaborate thermal insulation. In this talk, the design of the calorimeter is reviewed and results of calibration measurements are presented. The thermal insulation of the system was examined and heat losses were quantified. The methods to reconstruct the source power and the decay rate from measurements are described.

  1. Danish High Performance Concretes

    Nielsen, M. P.; Christoffersen, J.; Frederiksen, J.

    1994-01-01

    In this paper the main results obtained in the research program High Performance Concretes in the 90's are presented. This program was financed by the Danish government and was carried out in cooperation between The Technical University of Denmark, several private companies, and Aalborg University...... concretes, workability, ductility, and confinement problems....

  2. High performance homes

    Beim, Anne; Vibæk, Kasper Sánchez

    2014-01-01

    . Consideration of all these factors is a precondition for a truly integrated practice and as this chapter demonstrates, innovative project delivery methods founded on the manufacturing of prefabricated buildings contribute to the production of high performance homes that are cost effective to construct, energy...

  3. The Politics of Performance Measurement

    Bjørnholt, Bente; Larsen, Flemming

    2014-01-01

    Performance measurements are meant to improve public decision making and organizational performance. But performance measurements are far from always rational tools for problem solving, they are also political instruments. The central question addressed in this article is how performance...... impact on the political decision making process, as the focus on performance goals entails a kind of reductionism (complex problems are simplified), sequential decision making processes (with a division in separate policy issues) and short-sighted decisions (based on the need for making operational goals)....... measurement affects public policy. The aim is to conceptualize the political consequences of performance measurements and of special concern is how performance systems influence how political decisions are made, what kind of political decisions are conceivable, and how they are implemented. The literature...

  4. Concept development and needs identification for INFLO : report on stakeholder input on transformative goals, performance measures and high level user needs for INFLO.

    2012-04-01

    The purpose of this report is to document the stakeholder input received at the February 8, 2012, stakeholder workshop at the Hall of States in Washington, D.C. on goals, performance measures, transformative performance targets, and high-level user n...

  5. Design and fabrication of a dead weight equipment to perform creep measurements on highly irradiated beryllium specimens

    Scibetta, M.; Pellettieri, A.; Wouters, P.; Leenaerts, A.; Verpoucke, G.

    2005-01-01

    Beryllium is an important material to be used in the blanket of fusion reactors. It acts as a neutron multiplier that allows tritium production. In order to use this material effectively, some data on creep and swelling behaviour are needed. This paper describes preliminary microstructural investigations and the qualification of a creep set-up that will be used to measure creep of highly irradiated beryllium from the BR2 research reactor matrix. (Author)

  6. Diagnostic colonoscopy: performance measurement study.

    Kuznets, Naomi

    2002-07-01

    This is the fifth of a series of best practices studies undertaken by the Performance Measurement Initiative (PMI), the centerpiece of the Institute for Quality Improvement (IQI), a not-for-profit quality improvement subsidiary of the Accreditation Association for Ambulatory Health Care (AAAHC) (Performance Measurement Initiative, 1999a, 1999b, 2000a, 2000b). The IQI was created to offer clinical performance measurement and improvement opportunities to ambulatory health care organizations and others interested in quality patient care. The purpose of the study was to provide opportunities to initiate clinical performance measurement on key processes and outcomes for this procedure and use this information for clinical quality improvement. This article provides performance measurement information on how organizations that have demonstrated and validated differences in clinical practice can have similar outcomes, but at a dramatically lower cost. The intent of the article is to provide organizations with alternatives in practice to provide a better value to their patients.

  7. High-Performance Networking

    CERN. Geneva

    2003-01-01

    The series will start with an historical introduction about what people saw as high performance message communication in their time and how that developed to the now to day known "standard computer network communication". It will be followed by a far more technical part that uses the High Performance Computer Network standards of the 90's, with 1 Gbit/sec systems as introduction for an in depth explanation of the three new 10 Gbit/s network and interconnect technology standards that exist already or emerge. If necessary for a good understanding some sidesteps will be included to explain important protocols as well as some necessary details of concerned Wide Area Network (WAN) standards details including some basics of wavelength multiplexing (DWDM). Some remarks will be made concerning the rapid expanding applications of networked storage.

  8. High performance data transfer

    Cottrell, R.; Fang, C.; Hanushevsky, A.; Kreuger, W.; Yang, W.

    2017-10-01

    The exponentially increasing need for high speed data transfer is driven by big data, and cloud computing together with the needs of data intensive science, High Performance Computing (HPC), defense, the oil and gas industry etc. We report on the Zettar ZX software. This has been developed since 2013 to meet these growing needs by providing high performance data transfer and encryption in a scalable, balanced, easy to deploy and use way while minimizing power and space utilization. In collaboration with several commercial vendors, Proofs of Concept (PoC) consisting of clusters have been put together using off-the- shelf components to test the ZX scalability and ability to balance services using multiple cores, and links. The PoCs are based on SSD flash storage that is managed by a parallel file system. Each cluster occupies 4 rack units. Using the PoCs, between clusters we have achieved almost 200Gbps memory to memory over two 100Gbps links, and 70Gbps parallel file to parallel file with encryption over a 5000 mile 100Gbps link.

  9. Facilities projects performance measurement system

    Erben, J.F.

    1979-01-01

    The two DOE-owned facilities at Hanford, the Fuels and Materials Examination Facility (FMEF), and the Fusion Materials Irradiation Test Facility (FMIT), are described. The performance measurement systems used at these two facilities are next described

  10. Performance Measures, Benchmarking and Value.

    McGregor, Felicity

    This paper discusses performance measurement in university libraries, based on examples from the University of Wollongong (UoW) in Australia. The introduction highlights the integration of information literacy into the curriculum and the outcomes of a 1998 UoW student satisfaction survey. The first section considers performance indicators in…

  11. High frequency energy measurements

    Stotlar, S.C.

    1981-01-01

    High-frequency (> 100 MHz) energy measurements present special problems to the experimenter. Environment or available electronics often limit the applicability of a given detector type. The physical properties of many detectors are frequency dependent and in some cases, the physical effect employed can be frequency dependent. State-of-the-art measurements generally involve a detection scheme in association with high-speed electronics and a method of data recording. Events can be single or repetitive shot requiring real time, sampling, or digitizing data recording. Potential modification of the pulse by the detector and the associated electronics should not be overlooked. This presentation will review typical applications, methods of choosing a detector, and high-speed detectors. Special considerations and limitations of some applications and devices will be described

  12. Ultra-high-performance liquid chromatography-tandem mass spectrometry measurement of climbazole deposition from hair care products onto artificial skin and human scalp

    Chen, G.; Hoptroff, M.; Fei, X.; Su, Y.; Janssen, H.-G.

    2013-01-01

    A sensitive and specific ultra-high-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method was developed and validated for the measurement of climbazole deposition from hair care products onto artificial skin and human scalp. Deuterated climbazole was used as the internal

  13. Measurement of opioid peptides with combinations of reversed phase high performance liquid chromatography, radioimmunoassay, radioreceptorassay, and mass spectrometry

    Fridland, G.H.; Desiderio, D.M.

    1987-01-01

    As the first step, RP-HPLC gradient elution is performed of a Sep-Pak treated peptide-rich fraction from a tissue extract, and the eluent is monitored by a variety of post-HPLC detectors. In an effort to maximize the structural information that can be obtained from the analysis, UV provides the analog absorption trace; receptorassay analysis (RRA) data of all fractions that are collected are used to construct the profile of opioid-receptoractive peptides; radioimmunoassay (RIA) of selected HPLC fractions at retention times corresponding to the retention time of standards, or in some special cases of all 90-fractions, provides immunoreactivity information; and fast atom bombardment mass spectrometry (FAB-MS) in two modes - corroboration of the (M + H) + of the expected peptide, or MS/MS to monitor an amino acid sequence-determining fragment ion unique to that peptide in the selected ion monitoring (SIM) mode - provides structural information. As a demonstration of the level of quantification sensitivity that can be attained by these novel MS methods, FAB-MS-MS-SIM of solutions of synthetic leucine enkephalin was sensitive to the 70 femtomole level. This paper discusses RIA versus RRA data, and recent MS measurements of peptides in human tissues. 4 references, 1 figure

  14. Brush seal performance measurement system

    Aksoy, Serdar; Akşit, Mahmut Faruk; Aksit, Mahmut Faruk; Duran, Ertuğrul Tolga; Duran, Ertugrul Tolga

    2009-01-01

    Brush seals are rapidly replacing conventional labyrinth seals in turbomachinery applications. Upon pressure application, seal stiffness increases drastically due to frictional bristle interlocking. Operating stiffness is critical to determine seal wear life. Typically, seal stiffness is measured by pressing a curved shoe to brush bore. The static-unpressurized measurement is extrapolated to pressurized and high speed operating conditions. This work presents a seal stiffness measurement syste...

  15. Performance of biometric quality measures.

    Grother, Patrick; Tabassi, Elham

    2007-04-01

    We document methods for the quantitative evaluation of systems that produce a scalar summary of a biometric sample's quality. We are motivated by a need to test claims that quality measures are predictive of matching performance. We regard a quality measurement algorithm as a black box that converts an input sample to an output scalar. We evaluate it by quantifying the association between those values and observed matching results. We advance detection error trade-off and error versus reject characteristics as metrics for the comparative evaluation of sample quality measurement algorithms. We proceed this with a definition of sample quality, a description of the operational use of quality measures. We emphasize the performance goal by including a procedure for annotating the samples of a reference corpus with quality values derived from empirical recognition scores.

  16. A simple, rapid and validated high-performance liquid chromatography method suitable for clinical measurements of human mercaptalbumin and non-mercaptalbumin.

    Yasukawa, Keiko; Shimosawa, Tatsuo; Okubo, Shigeo; Yatomi, Yutaka

    2018-01-01

    Background Human mercaptalbumin and human non-mercaptalbumin have been reported as markers for various pathological conditions, such as kidney and liver diseases. These markers play important roles in redox regulations throughout the body. Despite the recognition of these markers in various pathophysiologic conditions, the measurements of human mercaptalbumin and non-mercaptalbumin have not been popular because of the technical complexity and long measurement time of conventional methods. Methods Based on previous reports, we explored the optimal analytical conditions for a high-performance liquid chromatography method using an anion-exchange column packed with a hydrophilic polyvinyl alcohol gel. The method was then validated using performance tests as well as measurements of various patients' serum samples. Results We successfully established a reliable high-performance liquid chromatography method with an analytical time of only 12 min per test. The repeatability (within-day variability) and reproducibility (day-to-day variability) were 0.30% and 0.27% (CV), respectively. A very good correlation was obtained with the results of the conventional method. Conclusions A practical method for the clinical measurement of human mercaptalbumin and non-mercaptalbumin was established. This high-performance liquid chromatography method is expected to be a powerful tool enabling the expansion of clinical usefulness and ensuring the elucidation of the roles of albumin in redox reactions throughout the human body.

  17. High-Intensity Cycling Training: The Effect of Work-to-Rest Intervals on Running Performance Measures.

    Kavaliauskas, Mykolas; Aspe, Rodrigo R; Babraj, John

    2015-08-01

    The work-to-rest ratio during cycling-based high-intensity interval training (HIT) could be important in regulating physiological and performance adaptations. We sought to determine the effectiveness of cycling-based HIT with different work-to-rest ratios for long-distance running. Thirty-two long-distance runners (age: 39 ± 8 years; sex: 14 men, 18 women; average weekly running training volume: 25 miles) underwent baseline testing (3-km time-trial, V[Combining Dot Above]O2peak and time to exhaustion, and Wingate test) before a 2-week matched-work cycling HIT of 6 × 10-second sprints with different rest periods (30 seconds [R30], 80 seconds [R80], 120 seconds [R120], or control). Three-kilometer time trial was significantly improved in the R30 group only (3.1 ± 4.0%, p = 0.04), whereas time to exhaustion was significantly increased in the 2 groups with a lower work-to-rest ratio (R30 group 6.4 ± 6.3%, p = 0.003 vs. R80 group 4.4 ± 2.7%, p = 0.03 vs. R120 group 1.9 ± 5.0%, p = 0.2). However, improvements in average power production were significantly greater with a higher work-to-rest ratio (R30 group 0.3 ± 4.1%, p = 0.8 vs. R80 group 4.6 ± 4.2%, p = 0.03 vs. R120 group 5.3 ± 5.9%, p = 0.02), whereas peak power significantly increased only in the R80 group (8.5 ± 8.2%, p = 0.04) but not in the R30 group (4.3 ± 6.1%, p = 0.3) or in the R120 group (7.1 ± 7.9%, p = 0.09). Therefore, cycling-based HIT is an effective way to improve running performance, and the type and magnitude of adaptation is dependent on the work-to-rest ratio.

  18. R high performance programming

    Lim, Aloysius

    2015-01-01

    This book is for programmers and developers who want to improve the performance of their R programs by making them run faster with large data sets or who are trying to solve a pesky performance problem.

  19. Scalable Performance Measurement and Analysis

    Gamblin, Todd [Univ. of North Carolina, Chapel Hill, NC (United States)

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number of tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.

  20. Measuring performance at trade shows

    Hansen, Kåre

    2004-01-01

    Trade shows is an increasingly important marketing activity to many companies, but current measures of trade show performance do not adequately capture dimensions important to exhibitors. Based on the marketing literature's outcome and behavior-based control system taxonomy, a model is built...... that captures a outcome-based sales dimension and four behavior-based dimensions (i.e. information-gathering, relationship building, image building, and motivation activities). A 16-item instrument is developed for assessing exhibitors perceptions of their trade show performance. The paper presents evidence...

  1. Performance Measurement of Research Activitities

    Jakobsen, Morten; Jensen, Tina Blegind; Peyton, Margit Malmmose

    Performance measurements have made their entry into the world of universities. Every research activity is registered in a database and the output measures form the foundation for managerial decisions. The purpose of this chapter is to investigate the registration practices among researchers...... the registrations as a way to be promoted and to legitimise and account for their work; on the other hand, the economic incentives behind ranking lists and bibliographic research indicators threaten the individual researcher's freedom. The findings also show how managers have difficulties in translating back...

  2. From performance measurement to learning

    Lewis, Jenny; Triantafillou, Peter

    2012-01-01

    Over the last few decades accountability has accommodated an increasing number of different political, legal and administrative goals. This article focuses on the administrative aspect of accountability and explores the potential perils of a shift from performance measurement to learning. While...... overload. We conclude with some comments on limiting the undesirable consequences of such a move. Points for practitioners Public administrators need to identify and weigh the (human, political and economic) benefits and costs of accountability regimes. While output-focused performance measurement regimes...... to comply with accountability requirements, because of the first point. Third, the costs of compliance are likely to increase because learning requires more participation and dialogue. Fourth, accountability as learning may generate a ‘change for the sake of change’ mentality, creating further government...

  3. Evaluation of conventional and high-performance routine solar radiation measurements for improved solar resource, climatological trends, and radiative modeling

    Gueymard, Christian A. [Solar Consulting Services, P.O. Box 392, Colebrook, NH 03576 (United States); Myers, Daryl R. [National Renewable Energy Laboratory, 1617 Cole Blvd., Golden, CO 80401-3305 (United States)

    2009-02-15

    The solar renewable energy community depends on radiometric measurements and instrumentation for data to design and monitor solar energy systems, and develop and validate solar radiation models. This contribution evaluates the impact of instrument uncertainties contributing to data inaccuracies and their effect on short-term and long-term measurement series, and on radiation model validation studies. For the latter part, transposition (horizontal-to-tilt) models are used as an example. Confirming previous studies, it is found that a widely used pyranometer strongly underestimates diffuse and global radiation, particularly in winter, unless appropriate corrective measures are taken. Other types of measurement problems are also discussed, such as those involved in the indirect determination of direct or diffuse irradiance, and in shadowband correction methods. The sensitivity of the predictions from transposition models to inaccuracies in input radiation data is demonstrated. Caution is therefore issued to the whole community regarding drawing detailed conclusions about solar radiation data without due attention to the data quality issues only recently identified. (author)

  4. High-performance liquid chromatography method with radiochemical detection for measurement of nitric oxide synthase, arginase, and arginine decarboxylase activities

    Volke, A; Wegener, Gregers; Vasar, E

    2006-01-01

    regulate NOS activity. We aimed to develop a HPLC-based method to measure simultaneously the products of these three enzymes. Traditionally, the separation of amino acids and related compounds with HPLC has been carried out with precolumn derivatization and reverse phase chromatography. We describe here...

  5. Python high performance programming

    Lanaro, Gabriele

    2013-01-01

    An exciting, easy-to-follow guide illustrating the techniques to boost the performance of Python code, and their applications with plenty of hands-on examples.If you are a programmer who likes the power and simplicity of Python and would like to use this language for performance-critical applications, this book is ideal for you. All that is required is a basic knowledge of the Python programming language. The book will cover basic and advanced topics so will be great for you whether you are a new or a seasoned Python developer.

  6. High performance germanium MOSFETs

    Saraswat, Krishna [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States)]. E-mail: saraswat@stanford.edu; Chui, Chi On [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Krishnamohan, Tejas [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Kim, Donghyun [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Nayfeh, Ammar [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Pethe, Abhijit [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States)

    2006-12-15

    Ge is a very promising material as future channel materials for nanoscale MOSFETs due to its high mobility and thus a higher source injection velocity, which translates into higher drive current and smaller gate delay. However, for Ge to become main-stream, surface passivation and heterogeneous integration of crystalline Ge layers on Si must be achieved. We have demonstrated growth of fully relaxed smooth single crystal Ge layers on Si using a novel multi-step growth and hydrogen anneal process without any graded buffer SiGe layer. Surface passivation of Ge has been achieved with its native oxynitride (GeO {sub x}N {sub y} ) and high-permittivity (high-k) metal oxides of Al, Zr and Hf. High mobility MOSFETs have been demonstrated in bulk Ge with high-k gate dielectrics and metal gates. However, due to their smaller bandgap and higher dielectric constant, most high mobility materials suffer from large band-to-band tunneling (BTBT) leakage currents and worse short channel effects. We present novel, Si and Ge based heterostructure MOSFETs, which can significantly reduce the BTBT leakage currents while retaining high channel mobility, making them suitable for scaling into the sub-15 nm regime. Through full band Monte-Carlo, Poisson-Schrodinger and detailed BTBT simulations we show a dramatic reduction in BTBT and excellent electrostatic control of the channel, while maintaining very high drive currents in these highly scaled heterostructure DGFETs. Heterostructure MOSFETs with varying strained-Ge or SiGe thickness, Si cap thickness and Ge percentage were fabricated on bulk Si and SOI substrates. The ultra-thin ({approx}2 nm) strained-Ge channel heterostructure MOSFETs exhibited >4x mobility enhancements over bulk Si devices and >10x BTBT reduction over surface channel strained SiGe devices.

  7. High performance germanium MOSFETs

    Saraswat, Krishna; Chui, Chi On; Krishnamohan, Tejas; Kim, Donghyun; Nayfeh, Ammar; Pethe, Abhijit

    2006-01-01

    Ge is a very promising material as future channel materials for nanoscale MOSFETs due to its high mobility and thus a higher source injection velocity, which translates into higher drive current and smaller gate delay. However, for Ge to become main-stream, surface passivation and heterogeneous integration of crystalline Ge layers on Si must be achieved. We have demonstrated growth of fully relaxed smooth single crystal Ge layers on Si using a novel multi-step growth and hydrogen anneal process without any graded buffer SiGe layer. Surface passivation of Ge has been achieved with its native oxynitride (GeO x N y ) and high-permittivity (high-k) metal oxides of Al, Zr and Hf. High mobility MOSFETs have been demonstrated in bulk Ge with high-k gate dielectrics and metal gates. However, due to their smaller bandgap and higher dielectric constant, most high mobility materials suffer from large band-to-band tunneling (BTBT) leakage currents and worse short channel effects. We present novel, Si and Ge based heterostructure MOSFETs, which can significantly reduce the BTBT leakage currents while retaining high channel mobility, making them suitable for scaling into the sub-15 nm regime. Through full band Monte-Carlo, Poisson-Schrodinger and detailed BTBT simulations we show a dramatic reduction in BTBT and excellent electrostatic control of the channel, while maintaining very high drive currents in these highly scaled heterostructure DGFETs. Heterostructure MOSFETs with varying strained-Ge or SiGe thickness, Si cap thickness and Ge percentage were fabricated on bulk Si and SOI substrates. The ultra-thin (∼2 nm) strained-Ge channel heterostructure MOSFETs exhibited >4x mobility enhancements over bulk Si devices and >10x BTBT reduction over surface channel strained SiGe devices

  8. A low-cost, high-performance, digital signal processor-based lock-in amplifier capable of measuring multiple frequency sweeps simultaneously

    Sonnaillon, Maximiliano Osvaldo; Bonetto, Fabian Jose

    2005-01-01

    A high-performance digital lock-in amplifier implemented in a low-cost digital signal processor (DSP) board is described. This lock in is capable of measuring simultaneously multiple frequencies that change in time as frequency sweeps (chirps). The used 32-bit DSP has enough computing power to generate N=3 simultaneous reference signals and accurately measure the N=3 responses, operating as three lock ins connected in parallel to a linear system. The lock in stores the measured values in memory until they are downloaded to the a personal computer (PC). The lock in works in stand-alone mode and can be programmed and configured through the PC serial port. Downsampling and multiple filter stages were used in order to obtain a sharp roll off and a long time constant in the filters. This makes measurements possible in presence of high-noise levels. Before each measurement, the lock in performs an autocalibration that measures the frequency response of analog output and input circuitry in order to compensate for the departure from ideal operation. Improvements from previous lock-in implementations allow measuring the frequency response of a system in a short time. Furthermore, the proposed implementation can measure how the frequency response changes with time, a characteristic that is very important in our biotechnological application. The number of simultaneous components that the lock in can generate and measure can be extended, without reprogramming, by only using other DSPs of the same family that are code compatible and work at higher clock frequencies

  9. A low-cost, high-performance, digital signal processor-based lock-in amplifier capable of measuring multiple frequency sweeps simultaneously

    Sonnaillon, Maximiliano Osvaldo; Bonetto, Fabian Jose [Laboratorio de Cavitacion y Biotecnologia, San Carlos de Bariloche (8400) (Argentina)

    2005-02-01

    A high-performance digital lock-in amplifier implemented in a low-cost digital signal processor (DSP) board is described. This lock in is capable of measuring simultaneously multiple frequencies that change in time as frequency sweeps (chirps). The used 32-bit DSP has enough computing power to generate N=3 simultaneous reference signals and accurately measure the N=3 responses, operating as three lock ins connected in parallel to a linear system. The lock in stores the measured values in memory until they are downloaded to the a personal computer (PC). The lock in works in stand-alone mode and can be programmed and configured through the PC serial port. Downsampling and multiple filter stages were used in order to obtain a sharp roll off and a long time constant in the filters. This makes measurements possible in presence of high-noise levels. Before each measurement, the lock in performs an autocalibration that measures the frequency response of analog output and input circuitry in order to compensate for the departure from ideal operation. Improvements from previous lock-in implementations allow measuring the frequency response of a system in a short time. Furthermore, the proposed implementation can measure how the frequency response changes with time, a characteristic that is very important in our biotechnological application. The number of simultaneous components that the lock in can generate and measure can be extended, without reprogramming, by only using other DSPs of the same family that are code compatible and work at higher clock frequencies.

  10. High Performance Computing Multicast

    2012-02-01

    A History of the Virtual Synchrony Replication Model,” in Replication: Theory and Practice, Charron-Bost, B., Pedone, F., and Schiper, A. (Eds...Performance Computing IP / IPv4 Internet Protocol (version 4.0) IPMC Internet Protocol MultiCast LAN Local Area Network MCMD Dr. Multicast MPI

  11. NGINX high performance

    Sharma, Rahul

    2015-01-01

    System administrators, developers, and engineers looking for ways to achieve maximum performance from NGINX will find this book beneficial. If you are looking for solutions such as how to handle more users from the same system or load your website pages faster, then this is the book for you.

  12. Comparison of turbulence measurements from DIII-D low-mode and high-performance plasmas to turbulence simulations and models

    Rhodes, T.L.; Leboeuf, J.-N.; Sydora, R.D.; Groebner, R.J.; Doyle, E.J.; McKee, G.R.; Peebles, W.A.; Rettig, C.L.; Zeng, L.; Wang, G.

    2002-01-01

    Measured turbulence characteristics (correlation lengths, spectra, etc.) in low-confinement (L-mode) and high-performance plasmas in the DIII-D tokamak [Luxon et al., Proceedings Plasma Physics and Controlled Nuclear Fusion Research 1986 (International Atomic Energy Agency, Vienna, 1987), Vol. I, p. 159] show many similarities with the characteristics determined from turbulence simulations. Radial correlation lengths Δr of density fluctuations from L-mode discharges are found to be numerically similar to the ion poloidal gyroradius ρ θ,s , or 5-10 times the ion gyroradius ρ s over the radial region 0.2 θ,s or 5-10 times ρ s , an experiment was performed which modified ρ θs while keeping other plasma parameters approximately fixed. It was found that the experimental Δr did not scale as ρ θ,s , which was similar to low-resolution UCAN simulations. Finally, both experimental measurements and gyrokinetic simulations indicate a significant reduction in the radial correlation length from high-performance quiescent double barrier discharges, as compared to normal L-mode, consistent with reduced transport in these high-performance plasmas

  13. Microwave-assisted Derivatization of Fatty Acids for Its Measurement in Milk Using High-Performance Liquid Chromatography.

    Shrestha, Rojeet; Miura, Yusuke; Hirano, Ken-Ichi; Chen, Zhen; Okabe, Hiroaki; Chiba, Hitoshi; Hui, Shu-Ping

    2018-01-01

    Fatty acid (FA) profiling of milk has important applications in human health and nutrition. Conventional methods for the saponification and derivatization of FA are time-consuming and laborious. We aimed to develop a simple, rapid, and economical method for the determination of FA in milk. We applied a beneficial approach of microwave-assisted saponification (MAS) of milk fats and microwave-assisted derivatization (MAD) of FA to its hydrazides, integrated with HPLC-based analysis. The optimal conditions for MAS and MAD were determined. Microwave irradiation significantly reduced the sample preparation time from 80 min in the conventional method to less than 3 min. We used three internal standards for the measurement of short-, medium- and long-chain FA. The proposed method showed satisfactory analytical sensitivity, recovery and reproducibility. There was a significant correlation in the milk FA concentrations between the proposed and conventional methods. Being quick, economic, and convenient, the proposed method for the milk FA measurement can be substitute for the convention method.

  14. High performance liquid chromatographic separation of beryllium from some transition metals produced in high energy proton irradiations of medium mass elements: measurement of (p,7Be) cross sections

    Fassbender, M.; Spellerberg, S.; Qaim, S.M.

    1996-01-01

    A high performance liquid chromatographic (HPLC) method was developed for the separation of 7 Be formed in high energy proton irradiation of medium mass elements like Fe, Cu etc. The bulk of the target material was removed in a preseparation step. Thereafter beryllium was obtained in a high purity within a few minutes elution time using a mixture of 5 mM citric acid and 1.0 mM pyridinedicarboxylic acid as eluent and a SYKAM KO2 analytical cation-exchange column. The effect of Be-carrier on the quality of separation was investigated. The quality of separation deteriorated with the increasing Be-carrier column loading. A certain amount of Be-carrier was, however, necessary in order to quantitate the results. By using low Be-carrier amounts (∝100 μg) and determining the elution yield via a conductometric method, it was possible to obtain quantitative separation results. Besides the analytical column, a semi-preparative column was also used, and the Be separation yield determined gravimetrically. The cross sections for the (p, 7 Be) process on Cu obtained using the two separation columns (analytical and semipreparative) and the two separation yield determination methods agreed within 15%. (orig.)

  15. Performance measures for a dialysis setting.

    Gu, Xiuzhu; Itoh, Kenji

    2018-03-01

    This study from Japan extracted performance measures for dialysis unit management and investigated their characteristics from professional views. Two surveys were conducted using self-administered questionnaires, in which dialysis managers/staff were asked to rate the usefulness of 44 performance indicators. A total of 255 managers and 2,097 staff responded. Eight performance measures were elicited from dialysis manager and staff responses: these were safety, operational efficiency, quality of working life, financial effectiveness, employee development, mortality, patient/employee satisfaction and patient-centred health care. These performance measures were almost compatible with those extracted in overall healthcare settings in a previous study. Internal reliability, content and construct validity of the performance measures for the dialysis setting were ensured to some extent. As a general trend, both dialysis managers and staff perceived performance measures as highly useful, especially for safety, mortality, operational efficiency and patient/employee satisfaction, but showed relatively low concerns for patient-centred health care and employee development. However, dialysis managers' usefulness perceptions were significantly higher than staff. Important guidelines for designing a holistic hospital/clinic management system were yielded. Performance measures must be balanced for outcomes and performance shaping factors (PSF); a common set of performance measures could be applied to all the healthcare settings, although performance indicators of each measure should be composed based on the application field and setting; in addition, sound causal relationships between PSF and outcome measures/indicators should be explored for further improvement. © 2017 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  16. Measurement of Capsaicinoids in Chiltepin Hot Pepper: A Comparison Study between Spectrophotometric Method and High Performance Liquid Chromatography Analysis

    Alberto González-Zamora

    2015-01-01

    Full Text Available Direct spectrophotometric determination of capsaicinoids content in Chiltepin pepper was investigated as a possible alternative to HPLC analysis. Capsaicinoids were extracted from Chiltepin in red ripe and green fruit with acetonitrile and evaluated quantitatively using the HPLC method with capsaicin and dihydrocapsaicin standards. Three samples of different treatment were analyzed for their capsaicinoids content successfully by these methods. HPLC-DAD revealed that capsaicin, dihydrocapsaicin, and nordihydrocapsaicin comprised up to 98% of total capsaicinoids detected. The absorbance of the diluted samples was read on a spectrophotometer at 215–300 nm and monitored at 280 nm. We report herein the comparison between traditional UV assays and HPLC-DAD methods for the determination of the molar absorptivity coefficient of capsaicin (ε280=3,410 and ε280=3,720 M−1 cm−1 and dihydrocapsaicin (ε280=4,175 and ε280=4,350 M−1 cm−1, respectively. Statistical comparisons were performed using the regression analyses (ordinary linear regression and Deming regression and Bland-Altman analysis. Comparative data for pungency was determined spectrophotometrically and by HPLC on samples ranging from 29.55 to 129 mg/g with a correlation of 0.91. These results indicate that the two methods significantly agree. The described spectrophotometric method can be routinely used for total capsaicinoids analysis and quality control in agricultural and pharmaceutical analysis.

  17. Opcode counting for performance measurement

    Gara, Alan; Satterfield, David L.; Walkup, Robert E.

    2018-03-20

    Methods, systems and computer program products are disclosed for measuring a performance of a program running on a processing unit of a processing system. In one embodiment, the method comprises informing a logic unit of each instruction in the program that is executed by the processing unit, assigning a weight to each instruction, assigning the instructions to a plurality of groups, and analyzing the plurality of groups to measure one or more metrics. In one embodiment, each instruction includes an operating code portion, and the assigning includes assigning the instructions to the groups based on the operating code portions of the instructions. In an embodiment, each type of instruction is assigned to a respective one of the plurality of groups. These groups may be combined into a plurality of sets of the groups.

  18. High performance proton accelerators

    Favale, A.J.

    1989-01-01

    In concert with this theme this paper briefly outlines how Grumman, over the past 4 years, has evolved from a company that designed and fabricated a Radio Frequency Quadrupole (RFQ) accelerator from the Los Alamos National Laboratory (LANL) physics and specifications to a company who, as prime contractor, is designing, fabricating, assembling and commissioning the US Army Strategic Defense Commands (USA SDC) Continuous Wave Deuterium Demonstrator (CWDD) accelerator as a turn-key operation. In the case of the RFQ, LANL scientists performed the physics analysis, established the specifications supported Grumman on the mechanical design, conducted the RFQ tuning and tested the RFQ at their laboratory. For the CWDD Program Grumman has the responsibility for the physics and engineering designs, assembly, testing and commissioning albeit with the support of consultants from LANL, Lawrence Berkeley Laboratory (LBL) and Brookhaven National laboratory. In addition, Culham Laboratory and LANL are team members on CWDD. LANL scientists have reviewed the physics design as well as a USA SDC review board. 9 figs

  19. Measure Your Gradient”: A New Way to Measure Gradients in High Performance Liquid Chromatography by Mass Spectrometric or Absorbance Detection

    Magee, Megan H.; Manulik, Joseph C.; Barnes, Brian B.; Abate-Pella, Daniel; Hewitt, Joshua T.; Boswell, Paul G.

    2014-01-01

    The gradient produced by an HPLC is never the same as the one it is programmed to produce, but non-idealities in the gradient can be taken into account if they are measured. Such measurements are routine, yet only one general approach has been described to make them: both HPLC solvents are replaced with water, solvent B is spiked with 0.1% acetone, and the gradient is measured by UV absorbance. Despite the widespread use of this procedure, we found a number of problems and complications with it, mostly stemming from the fact that it measures the gradient under abnormal conditions (e.g. both solvents are water). It is also generally not amenable to MS detection, leaving those with only an MS detector no way to accurately measure their gradients. We describe a new approach called “Measure Your Gradient” that potentially solves these problems. One runs a test mixture containing 20 standards on a standard stationary phase and enters their gradient retention times into open-source software available at www.measureyourgradient.org. The software uses the retention times to back-calculate the gradient that was truly produced by the HPLC. Here we present a preliminary investigation of the new approach. We found that gradients measured this way are comparable to those measured by a more accurate, albeit impractical, version of the conventional approach. The new procedure worked with different gradients, flow rates, column lengths, inner diameters, on two different HPLCs, and with six different batches of the standard stationary phase. PMID:25441073

  20. Measurement of surface contamination by certain antineoplastic drugs using high-performance liquid chromatography: applications in occupational hygiene investigations in hospital environments.

    Rubino, F M; Floridia, L; Pietropaolo, A M; Tavazzani, M; Colombi, A

    1999-01-01

    Within the context of continuing interest in occupational hygiene of hospitals as workplaces, the authors report the results of a preliminary study on surface contamination by certain antineoplastic drugs (ANDs), recently performed in eight cancer departments of two large general hospitals in Milan, Italy. Since reliable quantitative information on the exposure levels to individual drugs is mandatory to establish a strong interpretative framework for correctly assessing the health risks associated with manipulation of ANDs and rationally advise intervention priorities for exposure abatement, two automated analytical methods were set up using reverse-phase high-performance liquid chromatography for the measurement of contamination by 1) methotrexate (MTX) and 2) the three most important nucleoside analogue antineoplastic drugs (5-fluorouracil 5FU, Cytarabin CYA, Gemcytabin GCA) on surfaces such as those of preparation hoods and work-benches in the pharmacies of cancer wards. The methods are characterized by short analysis time (7 min) under isocratic conditions, by the use of a mobile phase with a minimal content of organic solvent and by high sensitivity, adequate to detect surface contamination in the 5-10 micrograms/m2 range. To exemplify the performance of the analytical methods in the assessment of contamination levels from the target analyte ANDs, data are reported on the contamination levels measured on various surfaces (such as on handles, floor surfaces and window panes, even far from the preparation hood). Analyte concentrations corresponding to 0.8-1.5 micrograms of 5FU were measured on telephones, 0.85-28 micrograms/m2 of CYA were measured on tables, 1.2-1150 micrograms/m2 of GCA on furniture and floors. Spillage fractions between 1-5% of the used ANDs (daily use 5FU 7-13 g; CYA 0.1-7.1 g; GCA 0.2-5 g) were measured on the disposable polythene-backed paper cover sheet of the preparation hood.

  1. Performance measurement and pay for performance

    Tuijl, van H.F.J.M.; Kleingeld, P.A.M.; Algera, J.A.; Rutten, M.L.; Sonnentag, S.

    2002-01-01

    This chapter, which takes a (re)design perspective, focuses on the management of employees’ contributions to organisational goal attainment. The control loop for the self-regulation of task performance is used as a frame of reference. Several subsets of design requirements are described and related

  2. Measurement properties of performance-based outcome measures to assess physical function in young and middle-aged people known to be at high risk of hip and/or knee osteoarthritis: a systematic review.

    Kroman, S L; Roos, E M; Bennell, K L; Hinman, R S; Dobson, F

    2014-01-01

    To systematically appraise the evidence on measurement properties of performance-based outcome measures to assess physical function in young and middle-aged people known to be at high risk of hip and/or knee osteoarthritis (OA). Electronic searches were performed in MEDLINE, CINAHL, Scopus and SPORTDiscus in May 2013. Two reviewers independently rated the measurement properties using the 4-point COSMIN checklist. Best evidence synthesis was made using COSMIN quality, consistency and direction of findings and sample size. Twenty of 2736 papers were eligible for inclusion and 24 different performance-based outcome measures knee or obese populations were evaluated. No tests related to hip populations were included. Twenty-five measurement properties including reliability (nine studies), construct validity (hypothesis testing) (nine studies), measurement error (three studies), structural validity (two studies), interpretability (one study) and responsiveness (one study) were evaluated. A positive rating was given to 12.5% (30/240) of all possible measurement ratings. Tests were grouped into two categories based on the population characteristics. The one-legged hop for distance, followed by the 6-m timed hop and cross over hop for distance were the best-rated tests for the knee-injured population. Whereas the 6-min walk test was the only included test for the obese population. This review highlights the many gaps in knowledge about the measurement properties of performance-based outcome measures for young and middle-aged people known to be at high risk of hip and/or knee OA. There is a need for consensus on which outcome measures should be used and/or combined when assessing physical function in this population. Further good quality research is required. Copyright © 2013 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  3. The importance of the radial electric field (Er) on interpretation of motional Stark effect measurements of the q profile in DIII-D high performance plasmas

    Rice, B.W.; Lao, L.L.; Burrell, K.H.; Greenfield, C.M.; Lin-Liu, Y.R.

    1997-06-01

    The development of enhanced confinement regimes such as negative central magnetic shear (NCS) and VH-mode illustrates the importance of the q profile and ExB velocity shear in improving stability and confinement in tokamak plasmas. Recently, it was realized that the large values of radial electric field observed in these high performance plasmas, up to 200 kV/m in DIII-D, have an effect on the interpretation of motional Stark effect (MSE) measurements of the q profile. It has also been shown that, with additional MSE measurements, one can extract a direct measurement of E r in addition to the usual poloidal field measurement. During a recent vent on DIII-D, 19 additional MSE channels with new viewing angles were added (for a total of 35 channels) in order to descriminate between the neutral beam v b x B electric field and the plasma E r field. In this paper, the system upgrade will be described and initial measurements demonstrating simultaneous measurement of the q and E r profiles will be presented

  4. The Validity of Subjective Performance Measures

    Meier, Kenneth J.; Winter, Søren C.; O'Toole, Laurence J.

    2015-01-01

    to provide, and are highly policy specific rendering generalization difficult. But are perceptual performance measures valid, and do they generate unbiased findings? We examine these questions in a comparative study of middle managers in schools in Texas and Denmark. The findings are remarkably similar...

  5. On the Measurement of Morphine Level and Determination of Consumption of Different Drugs in People’s Urine at Different Ages through High-Performance Liquid Chromatography

    saeed shahabi

    2015-06-01

    Full Text Available Objective: Morphine is one of the important narcotics which constitutes one of the alkaloid and opium components. If this substance is prepared defectively, it will appear in a variety of colors. Therefore, it is not possible to identify this substance by its color. Method: In this study, drug addicts were invited to take urine tests. After morphine extraction from urine samples by chromium toxicity method, different standard concentrations were injected into HPLC device and the resultant diagrams were analyzed. Then, some changes were made into the methodology for the optimality of measurement process and morphine determination in human urine. Results: It was found that the amount of morphine available in the urine samples was measureable through high-performance liquid chromatography and the amount of impurities added to drugs could be determined. Conclusion: This method can be used for diagnosis.

  6. Online stable carbon isotope ratio measurement in formic acid, acetic acid, methanol and ethanol in water by high performance liquid chromatography-isotope ratio mass spectrometry

    Tagami, Keiko; Uchida, Shigeo

    2008-01-01

    A suitable analysis condition was determined for high performance liquid chromatography-isotope ratio mass spectrometry (HPLC-IRMS) while making sequential measurements of stable carbon isotope ratios of δ 13 C in formic acid, acetic acid, methanol and ethanol dissolved in water. For this online column separation method, organic reagents are not applicable due to carbon contamination; thus, water and KH 2 PO 4 at low concentrations were tested as mobile phase in combination with a HyPURITY AQUASTAR TM column. Formic acid, acetic acid, methanol and ethanol were separated when 2 mM KH 2 PO 4 aqueous solution was used. Under the determined analysis condition for HPLC-IRMS, carbon concentrations could be measured quantitatively as well as carbon isotope ratio when carbon concentration was higher than 0.4 mM L for each chemical

  7. Performance measurement for information systems: Industry perspectives

    Bishop, Peter C.; Yoes, Cissy; Hamilton, Kay

    1992-01-01

    Performance measurement has become a focal topic for information systems (IS) organizations. Historically, IS performance measures have dealt with the efficiency of the data processing function. Today, the function of most IS organizations goes beyond simple data processing. To understand how IS organizations have developed meaningful performance measures that reflect their objectives and activities, industry perspectives on IS performance measurement was studied. The objectives of the study were to understand the state of the practice in IS performance techniques for IS performance measurement; to gather approaches and measures of actual performance measures used in industry; and to report patterns, trends, and lessons learned about performance measurement to NASA/JSC. Examples of how some of the most forward looking companies are shaping their IS processes through measurement is provided. Thoughts on the presence of a life-cycle to performance measures development and a suggested taxonomy for performance measurements are included in the appendices.

  8. Measurement of isotopic composition of lanthanides in reprocessing process solutions by high-performance liquid chromatography with inductively coupled plasma mass spectrometry (HPLC/ICP-MS)

    Okano, Masanori; Jitsukata, Shu; Kuno, Takehiko; Yamada, Keiji

    2011-01-01

    Isotopic compositions of fission products in process solutions and wastes in a reprocessing plant are valuable to proceed safety study of the solutions and research/development concerning treatment/disposal of the wastes. The amount of neodymium-148 is a reliable indication to evaluate irradiation history. The isotopic compositions of samarium and gadolinium in high radioactive wastes are referred to as essential data to evaluate environmental impact in geological repositories. However, pretreatment of analysis must be done with complicated chemical separation such as solvent extraction and ion exchange. The actual measurement data of isotopic compositions of lanthanides comparable to the one of actinides in spent fuel reprocessing process has not been obtained enough. Rapid and high sensitive analytical technique based on high-performance liquid chromatography (HPLC) with an inductively coupled plasma mass spectrometry (ICP-MS) has been developed for the measurement of isotopic compositions of lanthanides in spent fuel reprocessing solutions. HPLC/ICP-MS measurement system was customized for a glove-box to be applied to the radioactive solutions. The cation exchange chromatographic columns (Shim-pack IC-C1) and injection valve (20μL) were located inside of the glove-box except the chromatographic pump. The elements of lanthanide group were separated by a gradient program of HPLC with α-hydroxyisobutyric acid. Isotopic compositions of lanthanides in eluate was sequentially analyzed by a quadruple ICP-MS. Optimization of parameter of HPLC and ICP-MS measurement system was examined with standard solutions containing 14 lanthanide elements. The elements of lanthanides were separated by HPLC and detected by ICP-MS within 25 minutes. The detection limits of Nd-146, Sm-147 and Gd-157 were 0.37 μg L -1 , 0.69 μg L -1 and 0.47 μg L -1 , respectively. The analytical precision of the above three isotopes was better than 10% for standard solutions of 100 μg L -1 with

  9. 45 CFR 305.2 - Performance measures.

    2010-10-01

    ... PROGRAM PERFORMANCE MEASURES, STANDARDS, FINANCIAL INCENTIVES, AND PENALTIES § 305.2 Performance measures. (a) The child support incentive system measures State performance levels in five program areas... 45 Public Welfare 2 2010-10-01 2010-10-01 false Performance measures. 305.2 Section 305.2 Public...

  10. High Performance Networks for High Impact Science

    Scott, Mary A.; Bair, Raymond A.

    2003-02-13

    This workshop was the first major activity in developing a strategic plan for high-performance networking in the Office of Science. Held August 13 through 15, 2002, it brought together a selection of end users, especially representing the emerging, high-visibility initiatives, and network visionaries to identify opportunities and begin defining the path forward.

  11. Emission measurement of diesel vehicles in Hong Kong through on-road remote sensing: Performance review and identification of high-emitters.

    Huang, Yuhan; Organ, Bruce; Zhou, John L; Surawski, Nic C; Hong, Guang; Chan, Edward F C; Yam, Yat Shing

    2018-06-01

    A two-year remote sensing measurement program was carried out in Hong Kong to obtain a large dataset of on-road diesel vehicle emissions. Analysis was performed to evaluate the effect of vehicle manufacture year (1949-2015) and engine size (0.4-20 L) on the emission rates and high-emitters. The results showed that CO emission rates of larger engine size vehicles were higher than those of small vehicles during the study period, while HC and NO were higher before manufacture year 2006 and then became similar levels between manufacture years 2006 and 2015. CO, HC and NO of all vehicles showed an unexpectedly increasing trend during 1998-2004, in particular ≥6001 cc vehicles. However, they all decreased steadily in the last decade (2005-2015), except for NO of ≥6001 cc vehicles during 2013-2015. The distributions of CO and HC emission rates were highly skewed as the dirtiest 10% vehicles emitted much higher emissions than all the other vehicles. Moreover, this skewness became more significant for larger engine size or newer vehicles. The results indicated that remote sensing technology would be very effective to screen the CO and HC high-emitters and thus control the on-road vehicle emissions, but less effective for controlling NO emissions. No clear correlation was observed between the manufacture year and percentage of high-emitters for ≤3000 cc vehicles. However, the percentage of high-emitters decreased with newer manufacture year for larger vehicles. In addition, high-emitters of different pollutants were relatively independent, in particular NO emissions, indicating that high-emitter screening criteria should be defined on a CO-or-HC-or-NO basis, rather than a CO-and-HC-and-NO basis. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Measurement of K-27, an oxime-type cholinesterase reactivator by high-performance liquid chromatography with electrochemical detection from different biological samples.

    Gyenge, Melinda; Kalász, Huba; Petroianu, George A; Laufer, Rudolf; Kuca, Kamil; Tekes, Kornélia

    2007-08-17

    K-27 is a bisquaternary asymmetric pyridinium aldoxime-type cholinesterase reactivator of use in the treatment of poisoning with organophosphorous esterase inhibitors. A sensitive, simple and reliable reverse-phase high-performance liquid chromatographic method with electrochemical detection was developed for the measurement of K-27 concentrations in rat brain, cerebrospinal fluid, serum and urine samples. Male Wistar rats were treated intramuscularly with K-27 and the samples were collected 60 min later. Separation was carried out on an octadecyl silica stationary phase and a disodium phosphate solution (pH 3.7) containing citric acid, octane sulphonic acid and acetonitrile served as mobile phase. Measurements were carried out at 30 degrees C at E(ox) 0.65 V. The calibration curve was linear through the range of 10-250 ng/mL. Accuracy, precision and the limit of detection calculated were satisfactory according to internationally accepted criteria. Limit of quantitation was 10 ng/mL. The method developed is reliable and sensitive enough for monitoring K-27 levels from different biological samples including as little as 10 microL of cerebrospinal fluid. The method--with slight modification in the composition of the mobile phase--can be used to measure a wide range of other related pyridinium aldoxime-type cholinesterase reactivators.

  13. RavenDB high performance

    Ritchie, Brian

    2013-01-01

    RavenDB High Performance is comprehensive yet concise tutorial that developers can use to.This book is for developers & software architects who are designing systems in order to achieve high performance right from the start. A basic understanding of RavenDB is recommended, but not required. While the book focuses on advanced topics, it does not assume that the reader has a great deal of prior knowledge of working with RavenDB.

  14. Mobility and reliability performance measurement.

    2013-06-01

    This project grew out of the fact that mobility was identified early on as one of the key performance focus areas of NCDOTs : strategic transformation effort. The Transformation Management Team (TMT) established a TMT Mobility Workstream Team : in...

  15. High-Performance Operating Systems

    Sharp, Robin

    1999-01-01

    Notes prepared for the DTU course 49421 "High Performance Operating Systems". The notes deal with quantitative and qualitative techniques for use in the design and evaluation of operating systems in computer systems for which performance is an important parameter, such as real-time applications......, communication systems and multimedia systems....

  16. Development and validation of an ultra-high performance liquid chromatography-tandem mass spectrometry method to measure creatinine in human urine.

    Fraselle, S; De Cremer, K; Coucke, W; Glorieux, G; Vanmassenhove, J; Schepers, E; Neirynck, N; Van Overmeire, I; Van Loco, J; Van Biesen, W; Vanholder, R

    2015-04-15

    Despite decades of creatinine measurement in biological fluids using a large variety of analytical methods, an accurate determination of this compound remains challenging. Especially with the novel trend to assess biomarkers on large sample sets preserved in biobanks, a simple and fast method that could cope with both a high sample throughput and a low volume of sample is still of interest. In answer to these challenges, a fast and accurate ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method was developed to measure creatinine in small volumes of human urine. In this method, urine samples are simply diluted with a basic mobile phase and injected directly under positive electrospray ionization (ESI) conditions, without further purification steps. The combination of an important diluting factor (10(4) times) due to the use of a very sensitive triple quadrupole mass spectrometer (XEVO TQ) and the addition of creatinine-d3 as internal standard completely eliminates matrix effects coming from the urine. The method was validated in-house in 2012 according to the EMA guideline on bioanalytical method validation using Certified Reference samples from the German External Quality Assessment Scheme (G-Equas) proficiency test. All obtained results for accuracy and recovery are within the authorized tolerance ranges defined by G-Equas. The method is linear between 0 and 5 g/L, with LOD and LOQ of 5 × 10(-3) g/L and 10(-2) g/L, respectively. The repeatability (CV(r) = 1.03-2.07%) and intra-laboratory reproducibility (CV(RW) = 1.97-2.40%) satisfy the EMA 2012 guideline. The validated method was firstly applied to perform the German G-Equas proficiency test rounds 51 and 53, in 2013 and 2014, respectively. The obtained results were again all within the accepted tolerance ranges and very close to the reference values defined by the organizers of the proficiency test scheme, demonstrating an excellent accuracy of the developed method. The

  17. Traffic Management Systems Performance Measurement: Final Report

    Banks, James H.; Kelly, Gregory

    1997-01-01

    This report documents a study of performance measurement for Transportation Management Centers (TMCs). Performance measurement requirements were analyzed, data collection and management techniques were investigated, and case study traffic data system improvement plans were prepared for two Caltrans districts.

  18. Performance measurement and insurance liabilities

    Plantinga, A; Huijgen, C

    2001-01-01

    In this article, the authors develop an attribution framework for evaluating the investment performance of institutional investors such as insurance companies. The model is useful in identifying the investment skills of insurance companies. This is accomplished by developing a dual benchmark for the

  19. Ultra-high-performance liquid chromatography-tandem mass spectrometry measurement of climbazole deposition from hair care products onto artificial skin and human scalp.

    Chen, Guoqiang; Hoptroff, Michael; Fei, Xiaoqing; Su, Ya; Janssen, Hans-Gerd

    2013-11-22

    A sensitive and specific ultra-high-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method was developed and validated for the measurement of climbazole deposition from hair care products onto artificial skin and human scalp. Deuterated climbazole was used as the internal standard. Atmospheric pressure chemical ionization (APCI) in positive mode was applied for the detection of climbazole. For quantification, multiple reaction monitoring (MRM) transition 293.0>69.0 was monitored for climbazole, and MRM transition 296.0>225.1 for the deuterated climbazole. The linear range ran from 4 to 2000 ng mL(-1). The limit of detection (LOD) and the limit of quantification (LOQ) were 1 ng mL(-1) and 4 ng mL(-1), respectively, which enabled quantification of climbazole on artificial skin and human scalp at ppb level (corresponding to 16 ng cm(-2)). For the sampling of climbazole from human scalp the buffer scrub method using a surfactant-modified phosphate buffered saline (PBS) solution was selected based on a performance comparison of tape stripping, the buffer scrub method and solvent extraction in in vitro studies. Using this method, climbazole deposition in in vitro and in vivo studies was successfully quantified. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. An improved high-performance liquid chromatography-tandem mass spectrometric method to measure atrazine and its metabolites in human urine.

    Panuwet, Parinya; Restrepo, Paula A; Magsumbol, Melina; Jung, Kyung Y; Montesano, M Angela; Needham, Larry L; Barr, Dana Boyd

    2010-04-15

    We report an improved solid-phase extraction-high-performance liquid chromatography-tandem mass spectrometry method with isotope dilution quantification to measure seven atrazine metabolites in urine. The metabolites measured were hydroxyatrazine (HA), diaminochloroatrazine (DACT), desisopropylatrazine (DIA), desethylatrazine (DEA), desethylatrazine mercapturate (DEAM), atrazine mercapturate (ATZM), and atrazine (ATZ). Using offline mixed-mode reversed-phase/cation-exchange solid-phase extraction dramatically increased recovery and sensitivity by reducing the influence of matrix components during separation and analysis. DACT extraction recovery improved to greater than 80% while the other analytes had similar extraction efficiencies as previously observed. Limits of detection were lower than our previous method (0.05-0.19 ng/mL) with relative standard deviations less than 10%. The total runtime was shorter (18 min) than the previous on-line method, thus it is suitable for large-scale sample analyses. We increased the throughput of our method twofold by using the newer extraction technique. Published by Elsevier B.V.

  1. Pilot Implementation of a Field Study Design to Evaluate the Impact of Source Control Measures on Indoor Air Quality in High Performance Homes

    Widder, Sarah H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chamness, Michele A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Petersen, Joseph M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Singer, Brett C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Maddalena, Randy L. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Destaillats, Hugo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Russell, M. L. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-10-01

    To improve the indoor air quality in new, high performance homes, a variety of standards and rating programs have been introduced to identify building materials that are designed to have lower emission rates of key contaminants of concern and a number of building materials are being introduced that are certified to these standards. For example, the U.S. Department of Energy (DOE) Zero Energy Ready Home program requires certification under the U.S. Environmental Protection Agency (EPA) Indoor airPLUS (IaP) label, which requires the use of PS1 or PS2 certified plywood and OSB; low-formaldehyde emitting wood products; low- or no-VOC paints and coatings as certified by Green Seal Standard GS-11, GreenGuard, SCS Indoor Advantage Gold Standard, MPI Green Performance Standard, or another third party rating program; and Green Label-certified carpet and carpet cushions. However, little is known regarding the efficacy of the IAP requirements in measurably reducing contaminant exposures in homes. The goal of this project is to develop a robust experimental approach and collect preliminary data to support the evaluation of indoor air quality (IAQ) measures linked to IAP-approved low-emitting materials and finishes in new residential homes. To this end, the research team of Pacific Northwest National Laboratory (PNNL) and Lawrence Berkeley National Laboratory (LBNL) developed a detailed experimental plan to measure IAQ constituents and other parameters, over time, in new homes constructed with materials compliant with IAP’s low-emitting material and ventilation requirements (i.e., section 6.1, 6.2, 6.3, and 7.2) and similar homes constructed to the state building code with conventional materials. The IAQ in IAP and conventional homes of similar age, location, and construction style is quantified as the differences in the speciated VOC and aldehyde concentrations, normalized to dilution rates. The experimental plan consists of methods to evaluate the difference between low

  2. Eva versus Other Performance Measures

    Hechmi Soumaya

    2013-01-01

    Create value not only intended to satisfy shareholders. This is also the way to ensure the ability of the company to ensure its sustainability and finance its growth. The company will not attract new capital if it destroys value. "The concept of value creation is none other than the intersection of strategy (create value) and technique (evaluate the company)"(Powilewicz, 2002). The basic idea behind the different measures of value creation by a company is that a company creates value for its ...

  3. Internet Performance and Reliability Measurements

    Cottrell, Les

    2003-01-01

    Collaborative HEP research is dependent on good Internet connectivity. Although most local- and wide-area networks are carefully watched, there is little monitoring of connections that cross many networks. This paper describes work in progress at several sites to monitor Internet end-to-end performance between hundreds of HEP sites worldwide. At each collection site, ICMP ping packets are automatically sent periodically to sites of interest. The data is recorded and made available to analysis nodes, which collect the data from multiple collection sites and provide analysis and graphing. Future work includes improving the efficiency and accuracy of ping data collection

  4. INL High Performance Building Strategy

    Jennifer D. Morton

    2010-02-01

    High performance buildings, also known as sustainable buildings and green buildings, are resource efficient structures that minimize the impact on the environment by using less energy and water, reduce solid waste and pollutants, and limit the depletion of natural resources while also providing a thermally and visually comfortable working environment that increases productivity for building occupants. As Idaho National Laboratory (INL) becomes the nation’s premier nuclear energy research laboratory, the physical infrastructure will be established to help accomplish this mission. This infrastructure, particularly the buildings, should incorporate high performance sustainable design features in order to be environmentally responsible and reflect an image of progressiveness and innovation to the public and prospective employees. Additionally, INL is a large consumer of energy that contributes to both carbon emissions and resource inefficiency. In the current climate of rising energy prices and political pressure for carbon reduction, this guide will help new construction project teams to design facilities that are sustainable and reduce energy costs, thereby reducing carbon emissions. With these concerns in mind, the recommendations described in the INL High Performance Building Strategy (previously called the INL Green Building Strategy) are intended to form the INL foundation for high performance building standards. This revised strategy incorporates the latest federal and DOE orders (Executive Order [EO] 13514, “Federal Leadership in Environmental, Energy, and Economic Performance” [2009], EO 13423, “Strengthening Federal Environmental, Energy, and Transportation Management” [2007], and DOE Order 430.2B, “Departmental Energy, Renewable Energy, and Transportation Management” [2008]), the latest guidelines, trends, and observations in high performance building construction, and the latest changes to the Leadership in Energy and Environmental Design

  5. Key indicators for organizational performance measurement

    Firoozeh Haddadi

    2014-09-01

    Full Text Available Each organization for assessing the amount of utility and desirability of their activities, especially in complex and dynamic environments, requires determining and ranking the vital performance indicators. Indicators provide essential links among strategy, execution and ultimate value creation. The aim of this paper is to develop a framework, which identifies and prioritizes Key Performance Indicators (KPIs that a company should focus on them to define and measure progress towards organizational objectives. For this purpose, an applied research was conducted in 2013 in an Iranian telecommunication company. We first determined the objectives of the company with respect to four perspectives of BSC (Balanced Scorecard framework. Next, performance indicators were listed and paired wise comparisons were accomplished by company's high-ranked employees through standard Analytic Hierarchy Process (AHP questionnaires. This helped us establish the weight of each indicator and to rank them, accordingly.

  6. PRINCIPLES OF THE SUPPLY CHAIN PERFORMANCE MEASUREMENT

    BEATA ŒLUSARCZYK; SEBASTIAN KOT

    2012-01-01

    Measurement of performance in every business management is a crucial activity allowing for effectiveness increase. The lack of suitable performance measurement is especially noticed in complex systems as supply chains. Responsible persons cannot manage effectively without suitable set of measures those are base for comparison to previous data or effects of other supply chain functioning. The analysis shows that it is very hard to find balanced set of supply chain performance measures those sh...

  7. High performance fuel technology development

    Koon, Yang Hyun; Kim, Keon Sik; Park, Jeong Yong; Yang, Yong Sik; In, Wang Kee; Kim, Hyung Kyu [KAERI, Daejeon (Korea, Republic of)

    2012-01-15

    {omicron} Development of High Plasticity and Annular Pellet - Development of strong candidates of ultra high burn-up fuel pellets for a PCI remedy - Development of fabrication technology of annular fuel pellet {omicron} Development of High Performance Cladding Materials - Irradiation test of HANA claddings in Halden research reactor and the evaluation of the in-pile performance - Development of the final candidates for the next generation cladding materials. - Development of the manufacturing technology for the dual-cooled fuel cladding tubes. {omicron} Irradiated Fuel Performance Evaluation Technology Development - Development of performance analysis code system for the dual-cooled fuel - Development of fuel performance-proving technology {omicron} Feasibility Studies on Dual-Cooled Annular Fuel Core - Analysis on the property of a reactor core with dual-cooled fuel - Feasibility evaluation on the dual-cooled fuel core {omicron} Development of Design Technology for Dual-Cooled Fuel Structure - Definition of technical issues and invention of concept for dual-cooled fuel structure - Basic design and development of main structure components for dual- cooled fuel - Basic design of a dual-cooled fuel rod.

  8. High Performance Bulk Thermoelectric Materials

    Ren, Zhifeng [Boston College, Chestnut Hill, MA (United States)

    2013-03-31

    Over 13 plus years, we have carried out research on electron pairing symmetry of superconductors, growth and their field emission property studies on carbon nanotubes and semiconducting nanowires, high performance thermoelectric materials and other interesting materials. As a result of the research, we have published 104 papers, have educated six undergraduate students, twenty graduate students, nine postdocs, nine visitors, and one technician.

  9. High performance in software development

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  10. Measuring levels of biogenic amines and their metabolites in rat brain tissue using high-performance liquid chromatography with photodiode array detection.

    Gu, Min-Jung; Jeon, Ji-Hyun; Oh, Myung Sook; Hong, Seon-Pyo

    2016-01-01

    We developed a method to detect biogenic amines and their metabolites in rat brain tissue using simultaneous high-performance liquid chromatography and a photodiode array detection. Measurements were made using a Hypersil Gold C-18 column (250 × 2.1 mm, 5 µm). The mobile phase was 5 mM perchloric acid containing 5 % acetonitrile. The correlation coefficient was 0.9995-0.9999. LODs (S/N = 3) and LOQs (S/N = 10) were as follows: dopamine 0.4 and 1.3 pg, 3, 4-dihydroxyphenylacetic acid 8.4 and 28.0 pg, serotonin 0.4 and 1.3 pg, 5-hydroxyindolacetic acid 3.4 and 11.3 pg, and homovanillic acid 8.4 and 28.0 pg. This method does not require derivatization steps, and is more sensitive than the widely used HPLC-UV method.

  11. Simultaneous measurement of proline and related compounds in oak leaves by high-performance ligand-exchange chromatography and electrospray ionization mass spectrometry for environmental stress studies.

    Oufir, Mouhssin; Schulz, Nadine; Sha Vallikhan, Patan Shaik; Wilhelm, Eva; Burg, Kornel; Hausman, Jean-Francois; Hoffmann, Lucien; Guignard, Cedric

    2009-02-13

    A mass spectrometer was coupled to high-performance ligand-exchange liquid chromatography (HPLEC) for simultaneous analysis of stress associated solutes such as proline, hydroxyproline, methylproline, glycine betaine and trigonelline extracted from leaves of drought stressed oaks and an internal standard namely N-acetylproline. Methanol/chloroform/water extracts were analyzed using an Aminex HPX-87C column and specifically quantified by the positive ion mode of an electrospray ionisation-mass spectrometry (ESI-MS) in single ion monitoring (SIM) mode. The recovery of N-acetyl proline added to oak leaf extracts ranged from 85.2 to 122.1% for an intra-day study. Standard calibration curves showed good linearity in the measured range from 0.3125 to 10micromolL(-1) with the lowest correlation coefficient of 0.99961 for trigonelline. The advantages of this alternative procedure, compared to previously published methods using fluorescence or amperometric detections, are the simultaneous and direct detection of osmoprotectants in a single chromatographic run, a minimal sample preparation, a good specificity and reduced limits of quantification, ranging from 0.1 to 0.6micromolL(-1). Fifty-six days of water deficit exposure resulted in increased foliar free proline levels (2.4-fold, P<0.001, 155micromolg(-1) FW) and glycine betaine contents (2.5-fold, P<0.05, 175micromolg(-1) FW) of drought stressed oak compared to control.

  12. Performance Measurement in Global Product Development

    Taylor, Thomas Paul; Ahmed-Kristensen, Saeema

    2013-01-01

    there is a requirement for the process to be monitored and measured relative to the business strategy of an organisation. It was found that performance measurement is a process that helps achieve sustainable business success, encouraging a learning culture within organisations. To this day, much of the research into how...... performance is measured has focussed on the process of product development. However, exploration of performance measurement related to global product development is relatively unexplored and a need for further research is evident. This paper contributes towards understanding how performance is measured...

  13. Determination of the structure and composition of Au-Ag bimetallic spherical nanoparticles using single particle ICP-MS measurements performed with normal and high temporal resolution.

    Kéri, Albert; Kálomista, Ildikó; Ungor, Ditta; Bélteki, Ádám; Csapó, Edit; Dékány, Imre; Prohaska, Thomas; Galbács, Gábor

    2018-03-01

    In this study, the information that can be obtained by combining normal and high resolution single particle ICP-MS (spICP-MS) measurements for spherical bimetallic nanoparticles (BNPs) was assessed. One commercial certified core-shell Au-Ag nanoparticle and three newly synthesized and fully characterized homogenous alloy Au-Ag nanoparticle batches of different composition were used in the experiments as BNP samples. By scrutinizing the high resolution spICP-MS signal time profiles, it was revealed that the width of the signal peak linearly correlates with the diameter of nanoparticles. It was also observed that the width of the peak for same-size nanoparticles is always significantly larger for Au than for Ag. It was also found that it can be reliably determined whether a BNP is of homogeneus alloy or core-shell structure and that, in the case of the latter, the core comprises of which element. We also assessed the performance of several ICP-MS based analytical methods in the analysis of the quantitative composition of bimetallic nanoparticles. Out of the three methods (normal resolution spICP-MS, direct NP nebulization with solution-mode ICP-MS, and solution-mode ICP-MS after the acid dissolution of the nanoparticles), the best accuracy and precision was achieved by spICP-MS. This method allows the determination of the composition with less than 10% relative inaccuracy and better than 3% precision. The analysis is fast and only requires the usual standard colloids for size calibration. Combining the results from both quantitative and structural analyses, the core diameter and shell thickness of core-shell particles can also be calculated. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Health Plan Performance Measurement within Medicare Subvention.

    1998-06-01

    the causes of poor performance (Siren & Laffel, 1996). Although outcomes measures such as nosocomial infection rates, admission rates for select...defined. Traditional outcomes measures include infection rates, morbidity, and mortality. The problem with these traditional measures is... Maternal /Child Care Indicators Nursing Staffing Indicators Outcome Indicators Technical Outcomes Plan Performance Stability of Health Plan

  15. Interpreting Mini-Mental State Examination Performance in Highly Proficient Bilingual Spanish-English and Asian Indian-English Speakers: Demographic Adjustments, Item Analyses, and Supplemental Measures.

    Milman, Lisa H; Faroqi-Shah, Yasmeen; Corcoran, Chris D; Damele, Deanna M

    2018-04-17

    Performance on the Mini-Mental State Examination (MMSE), among the most widely used global screens of adult cognitive status, is affected by demographic variables including age, education, and ethnicity. This study extends prior research by examining the specific effects of bilingualism on MMSE performance. Sixty independent community-dwelling monolingual and bilingual adults were recruited from eastern and western regions of the United States in this cross-sectional group study. Independent sample t tests were used to compare 2 bilingual groups (Spanish-English and Asian Indian-English) with matched monolingual speakers on the MMSE, demographically adjusted MMSE scores, MMSE item scores, and a nonverbal cognitive measure. Regression analyses were also performed to determine whether language proficiency predicted MMSE performance in both groups of bilingual speakers. Group differences were evident on the MMSE, on demographically adjusted MMSE scores, and on a small subset of individual MMSE items. Scores on a standardized screen of language proficiency predicted a significant proportion of the variance in the MMSE scores of both bilingual groups. Bilingual speakers demonstrated distinct performance profiles on the MMSE. Results suggest that supplementing the MMSE with a language screen, administering a nonverbal measure, and/or evaluating item-based patterns of performance may assist with test interpretation for this population.

  16. Measuring Distribution Performance? Benchmarking Warrants Your Attention

    Ericson, Sean J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Alvarez, Paul [The Wired Group

    2018-04-13

    Identifying, designing, and measuring performance metrics is critical to securing customer value, but can be a difficult task. This article examines the use of benchmarks based on publicly available performance data to set challenging, yet fair, metrics and targets.

  17. Synthesis of work-zone performance measures.

    2013-09-01

    The main objective of this synthesis was to identify and summarize how agencies collect, analyze, and report different work-zone : traffic-performance measures, which include exposure, mobility, and safety measures. The researchers also examined comm...

  18. Neo4j high performance

    Raj, Sonal

    2015-01-01

    If you are a professional or enthusiast who has a basic understanding of graphs or has basic knowledge of Neo4j operations, this is the book for you. Although it is targeted at an advanced user base, this book can be used by beginners as it touches upon the basics. So, if you are passionate about taming complex data with the help of graphs and building high performance applications, you will be able to get valuable insights from this book.

  19. Measuring the performance of business incubators

    VANDERSTRAETEN, Johanna; MATTHYSSENS, Paul; VAN WITTELOOSTUIJN, Arjen

    2012-01-01

    This paper focuses on incubator performance measurement. First, we report the findings of an extensive literature review. Both existing individual measures and more comprehensive measurement systems are discussed. This literature review shows that most incubator researchers and practitioners only use one or a few indicators for performance evaluation, and that existing measurement systems do not recognize the importance of short, medium and long-term results, do not always include an incubato...

  20. PERFORMANCE OF HIGH SCHOOL FOOTBALL PLAYERS ON CLINICAL MEASURES OF DEEP CERVICAL FLEXOR ENDURANCE AND CERVICAL ACTIVE RANGE OF MOTION: IS HISTORY OF CONCUSSION A FACTOR?

    Smith, Laura; Ruediger, Thomas; Alsalaheen, Bara; Bean, Ryan

    2016-04-01

    More than one million adolescent athletes participated in organized high school sanctioned football during the 2014-15 season. These athletes are at risk for sustaining concussion. Although cervical spine active range of motion (AROM) and deep neck flexor endurance may serve a preventative role in concussion, and widespread clinical use of measurements of these variables, reference values are not available for this population. Cost effective, clinically relevant methods for measuring neck endurance are also well established for adolescent athletes. The purpose of this study was to report reference values for deep cervical flexor endurance and cervical AROM in adolescent football players and examine whether differences in these measures exist in high school football players with and without a history of concussion. Concussion history, cervical AROM, and deep neck flexor endurance were measured in 122 high school football players. Reference values were calculated for AROM and endurance measures; association were examined between various descriptive variables and concussion. No statistically significant differences were found between athletes with a history of concussion and those without. A modest inverse correlation was seen between body mass and AROM in the sagittal and transverse planes. The results of this study indicate that the participants with larger body mass had less cervical AROM in some directions. While cervical AROM and endurance measurements may not be adequate to identify adolescents with a history of previous concussions among high school football players. However, if a concussion is sustained, these measures can offer a baseline to examine whether cervical AROM is affected as compared to healthy adolescents. 2c.

  1. The service of public services performance measurement

    Lystbæk, Christian Tang

    2014-01-01

    that performance measurement serves as “rituals of verification” which promotes the interests of political masters and their mistresses rather than public service. Another area of concern is the cost of performance measurement. Hood & Peters (2004:278) note that performance measurement is likely to “distract...... measurement suggests a range of contested and contradictory propositions. Its alleged benefits include public assurance, better functioning of supply markets for public services, and direct improvements of public services. But the literature also demonstrates the existence of significant concern about...... the actual impact, the costs and unintended consequences associated with performance measurement. This paper identifies the main rationales and rationalities in the scholarly discourse on public services performance measurement. It concludes with some suggestions on how to deal with the many rationales...

  2. The simple and sensitive measurement of malondialdehyde in selected specimens of biological origin and some feed by reversed phase high performance liquid chromatography

    Czauderna, M.; Kowalczyk, J.; Marounek, Milan

    2011-01-01

    Roč. 879, č. 23 (2011), s. 2251-2258 ISSN 1570-0232 Institutional research plan: CEZ:AV0Z50450515 Keywords : Malondialdehyde * 2,4-Dinitrophenylhydrazine * High performance liquid chromatography Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 2.888, year: 2011

  3. COMPANY PERFORMANCE MEASUREMENT AND REPORTING METHODS

    Nicu Ioana Elena

    2012-12-01

    Full Text Available One of the priorities of economic research has been and remains the re-evaluation of the notion of performance and especially exploring and finding some indicators that would reflect as accurately as possible the subtleties of the economic entity. The main purpose of this paper is to highlight the main company performance measurement and reporting methods. Performance is a concept that raises many question marks concerning the most accurate or the best method of reporting the performance at the company level. The research methodology has aimed at studying the Romanian and foreign specialized literature dealing with the analyzed field, studying magazines specialized on company performance measurement. If the financial performance measurement indicators are considered to offer an accurate image of the situation of the company, the modern approach through non-financial indicators offers a new perspective upon performance measurement, which is based on simplicity. In conclusion, after the theoretical study, I have noticed that the methods of performance measurement, reporting and interpretation are various, the opinions regarding the best performance measurement methods are contradictive and the companies prefer resorting to financial indicators that still play a more important role in the consolidation of the company performance measurement than the non-financial indicators do.

  4. Modification, calibration, and performance of the Ultra-High Sensitivity Aerosol Spectrometer for particle size distribution and volatility measurements during the Atmospheric Tomography Mission (ATom) airborne campaign

    Kupc, Agnieszka; Williamson, Christina; Wagner, Nicholas L.; Richardson, Mathews; Brock, Charles A.

    2018-01-01

    Atmospheric aerosol is a key component of the chemistry and climate of the Earth's atmosphere. Accurate measurement of the concentration of atmospheric particles as a function of their size is fundamental to investigations of particle microphysics, optical characteristics, and chemical processes. We describe the modification, calibration, and performance of two commercially available, Ultra-High Sensitivity Aerosol Spectrometers (UHSASs) as used on the NASA DC-8 aircraft during the Atmospheric Tomography Mission (ATom). To avoid sample flow issues related to pressure variations during aircraft altitude changes, we installed a laminar flow meter on each instrument to measure sample flow directly at the inlet as well as flow controllers to maintain constant volumetric sheath flows. In addition, we added a compact thermodenuder operating at 300 °C to the inlet line of one of the instruments. With these modifications, the instruments are capable of making accurate (ranging from 7 % for Dp 0.13 µm), precise ( 1000 to 225 hPa, while simultaneously providing information on particle volatility.We assessed the effect of uncertainty in the refractive index (n) of ambient particles that are sized by the UHSAS assuming the refractive index of ammonium sulfate (n = 1.52). For calibration particles with n between 1.44 and 1.58, the UHSAS diameter varies by +4/-10 % relative to ammonium sulfate. This diameter uncertainty associated with the range of refractive indices (i.e., particle composition) translates to aerosol surface area and volume uncertainties of +8.4/-17.8 and +12.4/-27.5 %, respectively. In addition to sizing uncertainty, low counting statistics can lead to uncertainties of 1000 cm-3.Examples of thermodenuded and non-thermodenuded aerosol number and volume size distributions as well as propagated uncertainties are shown for several cases encountered during the ATom project. Uncertainties in particle number concentration were limited by counting statistics

  5. High performance MEAs. Final report

    NONE

    2012-07-15

    The aim of the present project is through modeling, material and process development to obtain significantly better MEA performance and to attain the technology necessary to fabricate stable catalyst materials thereby providing a viable alternative to current industry standard. This project primarily focused on the development and characterization of novel catalyst materials for the use in high temperature (HT) and low temperature (LT) proton-exchange membrane fuel cells (PEMFC). New catalysts are needed in order to improve fuel cell performance and reduce the cost of fuel cell systems. Additional tasks were the development of new, durable sealing materials to be used in PEMFC as well as the computational modeling of heat and mass transfer processes, predominantly in LT PEMFC, in order to improve fundamental understanding of the multi-phase flow issues and liquid water management in fuel cells. An improved fundamental understanding of these processes will lead to improved fuel cell performance and hence will also result in a reduced catalyst loading to achieve the same performance. The consortium have obtained significant research results and progress for new catalyst materials and substrates with promising enhanced performance and fabrication of the materials using novel methods. However, the new materials and synthesis methods explored are still in the early research and development phase. The project has contributed to improved MEA performance using less precious metal and has been demonstrated for both LT-PEM, DMFC and HT-PEM applications. New novel approach and progress of the modelling activities has been extremely satisfactory with numerous conference and journal publications along with two potential inventions concerning the catalyst layer. (LN)

  6. High Performance Proactive Digital Forensics

    Alharbi, Soltan; Traore, Issa; Moa, Belaid; Weber-Jahnke, Jens

    2012-01-01

    With the increase in the number of digital crimes and in their sophistication, High Performance Computing (HPC) is becoming a must in Digital Forensics (DF). According to the FBI annual report, the size of data processed during the 2010 fiscal year reached 3,086 TB (compared to 2,334 TB in 2009) and the number of agencies that requested Regional Computer Forensics Laboratory assistance increasing from 689 in 2009 to 722 in 2010. Since most investigation tools are both I/O and CPU bound, the next-generation DF tools are required to be distributed and offer HPC capabilities. The need for HPC is even more evident in investigating crimes on clouds or when proactive DF analysis and on-site investigation, requiring semi-real time processing, are performed. Although overcoming the performance challenge is a major goal in DF, as far as we know, there is almost no research on HPC-DF except for few papers. As such, in this work, we extend our work on the need of a proactive system and present a high performance automated proactive digital forensic system. The most expensive phase of the system, namely proactive analysis and detection, uses a parallel extension of the iterative z algorithm. It also implements new parallel information-based outlier detection algorithms to proactively and forensically handle suspicious activities. To analyse a large number of targets and events and continuously do so (to capture the dynamics of the system), we rely on a multi-resolution approach to explore the digital forensic space. Data set from the Honeynet Forensic Challenge in 2001 is used to evaluate the system from DF and HPC perspectives.

  7. A Critique of Health System Performance Measurement.

    Lynch, Thomas

    2015-01-01

    Health system performance measurement is a ubiquitous phenomenon. Many authors have identified multiple methodological and substantive problems with performance measurement practices. Despite the validity of these criticisms and their cross-national character, the practice of health system performance measurement persists. Theodore Marmor suggests that performance measurement invokes an "incantatory response" wrapped within "linguistic muddle." In this article, I expand upon Marmor's insights using Pierre Bourdieu's theoretical framework to suggest that, far from an aberration, the "linguistic muddle" identified by Marmor is an indicator of a broad struggle about the representation and classification of public health services as a public good. I present a case study of performance measurement from Alberta, Canada, examining how this representational struggle occurs and what the stakes are. © The Author(s) 2015.

  8. The Effect of Morphological Characteristic of Coarse Aggregates Measured with Fractal Dimension on Asphalt Mixture’s High-Temperature Performance

    Hainian Wang

    2016-01-01

    Full Text Available The morphological properties of coarse aggregates, such as shape, angularity, and surface texture, have a great influence on the mechanical performance of asphalt mixtures. This study aims to investigate the effect of coarse aggregate morphological properties on the high-temperature performance of asphalt mixtures. A modified Los Angeles (LA abrasion test was employed to produce aggregates with various morphological properties by applying abrasion cycles of 0, 200, 400, 600, 800, 1000, and 1200 on crushed angular aggregates. Based on a laboratory-developed Morphology Analysis System for Coarse Aggregates (MASCA, the morphological properties of the coarse aggregate particles were quantified using the index of fractal dimension. The high-temperature performances of the dense-graded asphalt mixture (AC-16, gap-graded stone asphalt mixture (SAC-16, and stone mastic asphalt (SMA-16 mixtures containing aggregates with different fractal dimensions were evaluated through the dynamic stability (DS test and the penetration shear test in laboratory. Good linear correlations between the fractal dimension and high-temperature indexes were obtained for all three types of mixtures. Moreover, the results also indicated that higher coarse aggregate angularity leads to stronger high-temperature shear resistance of asphalt mixtures.

  9. Does hospital financial performance measure up?

    Cleverley, W O; Harvey, R K

    1992-05-01

    Comparisons are continuously being made between the financial performance, products and services, of the healthcare industry and those of non-healthcare industries. Several useful measures of financial performance--profitability, liquidity, financial risk, asset management and replacement, and debt capacity, are used by the authors to compare the financial performance of the hospital industry with that of the industrial, transportation and utility sectors. Hospitals exhibit weaknesses in several areas. Goals are suggested for each measure to bring hospitals closer to competitive levels.

  10. On music performance, theories, measurement en diversity

    Timmers, R.; Honing, H.J.

    2002-01-01

    Measurement of musical performances is of interest to studies in musicology, music psychology and music performance practice, but in general it has not been considered the main issue: when analyzing Western classical music, these disciplines usually focus on the score rather than the performance.

  11. Modification, calibration, and performance of the Ultra-High Sensitivity Aerosol Spectrometer for particle size distribution and volatility measurements during the Atmospheric Tomography Mission (ATom airborne campaign

    A. Kupc

    2018-01-01

    Full Text Available Atmospheric aerosol is a key component of the chemistry and climate of the Earth's atmosphere. Accurate measurement of the concentration of atmospheric particles as a function of their size is fundamental to investigations of particle microphysics, optical characteristics, and chemical processes. We describe the modification, calibration, and performance of two commercially available, Ultra-High Sensitivity Aerosol Spectrometers (UHSASs as used on the NASA DC-8 aircraft during the Atmospheric Tomography Mission (ATom. To avoid sample flow issues related to pressure variations during aircraft altitude changes, we installed a laminar flow meter on each instrument to measure sample flow directly at the inlet as well as flow controllers to maintain constant volumetric sheath flows. In addition, we added a compact thermodenuder operating at 300 °C to the inlet line of one of the instruments. With these modifications, the instruments are capable of making accurate (ranging from 7 % for Dp < 0.07 µm to 1 % for Dp > 0.13 µm, precise (< ±1.2 %, and continuous (1 Hz measurements of size-resolved particle number concentration over the diameter range of 0.063–1.0 µm at ambient pressures of > 1000 to 225 hPa, while simultaneously providing information on particle volatility.We assessed the effect of uncertainty in the refractive index (n of ambient particles that are sized by the UHSAS assuming the refractive index of ammonium sulfate (n =  1.52. For calibration particles with n between 1.44 and 1.58, the UHSAS diameter varies by +4/−10 % relative to ammonium sulfate. This diameter uncertainty associated with the range of refractive indices (i.e., particle composition translates to aerosol surface area and volume uncertainties of +8.4/−17.8 and +12.4/−27.5 %, respectively. In addition to sizing uncertainty, low counting statistics can lead to uncertainties of < 20 % for aerosol surface area and < 30

  12. Measuring the performance of maintenance service outsourcing.

    Cruz, Antonio Miguel; Rincon, Adriana Maria Rios; Haugan, Gregory L

    2013-01-01

    The aims of this paper are (1) to identify the characteristics of maintenance service providers that directly impact maintenance service quality, using 18 independent covariables; (2) to quantify the change in risk these covariables present to service quality, measured in terms of equipment turnaround time (TAT). A survey was applied to every maintenance service provider (n = 19) for characterization purposes. The equipment inventory was characterized, and the TAT variable recorded and monitored for every work order of each service provider (N = 1,025). Finally, the research team conducted a statistical analysis to accomplish the research objectives. The results of this study offer strong empirical evidence that the most influential variables affecting the quality of maintenance service performance are the following: type of maintenance, availability of spare parts in the country, user training, technological complexity of the equipment, distance between the company and the hospital, and the number of maintenance visits performed by the company. The strength of the results obtained by the Cox model built are supported by the measure of the Rp,e(2) = 0.57 with a value of Rp,e= 0.75. Thus, the model explained 57% of the variation in equipment TAT, with moderate high positive correlation between the dependent variable (TAT) and independent variables.

  13. Development of a Behavioral Performance Measure

    Marcelo Cabus Klotzle

    2012-09-01

    Full Text Available Since the fifties, several measures have been developed in order to measure the performance of investments or choices involving uncertain outcomes. Much of these measures are based on Expected Utility Theory, but since the nineties a number of measures have been proposed based on Non-Expected Utility Theory. Among the Theories of Non-Expected Utility highlights Prospect Theory, which is the foundation of Behavioral Finance. Based on this theory this study proposes a new performance measure in which are embedded loss aversion along with the likelihood of distortions in the choice of alternatives. A hypothetical example is presented in which various performance measures, including the new measure are compared. The results showed that the ordering of the assets varied depending on the performance measure adopted. According to what was expected, the new performance measure clearly has captured the distortion of probabilities and loss aversion of the decision maker, ie, those assets with the greatest negative deviations from the target were those who had the worst performance.

  14. MEASUREMENT: ACCOUNTING FOR RELIABILITY IN PERFORMANCE ESTIMATES.

    Waterman, Brian; Sutter, Robert; Burroughs, Thomas; Dunagan, W Claiborne

    2014-01-01

    When evaluating physician performance measures, physician leaders are faced with the quandary of determining whether departures from expected physician performance measurements represent a true signal or random error. This uncertainty impedes the physician leader's ability and confidence to take appropriate performance improvement actions based on physician performance measurements. Incorporating reliability adjustment into physician performance measurement is a valuable way of reducing the impact of random error in the measurements, such as those caused by small sample sizes. Consequently, the physician executive has more confidence that the results represent true performance and is positioned to make better physician performance improvement decisions. Applying reliability adjustment to physician-level performance data is relatively new. As others have noted previously, it's important to keep in mind that reliability adjustment adds significant complexity to the production, interpretation and utilization of results. Furthermore, the methods explored in this case study only scratch the surface of the range of available Bayesian methods that can be used for reliability adjustment; further study is needed to test and compare these methods in practice and to examine important extensions for handling specialty-specific concerns (e.g., average case volumes, which have been shown to be important in cardiac surgery outcomes). Moreover, it's important to note that the provider group average as a basis for shrinkage is one of several possible choices that could be employed in practice and deserves further exploration in future research. With these caveats, our results demonstrate that incorporating reliability adjustment into physician performance measurements is feasible and can notably reduce the incidence of "real" signals relative to what one would expect to see using more traditional approaches. A physician leader who is interested in catalyzing performance improvement

  15. Towards integrating environmental performance in divisional performance measurement

    Collins C Ngwakwe

    2014-08-01

    Full Text Available This paper suggests an integration of environmental performance measurement (EPM into conventional divisional financial performance measures as a catalyst to enhance managers’ drive toward cleaner production and sustainable development. The approach is conceptual and normative; and using a hypothetical firm, it suggests a model to integrate environmental performance measure as an ancillary to conventional divisional financial performance measures. Vroom’s motivation theory and other literature evidence indicate that corporate goals are achievable in an environment where managers’ efforts are recognised and thus rewarded. Consequently the paper suggests that environmentally motivated managers are important to propel corporate sustainability strategy toward desired corporate environmental governance and sustainable economic development. Thus this suggested approach modestly adds to existing environmental management accounting (EMA theory and literature. It is hoped that this paper may provide an agenda for further research toward a practical application of the suggested method in a firm.

  16. High performance light water reactor

    Squarer, D.; Schulenberg, T.; Struwe, D.; Oka, Y.; Bittermann, D.; Aksan, N.; Maraczy, C.; Kyrki-Rajamaeki, R.; Souyri, A.; Dumaz, P.

    2003-01-01

    The objective of the high performance light water reactor (HPLWR) project is to assess the merit and economic feasibility of a high efficiency LWR operating at thermodynamically supercritical regime. An efficiency of approximately 44% is expected. To accomplish this objective, a highly qualified team of European research institutes and industrial partners together with the University of Tokyo is assessing the major issues pertaining to a new reactor concept, under the co-sponsorship of the European Commission. The assessment has emphasized the recent advancement achieved in this area by Japan. Additionally, it accounts for advanced European reactor design requirements, recent improvements, practical design aspects, availability of plant components and the availability of high temperature materials. The final objective of this project is to reach a conclusion on the potential of the HPLWR to help sustain the nuclear option, by supplying competitively priced electricity, as well as to continue the nuclear competence in LWR technology. The following is a brief summary of the main project achievements:-A state-of-the-art review of supercritical water-cooled reactors has been performed for the HPLWR project.-Extensive studies have been performed in the last 10 years by the University of Tokyo. Therefore, a 'reference design', developed by the University of Tokyo, was selected in order to assess the available technological tools (i.e. computer codes, analyses, advanced materials, water chemistry, etc.). Design data and results of the analysis were supplied by the University of Tokyo. A benchmark problem, based on the 'reference design' was defined for neutronics calculations and several partners of the HPLWR project carried out independent analyses. The results of these analyses, which in addition help to 'calibrate' the codes, have guided the assessment of the core and the design of an improved HPLWR fuel assembly. Preliminary selection was made for the HPLWR scale

  17. Introduction to control system performance measurements

    Garner, K C

    1968-01-01

    Introduction to Control System Performance Measurements presents the methods of dynamic measurements, specifically as they apply to control system and component testing. This book provides an introduction to the concepts of statistical measurement methods.Organized into nine chapters, this book begins with an overview of the applications of automatic control systems that pervade almost every area of activity ranging from servomechanisms to electrical power distribution networks. This text then discusses the common measurement transducer functions. Other chapters consider the basic wave

  18. Internal Performance Measurement Systems: Problems and Solutions

    Jakobsen, Morten; Mitchell, Falconer; Nørreklit, Hanne

    2010-01-01

    This article pursues two aims: to identify problems and dangers related to the operational use of internal performance measurement systems of the Balanced Scorecard (BSC) type and to provide some guidance on how performance measurement systems may be designed to overcome these problems....... The analysis uses and extends N rreklit's (2000) critique of the BSC by applying the concepts developed therein to contemporary research on the BSC and to the development of practice in performance measurement. The analysis is of relevance for many companies in the Asia-Pacific area as an increasing numbers...

  19. Measurement uncertainty analysis techniques applied to PV performance measurements

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  20. Telerobotic system performance measurement - Motivation and methods

    Kondraske, George V.; Khoury, George J.

    1992-01-01

    A systems performance-based strategy for modeling and conducting experiments relevant to the design and performance characterization of telerobotic systems is described. A developmental testbed consisting of a distributed telerobotics network and initial efforts to implement the strategy described is presented. Consideration is given to the general systems performance theory (GSPT) to tackle human performance problems as a basis for: measurement of overall telerobotic system (TRS) performance; task decomposition; development of a generic TRS model; and the characterization of performance of subsystems comprising the generic model. GSPT employs a resource construct to model performance and resource economic principles to govern the interface of systems to tasks. It provides a comprehensive modeling/measurement strategy applicable to complex systems including both human and artificial components. Application is presented within the framework of a distributed telerobotics network as a testbed. Insight into the design of test protocols which elicit application-independent data is described.

  1. Performance measurement in transport sector analysis

    M. Išoraitė

    2004-06-01

    Full Text Available The article analyses the following issues: 1. Performance measurement in literature. The performance measurement has an important role to play in the efficient and effective management of organizations. Kaplan and Johnson highlighted the failure of the financial measures to reflect changes in the competitive circumstances and strategies of modern organizations. Many authors have focused attention on how organizations can design more appropriate measurement systems. Based on literature, consultancy experience and action research, numerous processes have been developed that organizations can follow in order to design and implement systems. Many frameworks have been proposed that support these processes. The objective of such frameworks is to help organizations define a set of measures that reflect their objectives and assess their performance appropriately. 2. Transport sector performance and its impacts measuring. The purpose of transport measurement is to identify opportunities enhancing transport performance. Successful transport sector management requires a system to analyze its efficiency and effectiveness as well as plan interventions if transport sector performance needs improvement. Transport impacts must be measurable and monitorable so that the person responsible for the project intervention can decide when and how to influence them. Performance indicators provide a means to measure and monitor impacts. These indicators essentially reflect quantitative and qualitative aspects of impacts at given time and places. 3. Transport sector output and input. Transport sector inputs are the resources required to deliver transport sector outputs. Transport sector inputs are typically: human resources, particularly skilled resources (including specialists consulting inputs; technology processes such as equipment and work; and finance, both public and private. 4. Transport sector policy and institutional framework; 5. Cause – effect linkages; 6

  2. Road weather management performance measures : 2012 update.

    2013-08-01

    In 2007, the Road Weather Management Program (RWMP) conducted a study with stakeholders from the transportation and meteorological communities to define eleven performance measures that would enable the Federal Highway Administration (FHWA) to determ...

  3. Performance measures for metropolitan planning organizations.

    2012-04-01

    Performance measurement is a topic of increasing importance to transportation agencies, as issues with : funding shortfalls and concerns about transportation system efficiency lead to a shift in how transportation : decision making is carried out. In...

  4. Smart city performance measurement framework. CITYkeys

    Airaksinen, M.; Seppa, I.P.; Huovilla, A.; Neumann, H.M.; Iglar, B.; Bosch, P.R.

    2017-01-01

    This paper presents a holistic performance measurement framework for harmonized and transparent monitoring and comparability of the European cities activities during the implementation of Smart City solutions. The work methodology was based on extensive collaboration and communication with European

  5. Haemoglobin variants may cause significant differences in haemoglobin A1c as measured by high-performance liquid chromatography and enzymatic methods in diabetic patients: a cross-sectional study.

    Otabe, Shuichi; Nakayama, Hitomi; Ohki, Tsuyoshi; Soejima, Eri; Tajiri, Yuji; Yamada, Kentaro

    2017-07-01

    Background We aimed to determine whether the discrepancy between haemoglobin A1c values determined by high-performance liquid chromatography and enzymatic haemoglobin A1c measurements in diabetic patients was clinically relevant. Methods We randomly recruited 1421 outpatients undergoing diabetic treatment and follow-up who underwent at least three haemoglobin A1c measurements between April 2014 and March 2015 at our clinic. In 6369 samples, haemoglobin A1c was simultaneously measured by HA-8160 and MetaboLead (enzymatic assay), and the values were compared. Results haemoglobin A1c measurements by high-performance liquid chromatography and enzymatic assay were strongly correlated (correlation coefficient: 0.9828, linear approximation curve y = 0.9986x - 0.2507). Mean haemoglobin A1c (6.8 ± 1.0%) measured by high-performance liquid chromatography was significantly higher than that measured by enzymatic assay (6.5 ± 1.0%, P liquid chromatography than those from enzymatic assay. Of these, three had Hb Toranomon [β112 (G14) Cys→Trp]. The fourth had Hb Ube-2 [α68 (E17) Asn→Asp]. One other subject presented consistently higher haemoglobin A1c values (>1%) by high-performance liquid chromatography than those from enzymatic assay and was diagnosed with a -77 (T > C) mutation in the δ-globin gene. These unrelated asymptomatic subjects had normal erythrocyte profiles, without anaemia. Conclusions We showed that haemoglobin A1c values measured by high-performance liquid chromatography were significantly higher than those measured by enzymatic assay in diabetic subjects. However, when an oversized deviation (>0.7%) between glycaemic control status and haemoglobin A1c is apparent, clinicians should check the methods used to measure haemoglobin A1c and consider the possible presence of a haemoglobin variant.

  6. Development of high performance cladding

    Kiuchi, Kiyoshi

    2003-01-01

    The developments of superior next-generation light water reactor are requested on the basis of general view points, such as improvement of safety, economics, reduction of radiation waste and effective utilization of plutonium, until 2030 year in which conventional reactor plants should be renovate. Improvements of stainless steel cladding for conventional high burn-up reactor to more than 100 GWd/t, developments of manufacturing technology for reduced moderation-light water reactor (RMWR) of breeding ratio beyond 1.0 and researches of water-materials interaction on super critical pressure-water cooled reactor are carried out in Japan Atomic Energy Research Institute. Stable austenite stainless steel has been selected for fuel element cladding of advanced boiling water reactor (ABWR). The austenite stain less has the superiority for anti-irradiation properties, corrosion resistance and mechanical strength. A hard spectrum of neutron energy up above 0.1 MeV takes place in core of the reduced moderation-light water reactor, as liquid metal-fast breeding reactor (LMFBR). High performance cladding for the RMWR fuel elements is required to get anti-irradiation properties, corrosion resistance and mechanical strength also. Slow strain rate test (SSRT) of SUS 304 and SUS 316 are carried out for studying stress corrosion cracking (SCC). Irradiation tests in LMFBR are intended to obtain irradiation data for damaged quantity of the cladding materials. (M. Suetake)

  7. New 30 kA power system at Fermilab and its use for measuring the effects of ripple current on the performance of superconducting high field magnets

    Carcagno, R.; Feher, S.; Garvey, J.; Jaskierny, W.; Lamm, M.; Makulski, A.; Orris, D.F.; Pfeffer, H.; Tartaglia, M.; Tompkins, J.; Wolff, D.; /Fermilab

    2004-12-01

    A new 30 kA, 30 V dc Power System was designed, built, and commissioned at Fermilab for testing Superconducting High Field Magnets. This system has been successfully supporting operations at the Fermilab Magnet Test Facility since April 2002. It is based on six commercial 150 kW Power Energy Industries power supply modules and the following in-house modules: six 720 Hz filters, two 15 kA/1kV dc solid-state dump switch, and a 3 MJ/30 kA/1 kV dc dump resistor. Additional inhouse electronic components were designed and built to provide precise current regulation and distribution of current and current rate of change. An industrial-type Programmable Logic Controller system was used to provide equipment interlocks and monitoring. This paper summarizes studies on the influence of characteristics of this new power system--such as ripple current--on the performance of High Field Superconducting magnets.

  8. Work zone performance measures pilot test.

    2011-04-01

    Currently, a well-defined and validated set of metrics to use in monitoring work zone performance do not : exist. This pilot test was conducted to assist state DOTs in identifying what work zone performance : measures can and should be targeted, what...

  9. ASUPT Automated Objective Performance Measurement System.

    Waag, Wayne L.; And Others

    To realize its full research potential, a need exists for the development of an automated objective pilot performance evaluation system for use in the Advanced Simulation in Undergraduate Pilot Training (ASUPT) facility. The present report documents the approach taken for the development of performance measures and also presents data collected…

  10. Environmental Measurements Laboratory 2002 Unit Performance Plan

    None

    2001-10-01

    This EML Unit Performance Plan provides the key goals and performance measures for FY 2002 and continuing to FY 2003. The purpose of the Plan is to inform EML's stakeholders and customers of the Laboratory's products and services, and its accomplishments and future challenges. Also incorporated in the Unit Performance Plan is EML's Communication Plan for FY 2002.

  11. High temperature measurement by noise thermometry

    Decreton, M.C.

    1982-06-01

    Noise thermometry has received a lot of attention for measurements of temperatures in the high range around 1000-2000 deg. K. For these measurements, laboratory type experiments have been mostly performed. These have shown the interest of the technique when long term stability, high precision and insensibility to external conditions are concerned. This is particularly true for measurements in nuclear reactors where important drifts due to irradiation effects are experienced with other measurement techniques, as thermocouple for instance. Industrial noise thermometer experiments have not been performed extensively up to now. The subject of the present study is the development of a 1800 deg. K noise thermometer for nuclear applications. The measurement method is based on a generalized noise power approach. The rms noise voltage (Vsub(s)) and noise current (Isub(s)) are successively measured on the resistive sensor. The same quantities are also measured on a dummy short circuited probe (Vsub(d) and Isub(d)). The temperature is then deduced from these measured values by the following formula: cTsub(s) = (Vsub(s) 2 - Vsub(d) 2 )(Vsub(s)/Isub(s) - Vsub(d)/Isub(d)) - 1 , where c is a constant and Tsub(s) the absolute temperature of the sensor. This approach has the particular advantage of greatly reducing the sensibility to environmental perturbations on the leads and to the influence of amplifier noise sources. It also eliminates the necessity of resistance measurement and keeps the electronic circuits as simple as possible

  12. Measuring performance in virtual reality phacoemulsification surgery

    Söderberg, Per; Laurell, Carl-Gustaf; Simawi, Wamidh; Skarman, Eva; Nordh, Leif; Nordqvist, Per

    2008-02-01

    We have developed a virtual reality (VR) simulator for phacoemulsification surgery. The current work aimed at developing a relative performance index that characterizes the performance of an individual trainee. We recorded measurements of 28 response variables during three iterated surgical sessions in 9 experienced cataract surgeons, separately for the sculpting phase and the evacuation phase of phacoemulsification surgery and compared their outcome to that of a reference group of naive trainees. We defined an individual overall performance index, an individual class specific performance index and an individual variable specific performance index. We found that on an average the experienced surgeons performed at a lower level than a reference group of naive trainees but that this was particularly attributed to a few surgeons. When their overall performance index was further analyzed as class specific performance index and variable specific performance index it was found that the low level performance was attributed to a behavior that is acceptable for an experienced surgeon but not for a naive trainee. It was concluded that relative performance indices should use a reference group that corresponds to the measured individual since the definition of optimal surgery may vary among trainee groups depending on their level of experience.

  13. APPROXIMATIONS TO PERFORMANCE MEASURES IN QUEUING SYSTEMS

    Kambo, N. S.

    2012-11-01

    Full Text Available Approximations to various performance measures in queuing systems have received considerable attention because these measures have wide applicability. In this paper we propose two methods to approximate the queuing characteristics of a GI/M/1 system. The first method is non-parametric in nature, using only the first three moments of the arrival distribution. The second method treads the known path of approximating the arrival distribution by a mixture of two exponential distributions by matching the first three moments. Numerical examples and optimal analysis of performance measures of GI/M/1 queues are provided to illustrate the efficacy of the methods, and are compared with benchmark approximations.

  14. Reconsidering the measurement of ancillary service performance.

    Griffin, D T; Rauscher, J A

    1987-08-01

    Prospective payment reimbursement systems have forced hospitals to review their costs more carefully. The result of the increased emphasis on costs is that many hospitals use costs, rather than margin, to judge the performance of ancillary services. However, arbitrary selection of performance measures for ancillary services can result in managerial decisions contrary to hospital objectives. Managerial accounting systems provide models which assist in the development of performance measures for ancillary services. Selection of appropriate performance measures provides managers with the incentive to pursue goals congruent with those of the hospital overall. This article reviews the design and implementation of managerial accounting systems, and considers the impact of prospective payment systems and proposed changes in capital reimbursement on this process.

  15. Measurement Of Shariah Stock Performance Using Risk Adjusted Performance

    Zuhairan Y Yunan

    2015-03-01

    Full Text Available The aim of this research is to analyze the shariah stock performance using risk adjusted performance method. There are three parameters to measure the stock performance i.e. Sharpe, Treynor, and Jensen. This performance’s measurements calculate the return and risk factor from shariah stocks. The data that used on this research is using the data of stocks at Jakarta Islamic Index. Sampling method that used on this paper is purposive sampling. This research is using ten companies as a sample. The result shows that from three parameters, the stock that have a best performance are AALI, ANTM, ASII, CPIN, INDF, KLBF, LSIP, and UNTR.DOI: 10.15408/aiq.v7i1.1364

  16. Developing Human Performance Measures (PSAM8)

    Jeffrey C. Joe

    2006-01-01

    Through the reactor oversight process (ROP), the U.S. Nuclear Regulatory Commission (NRC) monitors the performance of utilities licensed to operate nuclear power plants. The process is designed to assure public health and safety by providing reasonable assurance that licensees are meeting the cornerstones of safety and designated crosscutting elements. The reactor inspection program, together with performance indicators (PIs), and enforcement activities form the basis for the NRC's risk-informed, performance based regulatory framework. While human performance is a key component in the safe operation of nuclear power plants and is a designated cross-cutting element of the ROP, there is currently no direct inspection or performance indicator for assessing human performance. Rather, when human performance is identified as a substantive cross cutting element in any 1 of 3 categories (resources, organizational or personnel), it is then evaluated for common themes to determine if follow-up actions are warranted. However, variability in human performance occurs from day to day, across activities that vary in complexity, and workgroups, contributing to the uncertainty in the outcomes of performance. While some variability in human performance may be random, much of the variability may be attributed to factors that are not currently assessed. There is a need to identify and assess aspects of human performance that relate to plant safety and to develop measures that can be used to successfully assure licensee performance and indicate when additional investigation may be required. This paper presents research that establishes a technical basis for developing human performance measures. In particular, we discuss: (1) how historical data already gives some indication of connection between human performance and overall plant performance, (2) how industry led efforts to measure and model human performance and organizational factors could serve as a data source and basis for a

  17. HIGH PT MEASUREMENT AT RHIC

    MIODUSZEWSKI, S.

    2003-01-01

    We present recent high transverse momentum measurements in Au+Au and p+p collisions at the Relativistic Heavy Ion Collider (RHIC). We define and show the nuclear modification factor for neutral pions and charged hadrons and discuss the particle species dependence. By means of the nuclear modification factor, we observe a suppression factor at high p T of 5-6 for neutral pions and 3-4 for charged hadrons in central Au+Au collisions relative to the binary-scaled yields in p+p (or peripheral) collisions. Finally we present strong evidence for the observation of jets in Au+Au collisions and the disappearance of the away-side jet in central Au+Au collisions

  18. NINJA: Java for High Performance Numerical Computing

    José E. Moreira

    2002-01-01

    Full Text Available When Java was first introduced, there was a perception that its many benefits came at a significant performance cost. In the particularly performance-sensitive field of numerical computing, initial measurements indicated a hundred-fold performance disadvantage between Java and more established languages such as Fortran and C. Although much progress has been made, and Java now can be competitive with C/C++ in many important situations, significant performance challenges remain. Existing Java virtual machines are not yet capable of performing the advanced loop transformations and automatic parallelization that are now common in state-of-the-art Fortran compilers. Java also has difficulties in implementing complex arithmetic efficiently. These performance deficiencies can be attacked with a combination of class libraries (packages, in Java that implement truly multidimensional arrays and complex numbers, and new compiler techniques that exploit the properties of these class libraries to enable other, more conventional, optimizations. Two compiler techniques, versioning and semantic expansion, can be leveraged to allow fully automatic optimization and parallelization of Java code. Our measurements with the NINJA prototype Java environment show that Java can be competitive in performance with highly optimized and tuned Fortran code.

  19. Measurement uncertainty analysis techniques applied to PV performance measurements

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  20. Measurement uncertainty analysis techniques applied to PV performance measurements

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  1. Plasma L-ergothioneine measurement by high-performance liquid chromatography and capillary electrophoresis after a pre-column derivatization with 5-iodoacetamidofluorescein (5-IAF) and fluorescence detection.

    Sotgia, Salvatore; Pisanu, Elisabetta; Pintus, Gianfranco; Erre, Gian Luca; Pinna, Gerard Aime; Deiana, Luca; Carru, Ciriaco; Zinellu, Angelo

    2013-01-01

    Two sensitive and reproducible capillary electrophoresis and high-performance liquid chromatography-fluorescence procedures were established for quantitative determination of L-egothioneine in plasma. After derivatization of L-ergothioneine with 5-iodoacetamidofluorescein, the separation was carried out by HPLC on an ODS-2 C-18 sperisorb column by using a linear gradient elution and by HPCE on an uncoated fused silica capillary, 50 µm id, and 60 cm length. The methods were validated and found to be linear in the range of 0.3 to 10 µmol/l. The limit of quantification was 0.27 µmol/l for HPCE and 0.15 µmol/l for HPLC. The variations for intra- and inter-assay precision were around 6 RSD%, and the mean recovery accuracy close to 100% (96.11%).

  2. Simple measurement of isepamicin, a new aminoglycoside antibiotic, in guinea pig and human plasma, using high-performance liquid chromatography with ultraviolet detection

    Dionisotti, S.; Bamonte, F.; Scaglione, F.; Ongini, E.

    1991-01-01

    Isepamicin, the 1-N-(S-alpha-hydroxy-beta-aminopropionyl) derivative of gentamicin B, is a new aminoglycoside antibiotic, which not only has most of the properties of amikacin but also is effective against several amikacin-resistant strains of bacteria. The drug was assayed in guinea-pig and human plasma with a high-performance liquid chromatographic procedure using precolumn derivatization with 1-fluoro-2,4-dinitrobenzene and ultraviolet detection. Linearity was established over the range 0.5-40 micrograms/ml using 50 microliters of plasma. Accuracy has a mean relative error of less than 3% and precision a mean coefficient of variation of 5%. Isepamicin was determined without interference from plasma constituents or other drugs commonly prescribed during aminoglycoside therapy. This procedure correlates well with radioimmunoassay and can be used either in experimental studies or therapeutic monitoring of plasma levels

  3. Environmental Uncertainty, Performance Measure Variety and Perceived Performance in Icelandic Companies

    Rikhardsson, Pall; Sigurjonsson, Throstur Olaf; Arnardottir, Audur Arna

    and the perceived performance of the company. The sample was the 300 largest companies in Iceland and the response rate was 27%. Compared to other studies the majority of the respondents use a surprisingly high number of different measures – both financial and non-financial. This made testing of the three......The use of performance measures and performance measurement frameworks has increased significantly in recent years. The type and variety of performance measures in use has been researched in various countries and linked to different variables such as the external environment, performance...... measurement frameworks, and management characteristics. This paper reports the results of a study carried out at year end 2013 of the use of performance measures by Icelandic companies and the links to perceived environmental uncertainty, management satisfaction with the performance measurement system...

  4. New 30 kA power system at Fermilab and its use for measuring the effects of ripple current on the performance of superconducting high field magnets

    Carcagno, R.; Feher, S.; Garvey, J.; Jaskierny, W.; Lamm, M.; Makulski, A.; Orris, D.F.; Pfeffer, H.; Tartaglia, M.; Tompkins, J.; Wolff, D.

    2004-01-01

    A new 30 kA, 30 V dc Power System was designed, built, and commissioned at Fermilab for testing Superconducting High Field Magnets. This system has been successfully supporting operations at the Fermilab Magnet Test Facility since April 2002. It is based on six commercial 150 kW Power Energy Industries power supply modules and the following in-house modules: six 720 Hz filters, two 15 kA/1kV dc solid-state dump switch, and a 3 MJ/30 kA/1 kV dc dump resistor. Additional in-house electronic components were designed and built to provide precise current regulation and distribution of current and current rate of change. An industrial-type Programmable Logic Controller system was used to provide equipment interlocks and monitoring. This paper summarizes studies on the influence of characteristics of this new power system--such as ripple current--on the performance of High Field Superconducting Magnets

  5. Ambulatory care registered nurse performance measurement.

    Swan, Beth Ann; Haas, Sheila A; Chow, Marilyn

    2010-01-01

    On March 1-2, 2010, a state-of-the-science invitational conference titled "Ambulatory Care Registered Nurse Performance Measurement" was held to focus on measuring quality at the RN provider level in ambulatory care. The conference was devoted to ambulatory care RN performance measurement and quality of health care. The specific emphasis was on formulating a research agenda and developing a strategy to study the testable components of the RN role related to care coordination and care transitions, improving patient outcomes, decreasing health care costs, and promoting sustainable system change. The objectives were achieved through presentations and discussion among expert inter-professional participants from nursing, public health, managed care, research, practice, and policy. Conference speakers identified priority areas for a unified practice, policy, and research agenda. Crucial elements of the strategic dialogue focused on issues and implications for nursing and inter-professional practice, quality, and pay-for-performance.

  6. Procedure to Measure Indoor Lighting Energy Performance

    Deru, M.; Blair, N.; Torcellini, P.

    2005-10-01

    This document provides standard definitions of performance metrics and methods to determine them for the energy performance of building interior lighting systems. It can be used for existing buildings and for proposed buildings. The primary users for whom these documents are intended are building energy analysts and technicians who design, install, and operate data acquisition systems, and who analyze and report building energy performance data. Typical results from the use of this procedure are the monthly and annual energy used for lighting, energy savings from occupancy or daylighting controls, and the percent of the total building energy use that is used by the lighting system. The document is not specifically intended for retrofit applications. However, it does complement Measurement and Verification protocols that do not provide detailed performance metrics or measurement procedures.

  7. Performance measures for transform data coding.

    Pearl, J.; Andrews, H. C.; Pratt, W. K.

    1972-01-01

    This paper develops performance criteria for evaluating transform data coding schemes under computational constraints. Computational constraints that conform with the proposed basis-restricted model give rise to suboptimal coding efficiency characterized by a rate-distortion relation R(D) similar in form to the theoretical rate-distortion function. Numerical examples of this performance measure are presented for Fourier, Walsh, Haar, and Karhunen-Loeve transforms.

  8. English Value-Added Measures: Examining the Limitations of School Performance Measurement

    Perry, Thomas

    2016-01-01

    Value-added "Progress" measures are to be introduced for all English schools in 2016 as "headline" measures of school performance. This move comes despite research highlighting high levels of instability in value-added measures and concerns about the omission of contextual variables in the planned measure. This article studies…

  9. A novel extraction technique based on carbon nanotubes reinforced hollow fiber solid/liquid microextraction for the measurement of piroxicam and diclofenac combined with high performance liquid chromatography.

    Song, Xin-Yue; Shi, Yan-Ping; Chen, Juan

    2012-10-15

    A novel design of carbon nanotubes reinforced hollow fiber solid/liquid phase microextraction (CNTs-HF-SLPME) was developed to determine piroxicam and diclofenac in different real water samples. Functionalized multi-walled carbon nanotubes (MWCNTs) were held in the pores of hollow fiber with sol-gel technology. The pores and lumen of carbon nanotubes reinforced hollow fiber were subsequently filled with a μL volume of organic solvent (1-octanol), and then the whole assembly was used for the extraction of the target analytes in direct immersion sampling mode. The target analytes were extracted from the sample by two extractants, one of which is organic solvent placed inside the pores and lumen of hollow fiber and the other one is CNTs held in the pores of hollow fiber. After extraction, the analytes were desorbed in acetonitrile and analyzed using high performance liquid chromatography. This novel extraction mode showed more excellent extraction performance in comparison with conventional hollow fiber liquid microextraction (without adding CNTs) and carbon nanotubes reinforced hollow fiber solid microextraction (CNTs held in the pores of hollow fiber, but no organic solvents placed inside the lumen of hollow fiber) under the respective optimum conditions. This method provided 47- and 184-fold enrichment factors for piroxicam and diclofenac, respectively, good inter-fiber repeatability and batch-to-batch reproducibility. Linearity was observed in the range of 20-960 μg L(-1) for piroxicam, and 10-2560 μg L(-1) for diclofenac, with correlation coefficients of 0.9985 and 0.9989, respectively. The limits of detection were 4.58 μg L(-1) for piroxicam and 0.40 μg L(-1) for diclofenac. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Corporate Social Performance: From Output Measurement to Impact Measurement

    K.E.H. Maas (Karen)

    2009-01-01

    textabstractAll organisations have social, environmental and economic impacts that effect people, their communities and the natural environment. Impacts include intended as well as unintended effects and negative as well as positive effects. Current practice in performance measurement tends to focus

  11. Application of data mining in performance measures

    Chan, Michael F. S.; Chung, Walter W.; Wong, Tai Sun

    2001-10-01

    This paper proposes a structured framework for exploiting data mining application for performance measures. The context is set in an airline company is illustrated for the use of such framework. The framework takes in consideration of how a knowledge worker interacts with performance information at the enterprise level to support them to make informed decision in managing the effectiveness of operations. A case study of applying data mining technology for performance data in an airline company is illustrated. The use of performance measures is specifically applied to assist in the aircraft delay management process. The increasingly dispersed and complex operations of airline operation put much strain on the part of knowledge worker in using search, acquiring and analyzing information to manage performance. One major problem faced with knowledge workers is the identification of root causes of performance deficiency. The large amount of factors involved in the analyze the root causes can be time consuming and the objective of applying data mining technology is to reduce the time and resources needed for such process. The increasing market competition for better performance management in various industries gives rises to need of the intelligent use of data. Because of this, the framework proposed here is very much generalizable to industries such as manufacturing. It could assist knowledge workers who are constantly looking for ways to improve operation effectiveness through new initiatives and the effort is required to be quickly done to gain competitive advantage in the marketplace.

  12. Frequency Control Performance Measurement and Requirements

    Illian, Howard F.

    2010-12-20

    Frequency control is an essential requirement of reliable electric power system operations. Determination of frequency control depends on frequency measurement and the practices based on these measurements that dictate acceptable frequency management. This report chronicles the evolution of these measurements and practices. As technology progresses from analog to digital for calculation, communication, and control, the technical basis for frequency control measurement and practices to determine acceptable performance continues to improve. Before the introduction of digital computing, practices were determined largely by prior experience. In anticipation of mandatory reliability rules, practices evolved from a focus primarily on commercial and equity issues to an increased focus on reliability. This evolution is expected to continue and place increased requirements for more precise measurements and a stronger scientific basis for future frequency management practices in support of reliability.

  13. CITYkeys Smart city performance measurement system

    Huovila, A.; Airaksinen, M.; Pinto-Seppa, I.; Piira, K.; Bosch, P.R.; Penttinen, T.; Neumann, H.M.; Kontinakis, N.

    2017-01-01

    Cities are tackling their economic, social and environmental challenges through smart city solutions. To demonstrate that these solutions achieve the desired impact, an indicator-based assessment system is needed. This paper presents the process of developing CITYkeys performance measurement system

  14. Performance Measurement in Helicopter Training and Operations.

    Prophet, Wallace W.

    For almost 15 years, HumRRO Division No. 6 has conducted an active research program on techniques for measuring the flight performance of helicopter trainees and pilots. This program addressed both the elemental aspects of flying (i.e., maneuvers) and the mission- or goal-oriented aspects. A variety of approaches has been investigated, with the…

  15. Testing for Distortions in Performance Measures

    Sloof, Randolph; Van Praag, Mirjam

    2015-01-01

    Distorted performance measures in compensation contracts elicit suboptimal behavioral responses that may even prove to be dysfunctional (gaming). This paper applies the empirical test developed by Courty and Marschke (Review of Economics and Statistics, 90, 428-441) to detect whether the widely...

  16. Testing for Distortions in Performance Measures

    Sloof, Randolph; Van Praag, Mirjam

    Distorted performance measures in compensation contracts elicit suboptimal behavioral responses that may even prove to be dysfunctional (gaming). This paper applies the empirical test developed by Courty and Marschke (2008) to detect whether the widely used class of Residual Income based performa...

  17. Performance measurement in industrial R&D

    Kerssens-van Drongelen, I.C.; Nixon, Bill; Pearson, Alan

    2000-01-01

    Currently, the need for R&D performance measurements that are both practically useful and theoretically sound seems to be generally acknowledged; indeed, the rising cost of R&D, greater emphasis on value management and a trend towards decentralization are escalating the need for ways of evaluating

  18. External Innovation Implementation Determinants and Performance Measurement

    Coates, Matthew; Bals, Lydia

    2013-01-01

    for innovation implementation based on a case study in the pharmaceutical industry. The results of 25 expert interviews and a survey with 67 respondents led to the resulting framework and a corresponding performance measurement system. The results reveal the importance of supporting systems and show differences...

  19. 20 CFR 638.302 - Performance measurement.

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Performance measurement. 638.302 Section 638.302 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR JOB CORPS PROGRAM UNDER TITLE IV-B OF THE JOB TRAINING PARTNERSHIP ACT Funding, Site Selection, and Facilities Management...

  20. Tools for Measuring and Improving Performance.

    Jurow, Susan

    1993-01-01

    Explains the need for meaningful performance measures in libraries and the Total Quality Management (TQM) approach to data collection. Five tools representing different stages of a TQM inquiry are covered (i.e., the Shewhart Cycle, flowcharts, cause-and-effect diagrams, Pareto charts, and control charts), and benchmarking is addressed. (Contains…

  1. Combination of high-performance liquid chromatography and radioimmunoassay for the measurement of urodilatin and α-hANP in the urine of healthy males

    Solc, J.; Bauer, K.; Timnik, A.; Doehlemann, C.; Strom, T.M.; Weil, J.; Solcova, A.

    1991-01-01

    Urodilatin (ANP-(95-126)), a natriuretic peptide in urine, and α-hANP (ANP-(99-126)) are crossreactive in the radioimmunoassay of α-hANP (ANP-RIA). The authors therefore developed a method to separate physiological amounts of urodilatin and α-hANP in urine by high-performance liquid chromatography followed by ANP-RIA of the separated fractions. They studied urine samples of 10 healthy adult males with a plasma α-hANP level of 41 ± 21 pg/ml (mean ± SD) and a total urinary ANP-RIA reactivity of 40 ± 21 pg/ml. In all urine samples they found three peaks of ANP-RIA reactivity, the first one coeluting with synthetic urodilatin, the second one with the retention time of α-hANP and a late eluting ANP-RIA-reactive peak, possibly containing degradation products. The ratio of urodilatin/α-hANP was 0.77 ± 0.17

  2. Performance measures for world class maintenance

    Labib, A.W.

    1998-01-01

    A main problem in maintenance in general, and in power plants and related equipment in particular, is the lack of a practical, consistent, and adaptive performance measure that provides a focused feedback and integrates preventive and corrective modes of maintenance. The presentation defines concepts of world class and benchmarking. Desirable features in an appropriate performance measure are identified. It then, demonstrates current practices in maintenance and criticises their shortcomings. An alternative model is presented through a case study. The model monitors performance from a general view, and then offers a focused analysis. The main conclusion is that the proposed model offers an adaptive and a dynamic framework, and hence production and maintenance are integrated in a 'real time' environment. The system is also flexible in working with any other criteria whether they are of a quantitative or a qualitative nature. (orig.) 16 refs

  3. Performance measures for world class maintenance

    Labib, A.W. [Department of Mechanical Engineering, University of Manchester, Institute of Science and Technology, Manchester (United Kingdom)

    1998-12-31

    A main problem in maintenance in general, and in power plants and related equipment in particular, is the lack of a practical, consistent, and adaptive performance measure that provides a focused feedback and integrates preventive and corrective modes of maintenance. The presentation defines concepts of world class and benchmarking. Desirable features in an appropriate performance measure are identified. It then, demonstrates current practices in maintenance and criticises their shortcomings. An alternative model is presented through a case study. The model monitors performance from a general view, and then offers a focused analysis. The main conclusion is that the proposed model offers an adaptive and a dynamic framework, and hence production and maintenance are integrated in a `real time` environment. The system is also flexible in working with any other criteria whether they are of a quantitative or a qualitative nature. (orig.) 16 refs.

  4. Performance measures for world class maintenance

    Labib, A W [Department of Mechanical Engineering, University of Manchester, Institute of Science and Technology, Manchester (United Kingdom)

    1999-12-31

    A main problem in maintenance in general, and in power plants and related equipment in particular, is the lack of a practical, consistent, and adaptive performance measure that provides a focused feedback and integrates preventive and corrective modes of maintenance. The presentation defines concepts of world class and benchmarking. Desirable features in an appropriate performance measure are identified. It then, demonstrates current practices in maintenance and criticises their shortcomings. An alternative model is presented through a case study. The model monitors performance from a general view, and then offers a focused analysis. The main conclusion is that the proposed model offers an adaptive and a dynamic framework, and hence production and maintenance are integrated in a `real time` environment. The system is also flexible in working with any other criteria whether they are of a quantitative or a qualitative nature. (orig.) 16 refs.

  5. Learning Apache Solr high performance

    Mohan, Surendra

    2014-01-01

    This book is an easy-to-follow guide, full of hands-on, real-world examples. Each topic is explained and demonstrated in a specific and user-friendly flow, from search optimization using Solr to Deployment of Zookeeper applications. This book is ideal for Apache Solr developers and want to learn different techniques to optimize Solr performance with utmost efficiency, along with effectively troubleshooting the problems that usually occur while trying to boost performance. Familiarity with search servers and database querying is expected.

  6. High-performance composite chocolate

    Dean, Julian; Thomson, Katrin; Hollands, Lisa; Bates, Joanna; Carter, Melvyn; Freeman, Colin; Kapranos, Plato; Goodall, Russell

    2013-07-01

    The performance of any engineering component depends on and is limited by the properties of the material from which it is fabricated. It is crucial for engineering students to understand these material properties, interpret them and select the right material for the right application. In this paper we present a new method to engage students with the material selection process. In a competition-based practical, first-year undergraduate students design, cost and cast composite chocolate samples to maximize a particular performance criterion. The same activity could be adapted for any level of education to introduce the subject of materials properties and their effects on the material chosen for specific applications.

  7. A Study on Relationships between Functional Performance and Task Performance Measure through Experiments in NPP MCR

    Jang, In Seok; Seong, Poong Hyun; Park, Jin Kyun

    2011-01-01

    Further improvements in levels of organization, management, man-machine interfaces, education, training, etc. are required, if high operating reliability of operators in huge and complex plants such as chemical plants and electrical power generating plants is to be maintained. Improvement requires good understanding of operators' behavior, including defining what is good performance for operators, especially in emergency situations. Human performance measures, therefore, are important to enhance performance and to reduce the probability of incidents and accidents in Nuclear Power Plants (NPPs). Operators' performance measures are used for multi-objectives such as control room design, human system interface evaluation, training, procedure and so on. There are two kinds of representative methods to measure operators' performance. These methods are now known as the functional performance measure and task performance measure. Functional performance measures are basically based on the plant process parameters. Functional performance measures indicate how well the operators controlled selected critical parameters. The parameters selected in this paper are derived from the four Critical Safety Functions (CSFs) identified in the emergency operating procedures such as achievement of subcriticality, maintenance of core cooling, maintenance of heat sink and maintenance of containment integrity. Task performance measures are based on the task analysis. Task analysis is to determine the tasks required and how operators are performed. In this paper, task analysis is done with ideal path for an accident completed by experts and Emergency Operation Procedure (EOP). However, most literatures related to operators' performance have been using one of these measures and there is no research to find out the relationships between two measures. In this paper, the relationships between functional performance measure and task performance measure are investigated using experiments. Shortly

  8. High-Performance Composite Chocolate

    Dean, Julian; Thomson, Katrin; Hollands, Lisa; Bates, Joanna; Carter, Melvyn; Freeman, Colin; Kapranos, Plato; Goodall, Russell

    2013-01-01

    The performance of any engineering component depends on and is limited by the properties of the material from which it is fabricated. It is crucial for engineering students to understand these material properties, interpret them and select the right material for the right application. In this paper we present a new method to engage students with…

  9. Toward High-Performance Organizations.

    Lawler, Edward E., III

    2002-01-01

    Reviews management changes that companies have made over time in adopting or adapting four approaches to organizational performance: employee involvement, total quality management, re-engineering, and knowledge management. Considers future possibilities and defines a new view of what constitutes effective organizational design in management.…

  10. Optimization and validation of a reversed-phase high performance liquid chromatography method for the measurement of bovine liver methylmalonyl-coenzyme a mutase activity.

    Ouattara, Bazoumana; Duplessis, Mélissa; Girard, Christiane L

    2013-10-16

    Methylmalonyl-CoA mutase (MCM) is an adenosylcobalamin-dependent enzyme that catalyses the interconversion of (2R)-methylmalonyl-CoA to succinyl-CoA. In humans, a deficit in activity of MCM, due to an impairment of intracellular formation of adenosylcobalamin and methylcobalamin results in a wide spectrum of clinical manifestations ranging from moderate to fatal. Consequently, MCM is the subject of abundant literature. However, there is a lack of consensus on the reliable method to monitor its activity. This metabolic pathway is highly solicited in ruminants because it is essential for the utilization of propionate formed during ruminal fermentation. In lactating dairy cows, propionate is the major substrate for glucose formation. In present study, a reversed-phase high performance liquid chromatography (RP-HPLC) was optimized and validated to evaluate MCM activity in bovine liver. The major aim of the study was to describe the conditions to optimize reproducibility of the method and to determine stability of the enzyme and its product during storage and processing of samples. Specificity of the method was good, as there was no interfering peak from liver extract at the retention times corresponding to methylmalonyl-CoA or succinyl-CoA. Repeatability of the method was improved as compared to previous RP-HPLC published data. Using 66 μg of protein, intra-assay coefficient of variation (CV) of specific activities, ranged from 0.90 to 8.05% and the CV inter-day was 7.40%. Storage and processing conditions (frozen homogenate of fresh tissue vs. fresh homogenate of tissue snapped in liquid nitrogen) did not alter the enzyme activity. The analyte was also stable in liver crude extract for three frozen/thawed cycles when stored at -20°C and thawed to room temperature. The improved method provides a way for studying the effects of stages of lactation, diet composition, and physiology in cattle on MCM activity over long periods of time, such as a complete lactation period

  11. Improvement an enterprises marketing performance measurement system

    Stanković Ljiljana

    2013-01-01

    Full Text Available Business conditions in which modern enterprises do business are more and more complex. The complexity of the business environment is caused by activities of external and internal factors, which imposes the need for the turn in management focus. One of key turns is related to the need of adaptation and development of new business performance evaluation systems. The evaluation of marketing contribution to business performance is very important however a complex task as well. The marketing theory and practice indicates the need for developing adequate standards and systems for evaluating the efficiency of marketing decisions. The better understanding of marketing standards and ways that managers use is a very important factor that affects the efficiency of strategic decision-making. The paper presents the results of researching the way in which managers perceive and apply marketing performance measures. The data that were received through the field research sample enabled the consideration of the managers' attitudes on practical ways of implementing marketing performance measurement and identifying measures that managers imply as used mostly in business practice.

  12. Total performance measurement and management: TPM2

    Sheather, G. [University of Technology, Sydney, NSW (Australia)

    1996-10-01

    As the rate of incremental improvement activities and business process re-engineering programs increase, product development times reduce, collaborative endeavours between OEMs and out-sourcing suppliers increase, as agile manufacturing responds to the demands of a global marketplace, the `virtual` organisation is becoming a reality. In this context, customers, partners, suppliers and manufacturers are increasingly separated by field geography, time zone, and availability, but linked by distributed information systems. Measuring and monitoring business performance in this environment requires a entirely different framework and set of key performance indicators (KPIs) usually associated with traditional financial accounting approaches. These approaches are critiqued, then the paper introduces a new concept `Total Performance Measurement Management` (TPM2), to distinguish it from the conventional TPM (Total Productive Management). A model for combining both financial and non-financial KPIs relevant to real-time performance measures stretching across strategic, business unit and operational levels, is presented. The results of the model confirm the hypothesis that it is feasible to develop a TPM2 framework for achieving enterprise wide strategic objectives. (author). 6 tabs., 18 figs., refs.

  13. Performance expectations of measurement control programs

    Hammond, G.A.

    1985-01-01

    The principal index for designing and assessing the effectiveness of safeguards is the sensitivity and reliability of gauging the true status of material balances involving material flows, transfers, inventories, and process holdup. The measurement system must not only be capable of characterizing the material for gradation or intensity of protection, but also be responsive to needs for detection and localization of losses, provide confirmation that no diversion has occurred, and help meet requirements for process control, health and safety. Consequently, the judicious application of a measurement control and quality assurance program is vital to a complete understanding of the capabilities and limitations of the measurement system including systematic and random components of error for weight, volume, sampling, chemical, isotopic, and nondestructive determinations of material quantities in each material balance area. This paper describes performance expectations or criteria for a measurement control program in terms of ''what'' is desired and ''why'', relative to safeguards and security objectives

  14. Functional High Performance Financial IT

    Berthold, Jost; Filinski, Andrzej; Henglein, Fritz

    2011-01-01

    at the University of Copenhagen that attacks this triple challenge of increased performance, transparency and productivity in the financial sector by a novel integration of financial mathematics, domain-specific language technology, parallel functional programming, and emerging massively parallel hardware. HIPERFIT......The world of finance faces the computational performance challenge of massively expanding data volumes, extreme response time requirements, and compute-intensive complex (risk) analyses. Simultaneously, new international regulatory rules require considerably more transparency and external...... auditability of financial institutions, including their software systems. To top it off, increased product variety and customisation necessitates shorter software development cycles and higher development productivity. In this paper, we report about HIPERFIT, a recently etablished strategic research center...

  15. High performance Mo adsorbent PZC

    Anon,

    1998-10-01

    We have developed Mo adsorbents for natural Mo(n, {gamma}){sup 99}Mo-{sup 99m}Tc generator. Among them, we called the highest performance adsorbent PZC that could adsorb about 250 mg-Mo/g. In this report, we will show the structure, adsorption mechanism of Mo, and the other useful properties of PZC when you carry out the examination of Mo adsorption and elution of {sup 99m}Tc. (author)

  16. Indoor Air Quality in High Performance Schools

    High performance schools are facilities that improve the learning environment while saving energy, resources, and money. The key is understanding the lifetime value of high performance schools and effectively managing priorities, time, and budget.

  17. Coming up short on nonfinancial performance measurement.

    Ittner, Christopher D; Larcker, David F

    2003-11-01

    Companies in increasing numbers are measuring customer loyalty, employee satisfaction, and other nonfinancial areas of performance that they believe affect profitability. But they've failed to relate these measures to their strategic goals or establish a connection between activities undertaken and financial outcomes achieved. Failure to make such connections has led many companies to misdirect their investments and reward ineffective managers. Extensive field research now shows that businesses make some common mistakes when choosing, analyzing, and acting on their nonfinancial measures. Among these mistakes: They set the wrong performance targets because they focus too much on short-term financial results, and they use metrics that lack strong statistical validity and reliability. As a result, the companies can't demonstrate that improvements in nonfinancial measures actually affect their financial results. The authors lay out a series of steps that will allow companies to realize the genuine promise of nonfinancial performance measures. First, develop a model that proposes a causal relationship between the chosen nonfinancial drivers of strategic success and specific outcomes. Next, take careful inventory of all the data within your company. Then use established statistical methods for validating the assumed relationships and continue to test the model as market conditions evolve. Finally, base action plans on analysis of your findings, and determine whether those plans and their investments actually produce the desired results. Nonfinancial measures will offer little guidance unless you use a process for choosing and analyzing them that relies on sophisticated quantitative and qualitative inquiries into the factors actually contributing to economic results.

  18. Approaches towards airport economic performance measurement

    Ivana STRYČEKOVÁ

    2011-01-01

    Full Text Available The paper aims to assess how economic benchmarking is being used by airports as a means of performance measurement and comparison of major international airports in the world. The study focuses on current benchmarking practices and methods by taking into account different factors according to which it is efficient to benchmark airports performance. As methods are considered mainly data envelopment analysis and stochastic frontier analysis. Apart from them other approaches are discussed by airports to provide economic benchmarking. The main objective of this article is to evaluate the efficiency of the airports and answer some undetermined questions involving economic benchmarking of the airports.

  19. MODERN INSTRUMENTS FOR MEASURING ORGANIZATIONAL PERFORMANCE

    RADU CATALINA

    2010-12-01

    Full Text Available Any significant management action can be assessed both in terms of success of immediate goals and as effect of the organization ability to embrace change. Market competition intensifies with the development of Romanian society and its needs. Companies that offer different products and services need to impose certain advantages and to increase their performances. The paper will present modern tools for measuring and evaluating organizational performance, namely: Balanced Scorecard, Deming model and Baldrige model. We also present an example for Balance Scorecard, of an organizations belonging to the cosmetics industry.

  20. High performance inertial fusion targets

    Nuckolls, J.H.; Bangerter, R.O.; Lindl, J.D.; Mead, W.C.; Pan, Y.L.

    1977-01-01

    Inertial confinement fusion (ICF) designs are considered which may have very high gains (approximately 1000) and low power requirements (<100 TW) for input energies of approximately one megajoule. These include targets having very low density shells, ultra thin shells, central ignitors, magnetic insulation, and non-ablative acceleration

  1. High performance inertial fusion targets

    Nuckolls, J.H.; Bangerter, R.O.; Lindl, J.D.; Mead, W.C.; Pan, Y.L.

    1978-01-01

    Inertial confinement fusion (ICF) target designs are considered which may have very high gains (approximately 1000) and low power requirements (< 100 TW) for input energies of approximately one megajoule. These include targets having very low density shells, ultra thin shells, central ignitors, magnetic insulation, and non-ablative acceleration

  2. High performance nuclear fuel element

    Mordarski, W.J.; Zegler, S.T.

    1980-01-01

    A fuel-pellet composition is disclosed for use in fast breeder reactors. Uranium carbide particles are mixed with a powder of uraniumplutonium carbides having a stable microstructure. The resulting mixture is formed into fuel pellets. The pellets thus produced exhibit a relatively low propensity to swell while maintaining a high density

  3. Strategic Performance Measurement of Research and Development

    Parisi, Cristiana; Rossi, Paola

    2015-01-01

    The paper used an in depth case study to investigate how firms can integrate the strategic performance measurement of R&D with the Balanced Scorecard. Moreover, the paper investigated the crucial role of controller in the decision making process of this integration.The literature review of R......-financial ratio as the R&D measures to introduce in the Balanced Scorecard.In choosing our case study, we have selected the pharmaceutical industry because of its relevant R&D investment. Within the sector we chose the Italian affiliate of a traditional industry leader, Eli Lilly Italia,that was characterized...

  4. High Performance JavaScript

    Zakas, Nicholas

    2010-01-01

    If you're like most developers, you rely heavily on JavaScript to build interactive and quick-responding web applications. The problem is that all of those lines of JavaScript code can slow down your apps. This book reveals techniques and strategies to help you eliminate performance bottlenecks during development. You'll learn how to improve execution time, downloading, interaction with the DOM, page life cycle, and more. Yahoo! frontend engineer Nicholas C. Zakas and five other JavaScript experts -- Ross Harmes, Julien Lecomte, Steven Levithan, Stoyan Stefanov, and Matt Sweeney -- demonstra

  5. Performance measurement in healthcare: part II--state of the science findings by stage of the performance measurement process.

    Adair, Carol E; Simpson, Elizabeth; Casebeer, Ann L; Birdsell, Judith M; Hayden, Katharine A; Lewis, Steven

    2006-07-01

    This paper summarizes findings of a comprehensive, systematic review of the peer-reviewed and grey literature on performance measurement according to each stage of the performance measurement process--conceptualization, selection and development, data collection, and reporting and use. It also outlines implications for practice. Six hundred sixty-four articles about organizational performance measurement from the health and business literature were reviewed after systematic searches of the literature, multi-rater relevancy ratings, citation checks and expert author nominations. Key themes were extracted and summarized from the most highly rated papers for each performance measurement stage. Despite a virtually universal consensus on the potential benefits of performance measurement, little evidence currently exists to guide practice in healthcare. Issues in conceptualizing systems include strategic alignment and scope. There are debates on the criteria for selecting measures and on the types and quality of measures. Implementation of data collection and analysis systems is complex and costly, and challenges persist in reporting results, preventing unintended effects and putting findings for improvement into action. There is a need for further development and refinement of performance measures and measurement systems, with a particular focus on strategies to ensure that performance measurement leads to healthcare improvement.

  6. Carpet Aids Learning in High Performance Schools

    Hurd, Frank

    2009-01-01

    The Healthy and High Performance Schools Act of 2002 has set specific federal guidelines for school design, and developed a federal/state partnership program to assist local districts in their school planning. According to the Collaborative for High Performance Schools (CHPS), high-performance schools are, among other things, healthy, comfortable,…

  7. Measurement-based reliability/performability models

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  8. Toward a theory of high performance.

    Kirby, Julia

    2005-01-01

    What does it mean to be a high-performance company? The process of measuring relative performance across industries and eras, declaring top performers, and finding the common drivers of their success is such a difficult one that it might seem a fool's errand to attempt. In fact, no one did for the first thousand or so years of business history. The question didn't even occur to many scholars until Tom Peters and Bob Waterman released In Search of Excellence in 1982. Twenty-three years later, we've witnessed several more attempts--and, just maybe, we're getting closer to answers. In this reported piece, HBR senior editor Julia Kirby explores why it's so difficult to study high performance and how various research efforts--including those from John Kotter and Jim Heskett; Jim Collins and Jerry Porras; Bill Joyce, Nitin Nohria, and Bruce Roberson; and several others outlined in a summary chart-have attacked the problem. The challenge starts with deciding which companies to study closely. Are the stars the ones with the highest market caps, the ones with the greatest sales growth, or simply the ones that remain standing at the end of the game? (And when's the end of the game?) Each major study differs in how it defines success, which companies it therefore declares to be worthy of emulation, and the patterns of activity and attitude it finds in common among them. Yet, Kirby concludes, as each study's method incrementally solves problems others have faced, we are progressing toward a consensus theory of high performance.

  9. High performance electromagnetic simulation tools

    Gedney, Stephen D.; Whites, Keith W.

    1994-10-01

    Army Research Office Grant #DAAH04-93-G-0453 has supported the purchase of 24 additional compute nodes that were installed in the Intel iPsC/860 hypercube at the Univesity Of Kentucky (UK), rendering a 32-node multiprocessor. This facility has allowed the investigators to explore and extend the boundaries of electromagnetic simulation for important areas of defense concerns including microwave monolithic integrated circuit (MMIC) design/analysis and electromagnetic materials research and development. The iPSC/860 has also provided an ideal platform for MMIC circuit simulations. A number of parallel methods based on direct time-domain solutions of Maxwell's equations have been developed on the iPSC/860, including a parallel finite-difference time-domain (FDTD) algorithm, and a parallel planar generalized Yee-algorithm (PGY). The iPSC/860 has also provided an ideal platform on which to develop a 'virtual laboratory' to numerically analyze, scientifically study and develop new types of materials with beneficial electromagnetic properties. These materials simulations are capable of assembling hundreds of microscopic inclusions from which an electromagnetic full-wave solution will be obtained in toto. This powerful simulation tool has enabled research of the full-wave analysis of complex multicomponent MMIC devices and the electromagnetic properties of many types of materials to be performed numerically rather than strictly in the laboratory.

  10. High-Performance Data Converters

    Steensgaard-Madsen, Jesper

    -resolution internal D/A converters are required. Unit-element mismatch-shaping D/A converters are analyzed, and the concept of mismatch-shaping is generalized to include scaled-element D/A converters. Several types of scaled-element mismatch-shaping D/A converters are proposed. Simulations show that, when implemented...... in a standard CMOS technology, they can be designed to yield 100 dB performance at 10 times oversampling. The proposed scaled-element mismatch-shaping D/A converters are well suited for use as the feedback stage in oversampled delta-sigma quantizers. It is, however, not easy to make full use of their potential......-order difference of the output signal from the loop filter's first integrator stage. This technique avoids the need for accurate matching of analog and digital filters that characterizes the MASH topology, and it preserves the signal-band suppression of quantization errors. Simulations show that quantizers...

  11. High performance soft magnetic materials

    2017-01-01

    This book provides comprehensive coverage of the current state-of-the-art in soft magnetic materials and related applications, with particular focus on amorphous and nanocrystalline magnetic wires and ribbons and sensor applications. Expert chapters cover preparation, processing, tuning of magnetic properties, modeling, and applications. Cost-effective soft magnetic materials are required in a range of industrial sectors, such as magnetic sensors and actuators, microelectronics, cell phones, security, automobiles, medicine, health monitoring, aerospace, informatics, and electrical engineering. This book presents both fundamentals and applications to enable academic and industry researchers to pursue further developments of these key materials. This highly interdisciplinary volume represents essential reading for researchers in materials science, magnetism, electrodynamics, and modeling who are interested in working with soft magnets. Covers magnetic microwires, sensor applications, amorphous and nanocrystalli...

  12. High performance polyethylene nanocomposite fibers

    A. Dorigato

    2012-12-01

    Full Text Available A high density polyethylene (HDPE matrix was melt compounded with 2 vol% of dimethyldichlorosilane treated fumed silica nanoparticles. Nanocomposite fibers were prepared by melt spinning through a co-rotating twin screw extruder and drawing at 125°C in air. Thermo-mechanical and morphological properties of the resulting fibers were then investigated. The introduction of nanosilica improved the drawability of the fibers, allowing the achievement of higher draw ratios with respect to the neat matrix. The elastic modulus and creep stability of the fibers were remarkably improved upon nanofiller addition, with a retention of the pristine tensile properties at break. Transmission electronic microscope (TEM images evidenced that the original morphology of the silica aggregates was disrupted by the applied drawing.

  13. HIGH-PERFORMANCE COATING MATERIALS

    SUGAMA,T.

    2007-01-01

    Corrosion, erosion, oxidation, and fouling by scale deposits impose critical issues in selecting the metal components used at geothermal power plants operating at brine temperatures up to 300 C. Replacing these components is very costly and time consuming. Currently, components made of titanium alloy and stainless steel commonly are employed for dealing with these problems. However, another major consideration in using these metals is not only that they are considerably more expensive than carbon steel, but also the susceptibility of corrosion-preventing passive oxide layers that develop on their outermost surface sites to reactions with brine-induced scales, such as silicate, silica, and calcite. Such reactions lead to the formation of strong interfacial bonds between the scales and oxide layers, causing the accumulation of multiple layers of scales, and the impairment of the plant component's function and efficacy; furthermore, a substantial amount of time is entailed in removing them. This cleaning operation essential for reusing the components is one of the factors causing the increase in the plant's maintenance costs. If inexpensive carbon steel components could be coated and lined with cost-effective high-hydrothermal temperature stable, anti-corrosion, -oxidation, and -fouling materials, this would improve the power plant's economic factors by engendering a considerable reduction in capital investment, and a decrease in the costs of operations and maintenance through optimized maintenance schedules.

  14. Team performance measures for abnormal plant operations

    Montgomery, J.C.; Seaver, D.A.; Holmes, C.W.; Gaddy, C.D.; Toquam, J.L.

    1990-01-01

    In order to work effectively, control room crews need to possess well-developed team skills. Extensive research supports the notion that improved quality and effectiveness are possible when a group works together, rather than as individuals. The Nuclear Regulatory Commission (NRC) has recognized the role of team performance in plant safety and has attempted to evaluate licensee performance as part of audits, inspections, and reviews. However, reliable and valid criteria for team performance have not yet been adequately developed. The purpose of the present research was to develop such reliable and valid measures of team skills. Seven dimensions of team skill performance were developed on the basis of input from NRC operator licensing examiners and from the results of previous research and experience in the area. These dimensions included two-way communications, resource management, inquiry, advocacy, conflict resolution/decision-making, stress management, and team spirit. Several different types of rating formats were developed for use with these dimensions, including a modified Behaviorally Anchored Rating Scale (BARS) format and a Behavioral Frequency format. Following pilot-testing and revision, observer and control room crew ratings of team performance were obtained using 14 control room crews responding to simulator scenarios at a BWR and a PWR reactor. It is concluded, overall, that the Behavioral Frequency ratings appeared quite promising as a measure of team skills but that additional statistical analyses and other follow-up research are needed to refine several of the team skills dimensions and to make the scales fully functional in an applied setting

  15. Delivering high performance BWR fuel reliably

    Schardt, J.F.

    1998-01-01

    Utilities are under intense pressure to reduce their production costs in order to compete in the increasingly deregulated marketplace. They need fuel, which can deliver high performance to meet demanding operating strategies. GE's latest BWR fuel design, GE14, provides that high performance capability. GE's product introduction process assures that this performance will be delivered reliably, with little risk to the utility. (author)

  16. Performance measurement with fuzzy data envelopment analysis

    Tavana, Madjid

    2014-01-01

    The intensity of global competition and ever-increasing economic uncertainties has led organizations to search for more efficient and effective ways to manage their business operations.  Data envelopment analysis (DEA) has been widely used as a conceptually simple yet powerful tool for evaluating organizational productivity and performance. Fuzzy DEA (FDEA) is a promising extension of the conventional DEA proposed for dealing with imprecise and ambiguous data in performance measurement problems. This book is the first volume in the literature to present the state-of-the-art developments and applications of FDEA. It is designed for students, educators, researchers, consultants and practicing managers in business, industry, and government with a basic understanding of the DEA and fuzzy logic concepts.

  17. IASI instrument: technical description and measured performances

    Hébert, Ph.; Blumstein, D.; Buil, C.; Carlier, T.; Chalon, G.; Astruc, P.; Clauss, A.; Siméoni, D.; Tournier, B.

    2017-11-01

    IASI is an infrared atmospheric sounder. It will provide meteorologist and scientific community with atmospheric spectra. The IASI system includes 3 instruments that will be mounted on the Metop satellite series, a data processing software integrated in the EPS (EUMETSAT Polar System) ground segment and a technical expertise centre implemented in CNES Toulouse. The instrument is composed of a Fourier transform spectrometer and an associated infrared imager. The optical configuration is based on a Michelson interferometer and the interferograms are processed by an on-board digital processing subsystem, which performs the inverse Fourier transforms and the radiometric calibration. The infrared imager co-registers the IASI soundings with AVHRR imager (AVHRR is another instrument on the Metop satellite). The presentation will focus on the architectures of the instrument, the description of the implemented technologies and the measured performance of the first flight model. CNES is leading the IASI program in association with EUMETSAT. The instrument Prime is ALCATEL SPACE.

  18. A simple high-performance liquid chromatography (HPLC) method for the measurement of pyridoxal-5-phosphate and 4-pyridoxic acid in human plasma.

    Cabo, Rona; Kozik, Karolina; Milanowski, Maciej; Hernes, Sigrunn; Slettan, Audun; Haugen, Margaretha; Ye, Shu; Blomhoff, Rune; Mansoor, M Azam

    2014-06-10

    Low concentration of plasma pyridoxal-5-phosphate (PLP) is associated with hyperhomocysteinemia and inflammation. Most methods for the measurement of plasma PLP require large specimen volume and involve the use of toxic reagents. We have developed a HPLC method for the measurement of PLP and 4-pyridoxic acid (4-PA) in plasma, which requires small specimen volume. The samples are prepared without adding any toxic reagents. Furthermore, we have examined whether intake of vitamin B6 affects the concentration of plasma PLP and 4-PA. The coefficient of variation of the method was 6% and the recovery of the added vitamin in plasma was about 100%. The concentrations of plasma PLP and 4-PA in 168 healthy subjects were 40.6 (8.4-165.0) nmol/L, median and (range) and 17.5 (3.7-114.79) nmol/L, median and (range) respectively. In the multiple regression analyses, the concentration of plasma PLP was associated with the concentration of plasma 4-PA (pplasma 4-PA was associated with plasma PLP (pplasma PLP and 4-PA. Our findings demonstrate that plasma 4-PA, BMI and sex are the major determinants of plasma PLP in healthy individuals. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. CMS High Level Trigger Timing Measurements

    Richardson, Clint

    2015-01-01

    The two-level trigger system employed by CMS consists of the Level 1 (L1) Trigger, which is implemented using custom-built electronics, and the High Level Trigger (HLT), a farm of commercial CPUs running a streamlined version of the offline CMS reconstruction software. The operational L1 output rate of 100 kHz, together with the number of CPUs in the HLT farm, imposes a fundamental constraint on the amount of time available for the HLT to process events. Exceeding this limit impacts the experiment's ability to collect data efficiently. Hence, there is a critical need to characterize the performance of the HLT farm as well as the algorithms run prior to start up in order to ensure optimal data taking. Additional complications arise from the fact that the HLT farm consists of multiple generations of hardware and there can be subtleties in machine performance. We present our methods of measuring the timing performance of the CMS HLT, including the challenges of making such measurements. Results for the performance of various Intel Xeon architectures from 2009-2014 and different data taking scenarios are also presented. (paper)

  20. Measurement of sorafenib plasma concentration by high-performance liquid chromatography in patients with advanced hepatocellular carcinoma: is it useful the application in clinical practice? A pilot study.

    Fucile, Carmen; Marenco, Simona; Bazzica, Marco; Zuccoli, Maria Laura; Lantieri, Francesca; Robbiano, Luigi; Marini, Valeria; Di Gion, Paola; Pieri, Giulia; Stura, Paola; Martelli, Antonietta; Savarino, Vincenzo; Mattioli, Francesca; Picciotto, Antonino

    2015-01-01

    Pharmacokinetics and dose-finding studies on sorafenib were conducted on heterogeneous groups of patients with solid tumors. Portal hypertension, gut motility impairment and altered bile enterohepatic circulation may explain different sorafenib toxicological profile in cirrhotic patients. This study evaluated sorafenib plasma concentration in a homogeneous group of cirrhotic patients with hepatocellular carcinoma (HCC). Sorafenib concentrations were determined by liquid chromatography in 12 consecutive patients. Data have been evaluated by the generalized estimating equations method (p value statistical level was set at α = 0.05). (1) There were not significant differences between sorafenib concentrations in patients who tolerate the full dose versus patients with reduced dose due to toxicity; (2) the average sorafenib concentrations measured 3 h after the morning dosing were lower than those measured 12 h after the evening dosing (p = 0.005); (3) sorafenib concentrations decrease overtime (p < 10(-4)); (4) it has been found an association between the development of severe adverse reactions and sorafenib concentrations (p < 10(-5)). The relationship between dose and concentration of sorafenib in HCC patients is poor and not clinically predictable, confirming the variability both in the maximum tolerated dose and in plasma concentrations. Several factors may influence the pharmacokinetics in patients with liver disease. This may explain the inter-patient variability of concentrations and the lack of differences in concentration at different dosages. It could be interesting to extend the series of HCC patients to enhance information on the kinetics of the drug; furthermore, to establish a threshold of plasma sorafenib concentrations to predict severe adverse reactions would be clinically useful.

  1. High performance carbon nanocomposites for ultracapacitors

    Lu, Wen

    2012-10-02

    The present invention relates to composite electrodes for electrochemical devices, particularly to carbon nanotube composite electrodes for high performance electrochemical devices, such as ultracapacitors.

  2. Strategies and Experiences Using High Performance Fortran

    Shires, Dale

    2001-01-01

    .... High performance Fortran (HPF) is a relative new addition to the Fortran dialect It is an attempt to provide an efficient high-level Fortran parallel programming language for the latest generation of been debatable...

  3. Human performance assessment: methods and measures

    Andresen, Gisle; Droeivoldsmo, Asgeir

    2000-10-01

    The Human Error Analysis Project (HEAP) was initiated in 1994. The aim of the project was to acquire insights on how and why cognitive errors occur when operators are engaged in problem solving in advanced integrated control rooms. Since human error had not been studied in the HAlden Man-Machine LABoratory (HAMMLAB) before, it was also necessary to carry out research in methodology. In retrospect, it is clear that much of the methodological work is relevant to human-machine research in general, and not only to research on human error. The purpose of this report is, therefore, to give practitioners and researchers an overview of the methodological parts of HEAP. The scope of the report is limited to methods used throughout the data acquisition process, i.e., data-collection methods, data-refinement methods, and measurement methods. The data-collection methods include various types of verbal protocols, simulator logs, questionnaires, and interviews. Data-refinement methods involve different applications of the Eyecon system, a flexible data-refinement tool, and small computer programs used for rearranging, reformatting, and aggregating raw-data. Measurement methods involve assessment of diagnostic behaviour, erroneous actions, complexity, task/system performance, situation awareness, and workload. The report concludes that the data-collection methods are generally both reliable and efficient. The data-refinement methods, however, should be easier to use in order to facilitate explorative analyses. Although the series of experiments provided an opportunity for measurement validation, there are still uncertainties connected to several measures, due to their reliability still being unknown. (Author). 58 refs.,7 tabs

  4. Summary of the AccNet-EuCARD Workshop on Optics Measurements, Corrections and Modelling for High-Performance Storage Rings “OMCM”, CERN, Geneva, 20-22 June 2011

    Bartolini, R; Calaga, R; Einfeld, D; Giovannozzi, M; Koutchouk, J-P; Milardi, C; Safranek, J; Tomás, R; Wenninger, J; Zimmermann, F

    2012-01-01

    The LHC, its luminosity upgrade HL-LHC, its injectors upgrade LIU and other high performance storage rings around the world are facing challenging requirements for optics measurements, correction and modelling. This workshop aims to do a review of the existing techniques to measure and control linear and non-linear optics parameters. The precise optics determination has proven to be a key ingredient to improve the performance of the past and present accelerators. From 20 to 22 June 2011 an international workshop, “OMCM,” was held at CERN with the goal of assessing the limits of the present techniques and evaluating new paths for improvement. The OMCM workshop was sponsored and supported by CERN and by the European Commission under the FP7 “Research Infrastructures” project EuCARD, grant agreement no. 227579.

  5. High Performance Grinding and Advanced Cutting Tools

    Jackson, Mark J

    2013-01-01

    High Performance Grinding and Advanced Cutting Tools discusses the fundamentals and advances in high performance grinding processes, and provides a complete overview of newly-developing areas in the field. Topics covered are grinding tool formulation and structure, grinding wheel design and conditioning and applications using high performance grinding wheels. Also included are heat treatment strategies for grinding tools, using grinding tools for high speed applications, laser-based and diamond dressing techniques, high-efficiency deep grinding, VIPER grinding, and new grinding wheels.

  6. Strategy Guideline: High Performance Residential Lighting

    Holton, J.

    2012-02-01

    The Strategy Guideline: High Performance Residential Lighting has been developed to provide a tool for the understanding and application of high performance lighting in the home. The high performance lighting strategies featured in this guide are drawn from recent advances in commercial lighting for application to typical spaces found in residential buildings. This guide offers strategies to greatly reduce lighting energy use through the application of high quality fluorescent and light emitting diode (LED) technologies. It is important to note that these strategies not only save energy in the home but also serve to satisfy the homeowner's expectations for high quality lighting.

  7. Transport in JET high performance plasmas

    2001-01-01

    Two type of high performance scenarios have been produced in JET during DTE1 campaign. One of them is the well known and extensively used in the past ELM-free hot ion H-mode scenario which has two distinct regions- plasma core and the edge transport barrier. The results obtained during DTE-1 campaign with D, DT and pure T plasmas confirms our previous conclusion that the core transport scales as a gyroBohm in the inner half of plasma volume, recovers its Bohm nature closer to the separatrix and behaves as ion neoclassical in the transport barrier. Measurements on the top of the barrier suggest that the width of the barrier is dependent upon isotope and moreover suggest that fast ions play a key role. The other high performance scenario is a relatively recently developed Optimised Shear Scenario with small or slightly negative magnetic shear in plasma core. Different mechanisms of Internal Transport Barrier (ITB) formation have been tested by predictive modelling and the results are compared with experimentally observed phenomena. The experimentally observed non-penetration of the heavy impurities through the strong ITB which contradicts to a prediction of the conventional neo-classical theory is discussed. (author)

  8. Transport in JET high performance plasmas

    1999-01-01

    Two type of high performance scenarios have been produced in JET during DTE1 campaign. One of them is the well known and extensively used in the past ELM-free hot ion H-mode scenario which has two distinct regions- plasma core and the edge transport barrier. The results obtained during DTE-1 campaign with D, DT and pure T plasmas confirms our previous conclusion that the core transport scales as a gyroBohm in the inner half of plasma volume, recovers its Bohm nature closer to the separatrix and behaves as ion neoclassical in the transport barrier. Measurements on the top of the barrier suggest that the width of the barrier is dependent upon isotope and moreover suggest that fast ions play a key role. The other high performance scenario is a relatively recently developed Optimised Shear Scenario with small or slightly negative magnetic shear in plasma core. Different mechanisms of Internal Transport Barrier (ITB) formation have been tested by predictive modelling and the results are compared with experimentally observed phenomena. The experimentally observed non-penetration of the heavy impurities through the strong ITB which contradicts to a prediction of the conventional neo-classical theory is discussed. (author)

  9. Carbon nanomaterials for high-performance supercapacitors

    Tao Chen; Liming Dai

    2013-01-01

    Owing to their high energy density and power density, supercapacitors exhibit great potential as high-performance energy sources for advanced technologies. Recently, carbon nanomaterials (especially, carbon nanotubes and graphene) have been widely investigated as effective electrodes in supercapacitors due to their high specific surface area, excellent electrical and mechanical properties. This article summarizes the recent progresses on the development of high-performance supercapacitors bas...

  10. Measurements of operator performance - an experimental setup

    Netland, K.

    1980-01-01

    The human has to be considered as an important element in a process control system, even if the degree of automation is extremely high. Other elements, e.g. computer, displays, etc., can to a large extent be described and quantified. The human (operator), is difficult to describe in a precise way, and it is just as difficult to predict his thinking and acting in a control room environment. Many factors influence his performance, such as: experience, motivation, level of knowledge, training, control environment, job organization, etc. These factors have to a certain degree to be described before guidelines for design of the man-process interfaces and the control room layout can be developed. For decades, the psychological science has obtained knowledge of the human mind and behaviour. This knowledge should have the potential of a positive input on our effort to describe the factors influencing the operator performance. Even if the human is complex, a better understanding of his thinking and acting, and a more precise description of the factors influencing his performance can be obtained. At OECD Halden Reactor Project an experimental set-up for such studies has been developed and implemented in the computer laboratory. The present set-up includes elements as a computer- and display-based control room, a simulator representing a nuclear power plant, training programme for the subjects, and methods for the experiments. Set-up modules allow reconfiguration of experiments. (orig./HP)

  11. The design of high performance weak current integrated amplifier

    Chen Guojie; Cao Hui

    2005-01-01

    A design method of high performance weak current integrated amplifier using ICL7650 operational amplifier is introduced. The operating principle of circuits and the step of improving amplifier's performance are illustrated. Finally, the experimental results are given. The amplifier has programmable measurement range of 10 -9 -10 -12 A, automatic zero-correction, accurate measurement, and good stability. (authors)

  12. Standardization of test conditions for gamma camera performance measurement

    Jordan, K.

    1982-02-01

    The way of measuring gamma camera performance is to use point sources or flood sources in air, often in combination with bar phantoms. This method has nothing in common with the use of a camera in clinical practice. Particularly in the case of low energy emitters, like Tc-99m, the influence of scattered radiation over the performance of cameras is very high. The IEC document 'Characteristics and test conditions of radionuclide imaging devices' is discussed

  13. Performance measurements of hybrid PIN diode arrays

    Jernigan, J.G.; Arens, J.F.; Collins, T.; Herring, J.; Shapiro, S.L.; Wilburn, C.D.

    1990-05-01

    We report on the successful effort to develop hybrid PIN diode arrays and to demonstrate their potential as components of vertex detectors. Hybrid pixel arrays have been fabricated by the Hughes Aircraft Co. by bump bonding readout chips developed by Hughes to an array of PIN diodes manufactured by Micron Semiconductor Inc. These hybrid pixel arrays were constructed in two configurations. One array format having 10 x 64 pixels, each 120 μm square, and the other format having 256 x 256 pixels, each 30 μm square. In both cases, the thickness of the PIN diode layer is 300 μm. Measurements of detector performance show that excellent position resolution can be achieved by interpolation. By determining the centroid of the charge cloud which spreads charge into a number of neighboring pixels, a spatial resolution of a few microns has been attained. The noise has been measured to be about 300 electrons (rms) at room temperature, as expected from KTC and dark current considerations, yielding a signal-to-noise ratio of about 100 for minimum ionizing particles. 4 refs., 13 figs

  14. VPN (Virtual Private Network) Performance Measurements

    Calderon, Calixto; Goncalves, Joao G.M.; Sequeira, Vitor [Joint Research Centre, Ispra (Italy). Inst. for the Protection and Security of the Citizen; Vandaele, Roland; Meylemans, Paul [European Commission, DG-TREN (Luxembourg)

    2003-05-01

    Virtual Private Networks (VPN) is an important technology allowing for secure communications through insecure transmission media (i.e., Internet) by adding authentication and encryption to the existing protocols. This paper describes some VPN performance indicators measured over international communication links. An ISDN based VPN link was established between the Joint Research Centre, Ispra site, Italy, and EURATOM Safeguards in Luxembourg. This link connected two EURATOM Safeguards FAST surveillance stations, and used different vendor solutions hardware (Cisco router 1720 and Nokia CC-500 Gateway). To authenticate and secure this international link, we have used several methods at the different levels of the seven-layered ISO network protocol stack (i.e., Callback feature, CHAP - Challenge Handshake Protocol - authentication protocol). The tests made involved the use of different encryption algorithms and the way session secret keys are periodically renewed, considering these elements influence significantly the transmission throughput. Future tests will include the use of a wide variety of wireless media transmission and terminal equipment technologies, in particular PDAs (Personal Digital Assistants) and Notebook PCs. These tests aim at characterising the functionality of VPNs whenever field inspectors wish to contact headquarters to access information from a central archive database or transmit local measurements or documents. These technologies cover wireless transmission needs at different geographical scales: roombased level Bluetooth, floor or building level Wi-Fi and region or country level GPRS.

  15. Team Development for High Performance Management.

    Schermerhorn, John R., Jr.

    1986-01-01

    The author examines a team development approach to management that creates shared commitments to performance improvement by focusing the attention of managers on individual workers and their task accomplishments. It uses the "high-performance equation" to help managers confront shared beliefs and concerns about performance and develop realistic…

  16. Delivering high performance BWR fuel reliably

    Schardt, J.F. [GE Nuclear Energy, Wilmington, NC (United States)

    1998-07-01

    Utilities are under intense pressure to reduce their production costs in order to compete in the increasingly deregulated marketplace. They need fuel, which can deliver high performance to meet demanding operating strategies. GE's latest BWR fuel design, GE14, provides that high performance capability. GE's product introduction process assures that this performance will be delivered reliably, with little risk to the utility. (author)

  17. HPTA: High-Performance Text Analytics

    Vandierendonck, Hans; Murphy, Karen; Arif, Mahwish; Nikolopoulos, Dimitrios S.

    2017-01-01

    One of the main targets of data analytics is unstructured data, which primarily involves textual data. High-performance processing of textual data is non-trivial. We present the HPTA library for high-performance text analytics. The library helps programmers to map textual data to a dense numeric representation, which can be handled more efficiently. HPTA encapsulates three performance optimizations: (i) efficient memory management for textual data, (ii) parallel computation on associative dat...

  18. Measuring performances of linux hyper visors

    Chierici, A.; Veraldi, R.; Salomoni, D.

    2009-01-01

    Virtualisation is a now proven software technology that is rapidly transforming the I T landscape and fundamentally changing the way people make computations and implement services. Recently, all major software producers (e.g., Microsoft and Red Hat) developed or acquired virtualisation technologies. Our institute (http://www.CNAF.INFN.it) is a Tier l for experiments carried on at the Large Hadron Collider at CERN (http://lhc.web.CERN.ch/lhc/) and is experiencing several benefits from virtualisation technologies, like improving fault tolerance, providing efficient hardware resource usage and increasing security. Currently, the virtualisation solution we adopted is xen, which is well supported by the Scientific Linux distribution, widely used by the High-Energy Physics (HEP) community. Since Scientific Linux is based on Red Hat E S, we felt the need to investigate performances and usability differences with the new k vm technology, recently acquired by Red Hat. The case study of this work is the Tier2 site for the LHCb experiment hosted at our institute; all major grid elements for this Tier2 run on xen virtual machines smoothly. We will investigate the impact on performance and stability that a migration to k vm would entail on the Tier2 site, as well as the effort required by a system administrator to deploy the migration.

  19. Performance Measurement of Management System Standards Using the Balanced Scorecard

    Jan Kopia

    2017-11-01

    Full Text Available Management system standards (MSS, such as ISO standards, TQM, etc. are widely-used standards adopted by millions of organizations worldwide. It is still an unclear question whether these standards are beneficial for an organization, besides the fact that they might be required or expected by law or customers. The question, whether MSS increase the efficiency, the output, or the performance of an organization is still discussed in scientific research. One reason might be that performance measurement itself is not fully understood or in constant development ranging from pure financial evaluations over intellectual capital rating to calculating of levels of environmental, social or economic expectations known as the Trible Bottom Line. The Balanced Scorecard is one possible solution for performance measurement on a strategic and operational level and therefore useful for the measurement of the influence of MSS within organizations. This study summarized current research in the field of performance measurement in the context of MSS and IMS and the use of BSC and quantitatively and qualitatively tests the usefulness of BSC in measuring the effect of MSSs using the Execution Premium. It was found that BSC is often used, that an average number of companies integrate their measurement initiatives of their MSSs into the BSC-process, and that a high integration of MSS into the BSC improves the organizational performance. This research is useful for researchers and practitioners in order to understand the benefits of the usage of the BSC in the context of MSS or Integrated Management Systems.

  20. JT-60U high performance regimes

    Ishida, S.

    1999-01-01

    High performance regimes of JT-60U plasmas are presented with an emphasis upon the results from the use of a semi-closed pumped divertor with W-shaped geometry. Plasma performance in transient and quasi steady states has been significantly improved in reversed shear and high- βp regimes. The reversed shear regime elevated an equivalent Q DT eq transiently up to 1.25 (n D (0)τ E T i (0)=8.6x10 20 m-3·s·keV) in a reactor-relevant thermonuclear dominant regime. Long sustainment of enhanced confinement with internal transport barriers (ITBs) with a fully non-inductive current drive in a reversed shear discharge was successfully demonstrated with LH wave injection. Performance sustainment has been extended in the high- bp regime with a high triangularity achieving a long sustainment of plasma conditions equivalent to Q DT eq ∼0.16 (n D (0)τ E T i (0)∼1.4x10 20 m -3 ·s·keV) for ∼4.5 s with a large non-inductive current drive fraction of 60-70% of the plasma current. Thermal and particle transport analyses show significant reduction of thermal and particle diffusivities around ITB resulting in a strong Er shear in the ITB region. The W-shaped divertor is effective for He ash exhaust demonstrating steady exhaust capability of τ He */τ E ∼3-10 in support of ITER. Suppression of neutral back flow and chemical sputtering effect have been observed while MARFE onset density is rather decreased. Negative-ion based neutral beam injection (N-NBI) experiments have created a clear H-mode transition. Enhanced ionization cross- section due to multi-step ionization processes was confirmed as theoretically predicted. A current density profile driven by N-NBI is measured in a good agreement with theoretical prediction. N-NBI induced TAE modes characterized as persistent and bursting oscillations have been observed from a low hot beta of h >∼0.1-0.2% without a significant loss of fast ions. (author)

  1. Performance Measurement of Complex Event Platforms

    Eva Zámečníková

    2016-12-01

    Full Text Available The aim of this paper is to find and compare existing solutions of complex event processing platforms (CEP. CEP platforms generally serve for processing and/or predicting of high frequency data. We intend to use CEP platform for processing of complex time series and integrate a solution for newly proposed method of decision making. The decision making process will be described by formal grammar. As there are lots of CEP solutions we will take the following characteristics under consideration - the processing in real time, possibility of processing of high volume data from multiple sources, platform independence, platform allowing integration with user solution and open license. At first we will talk about existing CEP tools and their specific way of use in praxis. Then we will mention the design of method for formalization of business rules used for decision making. Afterwards, we focus on two platforms which seem to be the best fit for integration of our solution and we will list the main pros and cons of each approach. Next part is devoted to benchmark platforms for CEP. Final part is devoted to experimental measurements of platform with integrated method for decision support.

  2. Strategy Guideline. Partnering for High Performance Homes

    Prahl, Duncan [IBACOS, Inc., Pittsburgh, PA (United States)

    2013-01-01

    High performance houses require a high degree of coordination and have significant interdependencies between various systems in order to perform properly, meet customer expectations, and minimize risks for the builder. Responsibility for the key performance attributes is shared across the project team and can be well coordinated through advanced partnering strategies. For high performance homes, traditional partnerships need to be matured to the next level and be expanded to all members of the project team including trades, suppliers, manufacturers, HERS raters, designers, architects, and building officials as appropriate. This guide is intended for use by all parties associated in the design and construction of high performance homes. It serves as a starting point and features initial tools and resources for teams to collaborate to continually improve the energy efficiency and durability of new houses.

  3. Performance analysis and evaluation of direct phase measuring deflectometry

    Zhao, Ping; Gao, Nan; Zhang, Zonghua; Gao, Feng; Jiang, Xiangqian

    2018-04-01

    Three-dimensional (3D) shape measurement of specular objects plays an important role in intelligent manufacturing applications. Phase measuring deflectometry (PMD)-based methods are widely used to obtain the 3D shapes of specular surfaces because they offer the advantages of a large dynamic range, high measurement accuracy, full-field and noncontact operation, and automatic data processing. To enable measurement of specular objects with discontinuous and/or isolated surfaces, a direct PMD (DPMD) method has been developed to build a direct relationship between phase and depth. In this paper, a new virtual measurement system is presented and is used to optimize the system parameters and evaluate the system's performance in DPMD applications. Four system parameters are analyzed to obtain accurate measurement results. Experiments are performed using simulated and actual data and the results confirm the effects of these four parameters on the measurement results. Researchers can therefore select suitable system parameters for actual DPMD (including PMD) measurement systems to obtain the 3D shapes of specular objects with high accuracy.

  4. High-performance ceramics. Fabrication, structure, properties

    Petzow, G.; Tobolski, J.; Telle, R.

    1996-01-01

    The program ''Ceramic High-performance Materials'' pursued the objective to understand the chaining of cause and effect in the development of high-performance ceramics. This chain of problems begins with the chemical reactions for the production of powders, comprises the characterization, processing, shaping and compacting of powders, structural optimization, heat treatment, production and finishing, and leads to issues of materials testing and of a design appropriate to the material. The program ''Ceramic High-performance Materials'' has resulted in contributions to the understanding of fundamental interrelationships in terms of materials science, which are summarized in the present volume - broken down into eight special aspects. (orig./RHM)

  5. High Burnup Fuel Performance and Safety Research

    Bang, Je Keun; Lee, Chan Bok; Kim, Dae Ho (and others)

    2007-03-15

    The worldwide trend of nuclear fuel development is to develop a high burnup and high performance nuclear fuel with high economies and safety. Because the fuel performance evaluation code, INFRA, has a patent, and the superiority for prediction of fuel performance was proven through the IAEA CRP FUMEX-II program, the INFRA code can be utilized with commercial purpose in the industry. The INFRA code was provided and utilized usefully in the universities and relevant institutes domesticallly and it has been used as a reference code in the industry for the development of the intrinsic fuel rod design code.

  6. The effect of subject measurement error on joint kinematics in the conventional gait model: Insights from the open-source pyCGM tool using high performance computing methods.

    Schwartz, Mathew; Dixon, Philippe C

    2018-01-01

    The conventional gait model (CGM) is a widely used biomechanical model which has been validated over many years. The CGM relies on retro-reflective markers placed along anatomical landmarks, a static calibration pose, and subject measurements as inputs for joint angle calculations. While past literature has shown the possible errors caused by improper marker placement, studies on the effects of inaccurate subject measurements are lacking. Moreover, as many laboratories rely on the commercial version of the CGM, released as the Plug-in Gait (Vicon Motion Systems Ltd, Oxford, UK), integrating improvements into the CGM code is not easily accomplished. This paper introduces a Python implementation for the CGM, referred to as pyCGM, which is an open-source, easily modifiable, cross platform, and high performance computational implementation. The aims of pyCGM are to (1) reproduce joint kinematic outputs from the Vicon CGM and (2) be implemented in a parallel approach to allow integration on a high performance computer. The aims of this paper are to (1) demonstrate that pyCGM can systematically and efficiently examine the effect of subject measurements on joint angles and (2) be updated to include new calculation methods suggested in the literature. The results show that the calculated joint angles from pyCGM agree with Vicon CGM outputs, with a maximum lower body joint angle difference of less than 10-5 degrees. Through the hierarchical system, the ankle joint is the most vulnerable to subject measurement error. Leg length has the greatest effect on all joints as a percentage of measurement error. When compared to the errors previously found through inter-laboratory measurements, the impact of subject measurements is minimal, and researchers should rather focus on marker placement. Finally, we showed that code modifications can be performed to include improved hip, knee, and ankle joint centre estimations suggested in the existing literature. The pyCGM code is provided

  7. 45 CFR 305.40 - Penalty performance measures and levels.

    2010-10-01

    ... HUMAN SERVICES PROGRAM PERFORMANCE MEASURES, STANDARDS, FINANCIAL INCENTIVES, AND PENALTIES § 305.40 Penalty performance measures and levels. (a) There are three performance measures for which States must... 45 Public Welfare 2 2010-10-01 2010-10-01 false Penalty performance measures and levels. 305.40...

  8. ADVANCED HIGH PERFORMANCE SOLID WALL BLANKET CONCEPTS

    WONG, CPC; MALANG, S; NISHIO, S; RAFFRAY, R; SAGARA, S

    2002-01-01

    OAK A271 ADVANCED HIGH PERFORMANCE SOLID WALL BLANKET CONCEPTS. First wall and blanket (FW/blanket) design is a crucial element in the performance and acceptance of a fusion power plant. High temperature structural and breeding materials are needed for high thermal performance. A suitable combination of structural design with the selected materials is necessary for D-T fuel sufficiency. Whenever possible, low afterheat, low chemical reactivity and low activation materials are desired to achieve passive safety and minimize the amount of high-level waste. Of course the selected fusion FW/blanket design will have to match the operational scenarios of high performance plasma. The key characteristics of eight advanced high performance FW/blanket concepts are presented in this paper. Design configurations, performance characteristics, unique advantages and issues are summarized. All reviewed designs can satisfy most of the necessary design goals. For further development, in concert with the advancement in plasma control and scrape off layer physics, additional emphasis will be needed in the areas of first wall coating material selection, design of plasma stabilization coils, consideration of reactor startup and transient events. To validate the projected performance of the advanced FW/blanket concepts the critical element is the need for 14 MeV neutron irradiation facilities for the generation of necessary engineering design data and the prediction of FW/blanket components lifetime and availability

  9. High performance liquid chromatography in pharmaceutical analyses

    Branko Nikolin

    2004-05-01

    Full Text Available In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatographyreplaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography(HPLC analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1 Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or

  10. High Performance Walls in Hot-Dry Climates

    Hoeschele, Marc [National Renewable Energy Lab. (NREL), Golden, CO (United States); Springer, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dakin, Bill [National Renewable Energy Lab. (NREL), Golden, CO (United States); German, Alea [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-01-01

    High performance walls represent a high priority measure for moving the next generation of new homes to the Zero Net Energy performance level. The primary goal in improving wall thermal performance revolves around increasing the wall framing from 2x4 to 2x6, adding more cavity and exterior rigid insulation, achieving insulation installation criteria meeting ENERGY STAR's thermal bypass checklist, and reducing the amount of wood penetrating the wall cavity.

  11. High performance liquid chromatographic determination of ...

    STORAGESEVER

    2010-02-08

    ) high performance liquid chromatography (HPLC) grade .... applications. These are important requirements if the reagent is to be applicable to on-line pre or post column derivatisation in a possible automation of the analytical.

  12. Analog circuit design designing high performance amplifiers

    Feucht, Dennis

    2010-01-01

    The third volume Designing High Performance Amplifiers applies the concepts from the first two volumes. It is an advanced treatment of amplifier design/analysis emphasizing both wideband and precision amplification.

  13. High-performance computing using FPGAs

    Benkrid, Khaled

    2013-01-01

    This book is concerned with the emerging field of High Performance Reconfigurable Computing (HPRC), which aims to harness the high performance and relative low power of reconfigurable hardware–in the form Field Programmable Gate Arrays (FPGAs)–in High Performance Computing (HPC) applications. It presents the latest developments in this field from applications, architecture, and tools and methodologies points of view. We hope that this work will form a reference for existing researchers in the field, and entice new researchers and developers to join the HPRC community.  The book includes:  Thirteen application chapters which present the most important application areas tackled by high performance reconfigurable computers, namely: financial computing, bioinformatics and computational biology, data search and processing, stencil computation e.g. computational fluid dynamics and seismic modeling, cryptanalysis, astronomical N-body simulation, and circuit simulation.     Seven architecture chapters which...

  14. Embedded High Performance Scalable Computing Systems

    Ngo, David

    2003-01-01

    The Embedded High Performance Scalable Computing Systems (EHPSCS) program is a cooperative agreement between Sanders, A Lockheed Martin Company and DARPA that ran for three years, from Apr 1995 - Apr 1998...

  15. Gradient High Performance Liquid Chromatography Method ...

    Purpose: To develop a gradient high performance liquid chromatography (HPLC) method for the simultaneous determination of phenylephrine (PHE) and ibuprofen (IBU) in solid ..... nimesulide, phenylephrine. Hydrochloride, chlorpheniramine maleate and caffeine anhydrous in pharmaceutical dosage form. Acta Pol.

  16. Highlighting High Performance: Whitman Hanson Regional High School; Whitman, Massachusetts

    2006-06-01

    This brochure describes the key high-performance building features of the Whitman-Hanson Regional High School. The brochure was paid for by the Massachusetts Technology Collaborative as part of their Green Schools Initiative. High-performance features described are daylighting and energy-efficient lighting, indoor air quality, solar and wind energy, building envelope, heating and cooling systems, water conservation, and acoustics. Energy cost savings are also discussed.

  17. High temperature measurement of water vapor absorption

    Keefer, Dennis; Lewis, J. W. L.; Eskridge, Richard

    1985-01-01

    An investigation was undertaken to measure the absorption coefficient, at a wavelength of 10.6 microns, for mixtures of water vapor and a diluent gas at high temperature and pressure. The experimental concept was to create the desired conditions of temperature and pressure in a laser absorption wave, similar to that which would be created in a laser propulsion system. A simplified numerical model was developed to predict the characteristics of the absorption wave and to estimate the laser intensity threshold for initiation. A non-intrusive method for temperature measurement utilizing optical laser-beam deflection (OLD) and optical spark breakdown produced by an excimer laser, was thoroughly investigated and found suitable for the non-equilibrium conditions expected in the wave. Experiments were performed to verify the temperature measurement technique, to screen possible materials for surface initiation of the laser absorption wave and to attempt to initiate an absorption wave using the 1.5 kW carbon dioxide laser. The OLD technique was proven for air and for argon, but spark breakdown could not be produced in helium. It was not possible to initiate a laser absorption wave in mixtures of water and helium or water and argon using the 1.5 kW laser, a result which was consistent with the model prediction.

  18. High performance computing in Windows Azure cloud

    Ambruš, Dejan

    2013-01-01

    High performance, security, availability, scalability, flexibility and lower costs of maintenance have essentially contributed to the growing popularity of cloud computing in all spheres of life, especially in business. In fact cloud computing offers even more than this. With usage of virtual computing clusters a runtime environment for high performance computing can be efficiently implemented also in a cloud. There are many advantages but also some disadvantages of cloud computing, some ...

  19. High-performance computing — an overview

    Marksteiner, Peter

    1996-08-01

    An overview of high-performance computing (HPC) is given. Different types of computer architectures used in HPC are discussed: vector supercomputers, high-performance RISC processors, various parallel computers like symmetric multiprocessors, workstation clusters, massively parallel processors. Software tools and programming techniques used in HPC are reviewed: vectorizing compilers, optimization and vector tuning, optimization for RISC processors; parallel programming techniques like shared-memory parallelism, message passing and data parallelism; and numerical libraries.

  20. Energy harvesting in high voltage measuring techniques

    Żyłka, Pawel; Doliński, Marcin

    2016-01-01

    The paper discusses selected problems related to application of energy harvesting (that is, generating electricity from surplus energy present in the environment) to supply autonomous ultra-low-power measurement systems applicable in high voltage engineering. As a practical example of such implementation a laboratory model of a remote temperature sensor is presented, which is self-powered by heat generated in a current-carrying busbar in HV- switchgear. Presented system exploits a thermoelectric harvester based on a passively cooled Peltier module supplying micro-power low-voltage dc-dc converter driving energy-efficient temperature sensor, microcontroller and a fibre-optic transmitter. Performance of the model in laboratory simulated conditions are presented and discussed. (paper)

  1. Governance among Malaysian high performing companies

    Asri Marsidi

    2016-07-01

    Full Text Available Well performed companies have always been linked with effective governance which is generally reflected through effective board of directors. However many issues concerning the attributes for effective board of directors remained unresolved. Nowadays diversity has been perceived as able to influence the corporate performance due to the likelihood of meeting variety of needs and demands from diverse customers and clients. The study therefore aims to provide a fundamental understanding on governance among high performing companies in Malaysia.

  2. High-performance OPCPA laser system

    Zuegel, J.D.; Bagnoud, V.; Bromage, J.; Begishev, I.A.; Puth, J.

    2006-01-01

    Optical parametric chirped-pulse amplification (OPCPA) is ideally suited for amplifying ultra-fast laser pulses since it provides broadband gain across a wide range of wavelengths without many of the disadvantages of regenerative amplification. A high-performance OPCPA system has been demonstrated as a prototype for the front end of the OMEGA Extended Performance (EP) Laser System. (authors)

  3. High-performance OPCPA laser system

    Zuegel, J.D.; Bagnoud, V.; Bromage, J.; Begishev, I.A.; Puth, J. [Rochester Univ., Lab. for Laser Energetics, NY (United States)

    2006-06-15

    Optical parametric chirped-pulse amplification (OPCPA) is ideally suited for amplifying ultra-fast laser pulses since it provides broadband gain across a wide range of wavelengths without many of the disadvantages of regenerative amplification. A high-performance OPCPA system has been demonstrated as a prototype for the front end of the OMEGA Extended Performance (EP) Laser System. (authors)

  4. Comparing Dutch and British high performing managers

    Waal, A.A. de; Heijden, B.I.J.M. van der; Selvarajah, C.; Meyer, D.

    2016-01-01

    National cultures have a strong influence on the performance of organizations and should be taken into account when studying the traits of high performing managers. At the same time, many studies that focus upon the attributes of successful managers show that there are attributes that are similar

  5. Sampling Approaches for Multi-Domain Internet Performance Measurement Infrastructures

    Calyam, Prasad

    2014-09-15

    The next-generation of high-performance networks being developed in DOE communities are critical for supporting current and emerging data-intensive science applications. The goal of this project is to investigate multi-domain network status sampling techniques and tools to measure/analyze performance, and thereby provide “network awareness” to end-users and network operators in DOE communities. We leverage the infrastructure and datasets available through perfSONAR, which is a multi-domain measurement framework that has been widely deployed in high-performance computing and networking communities; the DOE community is a core developer and the largest adopter of perfSONAR. Our investigations include development of semantic scheduling algorithms, measurement federation policies, and tools to sample multi-domain and multi-layer network status within perfSONAR deployments. We validate our algorithms and policies with end-to-end measurement analysis tools for various monitoring objectives such as network weather forecasting, anomaly detection, and fault-diagnosis. In addition, we develop a multi-domain architecture for an enterprise-specific perfSONAR deployment that can implement monitoring-objective based sampling and that adheres to any domain-specific measurement policies.

  6. High Performance Work Systems for Online Education

    Contacos-Sawyer, Jonna; Revels, Mark; Ciampa, Mark

    2010-01-01

    The purpose of this paper is to identify the key elements of a High Performance Work System (HPWS) and explore the possibility of implementation in an online institution of higher learning. With the projected rapid growth of the demand for online education and its importance in post-secondary education, providing high quality curriculum, excellent…

  7. Teacher Accountability at High Performing Charter Schools

    Aguirre, Moises G.

    2016-01-01

    This study will examine the teacher accountability and evaluation policies and practices at three high performing charter schools located in San Diego County, California. Charter schools are exempted from many laws, rules, and regulations that apply to traditional school systems. By examining the teacher accountability systems at high performing…

  8. CONSIDERATIONS ON MEASURING PERFORMANCE AND MARKET STRUCTURE

    Spiridon Cosmin Alexandru

    2011-01-01

    According to neoclassical theory, the relationship between the price, respectively of marginal cost and market structures, the methods for determining the performance of a firm or of an industry, deviate from the model of perfect competition. Assessing performance involves performing comparisons, reporting that their reference level can be a standard value, or a statistical value which can be a national-regional average, a homogeneous group, or an average value at a market level. Modern theor...

  9. High Power Flex-Propellant Arcjet Performance

    Litchford, Ron J.

    2011-01-01

    A MW-class electrothermal arcjet based on a water-cooled, wall-stabilized, constricted arc discharge configuration was subjected to extensive performance testing using hydrogen and simulated ammonia propellants with the deliberate aim of advancing technology readiness level for potential space propulsion applications. The breadboard design incorporates alternating conductor/insulator wafers to form a discharge barrel enclosure with a 2.5-cm internal bore diameter and an overall length of approximately 1 meter. Swirling propellant flow is introduced into the barrel, and a DC arc discharge mode is established between a backplate tungsten cathode button and a downstream ringanode/ spin-coil assembly. The arc-heated propellant then enters a short mixing plenum and is accelerated through a converging-diverging graphite nozzle. This innovative design configuration differs substantially from conventional arcjet thrusters, in which the throat functions as constrictor and the expansion nozzle serves as the anode, and permits the attainment of an equilibrium sonic throat (EST) condition. During the test program, applied electrical input power was varied between 0.5-1 MW with hydrogen and simulated ammonia flow rates in the range of 4-12 g/s and 15-35 g/s, respectively. The ranges of investigated specific input energy therefore fell between 50-250 MJ/kg for hydrogen and 10-60 MJ/kg for ammonia. In both cases, observed arc efficiencies were between 40-60 percent as determined via a simple heat balance method based on electrical input power and coolant water calorimeter measurements. These experimental results were found to be in excellent agreement with theoretical chemical equilibrium predictions, thereby validating the EST assumption and enabling the utilization of standard TDK nozzle expansion analyses to reliably infer baseline thruster performance characteristics. Inferred specific impulse performance accounting for recombination kinetics during the expansion process

  10. Advanced high performance solid wall blanket concepts

    Wong, C.P.C.; Malang, S.; Nishio, S.; Raffray, R.; Sagara, A.

    2002-01-01

    First wall and blanket (FW/blanket) design is a crucial element in the performance and acceptance of a fusion power plant. High temperature structural and breeding materials are needed for high thermal performance. A suitable combination of structural design with the selected materials is necessary for D-T fuel sufficiency. Whenever possible, low afterheat, low chemical reactivity and low activation materials are desired to achieve passive safety and minimize the amount of high-level waste. Of course the selected fusion FW/blanket design will have to match the operational scenarios of high performance plasma. The key characteristics of eight advanced high performance FW/blanket concepts are presented in this paper. Design configurations, performance characteristics, unique advantages and issues are summarized. All reviewed designs can satisfy most of the necessary design goals. For further development, in concert with the advancement in plasma control and scrape off layer physics, additional emphasis will be needed in the areas of first wall coating material selection, design of plasma stabilization coils, consideration of reactor startup and transient events. To validate the projected performance of the advanced FW/blanket concepts the critical element is the need for 14 MeV neutron irradiation facilities for the generation of necessary engineering design data and the prediction of FW/blanket components lifetime and availability

  11. Measuring individual work performance: Identifying and selecting indicators

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; de Vet, H.C.W.; van der Beek, A.J.

    2014-01-01

    BACKGROUND: Theoretically, individual work performance (IWP) can be divided into four dimensions: task performance, contextual performance, adaptive performance, and counterproductive work behavior. However, there is no consensus on the indicators used to measure these dimensions.

  12. Measuring individual work performance: identifying and selecting indicators

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; Vet, H.C de; Beek, A.J. van der

    2014-01-01

    BACKGROUND: Theoretically, individual work performance (IWP) can be divided into four dimensions: task performance, contextual performance, adaptive performance, and counterproductive work behavior. However, there is no consensus on the indicators used to measure these dimensions. OBJECTIVE: This

  13. Developing integrated benchmarks for DOE performance measurement

    Barancik, J.I.; Kramer, C.F.; Thode, Jr. H.C.

    1992-09-30

    The objectives of this task were to describe and evaluate selected existing sources of information on occupational safety and health with emphasis on hazard and exposure assessment, abatement, training, reporting, and control identifying for exposure and outcome in preparation for developing DOE performance benchmarks. Existing resources and methodologies were assessed for their potential use as practical performance benchmarks. Strengths and limitations of current data resources were identified. Guidelines were outlined for developing new or improved performance factors, which then could become the basis for selecting performance benchmarks. Data bases for non-DOE comparison populations were identified so that DOE performance could be assessed relative to non-DOE occupational and industrial groups. Systems approaches were described which can be used to link hazards and exposure, event occurrence, and adverse outcome factors, as needed to generate valid, reliable, and predictive performance benchmarks. Data bases were identified which contain information relevant to one or more performance assessment categories . A list of 72 potential performance benchmarks was prepared to illustrate the kinds of information that can be produced through a benchmark development program. Current information resources which may be used to develop potential performance benchmarks are limited. There is need to develop an occupational safety and health information and data system in DOE, which is capable of incorporating demonstrated and documented performance benchmarks prior to, or concurrent with the development of hardware and software. A key to the success of this systems approach is rigorous development and demonstration of performance benchmark equivalents to users of such data before system hardware and software commitments are institutionalized.

  14. Measuring Student Performance in General Organic Chemistry

    Austin, Ara C.; Ben-Daat, Hagit; Zhu, Mary; Atkinson, Robert; Barrows, Nathan; Gould, Ian R.

    2015-01-01

    Student performance in general organic chemistry courses is determined by a wide range of factors including cognitive ability, motivation and cultural capital. Previous work on cognitive factors has tended to focus on specific areas rather than exploring performance across all problem types and cognitive skills. In this study, we have categorized…

  15. Contracts, Performance Measurement and Accountability in the Public Sector

    Drewry, Gavin; Greve, Carsten; Tanquerel, Thierry

    This book addresses issues to do with public accountability, audit and performance measurement that are both highly topical and of crucial importance to the theory and practice of public administration in an era of contractualized public management. The literature on public sector contracting...... of audit and accountability in a variety of countries and contexts; the third part offers some wider, cross-cutting perspectives. Based on the work of the EGPA permanent study group on the history of contractualization, Contracts, Performance Measurement and Accountability in the Public Sector draws upon...... - covering both 'hard' agreements (ones that are legally enforceable) and 'soft' agreements (enforced by negotiation and mutual trust) - has been growing for some time and the present book adds a primarily European perspective on contracting, performance-based management and accountability. One important...

  16. Standardization of test conditions for gamma camera performance measurement

    Jordan, K.

    1980-01-01

    The actual way of measuring gamma camera performance is to use point sources or flood sources in air, often in combination with bar phantoms. This method mostly brings best performance parameters for cameras but it has nothing in common with the use of a camera in clinical practice. Particular in the case of low energy emitters, like Tc-99m, the influence of scattered radiation over the performance of cameras is very high. Therefore it is important to have test conditions of radionuclide imaging devices, that will approach as best as practicable the measuring conditions in clinical applications. It is therefore a good news that the International Electrochemical Commission IEC has prepared a draft 'Characteristics and test conditions of radionuclide imaging devices' which is now submitted to the national committees for formal approval under the Six Months' Rule. Some essential points of this document are discussed in the paper. (orig.) [de

  17. Availability of high quality weather data measurements

    Andersen, Elsa; Johansen, Jakob Berg; Furbo, Simon

    In the period 2016-2017 the project “Availability of high quality weather data measurements” is carried out at Department of Civil Engineering at the Technical University of Denmark. The aim of the project is to establish measured high quality weather data which will be easily available...... for the building energy branch and the solar energy branch in their efforts to achieve energy savings and for researchers and students carrying out projects where measured high quality weather data are needed....

  18. Frictional behaviour of high performance fibrous tows: Friction experiments

    Cornelissen, Bo; Rietman, Bert; Akkerman, Remko

    2013-01-01

    Tow friction is an important mechanism in the production and processing of high performance fibrous tows. The frictional behaviour of these tows is anisotropic due to the texture of the filaments as well as the tows. This work describes capstan experiments that were performed to measure the

  19. A high performance electrometer amplifier of hybrid design

    Rao, N.V.; Nazare, C.K.

    1979-01-01

    A high performance, reliable, electrometer amplifier of hybrid design for low current measurements in mass spectrometers has been developed. The short term instability with a 5 x 10 11 ohms input resistor is less than 1 x 10sup(-15) Amp. The drift is better than 1 mV/hour. The design steps are illustrated with a typical amplifier performance details. (auth.)

  20. High resolution simultaneous measurements of airborne radionuclides

    Abe, T.; Yamaguchi, Y.; Tanaka, K.; Komura, K.

    2006-01-01

    High resolution (2-3 hrs) simultaneous measurements of airborne radionuclides, 212 Pb, 210 Pb and 7 Be, have been performed by using extremely low background Ge detectors at Ogoya Underground Laboratory. We have measured above radionuclides at three monitoring points viz, 1) Low Level Radioactivity Laboratory (LLRL) Kanazawa University, 2) Shishiku Plateau (640 m MSL) located about 8 km from LLRL to investigate vertical difference of activity levels, and 3) Hegura Island (10 m MSL) located about 50 km from Noto Peninsula in the Sea of Japan to evaluate the influences of Asian continent or mainland of Japan on the variation to the activity levels. Variations of short-lived 212 Pb concentration showed noticeable time lags between at LLRL and at Shishiku Plateau. These time lags might be caused by change of height of a planetary boundary layer. On the contrary, variations of long-lived 210 Pb and 7 Be showed simultaneity at three locations because of homogeneity of these concentrations all over the area. (author)

  1. Performance measures for public transit mobility management.

    2011-12-01

    "Mobility management is an innovative approach for managing and delivering coordinated public : transportation services that embraces the full family of public transit options. At a national level, there are : currently no industry recognized perform...

  2. Measured performance of the GTA rf systems

    Denney, P.M.; Jachim, S.P.

    1993-01-01

    This paper describes the performance of the RF systems on the Ground Test Accelerator (GTA). The RF system architecture is briefly described. Among the RF performance results presented are RF field flatness and stability, amplitude and phase control resolution, and control system bandwidth and stability. The rejection by the RF systems of beam-induced disturbances, such as transients and noise, are analyzed. The observed responses are also compared to computer-based simulations of the RF systems for validation

  3. Performance Measurement Systems in Swedish Health Care Services

    Kollberg, Beata

    2007-01-01

    In the quality management literature, measurements are attributed great importance in improving products and processes. Systems for performance measurement assessing financial and non-financial measurements were developed in the late 1980s and early 1990s. The research on performance measurement systems has mainly been focused on the design of different performance measurement systems. Many authors are occupied with the study of the constructs of measures and developing prescriptive models of...

  4. ESRD - Clinical Performance Measures (CPM) Project

    U.S. Department of Health & Human Services — Section 4558 (b) of the Balanced Budget Act (BBA) requires CMS to develop and implement by January 1, 2000, a method to measure and report the quality of renal...

  5. Measuring collections effort improves cash performance.

    Shutts, Joe

    2009-09-01

    Having a satisfied work force can lead to an improved collections effort. Hiring the right people and training them ensures employee engagement. Measuring collections effort and offering incentives is key to revenue cycle success.

  6. Validation of an assay for quantification of free normetanephrine, metanephrine and methoxytyramine in plasma by high performance liquid chromatography with coulometric detection: Comparison of peak-area vs. peak-height measurements.

    Nieć, Dawid; Kunicki, Paweł K

    2015-10-01

    Measurements of plasma concentrations of free normetanephrine (NMN), metanephrine (MN) and methoxytyramine (MTY) constitute the most diagnostically accurate screening test for pheochromocytomas and paragangliomas. The aim of this article is to present the results from a validation of an analytical method utilizing high performance liquid chromatography with coulometric detection (HPLC-CD) for quantifying plasma free NMN, MN and MTY. Additionally, peak integration by height and area and the use of one calibration curve for all batches or individual calibration curve for each batch of samples was explored as to determine the optimal approach with regard to accuracy and precision. The method was validated using charcoal stripped plasma spiked with solutions of NMN, MN, MTY and internal standard (4-hydroxy-3-methoxybenzylamine) with the exception of selectivity which was evaluated by analysis of real plasma samples. Calibration curve performance, accuracy, precision and recovery were determined following both peak-area and peak-height measurements and the obtained results were compared. The most accurate and precise method of calibration was evaluated by analyzing quality control samples at three concentration levels in 30 analytical runs. The detector response was linear over the entire tested concentration range from 10 to 2000pg/mL with R(2)≥0.9988. The LLOQ was 10pg/mL for each analyte of interest. To improve accuracy for measurements at low concentrations, a weighted (1/amount) linear regression model was employed, which resulted in inaccuracies of -2.48 to 9.78% and 0.22 to 7.81% following peak-area and peak-height integration, respectively. The imprecisions ranged from 1.07 to 15.45% and from 0.70 to 11.65% for peak-area and peak-height measurements, respectively. The optimal approach to calibration was the one utilizing an individual calibration curve for each batch of samples and peak-height measurements. It was characterized by inaccuracies ranging from -3

  7. High performance bio-integrated devices

    Kim, Dae-Hyeong; Lee, Jongha; Park, Minjoon

    2014-06-01

    In recent years, personalized electronics for medical applications, particularly, have attracted much attention with the rise of smartphones because the coupling of such devices and smartphones enables the continuous health-monitoring in patients' daily life. Especially, it is expected that the high performance biomedical electronics integrated with the human body can open new opportunities in the ubiquitous healthcare. However, the mechanical and geometrical constraints inherent in all standard forms of high performance rigid wafer-based electronics raise unique integration challenges with biotic entities. Here, we describe materials and design constructs for high performance skin-mountable bio-integrated electronic devices, which incorporate arrays of single crystalline inorganic nanomembranes. The resulting electronic devices include flexible and stretchable electrophysiology electrodes and sensors coupled with active electronic components. These advances in bio-integrated systems create new directions in the personalized health monitoring and/or human-machine interfaces.

  8. Performance measurement system for training simulators. Interim report

    Bockhold, G. Jr.; Roth, D.R.

    1978-05-01

    In the first project phase, the project team has designed, installed, and test run on the Browns Ferry nuclear power plant training simulator a performance measurement system capable of automatic recording of statistical information on operator actions and plant response. Key plant variables and operator actions were monitored and analyzed by the simulator computer for a selected set of four operating and casualty drills. The project has the following objectives: (1) To provide an empirical data base for statistical analysis of operator reliability and for allocation of safety and control functions between operators and automated controls; (2) To develop a method for evaluation of the effectiveness of control room designs and operating procedures; and (3) To develop a system for scoring aspects of operator performance to assist in training evaluations and to support operator selection research. The performance measurement system has shown potential for meeting the research objectives. However, the cost of training simulator time is high; to keep research program costs reasonable, the measurement system is being designed to be an integral part of operator training programs. In the pilot implementation, participating instructors judged the measurement system to be a valuable and objective extension of their abilities to monitor trainee performance

  9. Designing a High Performance Parallel Personal Cluster

    Kapanova, K. G.; Sellier, J. M.

    2016-01-01

    Today, many scientific and engineering areas require high performance computing to perform computationally intensive experiments. For example, many advances in transport phenomena, thermodynamics, material properties, computational chemistry and physics are possible only because of the availability of such large scale computing infrastructures. Yet many challenges are still open. The cost of energy consumption, cooling, competition for resources have been some of the reasons why the scientifi...

  10. vSphere high performance cookbook

    Sarkar, Prasenjit

    2013-01-01

    vSphere High Performance Cookbook is written in a practical, helpful style with numerous recipes focusing on answering and providing solutions to common, and not-so common, performance issues and problems.The book is primarily written for technical professionals with system administration skills and some VMware experience who wish to learn about advanced optimization and the configuration features and functions for vSphere 5.1.

  11. High performance parallel I/O

    Prabhat

    2014-01-01

    Gain Critical Insight into the Parallel I/O EcosystemParallel I/O is an integral component of modern high performance computing (HPC), especially in storing and processing very large datasets to facilitate scientific discovery. Revealing the state of the art in this field, High Performance Parallel I/O draws on insights from leading practitioners, researchers, software architects, developers, and scientists who shed light on the parallel I/O ecosystem.The first part of the book explains how large-scale HPC facilities scope, configure, and operate systems, with an emphasis on choices of I/O har

  12. Economic Adversity Transitions From Childhood to Older Adulthood Are Differentially Associated With Later-Life Physical Performance Measures in Men and Women in Middle and High-Income Sites.

    Hwang, Phoebe W; Dos Santos Gomes, Cristiano; Auais, Mohammad; Braun, Kathryn L; Guralnik, Jack M; Pirkle, Catherine M

    2017-10-01

    This study examines the relationship between economic adversity transitions from childhood to older adulthood and older adulthood physical performance among 1,998 community-dwelling older adults from five demographically diverse sites from middle and high-income countries. The principal exposure variable was economic adversity transition. No adversity encompassed not experiencing poverty in both childhood and older adulthood, improved described having only experienced poverty in childhood, worsened captured having experienced poverty in older adulthood, and severe is having experienced poverty in both childhood and older adulthood. The short physical performance battery (SPPB) was used for outcome measures. Analyses of the continuous SPPB score used linear regression, while analysis of a binary outcome (SPPB < 8 vs. ≥8) used Poisson regression models with robust error variance, both adjusting for sex, education, and site location. In sex-stratified models, the SPPB < 8 prevalence rate ratio (PRR) was higher for the severe (PRR: 2.80, 95% confidence interval [CI] = [1.70, 4.61]), worsened (PRR: 2.40, 95% CI = [1.41, 4.09]), and improved (PRR: 1.82, 95% CI = [1.11, 3.01]) groups, compared with those with no adversity in childhood or as adults, but only for females. Findings from this study indicate that persistent economic adversity has a negative effect on older adult physical performance, especially among women.

  13. 10 ps resolution, 160 ns full scale range and less than 1.5% differential non-linearity time-to-digital converter module for high performance timing measurements

    Markovic, B.; Tamborini, D.; Villa, F.; Tisa, S.; Tosi, A.; Zappa, F. [Politecnico di Milano, Dipartimento di Elettronica e Informazione, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)

    2012-07-15

    We present a compact high performance time-to-digital converter (TDC) module that provides 10 ps timing resolution, 160 ns dynamic range and a differential non-linearity better than 1.5% LSB{sub rms}. The TDC can be operated either as a general-purpose time-interval measurement device, when receiving external START and STOP pulses, or in photon-timing mode, when employing the on-chip SPAD (single photon avalanche diode) detector for detecting photons and time-tagging them. The instrument precision is 15 ps{sub rms} (i.e., 36 ps{sub FWHM}) and in photon timing mode it is still better than 70 ps{sub FWHM}. The USB link to the remote PC allows the easy setting of measurement parameters, the fast download of acquired data, and their visualization and storing via an user-friendly software interface. The module proves to be the best candidate for a wide variety of applications such as: fluorescence lifetime imaging, time-of-flight ranging measurements, time-resolved positron emission tomography, single-molecule spectroscopy, fluorescence correlation spectroscopy, diffuse optical tomography, optical time-domain reflectometry, quantum optics, etc.

  14. Pressurized planar electrochromatography, high-performance thin-layer chromatography and high-performance liquid chromatography--comparison of performance.

    Płocharz, Paweł; Klimek-Turek, Anna; Dzido, Tadeusz H

    2010-07-16

    Kinetic performance, measured by plate height, of High-Performance Thin-Layer Chromatography (HPTLC), High-Performance Liquid Chromatography (HPLC) and Pressurized Planar Electrochromatography (PPEC) was compared for the systems with adsorbent of the HPTLC RP18W plate from Merck as the stationary phase and the mobile phase composed of acetonitrile and buffer solution. The HPLC column was packed with the adsorbent, which was scrapped from the chromatographic plate mentioned. An additional HPLC column was also packed with adsorbent of 5 microm particle diameter, C18 type silica based (LiChrosorb RP-18 from Merck). The dependence of plate height of both HPLC and PPEC separating systems on flow velocity of the mobile phase and on migration distance of the mobile phase in TLC system was presented applying test solute (prednisolone succinate). The highest performance, amongst systems investigated, was obtained for the PPEC system. The separation efficiency of the systems investigated in the paper was additionally confirmed by the separation of test component mixture composed of six hormones. 2010 Elsevier B.V. All rights reserved.

  15. Reduction in the balanced scorecard performance measurement ...

    In this paper, we compare PCA and ordinal logistic regression in ranking the manufacturing systems. In this regard we present an integrated framework for assessment and ranking of manufacturing systems based on management and organizational performance indicators. To achieve the objectives of this study, ...

  16. Measuring Institutional Performance in Higher Education.

    Meyerson, Joel W., Ed.; Massy, William F., Ed.

    This collection of seven essays from the Stanford Forum for Higher Education Futures focuses on how downsizing, quality management, and reengineering have are affecting higher education. An introductory paper, "Introduction: Change in Higher Education: Its Effect on Institutional Performance," (Joel W. Meyerson and Sandra L. Johnson)…

  17. 75 FR 38725 - Service Performance Measurement

    2010-07-06

    ... proposed rules against the importance of the information that is being gathered. This would have provided... Commission is adopting a final rule on service perfomance measurement and customer satisfaction. The final... is effective on August 5, 2010. FOR FURTHER INFORMATION CONTACT: Stephen L. Sharfman, General Counsel...

  18. CONFOCAL MICROSCOPY SYSTEM PERFORMANCE: LASER POWER MEASUREMENTS

    Laser power abstract The reliability of the confocal laser-scanning microscope (CLSM) to obtain intensity measurements and quantify fluorescence data is dependent on using a correctly aligned machine that contains a stable laser power. The laser power test appears to be one ...

  19. High Energy Measurement of the Deuteron Photodisintegration Differential Cross Section

    Schulte, Elaine [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2002-05-01

    New measurements of the high energy deuteron photodisintegration differential cross section were made at the Thomas Jefferson National Accelerator Facility in Newport News, Virginia. Two experiments were performed. Experiment E96-003 was performed in experimental Hall C. The measurements were designed to extend the highest energy differential cross section values to 5.5 GeV incident photon energy at forward angles. This builds upon previous high energy measurements in which scaling consistent with the pQCD constituent counting rules was observed at 90 degrees and 70 degrees in the center of mass. From the new measurements, a threshold for the onset of constituent counting rule scaling seems present at transverse momentum approximately 1.3 GeV/c. The second experiment, E99-008, was performed in experimental Hall A. The measurements were designed to explore the angular distribution of the differential cross section at constant energy. The measurements were made symmetric about 90 degrees

  20. Performance Measurement using KPKU- BUMN in X School Education Foundation

    Arijanto, Sugih; Harsono, Ambar; Taroepratjeka, Harsono

    2016-01-01

    The purpose of this research is to determine X School's Strengths and Opportunity of Improvement through performance measurement using KPKU-BUMN (Kriteria Penilaian Kinerja Unggul - Kementerian Badan Usaha Milik Negara). KPKU-BUMN is developed based on Malcolm Baldrige Criteria for Performance Excellent (MBCfPE). X school is an education foundation at Bandung that has provides education from kindergarten, elementary school, to junior and senior high school. The measurement is implemented by two aspects, Process and Result. The Process is measured by A-D-L-I approaches (Approach- Deployment-Learning- Integration), on the other hand The Result is measured by Le-T-C-I approach (Level-Trend- Comparison-Integration). There are six processes that will be measured: (1) Leadership, (2) Strategic Planning, (3) Customer Focus, (4) Measurement, Analysis and Knowledge Management, (5) Work Force Focus, and (6) Operation Focus. Meanwhile, the result are (a) product & process outcomes, (b) customer-focused outcomes, (c) workforce-focused outcomes, (d) leadership & governance outcomes, and (e) financial & market outcomes. The overall score for X School is 284/1000, which means X School is at “early result” level at “poor” global image.

  1. Strategy Guideline: Partnering for High Performance Homes

    Prahl, D.

    2013-01-01

    High performance houses require a high degree of coordination and have significant interdependencies between various systems in order to perform properly, meet customer expectations, and minimize risks for the builder. Responsibility for the key performance attributes is shared across the project team and can be well coordinated through advanced partnering strategies. For high performance homes, traditional partnerships need to be matured to the next level and be expanded to all members of the project team including trades, suppliers, manufacturers, HERS raters, designers, architects, and building officials as appropriate. In an environment where the builder is the only source of communication between trades and consultants and where relationships are, in general, adversarial as opposed to cooperative, the chances of any one building system to fail are greater. Furthermore, it is much harder for the builder to identify and capitalize on synergistic opportunities. Partnering can help bridge the cross-functional aspects of the systems approach and achieve performance-based criteria. Critical success factors for partnering include support from top management, mutual trust, effective and open communication, effective coordination around common goals, team building, appropriate use of an outside facilitator, a partnering charter progress toward common goals, an effective problem-solving process, long-term commitment, continuous improvement, and a positive experience for all involved.

  2. Evaluating performance measures to determine training effectiveness

    Klemm, R.W.; Feiza, A.S.

    1987-01-01

    This research was conceived and dedicated to helping the CECo training organization become a more integrated part of the corporate business. The target population for this study was nuclear and fossil generating station employees who directly impacted the production of electricity. The target sample (n = 150) included: instrument, mechanical, and electrical maintenance personnel; control room operators; engineers, radiation chemists, and other technical specialists; and equipment operators and attendants. A total of four instruments were utilized by this study. Three instruments were administered to the generating station personnel. These included a demographic form, a learning style profile, and a motivational style profile. The focal instrument, a performance skills rating form, was administered to supervisory personnel. Data analysis consisted of three major parts. Part one established internal consistency through Cronbach alpha statistics. Part two provides summary statistics and breakdown tables for important variables. Part three provides inferential statistics responding to the research questions. All six Performance Skills variables discriminated significantly between the trained and non-trained groups (p .001). In all cases, the mean value for the trained group exceeded the mean value for the non-trained group. Implications for further research indicate that training does have a quantifiable effect on job performance

  3. 26 CFR 801.2 - Measuring organizational performance.

    2010-04-01

    ... 26 Internal Revenue 20 2010-04-01 2010-04-01 false Measuring organizational performance. 801.2 Section 801.2 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INTERNAL... REVENUE SERVICE § 801.2 Measuring organizational performance. The performance measures that comprise the...

  4. Long-term bridge performance high priority bridge performance issues.

    2014-10-01

    Bridge performance is a multifaceted issue involving performance of materials and protective systems, : performance of individual components of the bridge, and performance of the structural system as a whole. The : Long-Term Bridge Performance (LTBP)...

  5. Validated High Performance Liquid Chromatography Method for ...

    Purpose: To develop a simple, rapid and sensitive high performance liquid chromatography (HPLC) method for the determination of cefadroxil monohydrate in human plasma. Methods: Schimadzu HPLC with LC solution software was used with Waters Spherisorb, C18 (5 μm, 150mm × 4.5mm) column. The mobile phase ...

  6. An Introduction to High Performance Fortran

    John Merlin

    1995-01-01

    Full Text Available High Performance Fortran (HPF is an informal standard for extensions to Fortran 90 to assist its implementation on parallel architectures, particularly for data-parallel computation. Among other things, it includes directives for specifying data distribution across multiple memories, and concurrent execution features. This article provides a tutorial introduction to the main features of HPF.

  7. High performance computing on vector systems

    Roller, Sabine

    2008-01-01

    Presents the developments in high-performance computing and simulation on modern supercomputer architectures. This book covers trends in hardware and software development in general and specifically the vector-based systems and heterogeneous architectures. It presents innovative fields like coupled multi-physics or multi-scale simulations.

  8. High Performance Electronics on Flexible Silicon

    Sevilla, Galo T.

    2016-09-01

    Over the last few years, flexible electronic systems have gained increased attention from researchers around the world because of their potential to create new applications such as flexible displays, flexible energy harvesters, artificial skin, and health monitoring systems that cannot be integrated with conventional wafer based complementary metal oxide semiconductor processes. Most of the current efforts to create flexible high performance devices are based on the use of organic semiconductors. However, inherent material\\'s limitations make them unsuitable for big data processing and high speed communications. The objective of my doctoral dissertation is to develop integration processes that allow the transformation of rigid high performance electronics into flexible ones while maintaining their performance and cost. In this work, two different techniques to transform inorganic complementary metal-oxide-semiconductor electronics into flexible ones have been developed using industry compatible processes. Furthermore, these techniques were used to realize flexible discrete devices and circuits which include metal-oxide-semiconductor field-effect-transistors, the first demonstration of flexible Fin-field-effect-transistors, and metal-oxide-semiconductors-based circuits. Finally, this thesis presents a new technique to package, integrate, and interconnect flexible high performance electronics using low cost additive manufacturing techniques such as 3D printing and inkjet printing. This thesis contains in depth studies on electrical, mechanical, and thermal properties of the fabricated devices.

  9. Debugging a high performance computing program

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  10. Technology Leadership in Malaysia's High Performance School

    Yieng, Wong Ai; Daud, Khadijah Binti

    2017-01-01

    Headmaster as leader of the school also plays a role as a technology leader. This applies to the high performance schools (HPS) headmaster as well. The HPS excel in all aspects of education. In this study, researcher is interested in examining the role of the headmaster as a technology leader through interviews with three headmasters of high…

  11. Toward High Performance in Industrial Refrigeration Systems

    Thybo, C.; Izadi-Zamanabadi, Roozbeh; Niemann, H.

    2002-01-01

    Achieving high performance in complex industrial systems requires information manipulation at different system levels. The paper shows how different models of same subsystems, but using different quality of information/data, are used for fault diagnosis as well as robust control design...

  12. Towards high performance in industrial refrigeration systems

    Thybo, C.; Izadi-Zamanabadi, R.; Niemann, Hans Henrik

    2002-01-01

    Achieving high performance in complex industrial systems requires information manipulation at different system levels. The paper shows how different models of same subsystems, but using different quality of information/data, are used for fault diagnosis as well as robust control design...

  13. Validated high performance liquid chromatographic (HPLC) method ...

    STORAGESEVER

    2010-02-22

    Feb 22, 2010 ... specific and accurate high performance liquid chromatographic method for determination of ZER in micro-volumes ... tional medicine as a cure for swelling, sores, loss of appetite and ... Receptor Activator for Nuclear Factor κ B Ligand .... The effect of ... be suitable for preclinical pharmacokinetic studies. The.

  14. Validated High Performance Liquid Chromatography Method for ...

    Purpose: To develop a simple, rapid and sensitive high performance liquid ... response, tailing factor and resolution of six replicate injections was < 3 %. ... Cefadroxil monohydrate, Human plasma, Pharmacokinetics Bioequivalence ... Drug-free plasma was obtained from the local .... Influence of probenicid on the renal.

  15. Integrated plasma control for high performance tokamaks

    Humphreys, D.A.; Deranian, R.D.; Ferron, J.R.; Johnson, R.D.; LaHaye, R.J.; Leuer, J.A.; Penaflor, B.G.; Walker, M.L.; Welander, A.S.; Jayakumar, R.J.; Makowski, M.A.; Khayrutdinov, R.R.

    2005-01-01

    Sustaining high performance in a tokamak requires controlling many equilibrium shape and profile characteristics simultaneously with high accuracy and reliability, while suppressing a variety of MHD instabilities. Integrated plasma control, the process of designing high-performance tokamak controllers based on validated system response models and confirming their performance in detailed simulations, provides a systematic method for achieving and ensuring good control performance. For present-day devices, this approach can greatly reduce the need for machine time traditionally dedicated to control optimization, and can allow determination of high-reliability controllers prior to ever producing the target equilibrium experimentally. A full set of tools needed for this approach has recently been completed and applied to present-day devices including DIII-D, NSTX and MAST. This approach has proven essential in the design of several next-generation devices including KSTAR, EAST, JT-60SC, and ITER. We describe the method, results of design and simulation tool development, and recent research producing novel approaches to equilibrium and MHD control in DIII-D. (author)

  16. Project materials [Commercial High Performance Buildings Project

    None

    2001-01-01

    The Consortium for High Performance Buildings (ChiPB) is an outgrowth of DOE'S Commercial Whole Buildings Roadmapping initiatives. It is a team-driven public/private partnership that seeks to enable and demonstrate the benefit of buildings that are designed, built and operated to be energy efficient, environmentally sustainable, superior quality, and cost effective.

  17. High performance structural ceramics for nuclear industry

    Pujari, Vimal K.; Faker, Paul

    2006-01-01

    A family of Saint-Gobain structural ceramic materials and products produced by its High performance Refractory Division is described. Over the last fifty years or so, Saint-Gobain has been a leader in developing non oxide ceramic based novel materials, processes and products for application in Nuclear, Chemical, Automotive, Defense and Mining industries

  18. A new high performance current transducer

    Tang Lijun; Lu Songlin; Li Deming

    2003-01-01

    A DC-100 kHz current transducer is developed using a new technique on zero-flux detecting principle. It was shown that the new current transducer is of high performance, its magnetic core need not be selected very stringently, and it is easy to manufacture

  19. "Productivity performance measurement - follow-up"

    Kristensen, Troels

    2008-01-01

    The Danish Ministry of Health has published the third annual report on hospital productivity. This experience has contributed to policy goals becoming more detailed and ambitious. New policy goals are: to include hospital productivity measures at less aggregated levels, to include labour producti...... productivity and hospital psychiatric care, to provide web-based solutions that facilitate access to productivity data, and to develop new classifications of hospital levels related to structural reforms....

  20. Strategy Guideline. High Performance Residential Lighting

    Holton, J. [IBACOS, Inc., Pittsburgh, PA (United States)

    2012-02-01

    This report has been developed to provide a tool for the understanding and application of high performance lighting in the home. The strategies featured in this guide are drawn from recent advances in commercial lighting for application to typical spaces found in residential buildings. This guide offers strategies to greatly reduce lighting energy use through the application of high quality fluorescent and light emitting diode (LED) technologies. It is important to note that these strategies not only save energy in the home but also serve to satisfy the homeowner’s expectations for high quality lighting.

  1. Growth performance, body measurements, carcass composition and ...

    Gizzard percentage was significantly greater in males, and heart percentage was significantly greater in females. Due to their high bodyweight, good FCR, and favourable carcass composition, SM3 Heavy male ducks are more useful as broiler duck than females. Keywords: carcasses, digestive system, growth, Pekin duck, ...

  2. Architecting Web Sites for High Performance

    Arun Iyengar

    2002-01-01

    Full Text Available Web site applications are some of the most challenging high-performance applications currently being developed and deployed. The challenges emerge from the specific combination of high variability in workload characteristics and of high performance demands regarding the service level, scalability, availability, and costs. In recent years, a large body of research has addressed the Web site application domain, and a host of innovative software and hardware solutions have been proposed and deployed. This paper is an overview of recent solutions concerning the architectures and the software infrastructures used in building Web site applications. The presentation emphasizes three of the main functions in a complex Web site: the processing of client requests, the control of service levels, and the interaction with remote network caches.

  3. Measurement properties of continuous text reading performance tests.

    Brussee, Tamara; van Nispen, Ruth M A; van Rens, Ger H M B

    2014-11-01

    Measurement properties of tests to assess reading acuity or reading performance have not been extensively evaluated. This study aims to provide an overview of the literature on available continuous text reading tests and their measurement properties. A literature search was performed in PubMed, Embase and PsycInfo. Subsequently, information on design and content of reading tests, study design and measurement properties were extracted using consensus-based standards for selection of health measurement instruments. Quality of studies, reading tests and measurement properties were systematically assessed using pre-specified criteria. From 2334 identified articles, 20 relevant articles were found on measurement properties of three reading tests in various languages: IReST, MNread Reading Test and Radner Reading Charts. All three reading tests scored high on content validity. Reproducibility studies (repeated measurements between different testing sessions) of the IReST and MNread of commercially available reading tests in different languages were missing. The IReST scored best on inter-language comparison, the MNread scored well in repeatability studies (repeated measurements under the same conditions) and the Radner showed good reproducibility in studies. Although in daily practice there are other continuous text reading tests available meeting the criteria of this review, measurement properties were described in scientific studies for only three of them. Of the few available studies, the quality and content of study design and methodology used varied. For testing existing reading tests and the development of new ones, for example in other languages, we make several recommendations, including careful description of patient characteristics, use of objective and subjective lighting levels, good control of working distance, documentation of the number of raters and their training, careful documentation of scoring rules and the use of Bland-Altman analyses or similar for

  4. Thermal performance measurements on ultimate heat sinks--cooling ponds

    Hadlock, R.K.; Abbey, O.B.

    1977-12-01

    The primary objective of the studies described is to obtain the requisite data, with respect to modeling requirements, to characterize thermal performance of heat sinks for nuclear facilities existing at elevated water temperatures in result of experiencing a genuinely large heat load and responding to meteorological influence. The data should reflect thermal performance for combinations leading to worst-case meteorological influence. A geothermal water retention basin has been chosen as the site for the first measurement program and data have been obtained in the first of several experiments scheduled to be performed there. These data illustrate the thermal and water budgets during episodes of cooling from an initially high pond water bulk temperature. Monitoring proceeded while the pond experienced only meteorological and seepage influence. The data are discussed and are presented as a data volume which may be used for calculation purposes. Suggestions for future measurement programs are stated with the intent to maintain and improve relevance to nuclear ultimate heat sinks while continuing to examine the performance of the analog geothermal pond. It is further suggested that the geothermal pond, with some modification, may be a suitable site for spray pond measurements

  5. Measuring systemic performance of the Lithuanian government

    Nakrošis, Vitalis

    2008-01-01

    This paper seeks assessing the dynamics of Lithuania's governmental perfor­mance and comparing it to other countries. It draws on a simple logical model, linking the inputs of the government to its outputs and outcomes. It was found that performance of the Lithuanian government is average and even poor, if compared with the EU aver­age or such countries as Estonia and Ireland. This is despite the fact the public mode of production is rather expensive in Lithuania and the number of public empl...

  6. Measuring recent research performance for Chinese universities using bibliometric methods

    Zhu, Jia

    2014-07-29

    This paper focuses on measuring the academic research performance of Chinese universities by using Scopus database from 2007 to 2010. We have provided meaningful indicators to measure the research performance of Chinese universities as compared to world class universities of the US and the European region. Using these indicators, we first measure the quantity and quality of the research outcomes of the universities and then examine the internationalization of research by using international collaborations, international citations and international impact metrics. Using all of this data, we finally present an overall score called research performance point to measure the comprehensive research strength of the universities for the selected subject categories. The comparison identifies the gap between Chinese universities and top-tier universities from selected regions across various subject areas. We find that Chinese universities are doing well in terms of publication volume but receive less citations from their published work. We also find that the Chinese universities have relative low percentage of publications at high impact venues, which may be the reason that they are not receiving more citations. Therefore, a careful selection of publication venues may help the Chinese universities to compete with world class universities and increase their research internationalization. © 2014 Akadémiai Kiadó, Budapest, Hungary.

  7. Evaluation of emergency department performance - a systematic review on recommended performance and quality-in-care measures.

    Sørup, Christian Michel; Jacobsen, Peter; Forberg, Jakob Lundager

    2013-08-09

    Evaluation of emergency department (ED) performance remains a difficult task due to the lack of consensus on performance measures that reflects high quality, efficiency, and sustainability. To describe, map, and critically evaluate which performance measures that the published literature regard as being most relevant in assessing overall ED performance. Following the PRISMA guidelines, a systematic literature review of review articles reporting accentuated ED performance measures was conducted in the databases of PubMed, Cochrane Library, and Web of Science. Study eligibility criteria includes: 1) the main purpose was to discuss, analyse, or promote performance measures best reflecting ED performance, 2) the article was a review article, and 3) the article reported macro-level performance measures, thus reflecting an overall departmental performance level. A number of articles addresses this study's objective (n = 14 of 46 unique hits). Time intervals and patient-related measures were dominant in the identified performance measures in review articles from US, UK, Sweden and Canada. Length of stay (LOS), time between patient arrival to initial clinical assessment, and time between patient arrivals to admission were highlighted by the majority of articles. Concurrently, "patients left without being seen" (LWBS), unplanned re-attendance within a maximum of 72 hours, mortality/morbidity, and number of unintended incidents were the most highlighted performance measures that related directly to the patient. Performance measures related to employees were only stated in two of the 14 included articles. A total of 55 ED performance measures were identified. ED time intervals were the most recommended performance measures followed by patient centeredness and safety performance measures. ED employee related performance measures were rarely mentioned in the investigated literature. The study's results allow for advancement towards improved performance measurement and

  8. High performance anode for advanced Li batteries

    Lake, Carla [Applied Sciences, Inc., Cedarville, OH (United States)

    2015-11-02

    The overall objective of this Phase I SBIR effort was to advance the manufacturing technology for ASI’s Si-CNF high-performance anode by creating a framework for large volume production and utilization of low-cost Si-coated carbon nanofibers (Si-CNF) for the battery industry. This project explores the use of nano-structured silicon which is deposited on a nano-scale carbon filament to achieve the benefits of high cycle life and high charge capacity without the consequent fading of, or failure in the capacity resulting from stress-induced fracturing of the Si particles and de-coupling from the electrode. ASI’s patented coating process distinguishes itself from others, in that it is highly reproducible, readily scalable and results in a Si-CNF composite structure containing 25-30% silicon, with a compositionally graded interface at the Si-CNF interface that significantly improve cycling stability and enhances adhesion of silicon to the carbon fiber support. In Phase I, the team demonstrated the production of the Si-CNF anode material can successfully be transitioned from a static bench-scale reactor into a fluidized bed reactor. In addition, ASI made significant progress in the development of low cost, quick testing methods which can be performed on silicon coated CNFs as a means of quality control. To date, weight change, density, and cycling performance were the key metrics used to validate the high performance anode material. Under this effort, ASI made strides to establish a quality control protocol for the large volume production of Si-CNFs and has identified several key technical thrusts for future work. Using the results of this Phase I effort as a foundation, ASI has defined a path forward to commercialize and deliver high volume and low-cost production of SI-CNF material for anodes in Li-ion batteries.

  9. High performance direct methanol fuel cell with thin electrolyte membrane

    Wan, Nianfang

    2017-06-01

    A high performance direct methanol fuel cell is achieved with thin electrolyte membrane. 320 mW cm-2 of peak power density and over 260 mW cm-2 at 0.4 V are obtained when working at 90 °C with normal pressure air supply. It is revealed that the increased anode half-cell performance with temperature contributes primarily to the enhanced performance at elevated temperature. From the comparison of iR-compensated cathode potential of methanol/air with that of H2/air fuel cell, the impact of methanol crossover on cathode performance decreases with current density and becomes negligible at high current density. Current density is found to influence fuel efficiency and methanol crossover significantly from the measurement of fuel efficiency at different current density. At high current density, high fuel efficiency can be achieved even at high temperature, indicating decreased methanol crossover.

  10. Development of high performance cladding materials

    Park, Jeong Yong; Jeong, Y. H.; Park, S. Y.

    2010-04-01

    The irradiation test for HANA claddings conducted and a series of evaluation for next-HANA claddings as well as their in-pile and out-of pile performances tests were also carried out at Halden research reactor. The 6th irradiation test have been completed successfully in Halden research reactor. As a result, HANA claddings showed high performance, such as corrosion resistance increased by 40% compared to Zircaloy-4. The high performance of HANA claddings in Halden test has enabled lead test rod program as the first step of the commercialization of HANA claddings. DB has been established for thermal and LOCA-related properties. It was confirmed from the thermal shock test that the integrity of HANA claddings was maintained in more expanded region than the criteria regulated by NRC. The manufacturing process of strips was established in order to apply HANA alloys, which were originally developed for the claddings, to the spacer grids. 250 kinds of model alloys for the next-generation claddings were designed and manufactured over 4 times and used to select the preliminary candidate alloys for the next-generation claddings. The selected candidate alloys showed 50% better corrosion resistance and 20% improved high temperature oxidation resistance compared to the foreign advanced claddings. We established the manufacturing condition controlling the performance of the dual-cooled claddings by changing the reduction rate in the cold working steps

  11. A Linux Workstation for High Performance Graphics

    Geist, Robert; Westall, James

    2000-01-01

    The primary goal of this effort was to provide a low-cost method of obtaining high-performance 3-D graphics using an industry standard library (OpenGL) on PC class computers. Previously, users interested in doing substantial visualization or graphical manipulation were constrained to using specialized, custom hardware most often found in computers from Silicon Graphics (SGI). We provided an alternative to expensive SGI hardware by taking advantage of third-party, 3-D graphics accelerators that have now become available at very affordable prices. To make use of this hardware our goal was to provide a free, redistributable, and fully-compatible OpenGL work-alike library so that existing bodies of code could simply be recompiled. for PC class machines running a free version of Unix. This should allow substantial cost savings while greatly expanding the population of people with access to a serious graphics development and viewing environment. This should offer a means for NASA to provide a spectrum of graphics performance to its scientists, supplying high-end specialized SGI hardware for high-performance visualization while fulfilling the requirements of medium and lower performance applications with generic, off-the-shelf components and still maintaining compatibility between the two.

  12. The path toward HEP High Performance Computing

    Apostolakis, John; Carminati, Federico; Gheata, Andrei; Wenzel, Sandro

    2014-01-01

    High Energy Physics code has been known for making poor use of high performance computing architectures. Efforts in optimising HEP code on vector and RISC architectures have yield limited results and recent studies have shown that, on modern architectures, it achieves a performance between 10% and 50% of the peak one. Although several successful attempts have been made to port selected codes on GPUs, no major HEP code suite has a 'High Performance' implementation. With LHC undergoing a major upgrade and a number of challenging experiments on the drawing board, HEP cannot any longer neglect the less-than-optimal performance of its code and it has to try making the best usage of the hardware. This activity is one of the foci of the SFT group at CERN, which hosts, among others, the Root and Geant4 project. The activity of the experiments is shared and coordinated via a Concurrency Forum, where the experience in optimising HEP code is presented and discussed. Another activity is the Geant-V project, centred on th...

  13. Drivers of Performance Measurement Use: Empirical Evidence from Serbia

    Miloš Milosavljević

    2016-05-01

    Full Text Available In the last decades, the interest of academics and practitioners for the efficiency of performance measurement system use has grown rapidly. The aim of this paper is to examine, articulate and test the relationship between maturity of performance measurement systems, strategic compliance of performance measurement and managerial orientation, on one side, and the portfolio of performance measurement uses, on the other. Data were collected from 86 Serbian companies. The results indicate that the most influential factor for diversified use of performance measurement is the maturity of the system. The paper also discusses theoretical contributions, implications for managers and scholars, and recommendations for decision-makers.

  14. High precision mass measurements in Ψ and Υ families revisited

    Artamonov, A.S.; Baru, S.E.; Blinov, A.E.

    2000-01-01

    High precision mass measurements in Ψ and Υ families performed in 1980-1984 at the VEPP-4 collider with OLYA and MD-1 detectors are revisited. The corrections for the new value of the electron mass are presented. The effect of the updated radiative corrections has been calculated for the J/Ψ(1S) and Ψ(2S) mass measurements [ru

  15. High Performance Commercial Fenestration Framing Systems

    Mike Manteghi; Sneh Kumar; Joshua Early; Bhaskar Adusumalli

    2010-01-31

    A major objective of the U.S. Department of Energy is to have a zero energy commercial building by the year 2025. Windows have a major influence on the energy performance of the building envelope as they control over 55% of building energy load, and represent one important area where technologies can be developed to save energy. Aluminum framing systems are used in over 80% of commercial fenestration products (i.e. windows, curtain walls, store fronts, etc.). Aluminum framing systems are often required in commercial buildings because of their inherent good structural properties and long service life, which is required from commercial and architectural frames. At the same time, they are lightweight and durable, requiring very little maintenance, and offer design flexibility. An additional benefit of aluminum framing systems is their relatively low cost and easy manufacturability. Aluminum, being an easily recyclable material, also offers sustainable features. However, from energy efficiency point of view, aluminum frames have lower thermal performance due to the very high thermal conductivity of aluminum. Fenestration systems constructed of aluminum alloys therefore have lower performance in terms of being effective barrier to energy transfer (heat loss or gain). Despite the lower energy performance, aluminum is the choice material for commercial framing systems and dominates the commercial/architectural fenestration market because of the reasons mentioned above. In addition, there is no other cost effective and energy efficient replacement material available to take place of aluminum in the commercial/architectural market. Hence it is imperative to improve the performance of aluminum framing system to improve the energy performance of commercial fenestration system and in turn reduce the energy consumption of commercial building and achieve zero energy building by 2025. The objective of this project was to develop high performance, energy efficient commercial

  16. Fracture toughness of ultra high performance concrete by flexural performance

    Manolova Emanuela

    2016-01-01

    Full Text Available This paper describes the fracture toughness of the innovative structural material - Ultra High Performance Concrete (UHPC, evaluated by flexural performance. For determination the material behaviour by static loading are used adapted standard test methods for flexural performance of fiber-reinforced concrete (ASTM C 1609 and ASTM C 1018. Fracture toughness is estimated by various deformation parameters derived from the load-deflection curve, obtained by testing simple supported beam under third-point loading, using servo-controlled testing system. This method is used to be estimated the contribution of the embedded fiber-reinforcement into improvement of the fractural behaviour of UHPC by changing the crack-resistant capacity, fracture toughness and energy absorption capacity with various mechanisms. The position of the first crack has been formulated based on P-δ (load- deflection response and P-ε (load - longitudinal deformation in the tensile zone response, which are used for calculation of the two toughness indices I5 and I10. The combination of steel fibres with different dimensions leads to a composite, having at the same time increased crack resistance, first crack formation, ductility and post-peak residual strength.

  17. Employee participation in developing performance measures and job performance: on the role of measurement properties and incentives

    Groen, B.; Wouters, M.; Wilderom, C.

    2013-01-01

    Involving employees in the development of performance measures often results in better employee job performance. Yet not all prior studies find such a direct effect. This study explains these inconsistent findings. It focuses on the measurement properties of performance measures and using them for

  18. A comparative study of performance measurement standards of railway operator

    Pongjirawut Siripong

    2017-01-01

    Full Text Available The European standard (EN 13816, is one of the widely accepted standards for measuring the quality of public passenger transport (PPT service. EN 13816 indicates 8 measurement criteria, 29 sub-criteria and 193 Key Performance Indicators (KPIs to be used to measure the performance of railway operators. Nowadays, there are other addition criteria beyond EN13816, developed by various organisations. This research firstly aims to explore the service performance measurement of railway operators used by actual railway operators at international level and in Thailand. After an intensive review of performance measurement standards, 9 standards are compiled and compared in terms of criteria, sub-criteria and KPIs using a cluster analysis methodology. The result found additional performance measurement aspects at 2 sub-criteria and 91 KPIs in addition to EN 13816. This research summarized and compared different performance measurement standards to measure service quality of metro rail line.

  19. High dose rate brachytherapy source measurement intercomparison.

    Poder, Joel; Smith, Ryan L; Shelton, Nikki; Whitaker, May; Butler, Duncan; Haworth, Annette

    2017-06-01

    This work presents a comparison of air kerma rate (AKR) measurements performed by multiple radiotherapy centres for a single HDR 192 Ir source. Two separate groups (consisting of 15 centres) performed AKR measurements at one of two host centres in Australia. Each group travelled to one of the host centres and measured the AKR of a single 192 Ir source using their own equipment and local protocols. Results were compared to the 192 Ir source calibration certificate provided by the manufacturer by means of a ratio of measured to certified AKR. The comparisons showed remarkably consistent results with the maximum deviation in measurement from the decay-corrected source certificate value being 1.1%. The maximum percentage difference between any two measurements was less than 2%. The comparisons demonstrated the consistency of well-chambers used for 192 Ir AKR measurements in Australia, despite the lack of a local calibration service, and served as a valuable focal point for the exchange of ideas and dosimetry methods.

  20. HIGH PERFORMANCE CERIA BASED OXYGEN MEMBRANE

    2014-01-01

    The invention describes a new class of highly stable mixed conducting materials based on acceptor doped cerium oxide (CeO2-8 ) in which the limiting electronic conductivity is significantly enhanced by co-doping with a second element or co- dopant, such as Nb, W and Zn, so that cerium and the co......-dopant have an ionic size ratio between 0.5 and 1. These materials can thereby improve the performance and extend the range of operating conditions of oxygen permeation membranes (OPM) for different high temperature membrane reactor applications. The invention also relates to the manufacturing of supported...

  1. Playa: High-Performance Programmable Linear Algebra

    Victoria E. Howle

    2012-01-01

    Full Text Available This paper introduces Playa, a high-level user interface layer for composing algorithms for complex multiphysics problems out of objects from other Trilinos packages. Among other features, Playa provides very high-performance overloaded operators implemented through an expression template mechanism. In this paper, we give an overview of the central Playa objects from a user's perspective, show application to a sequence of increasingly complex solver algorithms, provide timing results for Playa's overloaded operators and other functions, and briefly survey some of the implementation issues involved.

  2. Optimizing the design of very high power, high performance converters

    Edwards, R.J.; Tiagha, E.A.; Ganetis, G.; Nawrocky, R.J.

    1980-01-01

    This paper describes how various technologies are used to achieve the desired performance in a high current magnet power converter system. It is hoped that the discussions of the design approaches taken will be applicable to other power supply systems where stringent requirements in stability, accuracy and reliability must be met

  3. Robust High Performance Aquaporin based Biomimetic Membranes

    Helix Nielsen, Claus; Zhao, Yichun; Qiu, C.

    2013-01-01

    on top of a support membrane. Control membranes, either without aquaporins or with the inactive AqpZ R189A mutant aquaporin served as controls. The separation performance of the membranes was evaluated by cross-flow forward osmosis (FO) and reverse osmosis (RO) tests. In RO the ABM achieved a water......Aquaporins are water channel proteins with high water permeability and solute rejection, which makes them promising for preparing high-performance biomimetic membranes. Despite the growing interest in aquaporin-based biomimetic membranes (ABMs), it is challenging to produce robust and defect...... permeability of ~ 4 L/(m2 h bar) with a NaCl rejection > 97% at an applied hydraulic pressure of 5 bar. The water permeability was ~40% higher compared to a commercial brackish water RO membrane (BW30) and an order of magnitude higher compared to a seawater RO membrane (SW30HR). In FO, the ABMs had > 90...

  4. Evaluation of high-performance computing software

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  5. High performance cloud auditing and applications

    Choi, Baek-Young; Song, Sejun

    2014-01-01

    This book mainly focuses on cloud security and high performance computing for cloud auditing. The book discusses emerging challenges and techniques developed for high performance semantic cloud auditing, and presents the state of the art in cloud auditing, computing and security techniques with focus on technical aspects and feasibility of auditing issues in federated cloud computing environments.   In summer 2011, the United States Air Force Research Laboratory (AFRL) CyberBAT Cloud Security and Auditing Team initiated the exploration of the cloud security challenges and future cloud auditing research directions that are covered in this book. This work was supported by the United States government funds from the Air Force Office of Scientific Research (AFOSR), the AFOSR Summer Faculty Fellowship Program (SFFP), the Air Force Research Laboratory (AFRL) Visiting Faculty Research Program (VFRP), the National Science Foundation (NSF) and the National Institute of Health (NIH). All chapters were partially suppor...

  6. Monitoring SLAC High Performance UNIX Computing Systems

    Lettsome, Annette K.

    2005-01-01

    Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia. Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface

  7. High performance parallel computers for science

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1989-01-01

    This paper reports that Fermilab's Advanced Computer Program (ACP) has been developing cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 Mflops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction

  8. High-performance phase-field modeling

    Vignal, Philippe; Sarmiento, Adel; Cortes, Adriano Mauricio; Dalcin, L.; Collier, N.; Calo, Victor M.

    2015-01-01

    and phase-field crystal equation will be presented, which corroborate the theoretical findings, and illustrate the robustness of the method. Results related to more challenging examples, namely the Navier-Stokes Cahn-Hilliard and a diusion-reaction Cahn-Hilliard system, will also be presented. The implementation was done in PetIGA and PetIGA-MF, high-performance Isogeometric Analysis frameworks [1, 3], designed to handle non-linear, time-dependent problems.

  9. AHPCRC - Army High Performance Computing Research Center

    2010-01-01

    computing. Of particular interest is the ability of a distrib- uted jamming network (DJN) to jam signals in all or part of a sensor or communications net...and reasoning, assistive technologies. FRIEDRICH (FRITZ) PRINZ Finmeccanica Professor of Engineering, Robert Bosch Chair, Department of Engineering...High Performance Computing Research Center www.ahpcrc.org BARBARA BRYAN AHPCRC Research and Outreach Manager, HPTi (650) 604-3732 bbryan@hpti.com Ms

  10. Performance concerns for high duty fuel cycle

    Esposito, V.J.; Gutierrez, J.E.

    1999-01-01

    One of the goals of the nuclear industry is to achieve economic performance such that nuclear power plants are competitive in a de-regulated market. The manner in which nuclear fuel is designed and operated lies at the heart of economic viability. In this sense reliability, operating flexibility and low costs are the three major requirements of the NPP today. The translation of these three requirements to the design is part of our work. The challenge today is to produce a fuel design which will operate with long operating cycles, high discharge burnup, power up-rating and while still maintaining all design and safety margins. European Fuel Group (EFG) understands that to achieve the required performance high duty/energy fuel designs are needed. The concerns for high duty design includes, among other items, core design methods, advanced Safety Analysis methodologies, performance models, advanced material and operational strategies. The operational aspects require the trade-off and evaluation of various parameters including coolant chemistry control, material corrosion, boiling duty, boron level impacts, etc. In this environment MAEF is the design that EFG is now offering based on ZIRLO alloy and a robust skeleton. This new design is able to achieve 70 GWd/tU and Lead Test Programs are being executed to demonstrate this capability. A number of performance issues which have been a concern with current designs have been resolved such as cladding corrosion and incomplete RCCA insertion (IRI). As the core duty becomes more aggressive other new issues need to be addressed such as Axial Offset Anomaly. These new issues are being addressed by combination of the new design in concert with advanced methodologies to meet the demanding needs of NPP. The ability and strategy to meet high duty core requirements, flexibility of operation and maintain acceptable balance of all technical issues is the discussion in this paper. (authors)

  11. DURIP: High Performance Computing in Biomathematics Applications

    2017-05-10

    Mathematics and Statistics (AMS) at the University of California, Santa Cruz (UCSC) to conduct research and research-related education in areas of...Computing in Biomathematics Applications Report Title The goal of this award was to enhance the capabilities of the Department of Applied Mathematics and...DURIP: High Performance Computing in Biomathematics Applications The goal of this award was to enhance the capabilities of the Department of Applied

  12. High Performance Computing Operations Review Report

    Cupps, Kimberly C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-19

    The High Performance Computing Operations Review (HPCOR) meeting—requested by the ASC and ASCR program headquarters at DOE—was held November 5 and 6, 2013, at the Marriott Hotel in San Francisco, CA. The purpose of the review was to discuss the processes and practices for HPC integration and its related software and facilities. Experiences and lessons learned from the most recent systems deployed were covered in order to benefit the deployment of new systems.

  13. Planning for high performance project teams

    Reed, W.; Keeney, J.; Westney, R.

    1997-01-01

    Both industry-wide research and corporate benchmarking studies confirm the significant savings in cost and time that result from early planning of a project. Amoco's Team Planning Workshop combines long-term strategic project planning and short-term tactical planning with team building to provide the basis for high performing project teams, better project planning, and effective implementation of the Amoco Common Process for managing projects

  14. Performance measurement of workplace change: in two different cultural contexts

    Chaiwat Riratanaphong

    2014-01-01

    Full Text Available Nowadays, organisations must cope with the pressure of cost reduction and efficiency in order to succeed in a highly competitive business environment. However, drivers to improve social interaction and employee’s performance and as such to contribute to organisational goals and objectives make it necessary to be concerned with other performance criteria as well, such as effectiveness, flexibility, employee satisfaction, productivity and creativity. There is a growing need for performance management and performance measurement that not only covers all aspects of an organisation, but which can be applied to various situations in a changing internal and external environment. Performance measurement methods which include an integrated perspective of performance have become essential. In addition, it has been realised that corporate real estate can contribute to organisational performance (Nourse and Roulac, 1993, De Vries et al., 2008, Lindholm, 2008, Den Heijer, 2011, Jensen et al., 2012. For this reason, worldwide organisations started to implement new ways of working in a more open and flexible work environment. Although there are various objectives and drivers of workplace change, the common objectives are to reduce costs and to increase efficiency. The changing organisational and external contexts, such as the increasing demand for talented knowledge workers and changing work patterns, have led to the development of new offices that can promote social networks and interaction among employees. The new workplace does not only aim at achieving cost efficiency, but it should also support employee satisfaction and productivity. This PhD research focuses on both themes i.e. performance measurement of workplace change. The aim of this research is to provide a conceptual framework that visualises the impact of workplace change on employees’ responses to the new work environment and to present guidelines on performance measurement of workplace

  15. High Performance Single Nanowire Tunnel Diodes

    Wallentin, Jesper; Persson, Johan Mikael; Wagner, Jakob Birkedal

    NWs were contacted in a NW-FET setup. Electrical measurements at room temperature display typical tunnel diode behavior, with a Peak-to-Valley Current Ratio (PVCR) as high as 8.2 and a peak current density as high as 329 A/cm2. Low temperature measurements show improved PVCR of up to 27.6....... is the tunnel (Esaki) diode, which provides a low-resistance connection between junctions. We demonstrate an InP-GaAs NW axial heterostructure with tunnel diode behavior. InP and GaAs can be readily n- and p-doped, respectively, and the heterointerface is expected to have an advantageous type II band alignment...

  16. Computational Biology and High Performance Computing 2000

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  17. High-performance commercial building systems

    Selkowitz, Stephen

    2003-10-01

    This report summarizes key technical accomplishments resulting from the three year PIER-funded R&D program, ''High Performance Commercial Building Systems'' (HPCBS). The program targets the commercial building sector in California, an end-use sector that accounts for about one-third of all California electricity consumption and an even larger fraction of peak demand, at a cost of over $10B/year. Commercial buildings also have a major impact on occupant health, comfort and productivity. Building design and operations practices that influence energy use are deeply engrained in a fragmented, risk-averse industry that is slow to change. Although California's aggressive standards efforts have resulted in new buildings designed to use less energy than those constructed 20 years ago, the actual savings realized are still well below technical and economic potentials. The broad goal of this program is to develop and deploy a set of energy-saving technologies, strategies, and techniques, and improve processes for designing, commissioning, and operating commercial buildings, while improving health, comfort, and performance of occupants, all in a manner consistent with sound economic investment practices. Results are to be broadly applicable to the commercial sector for different building sizes and types, e.g. offices and schools, for different classes of ownership, both public and private, and for owner-occupied as well as speculative buildings. The program aims to facilitate significant electricity use savings in the California commercial sector by 2015, while assuring that these savings are affordable and promote high quality indoor environments. The five linked technical program elements contain 14 projects with 41 distinct R&D tasks. Collectively they form a comprehensive Research, Development, and Demonstration (RD&D) program with the potential to capture large savings in the commercial building sector, providing significant economic benefits to

  18. High performance separation of lanthanides and actinides

    Sivaraman, N.; Vasudeva Rao, P.R.

    2011-01-01

    The major advantage of High Performance Liquid Chromatography (HPLC) is its ability to provide rapid and high performance separations. It is evident from Van Deemter curve for particle size versus resolution that packing materials with particle sizes less than 2 μm provide better resolution for high speed separations and resolving complex mixtures compared to 5 μm based supports. In the recent past, chromatographic support material using monolith has been studied extensively at our laboratory. Monolith column consists of single piece of porous, rigid material containing mesopores and micropores, which provide fast analyte mass transfer. Monolith support provides significantly higher separation efficiency than particle-packed columns. A clear advantage of monolith is that it could be operated at higher flow rates but with lower back pressure. Higher operating flow rate results in higher column permeability, which drastically reduces analysis time and provides high separation efficiency. The above developed fast separation methods were applied to assay the lanthanides and actinides from the dissolver solutions of nuclear reactor fuels

  19. Measure Guideline. High Efficiency Natural Gas Furnaces

    Brand, L. [Partnership for Advanced Residential Retrofit (PARR), Des Plaines, IL (United States); Rose, W. [Partnership for Advanced Residential Retrofit (PARR), Des Plaines, IL (United States)

    2012-10-01

    This measure guideline covers installation of high-efficiency gas furnaces, including: when to install a high-efficiency gas furnace as a retrofit measure; how to identify and address risks; and the steps to be used in the selection and installation process. The guideline is written for Building America practitioners and HVAC contractors and installers. It includes a compilation of information provided by manufacturers, researchers, and the Department of Energy as well as recent research results from the Partnership for Advanced Residential Retrofit (PARR) Building America team.

  20. Measure Guideline: High Efficiency Natural Gas Furnaces

    Brand, L.; Rose, W.

    2012-10-01

    This Measure Guideline covers installation of high-efficiency gas furnaces. Topics covered include when to install a high-efficiency gas furnace as a retrofit measure, how to identify and address risks, and the steps to be used in the selection and installation process. The guideline is written for Building America practitioners and HVAC contractors and installers. It includes a compilation of information provided by manufacturers, researchers, and the Department of Energy as well as recent research results from the Partnership for Advanced Residential Retrofit (PARR) Building America team.

  1. Performance measurement, expectancy and agency theory: An experimental study

    Sloof, R.; van Praag, C.M.

    2008-01-01

    Theoretical analyses of (optimal) performance measures are typically performed within the realm of the linear agency model. This model implies that, for a given compensation scheme, the agent’s optimal effort is unrelated to the amount of noise in the performance measure. In contrast, expectancy

  2. High Performance OLED Panel and Luminaire

    Spindler, Jeffrey [OLEDWorks LLC, Rochester, NY (United States)

    2017-02-20

    In this project, OLEDWorks developed and demonstrated the technology required to produce OLED lighting panels with high energy efficiency and excellent light quality. OLED panels developed in this program produce high quality warm white light with CRI greater than 85 and efficacy up to 80 lumens per watt (LPW). An OLED luminaire employing 24 of the high performance panels produces practical levels of illumination for general lighting, with a flux of over 2200 lumens at 60 LPW. This is a significant advance in the state of the art for OLED solid-state lighting (SSL), which is expected to be a complementary light source to the more advanced LED SSL technology that is rapidly replacing all other traditional forms of lighting.

  3. The path toward HEP High Performance Computing

    Apostolakis, John; Brun, René; Gheata, Andrei; Wenzel, Sandro; Carminati, Federico

    2014-01-01

    High Energy Physics code has been known for making poor use of high performance computing architectures. Efforts in optimising HEP code on vector and RISC architectures have yield limited results and recent studies have shown that, on modern architectures, it achieves a performance between 10% and 50% of the peak one. Although several successful attempts have been made to port selected codes on GPUs, no major HEP code suite has a 'High Performance' implementation. With LHC undergoing a major upgrade and a number of challenging experiments on the drawing board, HEP cannot any longer neglect the less-than-optimal performance of its code and it has to try making the best usage of the hardware. This activity is one of the foci of the SFT group at CERN, which hosts, among others, the Root and Geant4 project. The activity of the experiments is shared and coordinated via a Concurrency Forum, where the experience in optimising HEP code is presented and discussed. Another activity is the Geant-V project, centred on the development of a highperformance prototype for particle transport. Achieving a good concurrency level on the emerging parallel architectures without a complete redesign of the framework can only be done by parallelizing at event level, or with a much larger effort at track level. Apart the shareable data structures, this typically implies a multiplication factor in terms of memory consumption compared to the single threaded version, together with sub-optimal handling of event processing tails. Besides this, the low level instruction pipelining of modern processors cannot be used efficiently to speedup the program. We have implemented a framework that allows scheduling vectors of particles to an arbitrary number of computing resources in a fine grain parallel approach. The talk will review the current optimisation activities within the SFT group with a particular emphasis on the development perspectives towards a simulation framework able to profit

  4. Miniaturized high performance sensors for space plasmas

    Young, D.T.

    1996-01-01

    Operating under ever more constrained budgets, NASA has turned to a new paradigm for instrumentation and mission development in which smaller, faster, better, cheaper is of primary consideration for future space plasma investigations. The author presents several examples showing the influence of this new paradigm on sensor development and discuss certain implications for the scientific return from resource constrained sensors. The author also discusses one way to improve space plasma sensor performance which is to search out new technologies, measurement techniques and instrument analogs from related fields including among others, laboratory plasma physics

  5. High temperature hall effect measurement system design, measurement and analysis

    Berkun, Isil

    A reliable knowledge of the transport properties of semiconductor materials is essential for the development and understanding of a number of electronic devices. In this thesis, the work on developing a Hall Effect measurement system with software based data acqui- sition and control for a temperature range of 300K-700K will be described. A system was developed for high temperature measurements of materials including single crystal diamond, poly-crystalline diamond, and thermoelectric compounds. An added capability for monitor- ing the current versus voltage behavior of the contacts was used for studying the influence of ohmic and non-ohmic contacts on Hall Effect measurements. The system has been primar- ily used for testing the transport properties of boron-doped single crystal diamond (SCD) deposited in a microwave plasma-assisted chemical vapor deposition (MPCVD) reactor [1]. Diamond has several outstanding properties that are of high interest for its development as an electronic material. These include a relatively wide band gap of 5.5 (eV), high thermal conductivity, high mobility, high saturation velocity, and a high breakdown voltage. For a temperature range of 300K-700K, IV curves, Hall mobilities and carrier concentrations are shown. Temperature dependent Hall effect measurements have shown carrier concentrations from below 1017cm --3 to approximately 1021 cm--3 with mobilities ranging from 763( cm2/V s) to 0.15(cm 2/V s) respectively. Simulation results have shown the effects of single and mixed carrier models, activation energies, effective mass and doping concentrations. These studies have been helpful in the development of single crystal diamond for diode applications. Reference materials of Ge and GaAs were used to test the Hall Effect system. The system was also used to characterize polycrystalline diamond deposited on glass for electrochemical applications, and Mg2(Si,Sn) compounds which are promising candidates of low-cost, light weight and non

  6. A High Performance COTS Based Computer Architecture

    Patte, Mathieu; Grimoldi, Raoul; Trautner, Roland

    2014-08-01

    Using Commercial Off The Shelf (COTS) electronic components for space applications is a long standing idea. Indeed the difference in processing performance and energy efficiency between radiation hardened components and COTS components is so important that COTS components are very attractive for use in mass and power constrained systems. However using COTS components in space is not straightforward as one must account with the effects of the space environment on the COTS components behavior. In the frame of the ESA funded activity called High Performance COTS Based Computer, Airbus Defense and Space and its subcontractor OHB CGS have developed and prototyped a versatile COTS based architecture for high performance processing. The rest of the paper is organized as follows: in a first section we will start by recapitulating the interests and constraints of using COTS components for space applications; then we will briefly describe existing fault mitigation architectures and present our solution for fault mitigation based on a component called the SmartIO; in the last part of the paper we will describe the prototyping activities executed during the HiP CBC project.

  7. Management issues for high performance storage systems

    Louis, S. [Lawrence Livermore National Lab., CA (United States); Burris, R. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    Managing distributed high-performance storage systems is complex and, although sharing common ground with traditional network and systems management, presents unique storage-related issues. Integration technologies and frameworks exist to help manage distributed network and system environments. Industry-driven consortia provide open forums where vendors and users cooperate to leverage solutions. But these new approaches to open management fall short addressing the needs of scalable, distributed storage. We discuss the motivation and requirements for storage system management (SSM) capabilities and describe how SSM manages distributed servers and storage resource objects in the High Performance Storage System (HPSS), a new storage facility for data-intensive applications and large-scale computing. Modem storage systems, such as HPSS, require many SSM capabilities, including server and resource configuration control, performance monitoring, quality of service, flexible policies, file migration, file repacking, accounting, and quotas. We present results of initial HPSS SSM development including design decisions and implementation trade-offs. We conclude with plans for follow-on work and provide storage-related recommendations for vendors and standards groups seeking enterprise-wide management solutions.

  8. Development of material measures for performance verifying surface topography measuring instruments

    Leach, Richard; Giusca, Claudiu; Rickens, Kai; Riemer, Oltmann; Rubert, Paul

    2014-01-01

    The development of two irregular-geometry material measures for performance verifying surface topography measuring instruments is described. The material measures are designed to be used to performance verify tactile and optical areal surface topography measuring instruments. The manufacture of the material measures using diamond turning followed by nickel electroforming is described in detail. Measurement results are then obtained using a traceable stylus instrument and a commercial coherence scanning interferometer, and the results are shown to agree to within the measurement uncertainties. The material measures are now commercially available as part of a suite of material measures aimed at the calibration and performance verification of areal surface topography measuring instruments

  9. Thermal and Hygric Expansion of High Performance Concrete

    J. Toman; R. Černý

    2001-01-01

    The linear thermal expansion coefficient of two types of high performance concrete was measured in the temperature range from 20 °C to 1000 °C, and the linear hygric expansion coefficient was determined in the moisture range from dry material to saturation water content. Comparative methods were applied for measurements of both coefficients. The experimental results show that both the effect of temperature on the values of linear thermal expansion coefficients and the effect of moisture on th...

  10. Automatic Energy Schemes for High Performance Applications

    Sundriyal, Vaibhav [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Although high-performance computing traditionally focuses on the efficient execution of large-scale applications, both energy and power have become critical concerns when approaching exascale. Drastic increases in the power consumption of supercomputers affect significantly their operating costs and failure rates. In modern microprocessor architectures, equipped with dynamic voltage and frequency scaling (DVFS) and CPU clock modulation (throttling), the power consumption may be controlled in software. Additionally, network interconnect, such as Infiniband, may be exploited to maximize energy savings while the application performance loss and frequency switching overheads must be carefully balanced. This work first studies two important collective communication operations, all-to-all and allgather and proposes energy saving strategies on the per-call basis. Next, it targets point-to-point communications to group them into phases and apply frequency scaling to them to save energy by exploiting the architectural and communication stalls. Finally, it proposes an automatic runtime system which combines both collective and point-to-point communications into phases, and applies throttling to them apart from DVFS to maximize energy savings. The experimental results are presented for NAS parallel benchmark problems as well as for the realistic parallel electronic structure calculations performed by the widely used quantum chemistry package GAMESS. Close to the maximum energy savings were obtained with a substantially low performance loss on the given platform.

  11. High-performance computing in seismology

    NONE

    1996-09-01

    The scientific, technical, and economic importance of the issues discussed here presents a clear agenda for future research in computational seismology. In this way these problems will drive advances in high-performance computing in the field of seismology. There is a broad community that will benefit from this work, including the petroleum industry, research geophysicists, engineers concerned with seismic hazard mitigation, and governments charged with enforcing a comprehensive test ban treaty. These advances may also lead to new applications for seismological research. The recent application of high-resolution seismic imaging of the shallow subsurface for the environmental remediation industry is an example of this activity. This report makes the following recommendations: (1) focused efforts to develop validated documented software for seismological computations should be supported, with special emphasis on scalable algorithms for parallel processors; (2) the education of seismologists in high-performance computing technologies and methodologies should be improved; (3) collaborations between seismologists and computational scientists and engineers should be increased; (4) the infrastructure for archiving, disseminating, and processing large volumes of seismological data should be improved.

  12. A high performance architecture for accelerator controls

    Allen, M.; Hunt, S.M; Lue, H.; Saltmarsh, C.G.; Parker, C.R.C.B.

    1991-01-01

    The demands placed on the Superconducting Super Collider (SSC) control system due to large distances, high bandwidth and fast response time required for operation will require a fresh approach to the data communications architecture of the accelerator. The prototype design effort aims at providing deterministic communication across the accelerator complex with a response time of < 100 ms and total bandwidth of 2 Gbits/sec. It will offer a consistent interface for a large number of equipment types, from vacuum pumps to beam position monitors, providing appropriate communications performance for each equipment type. It will consist of highly parallel links to all equipment: those with computing resources, non-intelligent direct control interfaces, and data concentrators. This system will give each piece of equipment a dedicated link of fixed bandwidth to the control system. Application programs will have access to all accelerator devices which will be memory mapped into a global virtual addressing scheme. Links to devices in the same geographical area will be multiplexed using commercial Time Division Multiplexing equipment. Low-level access will use reflective memory techniques, eliminating processing overhead and complexity of traditional data communication protocols. The use of commercial standards and equipment will enable a high performance system to be built at low cost

  13. A high performance architecture for accelerator controls

    Allen, M.; Hunt, S.M.; Lue, H.; Saltmarsh, C.G.; Parker, C.R.C.B.

    1991-03-01

    The demands placed on the Superconducting Super Collider (SSC) control system due to large distances, high bandwidth and fast response time required for operation will require a fresh approach to the data communications architecture of the accelerator. The prototype design effort aims at providing deterministic communication across the accelerator complex with a response time of <100 ms and total bandwidth of 2 Gbits/sec. It will offer a consistent interface for a large number of equipment types, from vacuum pumps to beam position monitors, providing appropriate communications performance for each equipment type. It will consist of highly parallel links to all equipments: those with computing resources, non-intelligent direct control interfaces, and data concentrators. This system will give each piece of equipment a dedicated link of fixed bandwidth to the control system. Application programs will have access to all accelerator devices which will be memory mapped into a global virtual addressing scheme. Links to devices in the same geographical area will be multiplexed using commercial Time Division Multiplexing equipment. Low-level access will use reflective memory techniques, eliminating processing overhead and complexity of traditional data communication protocols. The use of commercial standards and equipment will enable a high performance system to be built at low cost. 1 fig

  14. High performance computing in linear control

    Datta, B.N.

    1993-01-01

    Remarkable progress has been made in both theory and applications of all important areas of control. The theory is rich and very sophisticated. Some beautiful applications of control theory are presently being made in aerospace, biomedical engineering, industrial engineering, robotics, economics, power systems, etc. Unfortunately, the same assessment of progress does not hold in general for computations in control theory. Control Theory is lagging behind other areas of science and engineering in this respect. Nowadays there is a revolution going on in the world of high performance scientific computing. Many powerful computers with vector and parallel processing have been built and have been available in recent years. These supercomputers offer very high speed in computations. Highly efficient software, based on powerful algorithms, has been developed to use on these advanced computers, and has also contributed to increased performance. While workers in many areas of science and engineering have taken great advantage of these hardware and software developments, control scientists and engineers, unfortunately, have not been able to take much advantage of these developments

  15. Building Trust in High-Performing Teams

    Aki Soudunsaari

    2012-06-01

    Full Text Available Facilitation of growth is more about good, trustworthy contacts than capital. Trust is a driving force for business creation, and to create a global business you need to build a team that is capable of meeting the challenge. Trust is a key factor in team building and a needed enabler for cooperation. In general, trust building is a slow process, but it can be accelerated with open interaction and good communication skills. The fast-growing and ever-changing nature of global business sets demands for cooperation and team building, especially for startup companies. Trust building needs personal knowledge and regular face-to-face interaction, but it also requires empathy, respect, and genuine listening. Trust increases communication, and rich and open communication is essential for the building of high-performing teams. Other building materials are a shared vision, clear roles and responsibilities, willingness for cooperation, and supporting and encouraging leadership. This study focuses on trust in high-performing teams. It asks whether it is possible to manage trust and which tools and operation models should be used to speed up the building of trust. In this article, preliminary results from the authors’ research are presented to highlight the importance of sharing critical information and having a high level of communication through constant interaction.

  16. Durability and Performance of High Performance Infiltration Cathodes

    Samson, Alfred Junio; Søgaard, Martin; Hjalmarsson, Per

    2013-01-01

    The performance and durability of solid oxide fuel cell (SOFC) cathodes consisting of a porous Ce0.9Gd0.1O1.95 (CGO) infiltrated with nitrates corresponding to the nominal compositions La0.6Sr0.4Co1.05O3-δ (LSC), LaCoO3-δ (LC), and Co3O4 are discussed. At 600°C, the polarization resistance, Rp......, varied as: LSC (0.062Ωcm2)cathode was found to depend on the infiltrate firing temperature and is suggested to originate...... of the infiltrate but also from a better surface exchange property. A 450h test of an LSC-infiltrated CGO cathode showed an Rp with final degradation rate of only 11mΩcm2kh-1. An SOFC with an LSC-infiltrated CGO cathode tested for 1,500h at 700°C and 0.5Acm-2 (60% fuel, 20% air utilization) revealed no measurable...

  17. Improving UV Resistance of High Performance Fibers

    Hassanin, Ahmed

    High performance fibers are characterized by their superior properties compared to the traditional textile fibers. High strength fibers have high modules, high strength to weight ratio, high chemical resistance, and usually high temperature resistance. It is used in application where superior properties are needed such as bulletproof vests, ropes and cables, cut resistant products, load tendons for giant scientific balloons, fishing rods, tennis racket strings, parachute cords, adhesives and sealants, protective apparel and tire cords. Unfortunately, Ultraviolet (UV) radiation causes serious degradation to the most of high performance fibers. UV lights, either natural or artificial, cause organic compounds to decompose and degrade, because the energy of the photons of UV light is high enough to break chemical bonds causing chain scission. This work is aiming at achieving maximum protection of high performance fibers using sheathing approaches. The sheaths proposed are of lightweight to maintain the advantage of the high performance fiber that is the high strength to weight ratio. This study involves developing three different types of sheathing. The product of interest that need be protected from UV is braid from PBO. First approach is extruding a sheath from Low Density Polyethylene (LDPE) loaded with different rutile TiO2 % nanoparticles around the braid from the PBO. The results of this approach showed that LDPE sheath loaded with 10% TiO2 by weight achieved the highest protection compare to 0% and 5% TiO2. The protection here is judged by strength loss of PBO. This trend noticed in different weathering environments, where the sheathed samples were exposed to UV-VIS radiations in different weatheromter equipments as well as exposure to high altitude environment using NASA BRDL balloon. The second approach is focusing in developing a protective porous membrane from polyurethane loaded with rutile TiO2 nanoparticles. Membrane from polyurethane loaded with 4

  18. Intel Xeon Phi coprocessor high performance programming

    Jeffers, James

    2013-01-01

    Authors Jim Jeffers and James Reinders spent two years helping educate customers about the prototype and pre-production hardware before Intel introduced the first Intel Xeon Phi coprocessor. They have distilled their own experiences coupled with insights from many expert customers, Intel Field Engineers, Application Engineers and Technical Consulting Engineers, to create this authoritative first book on the essentials of programming for this new architecture and these new products. This book is useful even before you ever touch a system with an Intel Xeon Phi coprocessor. To ensure that your applications run at maximum efficiency, the authors emphasize key techniques for programming any modern parallel computing system whether based on Intel Xeon processors, Intel Xeon Phi coprocessors, or other high performance microprocessors. Applying these techniques will generally increase your program performance on any system, and better prepare you for Intel Xeon Phi coprocessors and the Intel MIC architecture. It off...

  19. Development of high-performance blended cements

    Wu, Zichao

    2000-10-01

    This thesis presents the development of high-performance blended cements from industrial by-products. To overcome the low-early strength of blended cements, several chemicals were studied as the activators for cement hydration. Sodium sulfate was discovered as the best activator. The blending proportions were optimized by Taguchi experimental design. The optimized blended cements containing up to 80% fly ash performed better than Type I cement in strength development and durability. Maintaining a constant cement content, concrete produced from the optimized blended cements had equal or higher strength and higher durability than that produced from Type I cement alone. The key for the activation mechanism was the reaction between added SO4 2- and Ca2+ dissolved from cement hydration products.

  20. PERFORMANCE MEASURES OF STUDENTS IN EXAMINATIONS: A STOCHASTIC APPROACH

    Goutam Saha; GOUTAM SAHA

    2013-01-01

    Data on Secondary and Higher Secondary examination (science stream) results from Tripura (North-East India) schools are analyzed to measure the performance of students based on tests and also the performance measures of schools based on final results and continuous assessment processes are obtained. The result variation in terms of grade points in the Secondary and Higher Secondary examinations are analysed using different sets of performance measures. The transition probabilities from one g...

  1. Performance and measurements of the AGS and Booster beams

    Weng, W.T.

    1995-01-01

    Analyses of Hot Gas Stream Cleanup (HGSC) ashes and descriptions of filter performance were made to address the problems with filter operation that are apparently linked to the collected ash. This task is designed to generate data base of the key properties of ashes collected from operating advanced particle filters and to relate these ash properties to the operation and performance of these filters. Activities including initial formatting of the data base and entry, modification of the permeability model, and initial design of a high-temperature test device for measuring uncompacted bulk porosity of ash aggregates (indicator of relative cohesivity of the ash, filter cake porosity/permeability). Chemical analyses of hopper and filter cake ashes from Tidd showed that the consolidation degree could not be accounted for by condensation/adsorption from the flue gas; the mechanism is likely physical rearrangement of the ash particles

  2. Procedure for Measuring and Reporting Commercial Building Energy Performance

    Barley, D.; Deru, M.; Pless, S.; Torcellini, P.

    2005-10-01

    This procedure is intended to provide a standard method for measuring and characterizing the energy performance of commercial buildings. The procedure determines the energy consumption, electrical energy demand, and on-site energy production in existing commercial buildings of all types. The performance metrics determined here may be compared against benchmarks to evaluate performance and verify that performance targets have been achieved.

  3. The Aviation Performance Measuring System (APMS): An Integrated Suite of Tools for Measuring Performance and Safety

    Statler, Irving C.; Connor, Mary M. (Technical Monitor)

    1998-01-01

    This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data, The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS offers to the air transport community an open, voluntary standard for flight-data-analysis software; a standard that will help to ensure suitable functionality and data interchangeability among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs-of aircrews in mind. APMS tools must serve the needs of the government and air carriers, as well as aircrews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but also through

  4. Simultaneous measurement of proguanil and its metabolites in human plasma and urine by reversed-phase high-performance liquid chromatography, and its preliminary application in relation to genetically determined S-mephenytoin 4'-hydroxylation status.

    Kusaka, M; Setiabudy, R; Chiba, K; Ishizaki, T

    1996-02-01

    A simple high-performance liquid chromatographic (HPLC) assay method was developed for the measurement of proguanil (PG) and its major metabolites, cycloguanil (CG) and 4-chlorophenyl-biguanide (CPB), in human plasma and urine. The assay allowed the simultaneous determination of all analytes in 1 ml of plasma or 0.1 ml of urine. The detection limits of PG, CG, and CPB, defined as the signal-to-noise ratio of 3, were 1 and 5 ng/ml for plasma and urine samples, respectively. Recoveries of the analytes and the internal standard (pyrimethamine) were > 62% from plasma and > 77% from urine. Intra-assay and interassay coefficients of variation for all analytes in plasma and urine were CG and CPB, which ranged from 10% to 15% at one or two concentrations among 4-5 concentrations studied. The clinical applicability of the method was assessed by the preliminary pharmacokinetic study of PG, CG, and CPB in six healthy volunteers with the individually known phenotypes (extensive and poor metabolizers) of S-mephenytoin 4'-hydroxylation, suggesting that individuals with a poor metabolizer phenotype of S-mephenytoin have a much lower capacity to bioactivate PG to CG compared with the extensive metabolizers.

  5. Transportation performance measures for outcome based system management and monitoring.

    2014-09-01

    The Oregon Department of Transportation (ODOT) is mature in its development and use of : performance measures, however there was not a standard approach for selecting measures nor : evaluating if existing ones were used to inform decision-making. Thi...

  6. Performance Measures for Public Participation Methods : Final Report

    2018-01-01

    Public engagement is an important part of transportation project development, but measuring its effectiveness is typically piecemealed. Performance measurementdescribed by the Urban Institute as the measurement on a regular basis of the results (o...

  7. Measuring the marketing performances of state forest enterprises in ...

    Measuring the marketing performances of state forest enterprises in Turkey. ... This study covers a limited period of time (1999 - 2003), and 41 variables were developed in order to measure the marketing ... AJOL African Journals Online.

  8. Utilities for high performance dispersion model PHYSIC

    Yamazawa, Hiromi

    1992-09-01

    The description and usage of the utilities for the dispersion calculation model PHYSIC were summarized. The model was developed in the study of developing high performance SPEEDI with the purpose of introducing meteorological forecast function into the environmental emergency response system. The procedure of PHYSIC calculation consists of three steps; preparation of relevant files, creation and submission of JCL, and graphic output of results. A user can carry out the above procedure with the help of the Geographical Data Processing Utility, the Model Control Utility, and the Graphic Output Utility. (author)

  9. An integrated high performance fastbus slave interface

    Christiansen, J.; Ljuslin, C.

    1992-01-01

    A high performance Fastbus slave interface ASIC is presented. The Fastbus slave integrated circuit (FASIC) is a programmable device, enabling its direct use in many different applications. The FASIC acts as an interface between Fastbus and a 'standard' processor/memory bus. It can work stand-alone or together with a microprocessor. A set of address mapping windows can map Fastbus addresses to convenient memory addresses and at the same time act as address decoding logic. Data rates of 100 MBytes/s to Fastbus can be obtained using an internal FIFO buffer in the FASIC. (orig.)

  10. Joint Integration Test Facility (JITF) Engineering II Performance Measurement Plans

    Boucher, Joanne

    2001-01-01

    ..., effectiveness, and accountability in federal programs and spending. The plan establishes six separate performance measurements, which correlate directly to customer satisfaction, Intelligence Mission Application (IMA...

  11. New’ Performance Measures: Determinants of Their Use and Their Impact on Performance

    F.H.M. Verbeeten (Frank)

    2005-01-01

    textabstractThis study investigates the extent to which Dutch organizations use ‘new’ performance measures to deal with the perceived inadequacies of traditional accounting performance measures. In addition, the determinants of the use of these ‘new’ performance measures are documented; finally, the

  12. High performance visual display for HENP detectors

    McGuigan, M; Spiletic, J; Fine, V; Nevski, P

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactiv...

  13. High-Performance Vertical Organic Electrochemical Transistors.

    Donahue, Mary J; Williamson, Adam; Strakosas, Xenofon; Friedlein, Jacob T; McLeod, Robert R; Gleskova, Helena; Malliaras, George G

    2018-02-01

    Organic electrochemical transistors (OECTs) are promising transducers for biointerfacing due to their high transconductance, biocompatibility, and availability in a variety of form factors. Most OECTs reported to date, however, utilize rather large channels, limiting the transistor performance and resulting in a low transistor density. This is typically a consequence of limitations associated with traditional fabrication methods and with 2D substrates. Here, the fabrication and characterization of OECTs with vertically stacked contacts, which overcome these limitations, is reported. The resulting vertical transistors exhibit a reduced footprint, increased intrinsic transconductance of up to 57 mS, and a geometry-normalized transconductance of 814 S m -1 . The fabrication process is straightforward and compatible with sensitive organic materials, and allows exceptional control over the transistor channel length. This novel 3D fabrication method is particularly suited for applications where high density is needed, such as in implantable devices. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. High Performance Data Distribution for Scientific Community

    Tirado, Juan M.; Higuero, Daniel; Carretero, Jesus

    2010-05-01

    Institutions such as NASA, ESA or JAXA find solutions to distribute data from their missions to the scientific community, and their long term archives. This is a complex problem, as it includes a vast amount of data, several geographically distributed archives, heterogeneous architectures with heterogeneous networks, and users spread around the world. We propose a novel architecture (HIDDRA) that solves this problem aiming to reduce user intervention in data acquisition and processing. HIDDRA is a modular system that provides a highly efficient parallel multiprotocol download engine, using a publish/subscribe policy which helps the final user to obtain data of interest transparently. Our system can deal simultaneously with multiple protocols (HTTP,HTTPS, FTP, GridFTP among others) to obtain the maximum bandwidth, reducing the workload in data server and increasing flexibility. It can also provide high reliability and fault tolerance, as several sources of data can be used to perform one file download. HIDDRA architecture can be arranged into a data distribution network deployed on several sites that can cooperate to provide former features. HIDDRA has been addressed by the 2009 e-IRG Report on Data Management as a promising initiative for data interoperability. Our first prototype has been evaluated in collaboration with the ESAC centre in Villafranca del Castillo (Spain) that shows a high scalability and performance, opening a wide spectrum of opportunities. Some preliminary results have been published in the Journal of Astrophysics and Space Science [1]. [1] D. Higuero, J.M. Tirado, J. Carretero, F. Félix, and A. de La Fuente. HIDDRA: a highly independent data distribution and retrieval architecture for space observation missions. Astrophysics and Space Science, 321(3):169-175, 2009

  15. High sensitivity optical measurement of skin gloss.

    Ezerskaia, Anna; Ras, Arno; Bloemen, Pascal; Pereira, Silvania F; Urbach, H Paul; Varghese, Babu

    2017-09-01

    We demonstrate a low-cost optical method for measuring the gloss properties with improved sensitivity in the low gloss regime, relevant for skin gloss properties. The gloss estimation method is based on, on the one hand, the slope of the intensity gradient in the transition regime between specular and diffuse reflection and on the other on the sum over the intensities of pixels above threshold, derived from a camera image obtained using unpolarized white light illumination. We demonstrate the improved sensitivity of the two proposed methods using Monte Carlo simulations and experiments performed on ISO gloss calibration standards with an optical prototype. The performance and linearity of the method was compared with different professional gloss measurement devices based on the ratio of specular to diffuse intensity. We demonstrate the feasibility for in-vivo skin gloss measurements by quantifying the temporal evolution of skin gloss after application of standard paraffin cream bases on skin. The presented method opens new possibilities in the fields of cosmetology and dermatopharmacology for measuring the skin gloss and resorption kinetics and the pharmacodynamics of various external agents.

  16. High-performance laboratories and cleanrooms; TOPICAL

    Tschudi, William; Sartor, Dale; Mills, Evan; Xu, Tengfang

    2002-01-01

    The California Energy Commission sponsored this roadmap to guide energy efficiency research and deployment for high performance cleanrooms and laboratories. Industries and institutions utilizing these building types (termed high-tech buildings) have played an important part in the vitality of the California economy. This roadmap's key objective to present a multi-year agenda to prioritize and coordinate research efforts. It also addresses delivery mechanisms to get the research products into the market. Because of the importance to the California economy, it is appropriate and important for California to take the lead in assessing the energy efficiency research needs, opportunities, and priorities for this market. In addition to the importance to California's economy, energy demand for this market segment is large and growing (estimated at 9400 GWH for 1996, Mills et al. 1996). With their 24hr. continuous operation, high tech facilities are a major contributor to the peak electrical demand. Laboratories and cleanrooms constitute the high tech building market, and although each building type has its unique features, they are similar in that they are extremely energy intensive, involve special environmental considerations, have very high ventilation requirements, and are subject to regulations-primarily safety driven-that tend to have adverse energy implications. High-tech buildings have largely been overlooked in past energy efficiency research. Many industries and institutions utilize laboratories and cleanrooms. As illustrated, there are many industries operating cleanrooms in California. These include semiconductor manufacturing, semiconductor suppliers, pharmaceutical, biotechnology, disk drive manufacturing, flat panel displays, automotive, aerospace, food, hospitals, medical devices, universities, and federal research facilities

  17. Long duration performance of high temperature irradiation resistant thermocouples

    Rempe, J.; Knudson, D.; Condie, K.; Cole, J.; Wilkins, S.C.

    2007-01-01

    Many advanced nuclear reactor designs require new fuel, cladding, and structural materials. Data are needed to characterize the performance of these new materials in high temperature, radiation conditions. However, traditional methods for measuring temperature in-pile degrade at temperatures above 1100 C degrees. To address this instrumentation need, the Idaho National Laboratory (INL) developed and evaluated the performance of a high temperature irradiation-resistant thermocouple that contains alloys of molybdenum and niobium. To verify the performance of INL's recommended thermocouple design, a series of high temperature (from 1200 to 1800 C) long duration (up to six months) tests has been initiated. This paper summarizes results from the tests that have been completed. Data are presented from 4000 hour tests conducted at 1200 and 1400 C that demonstrate the stability of this thermocouple (less than 2% drift). In addition, post test metallographic examinations are discussed which confirm the compatibility of thermocouple materials throughout these long duration, high temperature tests. (authors)

  18. High-performance computing for airborne applications

    Quinn, Heather M.; Manuzatto, Andrea; Fairbanks, Tom; Dallmann, Nicholas; Desgeorges, Rose

    2010-01-01

    Recently, there has been attempts to move common satellite tasks to unmanned aerial vehicles (UAVs). UAVs are significantly cheaper to buy than satellites and easier to deploy on an as-needed basis. The more benign radiation environment also allows for an aggressive adoption of state-of-the-art commercial computational devices, which increases the amount of data that can be collected. There are a number of commercial computing devices currently available that are well-suited to high-performance computing. These devices range from specialized computational devices, such as field-programmable gate arrays (FPGAs) and digital signal processors (DSPs), to traditional computing platforms, such as microprocessors. Even though the radiation environment is relatively benign, these devices could be susceptible to single-event effects. In this paper, we will present radiation data for high-performance computing devices in a accelerated neutron environment. These devices include a multi-core digital signal processor, two field-programmable gate arrays, and a microprocessor. From these results, we found that all of these devices are suitable for many airplane environments without reliability problems.

  19. High-performance vertical organic transistors.

    Kleemann, Hans; Günther, Alrun A; Leo, Karl; Lüssem, Björn

    2013-11-11

    Vertical organic thin-film transistors (VOTFTs) are promising devices to overcome the transconductance and cut-off frequency restrictions of horizontal organic thin-film transistors. The basic physical mechanisms of VOTFT operation, however, are not well understood and VOTFTs often require complex patterning techniques using self-assembly processes which impedes a future large-area production. In this contribution, high-performance vertical organic transistors comprising pentacene for p-type operation and C60 for n-type operation are presented. The static current-voltage behavior as well as the fundamental scaling laws of such transistors are studied, disclosing a remarkable transistor operation with a behavior limited by injection of charge carriers. The transistors are manufactured by photolithography, in contrast to other VOTFT concepts using self-assembled source electrodes. Fluorinated photoresist and solvent compounds allow for photolithographical patterning directly and strongly onto the organic materials, simplifying the fabrication protocol and making VOTFTs a prospective candidate for future high-performance applications of organic transistors. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Performance of the CMS High Level Trigger

    Perrotta, Andrea

    2015-01-01

    The CMS experiment has been designed with a 2-level trigger system. The first level is implemented using custom-designed electronics. The second level is the so-called High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. For Run II of the Large Hadron Collider, the increases in center-of-mass energy and luminosity will raise the event rate to a level challenging for the HLT algorithms. The increase in the number of interactions per bunch crossing, on average 25 in 2012, and expected to be around 40 in Run II, will be an additional complication. We present here the expected performance of the main triggers that will be used during the 2015 data taking campaign, paying particular attention to the new approaches that have been developed to cope with the challenges of the new run. This includes improvements in HLT electron and photon reconstruction as well as better performing muon triggers. We will also present the performance of the improved trac...

  1. Development of a High Performance Spacer Grid

    Song, Kee Nam; Song, K. N.; Yoon, K. H. (and others)

    2007-03-15

    A spacer grid in a LWR fuel assembly is a key structural component to support fuel rods and to enhance the heat transfer from the fuel rod to the coolant. In this research, the main research items are the development of inherent and high performance spacer grid shapes, the establishment of mechanical/structural analysis and test technology, and the set-up of basic test facilities for the spacer grid. The main research areas and results are as follows. 1. 18 different spacer grid candidates have been invented and applied for domestic and US patents. Among the candidates 16 are chosen from the patent. 2. Two kinds of spacer grids are finally selected for the advanced LWR fuel after detailed performance tests on the candidates and commercial spacer grids from a mechanical/structural point of view. According to the test results the features of the selected spacer grids are better than those of the commercial spacer grids. 3. Four kinds of basic test facilities are set up and the relevant test technologies are established. 4. Mechanical/structural analysis models and technology for spacer grid performance are developed and the analysis results are compared with the test results to enhance the reliability of the models.

  2. Low cost high performance uncertainty quantification

    Bekas, C.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost which quickly becomes intractable with the current explosion of data sizes. In this work we reduce this complexity to quadratic with the synergy of two algorithms that gracefully complement each other and lead to a radically different approach. First, we turned to stochastic estimation of the diagonal. This allowed us to cast the problem as a linear system with a relatively small number of multiple right hand sides. Second, for this linear system we developed a novel, mixed precision, iterative refinement scheme, which uses iterative solvers instead of matrix factorizations. We demonstrate that the new framework not only achieves the much needed quadratic cost but in addition offers excellent opportunities for scaling at massively parallel environments. We based our implementation on BLAS 3 kernels that ensure very high processor performance. We achieved a peak performance of 730 TFlops on 72 BG/P racks, with a sustained performance 73% of theoretical peak. We stress that the techniques presented in this work are quite general and applicable to several other important applications. Copyright © 2009 ACM.

  3. Energy Efficient Graphene Based High Performance Capacitors.

    Bae, Joonwon; Kwon, Oh Seok; Lee, Chang-Soo

    2017-07-10

    Graphene (GRP) is an interesting class of nano-structured electronic materials for various cutting-edge applications. To date, extensive research activities have been performed on the investigation of diverse properties of GRP. The incorporation of this elegant material can be very lucrative in terms of practical applications in energy storage/conversion systems. Among various those systems, high performance electrochemical capacitors (ECs) have become popular due to the recent need for energy efficient and portable devices. Therefore, in this article, the application of GRP for capacitors is described succinctly. In particular, a concise summary on the previous research activities regarding GRP based capacitors is also covered extensively. It was revealed that a lot of secondary materials such as polymers and metal oxides have been introduced to improve the performance. Also, diverse devices have been combined with capacitors for better use. More importantly, recent patents related to the preparation and application of GRP based capacitors are also introduced briefly. This article can provide essential information for future study. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  4. SISYPHUS: A high performance seismic inversion factory

    Gokhberg, Alexey; Simutė, Saulė; Boehm, Christian; Fichtner, Andreas

    2016-04-01

    In the recent years the massively parallel high performance computers became the standard instruments for solving the forward and inverse problems in seismology. The respective software packages dedicated to forward and inverse waveform modelling specially designed for such computers (SPECFEM3D, SES3D) became mature and widely available. These packages achieve significant computational performance and provide researchers with an opportunity to solve problems of bigger size at higher resolution within a shorter time. However, a typical seismic inversion process contains various activities that are beyond the common solver functionality. They include management of information on seismic events and stations, 3D models, observed and synthetic seismograms, pre-processing of the observed signals, computation of misfits and adjoint sources, minimization of misfits, and process workflow management. These activities are time consuming, seldom sufficiently automated, and therefore represent a bottleneck that can substantially offset performance benefits provided by even the most powerful modern supercomputers. Furthermore, a typical system architecture of modern supercomputing platforms is oriented towards the maximum computational performance and provides limited standard facilities for automation of the supporting activities. We present a prototype solution that automates all aspects of the seismic inversion process and is tuned for the modern massively parallel high performance computing systems. We address several major aspects of the solution architecture, which include (1) design of an inversion state database for tracing all relevant aspects of the entire solution process, (2) design of an extensible workflow management framework, (3) integration with wave propagation solvers, (4) integration with optimization packages, (5) computation of misfits and adjoint sources, and (6) process monitoring. The inversion state database represents a hierarchical structure with

  5. Ultra high performance concrete dematerialization study

    NONE

    2004-03-01

    Concrete is the most widely used building material in the world and its use is expected to grow. It is well recognized that the production of portland cement results in the release of large amounts of carbon dioxide, a greenhouse gas (GHG). The main challenge facing the industry is to produce concrete in an environmentally sustainable manner. Reclaimed industrial by-proudcts such as fly ash, silica fume and slag can reduce the amount of portland cement needed to make concrete, thereby reducing the amount of GHGs released to the atmosphere. The use of these supplementary cementing materials (SCM) can also enhance the long-term strength and durability of concrete. The intention of the EcoSmart{sup TM} Concrete Project is to develop sustainable concrete through innovation in supply, design and construction. In particular, the project focuses on finding a way to minimize the GHG signature of concrete by maximizing the replacement of portland cement in the concrete mix with SCM while improving the cost, performance and constructability. This paper describes the use of Ductal{sup R} Ultra High Performance Concrete (UHPC) for ramps in a condominium. It examined the relationship between the selection of UHPC and the overall environmental performance, cost, constructability maintenance and operational efficiency as it relates to the EcoSmart Program. The advantages and challenges of using UHPC were outlined. In addition to its very high strength, UHPC has been shown to have very good potential for GHG emission reduction due to the reduced material requirements, reduced transport costs and increased SCM content. refs., tabs., figs.

  6. High Performance Clocks and Gravity Field Determination

    Müller, J.; Dirkx, D.; Kopeikin, S. M.; Lion, G.; Panet, I.; Petit, G.; Visser, P. N. A. M.

    2018-02-01

    Time measured by an ideal clock crucially depends on the gravitational potential and velocity of the clock according to general relativity. Technological advances in manufacturing high-precision atomic clocks have rapidly improved their accuracy and stability over the last decade that approached the level of 10^{-18}. This notable achievement along with the direct sensitivity of clocks to the strength of the gravitational field make them practically important for various geodetic applications that are addressed in the present paper. Based on a fully relativistic description of the background gravitational physics, we discuss the impact of those highly-precise clocks on the realization of reference frames and time scales used in geodesy. We discuss the current definitions of basic geodetic concepts and come to the conclusion that the advances in clocks and other metrological technologies will soon require the re-definition of time scales or, at least, clarification to ensure their continuity and consistent use in practice. The relative frequency shift between two clocks is directly related to the difference in the values of the gravity potential at the points of clock's localization. According to general relativity the relative accuracy of clocks in 10^{-18} is equivalent to measuring the gravitational red shift effect between two clocks with the height difference amounting to 1 cm. This makes the clocks an indispensable tool in high-precision geodesy in addition to laser ranging and space geodetic techniques. We show how clock measurements can provide geopotential numbers for the realization of gravity-field-related height systems and can resolve discrepancies in classically-determined height systems as well as between national height systems. Another application of clocks is the direct use of observed potential differences for the improved recovery of regional gravity field solutions. Finally, clock measurements for space-borne gravimetry are analyzed along with

  7. High-performance phase-field modeling

    Vignal, Philippe

    2015-04-27

    Many processes in engineering and sciences involve the evolution of interfaces. Among the mathematical frameworks developed to model these types of problems, the phase-field method has emerged as a possible solution. Phase-fields nonetheless lead to complex nonlinear, high-order partial differential equations, whose solution poses mathematical and computational challenges. Guaranteeing some of the physical properties of the equations has lead to the development of efficient algorithms and discretizations capable of recovering said properties by construction [2, 5]. This work builds-up on these ideas, and proposes novel discretization strategies that guarantee numerical energy dissipation for both conserved and non-conserved phase-field models. The temporal discretization is based on a novel method which relies on Taylor series and ensures strong energy stability. It is second-order accurate, and can also be rendered linear to speed-up the solution process [4]. The spatial discretization relies on Isogeometric Analysis, a finite element method that possesses the k-refinement technology and enables the generation of high-order, high-continuity basis functions. These basis functions are well suited to handle the high-order operators present in phase-field models. Two-dimensional and three dimensional results of the Allen-Cahn, Cahn-Hilliard, Swift-Hohenberg and phase-field crystal equation will be presented, which corroborate the theoretical findings, and illustrate the robustness of the method. Results related to more challenging examples, namely the Navier-Stokes Cahn-Hilliard and a diusion-reaction Cahn-Hilliard system, will also be presented. The implementation was done in PetIGA and PetIGA-MF, high-performance Isogeometric Analysis frameworks [1, 3], designed to handle non-linear, time-dependent problems.

  8. High-voltage test and measuring techniques

    Hauschild, Wolfgang

    2014-01-01

    It is the intent of this book to combine high-voltage (HV) engineering with HV testing technique and HV measuring technique. Based on long-term experience gained by the authors as lecturer and researcher as well as member in international organizations, such as IEC and CIGRE, the book will reflect the state of the art as well as the future trends in testing and diagnostics of HV equipment to ensure a reliable generation, transmission and distribution of electrical energy. The book is intended not only for experts but also for students in electrical engineering and high-voltage engineering.

  9. HIGH PERFORMANCE PHOTOGRAMMETRIC PROCESSING ON COMPUTER CLUSTERS

    V. N. Adrov

    2012-07-01

    Full Text Available Most cpu consuming tasks in photogrammetric processing can be done in parallel. The algorithms take independent bits as input and produce independent bits as output. The independence of bits comes from the nature of such algorithms since images, stereopairs or small image blocks parts can be processed independently. Many photogrammetric algorithms are fully automatic and do not require human interference. Photogrammetric workstations can perform tie points measurements, DTM calculations, orthophoto construction, mosaicing and many other service operations in parallel using distributed calculations. Distributed calculations save time reducing several days calculations to several hours calculations. Modern trends in computer technology show the increase of cpu cores in workstations, speed increase in local networks, and as a result dropping the price of the supercomputers or computer clusters that can contain hundreds or even thousands of computing nodes. Common distributed processing in DPW is usually targeted for interactive work with a limited number of cpu cores and is not optimized for centralized administration. The bottleneck of common distributed computing in photogrammetry can be in the limited lan throughput and storage performance, since the processing of huge amounts of large raster images is needed.

  10. High performance visual display for HENP detectors

    McGuigan, Michael; Smith, Gordon; Spiletic, John; Fine, Valeri; Nevski, Pavel

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactive control, including the ability to slice, search and mark areas of the detector. We incorporate the ability to make a high quality still image of a view of the detector and the ability to generate animations and a fly through of the detector and output these to MPEG or VRML models. We develop data compression hardware and software so that remote interactive visualization will be possible among dispersed collaborators. We obtain real time visual display for events accumulated during simulations

  11. Development of high performance ODS alloys

    Shao, Lin [Texas A & M Univ., College Station, TX (United States); Gao, Fei [Univ. of Michigan, Ann Arbor, MI (United States); Garner, Frank [Texas A & M Univ., College Station, TX (United States)

    2018-01-29

    This project aims to capitalize on insights developed from recent high-dose self-ion irradiation experiments in order to develop and test the next generation of optimized ODS alloys needed to meet the nuclear community's need for high strength, radiation-tolerant cladding and core components, especially with enhanced resistance to void swelling. Two of these insights are that ferrite grains swell earlier than tempered martensite grains, and oxide dispersions currently produced only in ferrite grains require a high level of uniformity and stability to be successful. An additional insight is that ODS particle stability is dependent on as-yet unidentified compositional combinations of dispersoid and alloy matrix, such as dispersoids are stable in MA957 to doses greater than 200 dpa but dissolve in MA956 at doses less than 200 dpa. These findings focus attention on candidate next-generation alloys which address these concerns. Collaboration with two Japanese groups provides this project with two sets of first-round candidate alloys that have already undergone extensive development and testing for unirradiated properties, but have not yet been evaluated for their irradiation performance. The first set of candidate alloys are dual phase (ferrite + martensite) ODS alloys with oxide particles uniformly distributed in both ferrite and martensite phases. The second set of candidate alloys are ODS alloys containing non-standard dispersoid compositions with controllable oxide particle sizes, phases and interfaces.

  12. Low-Cost High-Performance MRI

    Sarracanie, Mathieu; Lapierre, Cristen D.; Salameh, Najat; Waddington, David E. J.; Witzel, Thomas; Rosen, Matthew S.

    2015-10-01

    Magnetic Resonance Imaging (MRI) is unparalleled in its ability to visualize anatomical structure and function non-invasively with high spatial and temporal resolution. Yet to overcome the low sensitivity inherent in inductive detection of weakly polarized nuclear spins, the vast majority of clinical MRI scanners employ superconducting magnets producing very high magnetic fields. Commonly found at 1.5-3 tesla (T), these powerful magnets are massive and have very strict infrastructure demands that preclude operation in many environments. MRI scanners are costly to purchase, site, and maintain, with the purchase price approaching $1 M per tesla (T) of magnetic field. We present here a remarkably simple, non-cryogenic approach to high-performance human MRI at ultra-low magnetic field, whereby modern under-sampling strategies are combined with fully-refocused dynamic spin control using steady-state free precession techniques. At 6.5 mT (more than 450 times lower than clinical MRI scanners) we demonstrate (2.5 × 3.5 × 8.5) mm3 imaging resolution in the living human brain using a simple, open-geometry electromagnet, with 3D image acquisition over the entire brain in 6 minutes. We contend that these practical ultra-low magnetic field implementations of MRI (standards for affordable (<$50,000) and robust portable devices.

  13. Apparatus for accurately measuring high temperatures

    Smith, D.D.

    The present invention is a thermometer used for measuring furnace temperatures in the range of about 1800/sup 0/ to 2700/sup 0/C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  14. Predicting Document Retrieval System Performance: An Expected Precision Measure.

    Losee, Robert M., Jr.

    1987-01-01

    Describes an expected precision (EP) measure designed to predict document retrieval performance. Highlights include decision theoretic models; precision and recall as measures of system performance; EP graphs; relevance feedback; and computing the retrieval status value of a document for two models, the Binary Independent Model and the Two Poisson…

  15. A customer perspective on performance measurement in humanitarian supply chains

    Schiffling, Sarah

    2013-01-01

    The increasing importance of services in SCM leads to a stronger focus on the customer perspective. Donors and beneficiaries are two distinct customer groups of humanitarian supply chains. This paper will analyse how this impacts performance measurement for example in the commonly used balanced scorecard, which includes a customer perspective. Keywords: Performance measurement, Humanitarian logistics, Customer perspective

  16. Automating Performance Measures and Clinical Practice Guidelines: Differences and Complementarities.

    Tu, Samson W; Martins, Susana; Oshiro, Connie; Yuen, Kaeli; Wang, Dan; Robinson, Amy; Ashcraft, Michael; Heidenreich, Paul A; Goldstein, Mary K

    2016-01-01

    Through close analysis of two pairs of systems that implement the automated evaluation of performance measures (PMs) and guideline-based clinical decision support (CDS), we contrast differences in their knowledge encoding and necessary changes to a CDS system that provides management recommendations for patients failing performance measures. We trace the sources of differences to the implementation environments and goals of PMs and CDS.

  17. Supply chain oriented performance measurement for automotive spare parts

    de Leeuw, S.L.J.M.; Beekman, L.

    2008-01-01

    Literature provides a number of conceptual frameworks and discussions on performance measurement in supply chains. However, most of these frameworks focus on a single link of a supply chain. Furthermore, there is a lack of empirical analysis and case studies on performance metrics and measurements

  18. High current density ion beam measurement techniques

    Ko, W.C.; Sawatzky, E.

    1976-01-01

    High ion beam current measurements are difficult due to the presence of the secondary particles and beam neutralization. For long Faraday cages, true current can be obtained only by negative bias on the target and by summing the cage wall and target currents; otherwise, the beam will be greatly distorted. For short Faraday cages, a combination of small magnetic field and the negative target bias results in correct beam current. Either component alone does not give true current

  19. High Performance Computing in Science and Engineering '15 : Transactions of the High Performance Computing Center

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2015. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  20. High Performance Computing in Science and Engineering '17 : Transactions of the High Performance Computing Center

    Kröner, Dietmar; Resch, Michael; HLRS 2017

    2018-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2017. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance.The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  1. High-temperature ultrasonic measurements applied to directly heated samples

    Moore, R.I.; Taylor, R.E.

    1984-01-01

    High-temperature ultrasonic measurements of Young's modulus were made of graphite samples heated directly. The samples were cylindrical rods of the same geometry as that used in the multiproperty apparatus for simultaneous/consecutive measurements of a number of thermophysical properties to high temperatures. The samples were resonated in simple longitudinal vibration modes. Measurements were performed up to 2000 K. Incorporation of ultrasonic measurements of Young's modulus in the capabilities of the multiproperty apparatus is valuable because (i) ultrasonic measurements can be related to normal destructive measurements of this property; (ii) they can be used for screening materials or acceptance testing of specimens; (iii) they can be used to increase the understanding of thermophysical properties and property correlations. (author)

  2. High-accuracy measurements of the normal specular reflectance

    Voarino, Philippe; Piombini, Herve; Sabary, Frederic; Marteau, Daniel; Dubard, Jimmy; Hameury, Jacques; Filtz, Jean Remy

    2008-01-01

    The French Laser Megajoule (LMJ) is designed and constructed by the French Commissariata l'Energie Atomique (CEA). Its amplifying section needs highly reflective multilayer mirrors for the flash lamps. To monitor and improve the coating process, the reflectors have to be characterized to high accuracy. The described spectrophotometer is designed to measure normal specular reflectance with high repeatability by using a small spot size of 100 μm. Results are compared with ellipsometric measurements. The instrument can also perform spatial characterization to detect coating nonuniformity

  3. Suns-VOC characteristics of high performance kesterite solar cells

    Gunawan, Oki; Gokmen, Tayfun; Mitzi, David B.

    2014-08-01

    Low open circuit voltage (VOC) has been recognized as the number one problem in the current generation of Cu2ZnSn(Se,S)4 (CZTSSe) solar cells. We report high light intensity and low temperature Suns-VOC measurement in high performance CZTSSe devices. The Suns-VOC curves exhibit bending at high light intensity, which points to several prospective VOC limiting mechanisms that could impact the VOC, even at 1 sun for lower performing samples. These VOC limiting mechanisms include low bulk conductivity (because of low hole density or low mobility), bulk or interface defects, including tail states, and a non-ohmic back contact for low carrier density CZTSSe. The non-ohmic back contact problem can be detected by Suns-VOC measurements with different monochromatic illuminations. These limiting factors may also contribute to an artificially lower JSC-VOC diode ideality factor.

  4. Thermal interface pastes nanostructured for high performance

    Lin, Chuangang

    Thermal interface materials in the form of pastes are needed to improve thermal contacts, such as that between a microprocessor and a heat sink of a computer. High-performance and low-cost thermal pastes have been developed in this dissertation by using polyol esters as the vehicle and various nanoscale solid components. The proportion of a solid component needs to be optimized, as an excessive amount degrades the performance, due to the increase in the bond line thickness. The optimum solid volume fraction tends to be lower when the mating surfaces are smoother, and higher when the thermal conductivity is higher. Both a low bond line thickness and a high thermal conductivity help the performance. When the surfaces are smooth, a low bond line thickness can be even more important than a high thermal conductivity, as shown by the outstanding performance of the nanoclay paste of low thermal conductivity in the smooth case (0.009 mum), with the bond line thickness less than 1 mum, as enabled by low storage modulus G', low loss modulus G" and high tan delta. However, for rough surfaces, the thermal conductivity is important. The rheology affects the bond line thickness, but it does not correlate well with the performance. This study found that the structure of carbon black is an important parameter that governs the effectiveness of a carbon black for use in a thermal paste. By using a carbon black with a lower structure (i.e., a lower DBP value), a thermal paste that is more effective than the previously reported carbon black paste was obtained. Graphite nanoplatelet (GNP) was found to be comparable in effectiveness to carbon black (CB) pastes for rough surfaces, but it is less effective for smooth surfaces. At the same filler volume fraction, GNP gives higher thermal conductivity than carbon black paste. At the same pressure, GNP gives higher bond line thickness than CB (Tokai or Cabot). The effectiveness of GNP is limited, due to the high bond line thickness. A

  5. Combining high productivity with high performance on commodity hardware

    Skovhede, Kenneth

    -like compiler for translating CIL bytecode on the CELL-BE. I then introduce a bytecode converter that transforms simple loops in Java bytecode to GPGPU capable code. I then introduce the numeric library for the Common Intermediate Language, NumCIL. I can then utilizing the vector programming model from Num......CIL and map this to the Bohrium framework. The result is a complete system that gives the user a choice of high-level languages with no explicit parallelism, yet seamlessly performs efficient execution on a number of hardware setups....

  6. The relationships between common measures of glucose meter performance.

    Wilmoth, Daniel R

    2012-09-01

    Glucose meter performance is commonly measured in several different ways, including the relative bias and coefficient of variation (CV), the total error, the mean absolute relative deviation (MARD), and the size of the interval around the reference value that would be necessary to contain a meter measurement at a specified probability. This fourth measure is commonly expressed as a proportion of the reference value and will be referred to as the necessary relative deviation. A deeper understanding of the relationships between these measures may aid health care providers, patients, and regulators in comparing meter performances when different measures are used. The relationships between common measures of glucose meter performance were derived mathematically. Equations are presented for calculating the total error, MARD, and necessary relative deviation using the reference value, relative bias, and CV when glucose meter measurements are normally distributed. When measurements are also unbiased, the CV, total error, MARD, and necessary relative deviation are linearly related and are therefore equivalent measures of meter performance. The relative bias and CV provide more information about meter performance than the other measures considered but may be difficult for some audiences to interpret. Reporting meter performance in multiple ways may facilitate the informed selection of blood glucose meters. © 2012 Diabetes Technology Society.

  7. MEASURING PERFORMANCE IN ORGANIZATIONS FROM MULTI-DIMENSIONAL PERSPECTIVE

    ȘTEFĂNESCU CRISTIAN

    2017-08-01

    Full Text Available In turbulent financial and economic present conditions a major challenge for the general management of organizations and in particular for the strategic human resources management is to establish a clear, coherent and consistent framework in terms of measuring organizational performance and economic efficiency. This paper aims to conduct an exploratory research of literature concerning measuring organizational performance. Based on the results of research the paper proposes a multi-dimensional model for measuring organizational performance providing a mechanism that will allow quantification of performance based on selected criteria. The model will attempt to eliminate inconsistencies and incongruities of organizational effectiveness models developed by specialists from organization theory area, performance measurement models developed by specialists from accounting management area and models of measuring the efficiency and effectiveness developed by specialists from strategic management and entrepreneurship areas.

  8. Integrating advanced facades into high performance buildings

    Selkowitz, Stephen E.

    2001-01-01

    Glass is a remarkable material but its functionality is significantly enhanced when it is processed or altered to provide added intrinsic capabilities. The overall performance of glass elements in a building can be further enhanced when they are designed to be part of a complete facade system. Finally the facade system delivers the greatest performance to the building owner and occupants when it becomes an essential element of a fully integrated building design. This presentation examines the growing interest in incorporating advanced glazing elements into more comprehensive facade and building systems in a manner that increases comfort, productivity and amenity for occupants, reduces operating costs for building owners, and contributes to improving the health of the planet by reducing overall energy use and negative environmental impacts. We explore the role of glazing systems in dynamic and responsive facades that provide the following functionality: Enhanced sun protection and cooling load control while improving thermal comfort and providing most of the light needed with daylighting; Enhanced air quality and reduced cooling loads using natural ventilation schemes employing the facade as an active air control element; Reduced operating costs by minimizing lighting, cooling and heating energy use by optimizing the daylighting-thermal tradeoffs; Net positive contributions to the energy balance of the building using integrated photovoltaic systems; Improved indoor environments leading to enhanced occupant health, comfort and performance. In addressing these issues facade system solutions must, of course, respect the constraints of latitude, location, solar orientation, acoustics, earthquake and fire safety, etc. Since climate and occupant needs are dynamic variables, in a high performance building the facade solution have the capacity to respond and adapt to these variable exterior conditions and to changing occupant needs. This responsive performance capability

  9. Measures of Strategic Alliance Performance, Classified and Assessed

    Christoffersen, Jeppe; Plenborg, Thomas; Robson, Matthew J.

    2014-01-01

    Over the last three decades, strategic alliance performance has been an important research topic within the international business and management fields. Researchers have investigated a number of factors explaining performance but often find diverging results. Scholars have suggested that one...... reason may be that different performance measures are used as the dependent variable. But which differences exist and how can they matter? Against this backdrop, the present study makes three main contributions. First, we identify dimensions that illustrate differences and similarities between...... performance measures and provide a simple yet comprehensive classification of the different performance measures used in 167 empirical studies in the literature. Second, we suggest how differences in performance measures may influence construct validity under different circumstances. Third, we show...

  10. The need for high performance breeder reactors

    Vaughan, R.D.; Chermanne, J.

    1977-01-01

    It can be easily demonstrated, on the basis of realistic estimates of continued high oil costs, that an increasing portion of the growth in energy demand must be supplied by nuclear power and that this one might account for 20% of all the energy production by the end of the century. Such assumptions lead very quickly to the conclusion that the discovery, extraction and processing of the uranium will not be able to follow the demand; the bottleneck will essentially be related to the rate at which the ore can be discovered and extracted, and not to the existing quantities nor their grade. Figures as high as 150.000 T/annum and more would be quickly reached, and it is necessary to wonder already now if enough capital can be attracted to meet these requirements. There is only one solution to this problem: improve the conversion ratio of the nuclear system and quickly reach the breeding; this would lead to the reduction of the natural uranium consumption by a factor of about 50. However, this condition is not sufficient; the commercial breeder must have a breeding gain as high as possible because the Pu out-of-pile time and the Pu losses in the cycle could lead to an unacceptable doubling time for the system, if the breeding gain is too low. That is the reason why it is vital to develop high performance breeder reactors. The present paper indicates how the Gas-cooled Breeder Reactor [GBR] can meet the problems mentioned above, on the basis of recent and realistic studies. It briefly describes the present status of GBR development, from the predecessors in the gas cooled reactor line, particularly the AGR. It shows how the GBR fuel takes mostly profit from the LMFBR fuel irradiation experience. It compares the GBR performance on a consistent basis with that of the LMFBR. The GBR capital and fuel cycle costs are compared with those of thermal and fast reactors respectively. The conclusion is, based on a cost-benefit study, that the GBR must be quickly developed in order

  11. High performance nano-composite technology development

    Kim, Whung Whoe; Rhee, C. K.; Kim, S. J.; Park, S. D. [KAERI, Taejon (Korea, Republic of); Kim, E. K.; Jung, S. Y.; Ryu, H. J. [KRICT, Taejon (Korea, Republic of); Hwang, S. S.; Kim, J. K.; Hong, S. M. [KIST, Taejon (Korea, Republic of); Chea, Y. B. [KIGAM, Taejon (Korea, Republic of); Choi, C. H.; Kim, S. D. [ATS, Taejon (Korea, Republic of); Cho, B. G.; Lee, S. H. [HGREC, Taejon (Korea, Republic of)

    1999-06-15

    The trend of new material development are being to carried out not only high performance but also environmental attraction. Especially nano composite material which enhances the functional properties of components, extending the component life resulting to reduced the wastes and environmental contamination, has a great effect on various industrial area. The application of nano composite, depends on the polymer matrix and filler materials, has various application from semiconductor to medical field. In spite of nano composite merits, nano composite study are confined to a few special materials as a lab, scale because a few technical difficulties are still on hold. Therefore, the purpose of this study establishes the systematical planning to carried out the next generation projects on order to compete with other countries and overcome the protective policy of advanced countries with grasping over sea's development trends and our present status. (author).

  12. How to create high-performing teams.

    Lam, Samuel M

    2010-02-01

    This article is intended to discuss inspirational aspects on how to lead a high-performance team. Cogent topics discussed include how to hire staff through methods of "topgrading" with reference to Geoff Smart and "getting the right people on the bus" referencing Jim Collins' work. In addition, once the staff is hired, this article covers how to separate the "eagles from the ducks" and how to inspire one's staff by creating the right culture with suggestions for further reading by Don Miguel Ruiz (The four agreements) and John Maxwell (21 Irrefutable laws of leadership). In addition, Simon Sinek's concept of "Start with Why" is elaborated to help a leader know what the core element should be with any superior culture. Thieme Medical Publishers.

  13. High performance nano-composite technology development

    Kim, Whung Whoe; Rhee, C. K.; Kim, S. J.; Park, S. D. [KAERI, Taejon (Korea, Republic of); Kim, E. K.; Jung, S. Y.; Ryu, H. J. [KRICT, Taejon (Korea, Republic of); Hwang, S. S.; Kim, J. K.; Hong, S. M. [KIST, Taejon (Korea, Republic of); Chea, Y. B. [KIGAM, Taejon (Korea, Republic of); Choi, C. H.; Kim, S. D. [ATS, Taejon (Korea, Republic of); Cho, B. G.; Lee, S. H. [HGREC, Taejon (Korea, Republic of)

    1999-06-15

    The trend of new material development are being to carried out not only high performance but also environmental attraction. Especially nano composite material which enhances the functional properties of components, extending the component life resulting to reduced the wastes and environmental contamination, has a great effect on various industrial area. The application of nano composite, depends on the polymer matrix and filler materials, has various application from semiconductor to medical field. In spite of nano composite merits, nano composite study are confined to a few special materials as a lab, scale because a few technical difficulties are still on hold. Therefore, the purpose of this study establishes the systematical planning to carried out the next generation projects on order to compete with other countries and overcome the protective policy of advanced countries with grasping over sea's development trends and our present status. (author).

  14. High performance nano-composite technology development

    Kim, Whung Whoe; Rhee, C. K.; Kim, S. J.; Park, S. D.; Kim, E. K.; Jung, S. Y.; Ryu, H. J.; Hwang, S. S.; Kim, J. K.; Hong, S. M.; Chea, Y. B.; Choi, C. H.; Kim, S. D.; Cho, B. G.; Lee, S. H.

    1999-06-01

    The trend of new material development are being to carried out not only high performance but also environmental attraction. Especially nano composite material which enhances the functional properties of components, extending the component life resulting to reduced the wastes and environmental contamination, has a great effect on various industrial area. The application of nano composite, depends on the polymer matrix and filler materials, has various application from semiconductor to medical field. In spite of nano composite merits, nano composite study are confined to a few special materials as a lab, scale because a few technical difficulties are still on hold. Therefore, the purpose of this study establishes the systematical planning to carried out the next generation projects on order to compete with other countries and overcome the protective policy of advanced countries with grasping over sea's development trends and our present status. (author).

  15. High Performance with Prescriptive Optimization and Debugging

    Jensen, Nicklas Bo

    parallelization and automatic vectorization is attractive as it transparently optimizes programs. The thesis contributes an improved dependence analysis for explicitly parallel programs. These improvements lead to more loops being vectorized, on average we achieve a speedup of 1.46 over the existing dependence...... analysis and vectorizer in GCC. Automatic optimizations often fail for theoretical and practical reasons. When they fail we argue that a hybrid approach can be effective. Using compiler feedback, we propose to use the programmer’s intuition and insight to achieve high performance. Compiler feedback...... enlightens the programmer why a given optimization was not applied, and suggest how to change the source code to make it more amenable to optimizations. We show how this can yield significant speedups and achieve 2.4 faster execution on a real industrial use case. To aid in parallel debugging we propose...

  16. Optimizing High Performance Self Compacting Concrete

    Raymond A Yonathan

    2017-01-01

    Full Text Available This paper’s objectives are to learn the effect of glass powder, silica fume, Polycarboxylate Ether, and gravel to optimizing composition of each factor in making High Performance SCC. Taguchi method is proposed in this paper as best solution to minimize specimen variable which is more than 80 variations. Taguchi data analysis method is applied to provide composition, optimizing, and the effect of contributing materials for nine variable of specimens. Concrete’s workability was analyzed using Slump flow test, V-funnel test, and L-box test. Compressive and porosity test were performed for the hardened state. With a dimension of 100×200 mm the cylindrical specimens were cast for compressive test with the age of 3, 7, 14, 21, 28 days. Porosity test was conducted at 28 days. It is revealed that silica fume contributes greatly to slump flow and porosity. Coarse aggregate shows the greatest contributing factor to L-box and compressive test. However, all factors show unclear result to V-funnel test.

  17. Performance Measurement without Benchmarks: An Examination of Mutual Fund Returns.

    Grinblatt, Mark; Titman, Sheridan

    1993-01-01

    This article introduces a new measure of portfolio performance and applies it to study the performance of a large sample of mutual funds. In contrast to previous studies of mutual fund performance, the measure used in this study employs portfolio holdings and does not require the use of a benchmark portfolio. It finds that the portfolio choices of mutual fund managers, particularly those that managed aggressive growth funds, earned significantly positive risk-adjusted returns in the 1976-85 p...

  18. An Examination of Organizatinal Performance Measurement System Utilization

    DeBusk, Gerald Kenneth

    2003-01-01

    This dissertation provides results of three studies, which examine the utilization of organizational performance measurement systems. Evidence gathered in the first study provides insight into the number of perspectives or components found in the evaluation of an organization's performance and the relative weight placed on those components. The evidence suggests that the number of performance measurement components and their relative composition is situational. Components depend heavily on th...

  19. A framework to improve performance measurement in engineering projects

    Zheng , Li; Baron , Claude; Esteban , Philippe; Xue , Rui; Zhang , Qiang

    2017-01-01

    International audience; A wide range of methods and good practices have been developed for the measurement of projects performance. They help project managers to effectively monitor the project progress and evaluate results. However, from a literature review, we noticed several remaining critical issues in measuring projects performance, such as an unbalanced development of Key Performance Indicators types between lagging and leading indicators. On the other hand, systems engineering measurem...

  20. Performance Measurement, Expectancy and Agency Theory: An Experimental Study

    Randolph Sloof; Mirjam van Praag

    2007-01-01

    Theoretical analyses of (optimal) performance measures are typically performed within the realm of the linear agency model. An important implication of this model is that, for a given compensation scheme, the agent's optimal effort choice is unrelated to the amount of noise in the performance measure. In contrast, expectancy theory as developed by psychologists predicts that effort levels are increasing in the signal-to-noise ratio. We conduct a real effort laboratory experiment to assess the...

  1. Thermal performance measurements on ATLAS-SCT KB forward modules

    Donegà, M; D'Onofrio, M; Ferrère, D; Hirt, C; Ikegami, Y; Kohriki, T; Kondo, T; Lindsay, S; Mangin-Brinet, M; Niinikoski, T O; Pernegger, H; Perrin, E; Taylor, G; Terada, S; Unno, Y; Wallny, R; Weber, M

    2003-01-01

    The thermal design of the KB module is presented. A Finite Elements Analysis (FEA) has been used to finalize the module design. The thermal performance of an outer irradiated KB module has been measured at different cooling conditions. The thermal runaway of the module has been measured. The FEA model has been compared with the measurements and has been used to predict the thermal performance in a realistic SCT scenario.

  2. Balanced Scorecard Based Performance Measurement & Strategic Management System

    Permatasari, Paulina

    2006-01-01

    Developing strategy and performance measurement are an integral part of management control system. Making strategic decision about planning and controlling require information regarding how different subunits in organization work. To be effective, performance measurement, both financial and non-financial must motivate manager and employees at different levels to force goal accomplishment and organization strategic. An organization's measurement system strongly affects the behavior of people b...

  3. Performance measurement in the context of CREM and FM

    Riratanaphong, C.; van der Voordt, Theo; Sarasoja, AL; Jensen, PA; van der Voordt, DJM; Coenen, C

    2012-01-01

    Purpose: To discuss trends in organisational performance measurement, to identify and discuss widely used performance criteria and key performance indicators (KPIs) in general and in the fields of Facility Management (FM) and Corporate Real Estate Management (CREM), and to identify how a more

  4. Designing a performance measurement system: A case study

    Lohman, Clemens; Fortuin, Leonard; Wouters, Marc

    2004-01-01

    Performance measurement (PM) by means of local performance indicators (PIs) is developing into performance management at a company-wide scale. But how should PIs at various levels in the organization be incorporated into one system that can help managers, working at levels that range from

  5. Performance measurement, expectancy and agency theory: An experimental study

    Sloof, R.; van Praag, C.M.

    2005-01-01

    Theoretical analyses of (optimal) performance measures are typically performed within the realm of the linear agency model. An important implication of this model is that, for a given compensation scheme, the agent's optimal effort choice is unrelated to the amount of noise in the performance

  6. Designing a performance measurement system : a case study

    Lohman, C.T.M.; Fortuin, L.; Wouters, M.J.F.

    2004-01-01

    Performance measurement (PM) by means of local performance indicators (PIs) is developing into performance management at a company-wide scale. But how should PIs at various levels in the organization be incorporated into one system that can help managers, working at levels that range from

  7. Biometric measurements in highly myopic eyes.

    Shen, Peiyang; Zheng, Yingfeng; Ding, Xiaohu; Liu, Bin; Congdon, Nathan; Morgan, Ian; He, Mingguang

    2013-02-01

    To assess the repeatability and accuracy of optical biometry (Lenstar LS900 optical low-coherence reflectometry [OLCR] and IOLMaster partial coherence interferometry [PCI]) and applanation ultrasound biometry in highly myopic eyes. Division of Preventive Ophthalmology, Zhongshan Ophthalmic Center, Guangzhou, China. Comparative evaluation of diagnostic technology. Biometric measurements were taken in highly myopic subjects with a spherical equivalent (SE) of -6.00 diopters (D) or higher and an axial length (AL) longer than 25.0 mm. Measurements of AL and anterior chamber depth (ACD) obtained by OLCR were compared with those obtained by PCI and applanation A-scan ultrasound. Right eyes were analyzed. Repeatability was evaluated using the coefficient of variation (CoV) and agreement, using Bland-Altman analyses. The mean SE was -11.20 D ± 4.65 (SD). The CoVs for repeated AL measurements using OLCR, PCI, and applanation ultrasound were 0.06%, 0.07%, and 0.20%, respectively. The limits of agreement (LoA) for AL were 0.11 mm between OLCR and PCI, 1.01 mm between OLCR and applanation ultrasound, and 1.03 mm between PCI and ultrasound. The ACD values were 0.29 mm, 0.53 mm, and 0.51 mm, respectively. These repeatability and agreement results were comparable in eyes with extreme myopia (AL ≥ 27.0 mm) or posterior staphyloma. The mean radius of corneal curvature was similar between OLCR and PCI (7.66 ± 0.24 mm versus 7.64 ± 0.25 mm), with an LoA of 0.12 mm. Optical biometry provided more repeatable and precise measurements of biometric parameters, including AL and ACD, than applanation ultrasound biometry in highly myopic eyes. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2012 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  8. Model Engine Performance Measurement From Force Balance Instrumentation

    Jeracki, Robert J.

    1998-01-01

    A large scale model representative of a low-noise, high bypass ratio turbofan engine was tested for acoustics and performance in the NASA Lewis 9- by 15-Foot Low-Speed Wind Tunnel. This test was part of NASA's continuing Advanced Subsonic Technology Noise Reduction Program. The low tip speed fan, nacelle, and an un-powered core passage (with core inlet guide vanes) were simulated. The fan blades and hub are mounted on a rotating thrust and torque balance. The nacelle, bypass duct stators, and core passage are attached to a six component force balance. The two balance forces, when corrected for internal pressure tares, measure the total thrust-minus-drag of the engine simulator. Corrected for scaling and other effects, it is basically the same force that the engine supports would feel, operating at similar conditions. A control volume is shown and discussed, identifying the various force components of the engine simulator thrust and definitions of net thrust. Several wind tunnel runs with nearly the same hardware installed are compared, to identify the repeatability of the measured thrust-minus-drag. Other wind tunnel runs, with hardware changes that affected fan performance, are compared to the baseline configuration, and the thrust and torque effects are shown. Finally, a thrust comparison between the force balance and nozzle gross thrust methods is shown, and both yield very similar results.

  9. Performance-Based Measurement: Action for Organizations and HPT Accountability

    Larbi-Apau, Josephine A.; Moseley, James L.

    2010-01-01

    Basic measurements and applications of six selected general but critical operational performance-based indicators--effectiveness, efficiency, productivity, profitability, return on investment, and benefit-cost ratio--are presented. With each measurement, goals and potential impact are explored. Errors, risks, limitations to measurements, and a…

  10. High Performance Circularly Polarized Microstrip Antenna

    Bondyopadhyay, Probir K. (Inventor)

    1997-01-01

    A microstrip antenna for radiating circularly polarized electromagnetic waves comprising a cluster array of at least four microstrip radiator elements, each of which is provided with dual orthogonal coplanar feeds in phase quadrature relation achieved by connection to an asymmetric T-junction power divider impedance notched at resonance. The dual fed circularly polarized reference element is positioned with its axis at a 45 deg angle with respect to the unit cell axis. The other three dual fed elements in the unit cell are positioned and fed with a coplanar feed structure with sequential rotation and phasing to enhance the axial ratio and impedance matching performance over a wide bandwidth. The centers of the radiator elements are disposed at the corners of a square with each side of a length d in the range of 0.7 to 0.9 times the free space wavelength of the antenna radiation and the radiator elements reside in a square unit cell area of sides equal to 2d and thereby permit the array to be used as a phased array antenna for electronic scanning and is realizable in a high temperature superconducting thin film material for high efficiency.

  11. NCI's Transdisciplinary High Performance Scientific Data Platform

    Evans, Ben; Antony, Joseph; Bastrakova, Irina; Car, Nicholas; Cox, Simon; Druken, Kelsey; Evans, Bradley; Fraser, Ryan; Ip, Alex; Kemp, Carina; King, Edward; Minchin, Stuart; Larraondo, Pablo; Pugh, Tim; Richards, Clare; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2016-04-01

    The Australian National Computational Infrastructure (NCI) manages Earth Systems data collections sourced from several domains and organisations onto a single High Performance Data (HPD) Node to further Australia's national priority research and innovation agenda. The NCI HPD Node has rapidly established its value, currently managing over 10 PBytes of datasets from collections that span a wide range of disciplines including climate, weather, environment, geoscience, geophysics, water resources and social sciences. Importantly, in order to facilitate broad user uptake, maximise reuse and enable transdisciplinary access through software and standardised interfaces, the datasets, associated information systems and processes have been incorporated into the design and operation of a unified platform that NCI has called, the National Environmental Research Data Interoperability Platform (NERDIP). The key goal of the NERDIP is to regularise data access so that it is easily discoverable, interoperable for different domains and enabled for high performance methods. It adopts and implements international standards and data conventions, and promotes scientific integrity within a high performance computing and data analysis environment. NCI has established a rich and flexible computing environment to access to this data, through the NCI supercomputer; a private cloud that supports both domain focused virtual laboratories and in-common interactive analysis interfaces; as well as remotely through scalable data services. Data collections of this importance must be managed with careful consideration of both their current use and the needs of the end-communities, as well as its future potential use, such as transitioning to more advanced software and improved methods. It is therefore critical that the data platform is both well-managed and trusted for stable production use (including transparency and reproducibility), agile enough to incorporate new technological advances and

  12. High-performance silicon nanowire bipolar phototransistors

    Tan, Siew Li; Zhao, Xingyan; Chen, Kaixiang; Crozier, Kenneth B.; Dan, Yaping

    2016-07-01

    Silicon nanowires (SiNWs) have emerged as sensitive absorbing materials for photodetection at wavelengths ranging from ultraviolet (UV) to the near infrared. Most of the reports on SiNW photodetectors are based on photoconductor, photodiode, or field-effect transistor device structures. These SiNW devices each have their own advantages and trade-offs in optical gain, response time, operating voltage, and dark current noise. Here, we report on the experimental realization of single SiNW bipolar phototransistors on silicon-on-insulator substrates. Our SiNW devices are based on bipolar transistor structures with an optically injected base region and are fabricated using CMOS-compatible processes. The experimentally measured optoelectronic characteristics of the SiNW phototransistors are in good agreement with simulation results. The SiNW phototransistors exhibit significantly enhanced response to UV and visible light, compared with typical Si p-i-n photodiodes. The near infrared responsivities of the SiNW phototransistors are comparable to those of Si avalanche photodiodes but are achieved at much lower operating voltages. Compared with other reported SiNW photodetectors as well as conventional bulk Si photodiodes and phototransistors, the SiNW phototransistors in this work demonstrate the combined advantages of high gain, high photoresponse, low dark current, and low operating voltage.

  13. Silicon Photomultiplier Performance in High ELectric Field

    Montoya, J.; Morad, J.

    2016-12-01

    Roughly 27% of the universe is thought to be composed of dark matter. The Large Underground Xenon (LUX) relies on the emission of light from xenon atoms after a collision with a dark matter particle. After a particle interaction in the detector, two things can happen: the xenon will emit light and charge. The charge (electrons), in the liquid xenon needs to be pulled into the gas section so that it can interact with gas and emit light. This allows LUX to convert a single electron into many photons. This is done by applying a high voltage across the liquid and gas regions, effectively ripping electrons out of the liquid xenon and into the gas. The current device used to detect photons is the photomultiplier tube (PMT). These devices are large and costly. In recent years, a new technology that is capable of detecting single photons has emerged, the silicon photomultiplier (SiPM). These devices are cheaper and smaller than PMTs. Their performance in a high electric fields, such as those found in LUX, are unknown. It is possible that a large electric field could introduce noise on the SiPM signal, drowning the single photon detection capability. My hypothesis is that SiPMs will not observe a significant increase is noise at an electric field of roughly 10kV/cm (an electric field within the range used in detectors like LUX). I plan to test this hypothesis by first rotating the SiPMs with no applied electric field between two metal plates roughly 2 cm apart, providing a control data set. Then using the same angles test the dark counts with the constant electric field applied. Possibly the most important aspect of LUX, is the photon detector because it's what detects the signals. Dark matter is detected in the experiment by looking at the ratio of photons to electrons emitted for a given interaction in the detector. Interactions with a low electron to photon ratio are more like to be dark matter events than those with a high electron to photon ratio. The ability to

  14. Strategy Archetypes Adopted by Icelandic Companies, Their Fit with Performance Measures and Effects on Financial Performance

    Rikhardsson, Pall; Sigurjonsson, Olaf; Arnardottir, Audur Arna

    Past research seems to suggest that companies adopting certain strategies favor certain sets of performance measures. That is to say companies using entrepreneurial focused strategies favor non-financial measures or a balanced mix of financial and non-financial measures. Companies adopting reactive...... or operational strategies seem to favor financial measures and are less likely to use non-financial measures. We take this research further by focusing not only on the link between strategy types and performance measures but also on what specific performance measures are used in connection to which strategies....... Furthermore, we examine the link between the strategies adopted, the performance measures favored and the financial performance of the companies. The empirical data collection was carried out in winter 2013 with a population of the 300 largest companies in Iceland. The survey was sent to the CFO...

  15. An inkjet vision measurement technique for high-frequency jetting

    Kwon, Kye-Si; Jang, Min-Hyuck; Park, Ha Yeong; Ko, Hyun-Seok

    2014-01-01

    Inkjet technology has been used as manufacturing a tool for printed electronics. To increase the productivity, the jetting frequency needs to be increased. When using high-frequency jetting, the printed pattern quality could be non-uniform since the jetting performance characteristics including the jetting speed and droplet volume could vary significantly with increases in jet frequency. Therefore, high-frequency jetting behavior must be evaluated properly for improvement. However, it is difficult to measure high-frequency jetting behavior using previous vision analysis methods, because subsequent droplets are close or even merged. In this paper, we present vision measurement techniques to evaluate the drop formation of high-frequency jetting. The proposed method is based on tracking target droplets such that subsequent droplets can be excluded in the image analysis by focusing on the target droplet. Finally, a frequency sweeping method for jetting speed and droplet volume is presented to understand the overall jetting frequency effects on jetting performance

  16. An inkjet vision measurement technique for high-frequency jetting

    Kwon, Kye-Si, E-mail: kskwon@sch.ac.kr; Jang, Min-Hyuck; Park, Ha Yeong [Department of Mechanical Engineering, Soonchunhyang University 22, Soonchunhyang-Ro, Shinchang, Asan Chungnam 336-745 (Korea, Republic of); Ko, Hyun-Seok [Department of Electrical and Robot Engineering, Soonchunhyang University, 22, Soonchunhyang-Ro, Shinchang, Asan Chungnam 336-745 (Korea, Republic of)

    2014-06-15

    Inkjet technology has been used as manufacturing a tool for printed electronics. To increase the productivity, the jetting frequency needs to be increased. When using high-frequency jetting, the printed pattern quality could be non-uniform since the jetting performance characteristics including the jetting speed and droplet volume could vary significantly with increases in jet frequency. Therefore, high-frequency jetting behavior must be evaluated properly for improvement. However, it is difficult to measure high-frequency jetting behavior using previous vision analysis methods, because subsequent droplets are close or even merged. In this paper, we present vision measurement techniques to evaluate the drop formation of high-frequency jetting. The proposed method is based on tracking target droplets such that subsequent droplets can be excluded in the image analysis by focusing on the target droplet. Finally, a frequency sweeping method for jetting speed and droplet volume is presented to understand the overall jetting frequency effects on jetting performance.

  17. The Role of Performance Management in the High Performance Organisation

    de Waal, André A.; van der Heijden, Beatrice I.J.M.

    2014-01-01

    The allegiance of partnering organisations and their employees to an Extended Enterprise performance is its proverbial sword of Damocles. Literature on Extended Enterprises focuses on collaboration, inter-organizational integration and learning to avoid diminishing or missing allegiance becoming an

  18. Performance Measure as Feedback Variable in Image Processing

    Ristić Danijela

    2006-01-01

    Full Text Available This paper extends the view of image processing performance measure presenting the use of this measure as an actual value in a feedback structure. The idea behind is that the control loop, which is built in that way, drives the actual feedback value to a given set point. Since the performance measure depends explicitly on the application, the inclusion of feedback structures and choice of appropriate feedback variables are presented on example of optical character recognition in industrial application. Metrics for quantification of performance at different image processing levels are discussed. The issues that those metrics should address from both image processing and control point of view are considered. The performance measures of individual processing algorithms that form a character recognition system are determined with respect to the overall system performance.

  19. GRAPH THEORY APPROACH TO QUANTIFY UNCERTAINTY OF PERFORMANCE MEASURES

    Sérgio D. Sousa

    2015-03-01

    Full Text Available In this work, the performance measurement process is studied to quantify the uncertainty induced in the resulting performance measure (PM. To that end, the causes of uncertainty are identified, analysing the activities undertaken in the three following stages of the performance measurement process: design and implementation, data collection and record, and determination and analysis. A quantitative methodology based on graph theory and on the sources of uncertainty of the performance measurement process is used to calculate an uncertainty index to evaluate the level of uncertainty of a given PM or (key performance indicator. An application example is presented. The quantification of PM uncertainty could contribute to better represent the risk associated with a given decision and also to improve the PM to increase its precision and reliability.

  20. Video performance for high security applications

    Connell, Jack C.; Norman, Bradley C.

    2010-01-01

    The complexity of physical protection systems has increased to address modern threats to national security and emerging commercial technologies. A key element of modern physical protection systems is the data presented to the human operator used for rapid determination of the cause of an alarm, whether false (e.g., caused by an animal, debris, etc.) or real (e.g., a human adversary). Alarm assessment, the human validation of a sensor alarm, primarily relies on imaging technologies and video systems. Developing measures of effectiveness (MOE) that drive the design or evaluation of a video system or technology becomes a challenge, given the subjectivity of the application (e.g., alarm assessment). Sandia National Laboratories has conducted empirical analysis using field test data and mathematical models such as binomial distribution and Johnson target transfer functions to develop MOEs for video system technologies. Depending on the technology, the task of the security operator and the distance to the target, the Probability of Assessment (PAs) can be determined as a function of a variety of conditions or assumptions. PAs used as an MOE allows the systems engineer to conduct trade studies, make informed design decisions, or evaluate new higher-risk technologies. This paper outlines general video system design trade-offs, discusses ways video can be used to increase system performance and lists MOEs for video systems used in subjective applications such as alarm assessment.