WorldWideScience

Sample records for validated high performance

  1. Validated High Performance Liquid Chromatography Method for ...

    African Journals Online (AJOL)

    Purpose: To develop a simple, rapid and sensitive high performance liquid chromatography (HPLC) method for the determination of cefadroxil monohydrate in human plasma. Methods: Schimadzu HPLC with LC solution software was used with Waters Spherisorb, C18 (5 μm, 150mm × 4.5mm) column. The mobile phase ...

  2. Validated high performance liquid chromatographic (HPLC) method ...

    African Journals Online (AJOL)

    STORAGESEVER

    2010-02-22

    Feb 22, 2010 ... specific and accurate high performance liquid chromatographic method for determination of ZER in micro-volumes ... tional medicine as a cure for swelling, sores, loss of appetite and ... Receptor Activator for Nuclear Factor κ B Ligand .... The effect of ... be suitable for preclinical pharmacokinetic studies. The.

  3. Validated High Performance Liquid Chromatography Method for ...

    African Journals Online (AJOL)

    Purpose: To develop a simple, rapid and sensitive high performance liquid ... response, tailing factor and resolution of six replicate injections was < 3 %. ... Cefadroxil monohydrate, Human plasma, Pharmacokinetics Bioequivalence ... Drug-free plasma was obtained from the local .... Influence of probenicid on the renal.

  4. Development and validation of a reversed phase High Performance ...

    African Journals Online (AJOL)

    A simple, rapid, accurate and economical isocratic Reversed Phase High Performance Liquid Chromatography (RPHPLC) method was developed, validated and used for the evaluation of content of different brands of paracetamol tablets. The method was validated according to ICH guidelines and may be adopted for the ...

  5. Validation of SCALE code package on high performance neutron shields

    International Nuclear Information System (INIS)

    Bace, M.; Jecmenica, R.; Smuc, T.

    1999-01-01

    The shielding ability and other properties of new high performance neutron shielding materials from the KRAFTON series have been recently published. A comparison of the published experimental and MCNP results for the two materials of the KRAFTON series, with our own calculations has been done. Two control modules of the SCALE-4.4 code system have been used, one of them based on one dimensional radiation transport analysis (SAS1) and other based on the three dimensional Monte Carlo method (SAS3). The comparison of the calculated neutron dose equivalent rates shows a good agreement between experimental and calculated results for the KRAFTON-N2 material.. Our results indicate that the N2-M-N2 sandwich type is approximately 10% inferior as neutron shield to the KRAFTON-N2 material. All values of neutron dose equivalent obtained by SAS1 are approximately 25% lower in comparison with the SAS3 results, which indicates proportions of discrepancies introduced by one-dimensional geometry approximation.(author)

  6. Development and Validation of Reverse Phase High Performance ...

    African Journals Online (AJOL)

    Performance Chromatography Method for Determination of. Olanzapine in ... Lowest limit of quantification (LLOQ) was 1 ng/ml while inter-day and intra-day precision was < 12.5 and 5.1 % .... Chromatograms of blank plasma and drug free.

  7. Validation of the high performance leadership competencies as measured by an assessment centre in-basket

    Directory of Open Access Journals (Sweden)

    H. H. Spangenberg

    2003-10-01

    Full Text Available The purpose of this study was to validate Schroder’s High Performance Leadership Competencies (HPLCs, measured by a specially designed In-basket, against multiple criteria. These consisted of six measures of managerial success, representing managerial advancement and salary progress criteria, and a newly developed comprehensive measure of work unit performance, the Performance Index. An environmental dynamism and complexity questionnaire served as moderator variable. Results indicated disappointing predictive validity quotients for the HPLCs as measured by an In-basket, in contrast to satisfactory predictive and construct validity obtained in previous studies by means of a full assessment centre. The implications of the findings are discussed and suggestions are made for improving the validity of the In-basket. Opsomming Die doel van hierdie studie was die validering van Schroder se Hoëvlak Leierskapsbevoegdhede, gemeet deur ‘n spesiaal ontwerpte Posmandjie, teen veelvoudige kriteria. Dit behels ses metings van bestuursukses wat bestuursbevorderings- en salarisvorderingskriteria insluit, sowel as ‘n nuutontwikkelde, omvattende meting van werkeenheidsprestasie, die Prestasie indeks. ‘n Vraelys wat die dinamika en kompleksiteit van die omgewing meet, het as moderator veranderlike gedien. Resultate dui op teleurstellende geldigheidskwosiënte vir die Hoëvlak Leierskapsbevoegdhede soos gemeet deur ‘n posmandjie, in teenstelling met bevredigende voorspellings- en konstrukgeldigheid wat in vorige studies deur middel van ‘n volle takseersentrum verkry is. Die bevindinge word bespreek en voorstelle word gemaak om die geldigheidskwosiënte te verbeter.

  8. A validated high performance thin layer chromatography method for determination of yohimbine hydrochloride in pharmaceutical preparations

    OpenAIRE

    Jihan M Badr

    2013-01-01

    Background: Yohimbine is an indole alkaloid used as a promising therapy for erectile dysfunction. A number of methods were reported for the analysis of yohimbine in the bark or in pharmaceutical preparations. Materials and Method: In the present work, a simple and sensitive high performance thin layer chromatographic method is developed for determination of yohimbine (occurring as yohimbine hydrochloride) in pharmaceutical preparations and validated according to International Conference of Ha...

  9. High-performance liquid chromatography method validation for determination of tetracycline residues in poultry meat

    Directory of Open Access Journals (Sweden)

    Vikas Gupta

    2014-01-01

    Full Text Available Background: In this study, a method for determination of tetracycline (TC residues in poultry with the help of high-performance liquid chromatography technique was validated. Materials and Methods: The principle step involved in ultrasonic-assisted extraction of TCs from poultry samples by 2 ml of 20% trichloroacetic acid and phosphate buffer (pH 4, which gave a clearer supernatant and high recovery, followed by centrifugation and purification by using 0.22 μm filter paper. Results: Validity study of the method revealed that all obtained calibration curves showed good linearity (r2 > 0.999 over the range of 40-4500 ng. Sensitivity was found to be 1.54 and 1.80 ng for oxytetracycline (OTC and TC. Accuracy was in the range of 87.94-96.20% and 72.40-79.84% for meat. Precision was lower than 10% in all cases indicating that the method can be used as a validated method. Limit of detection was found to be 4.8 and 5.10 ng for OTC and TC, respectively. The corresponding values of limit of quantitation were 11 and 12 ng. Conclusion: The method reliably identifies and quantifies the selected TC and OTC in the reconstituted poultry meat in the low and sub-nanogram range and can be applied in any laboratory.

  10. Validation of a technique by high-performance liquid chromatography for the determination of total isoflavones

    Directory of Open Access Journals (Sweden)

    Pilar A. Soledispa Cañarte

    2017-04-01

    Full Text Available Context: Isoflavones may act as selective regulators in the prevention of various diseases. The most important source of isoflavones is the soy, from which different phytotherapeutics are elaborated of use in Ecuadorian population. However, its concentration varies depending on several factors, therefore quality assessment need to be carried out through out several analytical methods. Aims: To validate an analytical method by high precision liquid chromatography (HPLC to quantify total isoflavones in herbal medicine. Methods: To quantify isoflavones, it was used a brand liquid chromatography with UV/VIS detector at 260 nm, C-18 column using isocratic method. The mobile phase was composed of 2% acetic acid: acetonitrile (75:25. The quantification was performed against reference standard. The parameters for the validation followed the established in the USP 33. Results: The chromatogram presented six peaks with elution between 1.557 and 18.913 min. The linearity of the system and the method got r2 equal to 0.98 and 0.99 respectively. The coefficients of variation 1.5% in the study of repetitiveness and 2% in intermediate precision. The accuracy of the adjusted lineal model exhibited r=0.95 and intercept reliable interval (-0.921; 1.743. Conclusions: The validated method was specific, accurate, precise and linear. It can be used for quality control and stability studies of isoflavones present in herbal medicine.

  11. A validated high performance thin layer chromatography method for determination of yohimbine hydrochloride in pharmaceutical preparations.

    Science.gov (United States)

    Badr, Jihan M

    2013-01-01

    Yohimbine is an indole alkaloid used as a promising therapy for erectile dysfunction. A number of methods were reported for the analysis of yohimbine in the bark or in pharmaceutical preparations. In the present work, a simple and sensitive high performance thin layer chromatographic method is developed for determination of yohimbine (occurring as yohimbine hydrochloride) in pharmaceutical preparations and validated according to International Conference of Harmonization (ICH) guidelines. The method employed thin layer chromatography aluminum sheets precoated with silica gel as the stationary phase and the mobile phase consisted of chloroform:methanol:ammonia (97:3:0.2), which gave compact bands of yohimbine hydrochloride. Linear regression data for the calibration curves of standard yohimbine hydrochloride showed a good linear relationship over a concentration range of 80-1000 ng/spot with respect to the area and correlation coefficient (R(2)) was 0.9965. The method was evaluated regarding accuracy, precision, selectivity, and robustness. Limits of detection and quantitation were recorded as 5 and 40 ng/spot, respectively. The proposed method efficiently separated yohimbine hydrochloride from other components even in complex mixture containing powdered plants. The amount of yohimbine hydrochloride ranged from 2.3 to 5.2 mg/tablet or capsule in preparations containing the pure alkaloid, while it varied from zero (0) to 1.5-1.8 mg/capsule in dietary supplements containing powdered yohimbe bark. We concluded that this method employing high performance thin layer chromatography (HPTLC) in quantitative determination of yohimbine hydrochloride in pharmaceutical preparations is efficient, simple, accurate, and validated.

  12. Validation of a novel automatic sleep spindle detector with high performance during sleep in middle aged subjects

    DEFF Research Database (Denmark)

    Wendt, Sabrina Lyngbye; Christensen, Julie A. E.; Kempfner, Jacob

    2012-01-01

    Many of the automatic sleep spindle detectors currently used to analyze sleep EEG are either validated on young subjects or not validated thoroughly. The purpose of this study is to develop and validate a fast and reliable sleep spindle detector with high performance in middle aged subjects....... An automatic sleep spindle detector using a bandpass filtering approach and a time varying threshold was developed. The validation was done on sleep epochs from EEG recordings with manually scored sleep spindles from 13 healthy subjects with a mean age of 57.9 ± 9.7 years. The sleep spindle detector reached...

  13. Development and validation of methodology for technetium-99m radiopharmaceuticals using high performance liquid chromatography (HPLC)

    International Nuclear Information System (INIS)

    Almeida, Erika Vieira de

    2009-01-01

    Radiopharmaceuticals are compounds, with no pharmacological action, which have a radioisotope in their composition and are used in Nuclear Medicine for diagnosis and therapy of several diseases. In this work, the development and validation of an analytical method for 99 mTc-HSA, 99 mTc-EC, 99 mTc-ECD and 99 mTc-Sestamibi radiopharmaceuticals and for some raw materials were carried out by high performance liquid chromatography (HPLC). The analyses were performed in a Shimadzu HPLC equipment, LC-20AT Prominence model. Some impurities were identified by the addition of a reference standard substance. Validation of the method was carried out according to the criteria defined in RE n. 899/2003 of the National Sanitary Agency (ANVISA). The results for robustness of the method showed that it is necessary to control flow rate conditions, sample volume, pH of the mobile phase and temperature of the oven. The analytical curves were linear in the concentration ranges, with linear correlation coefficients (r 2 ) above 0.9995. The results for precision, accuracy and recovery showed values in the range of 0.07-4.78%, 95.38-106.50% and 94.40-100.95%, respectively. The detection limits and quantification limits varied from 0.27 to 5.77 μg mL -1 and 0.90 to 19.23 μg mL -1 , respectively. The values for HAS, EC, ECD and MIBI in the lyophilized reagents were 8.95; 0.485; 0.986 and 0.974 mg L-1, respectively. The mean radiochemical purity for 99 mTc-HSA, 99 mTc-EC, 99 mTc-ECD and 99 mTc-Sestamibi was (97.28 ± 0.09)%, (98.96 ± 0.03)%, (98.96 ± 0.03)% and (98.07 ± 0.01)%, respectively. All the parameters recommended by ANVISA were evaluated and the results are below the established limits. (author)

  14. Development and validation of reverse phase high performance liquid chromatography for citral analysis from essential oils.

    Science.gov (United States)

    Gaonkar, Roopa; Yallappa, S; Dhananjaya, B L; Hegde, Gurumurthy

    2016-11-15

    Citral is a widely used monoterpene aldehyde in aromatherapy, food and pesticide industries. A new validated reverse phase high performance liquid chromatography (RP - HPLC) procedure for the detection and quantification of cis-trans isomers of citral was developed. The RP-HPLC analysis was carried out using Enable C - 18G column (250×4.6mm, 5μ), with acetonitrile and water (70: 30) mobile phase in isocratic mode at 1mL/min flow. A photodiode array (PDA) detector was set at 233nm for the detection of citral. The method showed linearity, selectivity and accuracy for citral in the range of 3-100μg/mL. In order to compare the new RP-HPLC method with the available methods, one of the commercially available essential oil from Cymbopogon flexuosus was analyzed using new RP-HPLC method and the same was analyzed using GC-MS for the comparison of the method for the detection of citral. The GC-MS analysis was done using mass selective detector (MSD) showed citral content to be of 72.76%; wherein the new method showed to contain that same at 74.98%. To prove the application of the new method, essential oils were extracted from lemongrass, lemon leaves and mosambi peels by steam distillation. The citral content present in the essential and also in the condensate was analyzed. The method was found to be suitable for the analysis of citral in essential oils and water based citral formulations with a very good resolution of its components geranial and neral. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Development and Validation of High Performance Liquid Chromatography Method for Determination Atorvastatin in Tablet

    Science.gov (United States)

    Yugatama, A.; Rohmani, S.; Dewangga, A.

    2018-03-01

    Atorvastatin is the primary choice for dyslipidemia treatment. Due to patent expiration of atorvastatin, the pharmaceutical industry makes copy of the drug. Therefore, the development methods for tablet quality tests involving atorvastatin concentration on tablets needs to be performed. The purpose of this research was to develop and validate the simple atorvastatin tablet analytical method by HPLC. HPLC system used in this experiment consisted of column Cosmosil C18 (150 x 4,6 mm, 5 µm) as the stationary reverse phase chomatography, a mixture of methanol-water at pH 3 (80:20 v/v) as the mobile phase, flow rate of 1 mL/min, and UV detector at wavelength of 245 nm. Validation methods were including: selectivity, linearity, accuracy, precision, limit of detection (LOD), and limit of quantitation (LOQ). The results of this study indicate that the developed method had good validation including selectivity, linearity, accuracy, precision, LOD, and LOQ for analysis of atorvastatin tablet content. LOD and LOQ were 0.2 and 0.7 ng/mL, and the linearity range were 20 - 120 ng/mL.

  16. Validation of the solar heating and cooling high speed performance (HISPER) computer code

    Science.gov (United States)

    Wallace, D. B.

    1980-01-01

    Developed to give a quick and accurate predictions HISPER, a simplification of the TRNSYS program, achieves its computational speed by not simulating detailed system operations or performing detailed load computations. In order to validate the HISPER computer for air systems the simulation was compared to the actual performance of an operational test site. Solar insolation, ambient temperature, water usage rate, and water main temperatures from the data tapes for an office building in Huntsville, Alabama were used as input. The HISPER program was found to predict the heating loads and solar fraction of the loads with errors of less than ten percent. Good correlation was found on both a seasonal basis and a monthly basis. Several parameters (such as infiltration rate and the outside ambient temperature above which heating is not required) were found to require careful selection for accurate simulation.

  17. Validation of the Performance of High-level Waste Disposal System

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Won Jin; Park, J. H.; Lee, J. O. (and others)

    2007-06-15

    The experimental researches to validate the integrity and safety of high-level waste disposal system were carried out. The studies on the construction of KURT, and the site rock characteristics were conducted. Thermal-hydro-mechanical behavior of engineered barrier system was investigated using the engineering-scale test facility. The migration and retardation of radionuclide through the rock fracture under anaerobic and reducing condition were studied. The distribution coefficients of radionuclides onto granite, the rock matrix diffusion coefficients, and the gap and grain boundary inventories of spent fuel were measured.

  18. Validation of a high performance liquid chromatography method for the stabilization of epigallocatechin gallate.

    Science.gov (United States)

    Fangueiro, Joana F; Parra, Alexander; Silva, Amélia M; Egea, Maria A; Souto, Eliana B; Garcia, Maria L; Calpena, Ana C

    2014-11-20

    Epigallocatechin gallate (EGCG) is a green tea catechin with potential health benefits, such as anti-oxidant, anti-carcinogenic and anti-inflammatory effects. In general, EGCG is highly susceptible to degradation, therefore presenting stability problems. The present paper was focused on the study of EGCG stability in HEPES (N-2-hydroxyethylpiperazine-N'-2-ethanesulfonic acid) medium regarding the pH dependency, storage temperature and in the presence of ascorbic acid a reducing agent. The evaluation of EGCG in HEPES buffer has demonstrated that this molecule is not able of maintaining its physicochemical properties and potential beneficial effects, since it is partially or completely degraded, depending on the EGCG concentration. The storage temperature of EGCG most suitable to maintain its structure was shown to be the lower values (4 or -20 °C). The pH 3.5 was able to provide greater stability than pH 7.4. However, the presence of a reducing agent (i.e., ascorbic acid) was shown to provide greater protection against degradation of EGCG. A validation method based on RP-HPLC with UV-vis detection was carried out for two media: water and a biocompatible physiological medium composed of Transcutol®P, ethanol and ascorbic acid. The quantification of EGCG for purposes, using pure EGCG, requires a validated HPLC method which could be possible to apply in pharmacokinetic and pharmacodynamics studies. Copyright © 2014. Published by Elsevier B.V.

  19. Validation of high-performance liquid chromatography (HPLC method for quantitative analysis of histamine in fish and fishery products

    Directory of Open Access Journals (Sweden)

    B.K.K.K. Jinadasa

    2016-12-01

    Full Text Available A high-performance liquid chromatography method is described for quantitative determination and validation of histamine in fish and fishery product samples. Histamine is extracted from fish/fishery products by homogenizing with tri-chloro acetic acid, separated with Amberlite CG-50 resin and C18-ODS Hypersil reversed phase column at ambient temperature (25°C. Linear standard curves with high correlation coefficients were obtained. An isocratic elution program was used; the total elution time was 10 min. The method was validated by assessing the following aspects; specificity, repeatability, reproducibility, linearity, recovery, limits of detection, limit of quantification and uncertainty. The validated parameters are in good agreement with method and it is a useful tool for determining histamine in fish and fishery products.

  20. Design and validation of the high performance and low noise CQU-DTU-LN1 airfoils

    DEFF Research Database (Denmark)

    Cheng, Jiangtao; Zhu, Wei Jun; Fischer, Andreas

    2014-01-01

    with the blade element momentum theory, the viscous-inviscid XFOIL code and an airfoil self-noise prediction model, an optimization algorithm has been developed for designing the high performance and low noise CQU-DTU-LN1 series of airfoils with targets of maximum power coefficient and low noise emission...... emission between the CQU-DTU-LN118 airfoil and the National Advisory Committee for Aeronautics (NACA) 64618 airfoil, which is used in modern wind turbine blades, are carried out. Copyright © 2013 John Wiley & Sons, Ltd....

  1. Development and validation of high-performance liquid chromatography and high-performance thin-layer chromatography methods for the quantification of khellin in Ammi visnaga seed

    Science.gov (United States)

    Kamal, Abid; Khan, Washim; Ahmad, Sayeed; Ahmad, F. J.; Saleem, Kishwar

    2015-01-01

    Objective: The present study was used to design simple, accurate and sensitive reversed phase-high-performance liquid chromatography RP-HPLC and high-performance thin-layer chromatography (HPTLC) methods for the development of quantification of khellin present in the seeds of Ammi visnaga. Materials and Methods: RP-HPLC analysis was performed on a C18 column with methanol: Water (75: 25, v/v) as a mobile phase. The HPTLC method involved densitometric evaluation of khellin after resolving it on silica gel plate using ethyl acetate: Toluene: Formic acid (5.5:4.0:0.5, v/v/v) as a mobile phase. Results: The developed HPLC and HPTLC methods were validated for precision (interday, intraday and intersystem), robustness and accuracy, limit of detection and limit of quantification. The relationship between the concentration of standard solutions and the peak response was linear in both HPLC and HPTLC methods with the concentration range of 10–80 μg/mL in HPLC and 25–1,000 ng/spot in HPTLC for khellin. The % relative standard deviation values for method precision was found to be 0.63–1.97%, 0.62–2.05% in HPLC and HPTLC for khellin respectively. Accuracy of the method was checked by recovery studies conducted at three different concentration levels and the average percentage recovery was found to be 100.53% in HPLC and 100.08% in HPTLC for khellin. Conclusions: The developed HPLC and HPTLC methods for the quantification of khellin were found simple, precise, specific, sensitive and accurate which can be used for routine analysis and quality control of A. visnaga and several formulations containing it as an ingredient. PMID:26681890

  2. Determination of validity and reliability of performance assessments tasks developed for selected topics in high school chemistry

    Science.gov (United States)

    Zichittella, Gail Eberhardt

    The primary purpose of this study was to validate performance assessments, which can be used as teaching and assessment instruments in high school science classrooms. This study evaluated the classroom usability of these performance instruments and establishes the interrater reliability of the scoring rubrics when used by classroom teachers. The assessment instruments were designed to represent two levels of scientific inquiry. The high level of inquiry tasks are relatively unstructured in terms of student directions; the low inquiry tasks provided more structure for the student. The tasks cover two content topics studied in chemistry (scientific observation and density). Students from a variety of Western New York school districts who were enrolled in chemistry classes and other science courses were involved in completion of the tasks at the two levels of inquiry. The chemistry students completed the NYS Regents Examination in Chemistry. Their classroom teachers were interviewed and completed a questionnaire to aid in the establishment their epistemological view on the inclusion of inquiry based learning in the science classroom. Data showed that the performance assessment tasks were reliable, valid and helpful for obtaining a more complete picture of the students' scientific understanding. The teacher participants reported no difficulty with the usability of the task in the high school chemistry setting. Collected data gave no evidence of gender bias with reference to the performance tasks or the NYS Regents Chemistry Examination. Additionally, it was shown that the instructors' classroom practices do have an effect upon the students' achievement on the performance tasks and the NYS Regents examination. Data also showed that achievement on the performance tasks was influenced by the number of years of science instruction students had received.

  3. Validation of histamine determination Method in yoghurt using High Performance Liquid Chromatography

    Directory of Open Access Journals (Sweden)

    M Jahedinia

    2014-02-01

    Full Text Available Biogenic amines are organic, basic nitrogenous compounds of low molecular weight that are mainly generated by the enzymatic decarboxylation of amino acids by microorganisms. Dairy products are among the foods with the highest amine content. A wide variety of methods and procedures for determination of histamine and biogenic amines have been established. Amongst, HPLC method is considered as reference method. The aim of this study was to validate Reversed Phase HPLC method determination of histamine in yoghurt. The mobile phase consisted of acetonitrile/water (18:88 v/v and the flow rate was set at 0.5 ml/min using isocratic HPLC. Detection was carried out at 254 nm using UV-detector. Calibration curve that was constructed using peak area of standards was linear and value of correlation coefficient (r2 was estimated at 0.998. Good recoveries were observed for histamine under investigation at all spiking levels and average of recoveries was 84%. The RSD% value from repeatability test was found to be %4.4. Limit of detection and limit of quantitation were 0.14 and 0.42 µ/ml, respectively. The results of validation tests showed that the method is reliable and rapid for quantification of histamine in yoghurt.

  4. Development and Validation of High Performance Liquid Chromatographic Method for Determination of Lamivudine from Pharmaceutical Preparation

    Directory of Open Access Journals (Sweden)

    S. K. Patro

    2010-01-01

    Full Text Available A new, simple, specific, accurate and precise RP-HPLC method was developed for determination of lamivudine in pure and tablet formulations. A Thermo BDS C18 column in isocratic mode, with a mobile phase consisting of 0.01 M ammonium dihydrogen orthophosphate buffer adjusted to pH 2.48 by using formic acid and methanol in the ratio of 50:50 was used. The flow rate was set at 0.6 mL/min and UV detection was carried out at 264 nm. The retention time of lamivudine and nevirapine were 2.825 min and 4.958 min respectively. The method was validated for linearity, precision, robustness and recovery. Linearity for lamivudine was found in the range of 50-175 μg/mL. Hence, it can be applied for routine quality control of lamivudine in bulk and pharmaceutical formulations.

  5. Development and validation of ultra-high performance supercritical fluid chromatography method for determination of illegal dyes and comparison to ultra-high performance liquid chromatography method.

    Science.gov (United States)

    Khalikova, Maria A; Šatínský, Dalibor; Solich, Petr; Nováková, Lucie

    2015-05-18

    A novel simple, fast and efficient ultra-high performance supercritical fluid chromatography (UHPSFC) method was developed and validated for the separation and quantitative determination of eleven illegal dyes in chili-containing spices. The method involved a simple ultrasound-assisted liquid extraction of illegal compounds with tetrahydrofuran. The separation was performed using a supercritical fluid chromatography system and CSH Fluoro-Phenyl stationary phase at 70°C. The mobile phase was carbon dioxide and the mixture of methanol:acetonitrile (1:1, v/v) with 2.5% formic acid as an additive at the flow rate 2.0 mL min(-1). The UV-vis detection was accomplished at 500 nm for seven compounds and at 420 nm for Sudan Orange G, Butter Yellow, Fast Garnet GBC and Methyl Red due to their maximum of absorbance. All eleven compounds were separated in less than 5 min. The method was successfully validated and applied using three commercial samples of chili-containing spices - Chili sauce (Indonesia), Feferony sauce (Slovakia) and Mojo sauce (Spain). The linearity range of proposed method was 0.50-9.09 mg kg(-1) (r ≥ 0.995). The detection limits were determined as signal to noise ratio of 3 and were ranged from 0.15 mg kg(-1) to 0.60 mg kg(-1) (1.80 mg kg(-1) for Fast Garnet) for standard solution and from 0.25 mg kg(-1) to 1.00 mg kg(-1) (2.50 mg kg(-1) for Fast Garnet, 1.50 mg kg(-1) for Sudan Red 7B) for chili-containing samples. The recovery values were in the range of 73.5-107.2% and relative standard deviation ranging from 0.1% to 8.2% for within-day precision and from 0.5% to 8.8% for between-day precision. The method showed potential for being used to monitor forbidden dyes in food constituents. The developed UHPSFC method was compared to the UHPLC-UV method. The orthogonality of Sudan dyes separation by these two methods was demonstrated. Benefits and drawbacks were discussed showing the reliability of both methods for monitoring of studied illegal dyes in real

  6. High Performance Electrical Modeling and Simulation Software Normal Environment Verification and Validation Plan, Version 1.0; TOPICAL

    International Nuclear Information System (INIS)

    WIX, STEVEN D.; BOGDAN, CAROLYN W.; MARCHIONDO JR., JULIO P.; DEVENEY, MICHAEL F.; NUNEZ, ALBERT V.

    2002-01-01

    The requirements in modeling and simulation are driven by two fundamental changes in the nuclear weapons landscape: (1) The Comprehensive Test Ban Treaty and (2) The Stockpile Life Extension Program which extends weapon lifetimes well beyond their originally anticipated field lifetimes. The move from confidence based on nuclear testing to confidence based on predictive simulation forces a profound change in the performance asked of codes. The scope of this document is to improve the confidence in the computational results by demonstration and documentation of the predictive capability of electrical circuit codes and the underlying conceptual, mathematical and numerical models as applied to a specific stockpile driver. This document describes the High Performance Electrical Modeling and Simulation software normal environment Verification and Validation Plan

  7. Optimization and validation of high-performance liquid chromatography method for analyzing 25-desacetyl rifampicin in human urine

    Science.gov (United States)

    Lily; Laila, L.; Prasetyo, B. E.

    2018-03-01

    A selective, reproducibility, effective, sensitive, simple and fast High-Performance Liquid Chromatography (HPLC) was developed, optimized and validated to analyze 25-Desacetyl Rifampicin (25-DR) in human urine which is from tuberculosis patient. The separation was performed by HPLC Agilent Technologies with column Agilent Eclipse XDB- Ci8 and amobile phase of 65:35 v/v methanol: 0.01 M sodium phosphate buffer pH 5.2, at 254 nm and flow rate of 0.8ml/min. The mean retention time was 3.016minutes. The method was linear from 2–10μg/ml 25-DR with a correlation coefficient of 0.9978. Standard deviation, relative standard deviation and coefficient variation of 2, 6, 10μg/ml 25-DR were 0-0.0829, 03.1752, 0-0.0317%, respectively. The recovery of 5, 7, 9μg/ml25-DR was 80.8661, 91.3480 and 111.1457%, respectively. Limits of detection (LoD) and quantification (LoQ) were 0.51 and 1.7μg/ml, respectively. The method has fulfilled the validity guidelines of the International Conference on Harmonization (ICH) bioanalytical method which includes parameters of specificity, linearity, precision, accuracy, LoD, and LoQ. The developed method is suitable for pharmacokinetic analysis of various concentrations of 25-DR in human urine.

  8. Benchmarks for GADRAS performance validation

    International Nuclear Information System (INIS)

    Mattingly, John K.; Mitchell, Dean James; Rhykerd, Charles L. Jr.

    2009-01-01

    The performance of the Gamma Detector Response and Analysis Software (GADRAS) was validated by comparing GADRAS model results to experimental measurements for a series of benchmark sources. Sources for the benchmark include a plutonium metal sphere, bare and shielded in polyethylene, plutonium oxide in cans, a highly enriched uranium sphere, bare and shielded in polyethylene, a depleted uranium shell and spheres, and a natural uranium sphere. The benchmark experimental data were previously acquired and consist of careful collection of background and calibration source spectra along with the source spectra. The calibration data were fit with GADRAS to determine response functions for the detector in each experiment. A one-dimensional model (pie chart) was constructed for each source based on the dimensions of the benchmark source. The GADRAS code made a forward calculation from each model to predict the radiation spectrum for the detector used in the benchmark experiment. The comparisons between the GADRAS calculation and the experimental measurements are excellent, validating that GADRAS can correctly predict the radiation spectra for these well-defined benchmark sources.

  9. Quest to validate and define performance for the high volume metallic stator PCP at 250 degrees Celsius

    Energy Technology Data Exchange (ETDEWEB)

    Noonan, S.G. [ConocoPhillips Co., Houston, TX (United States); Klaczek, W.; Piers, K. [C-FER Technologies, Edmonton, AB (Canada); Seince, L. [PCM USA Inc., Houston, TX (United States); Jahn, S. [Kudu Industries, Calgary, AB (Canada)

    2008-10-15

    ConocoPhillips has been searching for a high volume artificial lift system that will reliably operate in a 250 degree Celsius downhole environment to meet the needs of steam assisted gravity drainage (SAGD) operations. This paper described the complexity of building and operating a high temperature flow loop rated for 250 degrees Celsius. It also described the lessons learned while upgrading an existing flow loop, from the initial design, procurement and construction through to the final commissioning phases. The paper described the issues encountered with the first artificial lit system tested at 250 degrees Celsius. The system consisted of a metallic progressing cavity pump system rated for 6919 barrels per day at 500 rotations per minute. The final upgraded capabilities of the flow loop were also listed. Images of the upgraded flow loop were also provided. It was concluded that the test program not only served to validate and define the pump's performance, but also provided valuable lessons on the completion configuration and operational procedures. Testing new artificial lift technology in a controlled flow loop, rather than in field installation, provided the opportunity to test these pumping systems under a large variety of conditions to truly understand the performance and limitations of each pump. 3 refs., 1 tab., 5 figs.

  10. Analytical Method Validation of High-Performance Liquid Chromatography and Stability-Indicating Study of Medroxyprogesterone Acetate Intravaginal Sponges

    Directory of Open Access Journals (Sweden)

    Nidal Batrawi

    2017-02-01

    Full Text Available Medroxyprogesterone acetate is widely used in veterinary medicine as intravaginal dosage for the synchronization of breeding cycle in ewes and goats. The main goal of this study was to develop reverse-phase high-performance liquid chromatography method for the quantification of medroxyprogesterone acetate in veterinary vaginal sponges. A single high-performance liquid chromatography/UV isocratic run was used for the analytical assay of the active ingredient medroxyprogesterone. The chromatographic system consisted of a reverse-phase C18 column as the stationary phase and a mixture of 60% acetonitrile and 40% potassium dihydrogen phosphate buffer as the mobile phase; the pH was adjusted to 5.6. The method was validated according to the International Council for Harmonisation (ICH guidelines. Forced degradation studies were also performed to evaluate the stability-indicating properties and specificity of the method. Medroxyprogesterone was eluted at 5.9 minutes. The linearity of the method was confirmed in the range of 0.0576 to 0.1134 mg/mL ( R 2 > 0.999. The limit of quantification was shown to be 3.9 µg/mL. Precision and accuracy ranges were found to be %RSD <0.2 and 98% to 102%, respectively. Medroxyprogesterone capacity factor value of 2.1, tailing factor value of 1.03, and resolution value of 3.9 were obtained in accordance with ICH guidelines. Based on the obtained results, a rapid, precise, accurate, sensitive, and cost-effective analysis procedure was proposed for quantitative determination of medroxyprogesterone in vaginal sponges. This analytical method is the only available method to analyse medroxyprogesterone in veterinary intravaginal dosage form.

  11. Optimized and validated high-performance liquid chromatography method for the determination of deoxynivalenol and aflatoxins in cereals.

    Science.gov (United States)

    Skendi, Adriana; Irakli, Maria N; Papageorgiou, Maria D

    2016-04-01

    A simple, sensitive and accurate analytical method was optimized and developed for the determination of deoxynivalenol and aflatoxins in cereals intended for human consumption using high-performance liquid chromatography with diode array and fluorescence detection and a photochemical reactor for enhanced detection. A response surface methodology, using a fractional central composite design, was carried out for optimization of the water percentage at the beginning of the run (X1, 80-90%), the level of acetonitrile at the end of gradient system (X2, 10-20%) with the water percentage fixed at 60%, and the flow rate (X3, 0.8-1.2 mL/min). The studied responses were the chromatographic peak area, the resolution factor and the time of analysis. Optimal chromatographic conditions were: X1 = 80%, X2 = 10%, and X3 = 1 mL/min. Following a double sample extraction with water and a mixture of methanol/water, mycotoxins were rapidly purified by an optimized solid-phase extraction protocol. The optimized method was further validated with respect to linearity (R(2) >0.9991), sensitivity, precision, and recovery (90-112%). The application to 23 commercial cereal samples from Greece showed contamination levels below the legally set limits, except for one maize sample. The main advantages of the developed method are the simplicity of operation and the low cost. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Development and validation of a web-based questionnaire for surveying the health and working conditions of high-performance marine craft populations.

    Science.gov (United States)

    de Alwis, Manudul Pahansen; Lo Martire, Riccardo; Äng, Björn O; Garme, Karl

    2016-06-20

    High-performance marine craft crews are susceptible to various adverse health conditions caused by multiple interactive factors. However, there are limited epidemiological data available for assessment of working conditions at sea. Although questionnaire surveys are widely used for identifying exposures, outcomes and associated risks with high accuracy levels, until now, no validated epidemiological tool exists for surveying occupational health and performance in these populations. To develop and validate a web-based questionnaire for epidemiological assessment of occupational and individual risk exposure pertinent to the musculoskeletal health conditions and performance in high-performance marine craft populations. A questionnaire for investigating the association between work-related exposure, performance and health was initially developed by a consensus panel under four subdomains, viz. demography, lifestyle, work exposure and health and systematically validated by expert raters for content relevance and simplicity in three consecutive stages, each iteratively followed by a consensus panel revision. The item content validity index (I-CVI) was determined as the proportion of experts giving a rating of 3 or 4. The scale content validity index (S-CVI/Ave) was computed by averaging the I-CVIs for the assessment of the questionnaire as a tool. Finally, the questionnaire was pilot tested. The S-CVI/Ave increased from 0.89 to 0.96 for relevance and from 0.76 to 0.94 for simplicity, resulting in 36 items in the final questionnaire. The pilot test confirmed the feasibility of the questionnaire. The present study shows that the web-based questionnaire fulfils previously published validity acceptance criteria and is therefore considered valid and feasible for the empirical surveying of epidemiological aspects among high-performance marine craft crews and similar populations. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted

  13. The Validity of Subjective Performance Measures

    DEFF Research Database (Denmark)

    Meier, Kenneth J.; Winter, Søren C.; O'Toole, Laurence J.

    2015-01-01

    to provide, and are highly policy specific rendering generalization difficult. But are perceptual performance measures valid, and do they generate unbiased findings? We examine these questions in a comparative study of middle managers in schools in Texas and Denmark. The findings are remarkably similar...

  14. Validation of a high performance liquid chromatography analysis for the determination of noradrenaline and adrenaline in human urine with an on-line sample purification

    DEFF Research Database (Denmark)

    Hansen, Åse Marie; Kristiansen, J; Nielsen, J L

    1999-01-01

    A high performance liquid chromatography (HPLC) method with fluorescence detection including an on-line purification was established for determination of catecholamines in human urine. The method was evaluated using samples of pooled urine spiked with catecholamines and validated for measurements...

  15. The Validity and Incremental Validity of Knowledge Tests, Low-Fidelity Simulations, and High-Fidelity Simulations for Predicting Job Performance in Advanced-Level High-Stakes Selection

    Science.gov (United States)

    Lievens, Filip; Patterson, Fiona

    2011-01-01

    In high-stakes selection among candidates with considerable domain-specific knowledge and experience, investigations of whether high-fidelity simulations (assessment centers; ACs) have incremental validity over low-fidelity simulations (situational judgment tests; SJTs) are lacking. Therefore, this article integrates research on the validity of…

  16. Method Development and Validation for the Determination of Caffeine: An Alkaloid from Coffea arabica by High-performance Liquid Chromatography Method.

    Science.gov (United States)

    Naveen, P; Lingaraju, H B; Deepak, M; Medhini, B; Prasad, K Shyam

    2018-01-01

    The present study was investigated to develop and validate a reversed phase high performance liquid chromatography method for the determination of caffeine from bean material of Coffee arabica. The separation was achieved on a reversed-phase C18 column using a mobile phase composed of water: methanol (50:50) at a flow rate of 1.0 mlmin-1. The detection was carried out on a UV detector at 272 nm. The developed method was validated according to the requirements for International Conference on Harmonisation (ICH) guidelines, which includes specificity, linearity, precision, accuracy, limit of detection and limit of quantitation. The developed method validates good linearity with excellent correlation coefficient (R2 > 0.999). In repeatability and intermediate precision, the percentage relative standard deviation (% RSD) of peak area was less than 1% shows high precision of the method. The recovery rate for caffeine was within 98.78% - 101.28% indicates high accuracy of the method. The low limit of detection and limit of quantitation of caffeine enable the detection and quantitation of caffeine from C. arabica at low concentrations. The developed HPLC method is a simple, rapid, precisely, accurately and widely accepted and it is recommended for efficient assays in routine work. A simple, accurate, and sensitive high-performance liquid chromatography (HPLC) method for caffeine from Coffea arabica has been developed and validated. The developed HPLC method was validated for linearity, specificity, precision, recovery, limits of detection, and limits of quantification by the International Conference on Harmonization guidelines. The results revealed that the proposed method is highly reliable. This method could be successfully applied for routine quality work analysis. Abbreviation Used: C. arabica : Coffee arabica, ICH: International Conference on Harmonisation, % RSD: Percentage Relative Standard Deviation, R2: Correlation Coefficient, ppm: Parts per million, LOD: Limits

  17. Determination of methyldibromoglutaronitrile in cosmetic products by high-performance liquid chromatography with electrochemical detection. Method validation

    DEFF Research Database (Denmark)

    Rastogi, Suresh Chandra; Zachariae, Claus; Johansen, Jeanne D

    2004-01-01

    An increased frequency of contact allergy to methyldibromoglutaronitrile (MDBGN), a commonly used preservative in cosmetics and other consumer products, has been reported in recent years. A high-performance liquid chromatography (HPLC) method for the determination of MDBGN in cosmetic products ha...

  18. Determination of methyldibromoglutaronitrile in cosmetic products by high-performance liquid chromatography with electrochemical detection. Method validation

    DEFF Research Database (Denmark)

    Rastogi, Suresh Chandra; Zachariae, Claus; Johansen, Jeanne D

    2004-01-01

    An increased frequency of contact allergy to methyldibromoglutaronitrile (MDBGN), a commonly used preservative in cosmetics and other consumer products, has been reported in recent years. A high-performance liquid chromatography (HPLC) method for the determination of MDBGN in cosmetic products has...

  19. Validation and application of a high-performance liquid chromatography--tandem mass spectrometry assay for mosapride in human plasma.

    Science.gov (United States)

    Ramakrishna, N V S; Vishwottam, K N; Manoj, S; Koteshwara, M; Chidambara, J; Varma, D P

    2005-09-01

    A simple, rapid, sensitive and specific liquid chromatography-tandem mass spectrometry method was developed and validated for quantification of mosapride (I), a novel and potent gastroprokinetic agent that enhances the upper gastrointestinal motility by stimulating 5-HT(4) receptor. The analyte and internal standard, tamsulosin (II), were extracted by liquid-liquid extraction with diethyl ether-dichloromethane (70:30, v/v) using a Glas-Col Multi-Pulse Vortexer. The chromatographic separation was performed on a reversed-phase Waters symmetry C(18) column with a mobile phase of 0.03% formic acid-acetonitrile (10:90, v/v). The protonated analyte was quantitated in positive ionization by multiple reaction monitoring with a mass spectrometer. The mass transitions m/z 422.3 -->198.3 and m/z 409.1 -->228.1 were used to measure I and II, respectively. The assay exhibited a linear dynamic range of 0.5-100.0 ng/mL for mosapride in human plasma. The lower limit of quantitation was 500 pg/mL with a relative standard deviation of less than 15%. Acceptable precision and accuracy were obtained for concentrations over the standard curve ranges. A run time of 2.0 min for each sample made it possible to analyze a throughput of more than 400 human plasma samples per day. The validated method has been successfully used to analyze human plasma samples for application in pharmacokinetic, bioavailability or bioequivalence studies. Copyright (c) 2005 John Wiley & Sons, Ltd.

  20. Quantitative Analysis of Ingenol in Euphorbia species via Validated Isotope Dilution Ultra-high Performance Liquid Chromatography Tandem Mass Spectrometry

    Czech Academy of Sciences Publication Activity Database

    Béres, T.; Dragull, K.; Pospíšil, Jiří; Tarkowská, Danuše; Dančák, M.; Bíba, Ondřej; Tarkowski, P.; Doležal, K.; Strnad, Miroslav

    2018-01-01

    Roč. 29, č. 1 (2018), s. 23-29 ISSN 0958-0344 R&D Projects: GA ČR GA17-14007S; GA MŠk(CZ) LO1204 Institutional support: RVO:61389030 Keywords : Euphorbia genus * ingenol * isotope-dilution method * mass spectrometry * ultra-high performance liquid chromatography Subject RIV: FD - Oncology ; Hematology OBOR OECD: Analytical chemistry Impact factor: 2.292, year: 2016

  1. SENTINEL-2 GLOBAL REFERENCE IMAGE VALIDATION AND APPLICATION TO MULTITEMPORAL PERFORMANCES AND HIGH LATITUDE DIGITAL SURFACE MODEL

    Directory of Open Access Journals (Sweden)

    A. Gaudel

    2017-05-01

    Full Text Available In the frame of the Copernicus program of the European Commission, Sentinel-2 is a constellation of 2 satellites with a revisit time of 5 days in order to have temporal images stacks and a global coverage over terrestrial surfaces. Satellite 2A was launched in June 2015, and satellite 2B will be launched in March 2017. In cooperation with the European Space Agency (ESA, the French space agency (CNES is in charge of the image quality of the project, and so ensures the CAL/VAL commissioning phase during the months following the launch. This cooperation is also extended to routine phase as CNES supports European Space Research Institute (ESRIN and the Sentinel-2 Mission performance Centre (MPC for validation in geometric and radiometric image quality aspects, and in Sentinel-2 GRI geolocation performance assessment whose results will be presented in this paper. The GRI is a set of S2A images at 10m resolution covering the whole world with a good and consistent geolocation. This ground reference enables accurate multi-temporal registration of refined Sentinel-2 products. While not primarily intended for the generation of DSM, Sentinel-2 swaths overlap between orbits would also allow for the generation of a complete DSM of land and ices over 60° of northern latitudes (expected accuracy: few S2 pixels in altimetry. This DSM would benefit from the very frequent revisit times of Sentinel-2, to monitor ice or snow level in area of frequent changes, or to increase measurement accuracy in areas of little changes.

  2. SENTINEL-2 Global Reference Image Validation and Application to Multitemporal Performances and High Latitude Digital Surface Model

    Science.gov (United States)

    Gaudel, A.; Languille, F.; Delvit, J. M.; Michel, J.; Cournet, M.; Poulain, V.; Youssefi, D.

    2017-05-01

    In the frame of the Copernicus program of the European Commission, Sentinel-2 is a constellation of 2 satellites with a revisit time of 5 days in order to have temporal images stacks and a global coverage over terrestrial surfaces. Satellite 2A was launched in June 2015, and satellite 2B will be launched in March 2017. In cooperation with the European Space Agency (ESA), the French space agency (CNES) is in charge of the image quality of the project, and so ensures the CAL/VAL commissioning phase during the months following the launch. This cooperation is also extended to routine phase as CNES supports European Space Research Institute (ESRIN) and the Sentinel-2 Mission performance Centre (MPC) for validation in geometric and radiometric image quality aspects, and in Sentinel-2 GRI geolocation performance assessment whose results will be presented in this paper. The GRI is a set of S2A images at 10m resolution covering the whole world with a good and consistent geolocation. This ground reference enables accurate multi-temporal registration of refined Sentinel-2 products. While not primarily intended for the generation of DSM, Sentinel-2 swaths overlap between orbits would also allow for the generation of a complete DSM of land and ices over 60° of northern latitudes (expected accuracy: few S2 pixels in altimetry). This DSM would benefit from the very frequent revisit times of Sentinel-2, to monitor ice or snow level in area of frequent changes, or to increase measurement accuracy in areas of little changes.

  3. Development and validation of reversed-phase high performance liquid chromatographic method for analysis of cephradine in human plasma samples

    International Nuclear Information System (INIS)

    Ahmad, M.; Usman, M.; Madni, A.; Akhtar, N.; Khalid, N.; Asghar, W.

    2010-01-01

    An HPLC method with high precision, accuracy and selectivity was developed and validated for the assessment of cephradine in human plasma samples. The extraction procedure was simple and accurate with single step followed by direct injection of sample into HPLC system. The extracted cephradine in spiked human plasma was separated and quantitated using reversed phase C/sub 18/ column and UV detection wavelength of 254 nm. The optimized mobile phase of new composition of 0.05 M potassium dihydrogen phosphate (pH 3.4)-acetonitrile (88: 12) was pumped at an optimum flow rate of 1 mL.min/sup 1/. The method resulted linearity in the concentration range 0.15- 20 micro g mL/sup -1/. The limit of detection (LOD) and limit of quantification (LOQ) were 0.05 and 0.150 Microg.mL/sup -1/, respectively. The accuracy of method was 98.68 %. This method can 1>e applied for bioequivalence studies and therapeutic drug monitoring as well as for the routine analysis of cephradine. (author)

  4. Quantification of imatinib in human serum: validation of a high-performance liquid chromatography-mass spectrometry method for therapeutic drug monitoring and pharmacokinetic assays

    Directory of Open Access Journals (Sweden)

    Rezende VM

    2013-08-01

    Full Text Available Vinicius Marcondes Rezende,1 Ariane Rivellis,1 Mafalda Megumi Yoshinaga Novaes,1 Dalton de Alencar Fisher Chamone,2 Israel Bendit1,21Laboratory of Tumor Biology, 2Department of Hematology, School of Medicine, University of São Paulo, São Paulo, BrazilBackground: Imatinib mesylate has been a breakthrough treatment for chronic myeloid leukemia. It has become the ideal tyrosine kinase inhibitor and the standard treatment for chronic-phase leukemia. Striking results have recently been reported, but intolerance to imatinib and noncompliance with treatment remain to be solved. Molecular monitoring by quantitative real-time polymerase chain reaction is the gold standard for monitoring patients, and imatinib blood levels have also become an important tool for monitoring.Methods: A fast and cheap method was developed and validated using high-performance liquid chromatography-mass spectrometry for quantification of imatinib in human serum and tamsulosin as the internal standard. Remarkable advantages of the method includes use of serum instead of plasma, less time spent on processing and analysis, simpler procedures, and requiring reduced amounts of biological material, solvents, and reagents. Stability of the analyte was also studied. This research also intended to drive the validation scheme in clinical centers. The method was validated according to the requirements of the US Food and Drug Administration and Brazilian National Health Surveillance Agency within the range of 0.500–10.0 µg/mL with a limit of detection of 0.155 µg/mL. Stability data for the analyte are also presented.Conclusion: Given that the validated method has proved to be linear, accurate, precise, and robust, it is suitable for pharmacokinetic assays, such as bioavailability and bioequivalence, and is being successfully applied in routine therapeutic drug monitoring in the hospital service.Keywords: imatinib, high-performance liquid chromatography-mass spectrometry, therapeutic

  5. Quantification of imatinib in human serum: validation of a high-performance liquid chromatography-mass spectrometry method for therapeutic drug monitoring and pharmacokinetic assays.

    Science.gov (United States)

    Rezende, Vinicius Marcondes; Rivellis, Ariane; Novaes, Mafalda Megumi Yoshinaga; de Alencar Fisher Chamone, Dalton; Bendit, Israel

    2013-01-01

    Imatinib mesylate has been a breakthrough treatment for chronic myeloid leukemia. It has become the ideal tyrosine kinase inhibitor and the standard treatment for chronic-phase leukemia. Striking results have recently been reported, but intolerance to imatinib and noncompliance with treatment remain to be solved. Molecular monitoring by quantitative real-time polymerase chain reaction is the gold standard for monitoring patients, and imatinib blood levels have also become an important tool for monitoring. A fast and cheap method was developed and validated using high-performance liquid chromatography-mass spectrometry for quantification of imatinib in human serum and tamsulosin as the internal standard. Remarkable advantages of the method includes use of serum instead of plasma, less time spent on processing and analysis, simpler procedures, and requiring reduced amounts of biological material, solvents, and reagents. Stability of the analyte was also studied. This research also intended to drive the validation scheme in clinical centers. The method was validated according to the requirements of the US Food and Drug Administration and Brazilian National Health Surveillance Agency within the range of 0.500-10.0 μg/mL with a limit of detection of 0.155 μg/mL. Stability data for the analyte are also presented. Given that the validated method has proved to be linear, accurate, precise, and robust, it is suitable for pharmacokinetic assays, such as bioavailability and bioequivalence, and is being successfully applied in routine therapeutic drug monitoring in the hospital service.

  6. Ibuprofen analysis in blood samples by palladium particles-impregnated sodium montmorillonite electrodes: Validation using high performance liquid chromatography.

    Science.gov (United States)

    Loudiki, A; Boumya, W; Hammani, H; Nasrellah, H; El Bouabi, Y; Zeroual, M; Farahi, A; Lahrich, S; Hnini, K; Achak, M; Bakasse, M; El Mhammedi, M A

    2016-12-01

    The electrochemical detection of ibuprofen has been studied on Palladium-Montmorillonite (Mt) modified carbon paste electrode using differential pulse voltammetry. The optimization of the modifier preparation and the instrumental parameters was investigated. The results indicate that ibuprofen oxidation was favored in the presence of Pd-PdO particles. The quantitative determination of ibuprofen was statistically analyzed and validated using HPLC method. The detection and quantification limits, specificity and precision were found to be acceptable. Finally, the developed method was successfully applied for ibuprofen determination in human blood samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Discovery and validation of a colorectal cancer classifier in a new blood test with improved performance for high-risk subjects

    DEFF Research Database (Denmark)

    Croner, Lisa J; Dillon, Roslyn; Kao, Athit

    2017-01-01

    BACKGROUND: The aim was to improve upon an existing blood-based colorectal cancer (CRC) test directed to high-risk symptomatic patients, by developing a new CRC classifier to be used with a new test embodiment. The new test uses a robust assay format-electrochemiluminescence immunoassays......, the indeterminate rate of the new panel was 23.2%, sensitivity/specificity was 0.80/0.83, PPV was 36.5%, and NPV was 97.1%. CONCLUSIONS: The validated classifier serves as the basis of a new blood-based CRC test for symptomatic patients. The improved performance, resulting from robust concentration measures across......-to quantify protein concentrations. The aim was achieved by building and validating a CRC classifier using concentration measures from a large sample set representing a true intent-to-test (ITT) symptomatic population. METHODS: 4435 patient samples were drawn from the Endoscopy II sample set. Samples were...

  8. The design, validation, and performance of Grace

    Directory of Open Access Journals (Sweden)

    Ru Zhu

    2016-05-01

    Full Text Available The design, validation and performance of Grace, a GPU-accelerated micromagnetic simulation software, are presented. The software adopts C+ + Accelerated Massive Parallelism (C+ + AMP so that it runs on GPUs from various hardware vendors including NVidia, AMD and Intel. At large simulation scales, up to two orders of magnitude of speedup factor is observed, compared to CPU-based micromagnetic simulation software OOMMF. The software can run on high-end professional GPUs as well as budget personal laptops, and is free to download.

  9. Validation of high-performance liquid chromatographic method for analysis of fluconazole in microemulsions and liquid crystals

    Directory of Open Access Journals (Sweden)

    Hilris Rocha e Silva

    2014-04-01

    Full Text Available In recent decades, there has been a significant increase in the incidence of fungal diseases. Certain fungal diseases cause cutaneous lesions and in the usual treatment, generally administred orally, the drug reaches the site of action with difficulty and its concentration is too low. An approach much explored in recent years is the development of nanotechnology-based drug delivery systems, and microemulsions (ME and liquid crystals (LC are promising. ME and LC were developed with oleic acid or copaiba oil as the oil phase, propoxyl (5OP ethoxyl (20 OE cetyl alcohol as surfactant and water. An analytical method to assess the incorporation of fluconazole (FLU in the systems under study was validated according to guidelines of the International Conference on Harmonization (ICH guidelines and the Brazilian Food, Drug and Sanitation Agency (ANVISA. The method was conducted on a C18-RP column (250 × 4.6 mm i.d., maintained at room temperature. The mobile phase consisted of acetonitrile and water (50:50, v/v, run at a flow rate of 1.0mL/min and using ultraviolet detection at 210nm. The chromatographic separation was obtained with a retention time of 6.3min, and was linear in the range of 20-400 µg/mL (r2=0.9999. The specificity showed no interference of the excipients. The accuracy was 100.76%. The limits of detection and quantitation were 0.057 and 0.172 µg.mL-1, respectively. Moreover, method validation demonstrated satisfactory results for precision and robustness. The proposed method was applied for the analysis of the incorporation of FLU in ME and LC, contributing to improve the quality control and to assure the therapeutic efficacy.

  10. Densitometric Validation and Optimisation of Polyphenols in Ocimum sanctum Linn by High Performance Thin-layer Chromatography.

    Science.gov (United States)

    U K, Ilyas; Katare, Deepshikha P; Aeri, Vidhu

    2015-01-01

    Ocimum sanctum Linn (Sanskrit: Tulasi; family: Libiaceae), popularly known as holy basil or Ocimum teinufolium, is found throughout the semitropical and tropical parts of India. In Ayurveda, Tulasi has been well known for its therapeutic potentials. To optimise and develop a standard method to quantify seven polyphenols simultaneously by HPTLC. A three-level factor Box-Behnken statistical design was used for optimisation, where extraction time (min), temperature (°C) and methanol:water ratio (% v/v) are the independent variables with polyphenols as the dependent variable. The separation was archived on a silica-gel 60 F254 HPTLC plate using toluene:ethyl acetate:formic acid:methanol (3:3:0.8:0.2 v/v) as the mobile phase. Densitometric analysis of polyphenols was carried out in the absorbance mode at 366 nm. The quantification of polyphenols was carried out based on peak area with a linear calibration curve at concentration ranges of 60-240, 20-200, 100-1600, 40-200, 200-1400, 10-160, 200-1400, 100-5000 ng/band for caffeic acid, ellagic acid, rutin, kaempferol, catechin, quercetin, eupalitin and epicatechin respectively. The method was validated for peak purity, precision, accuracy, limit of detection (LOD) and quantification (LOQ). Method specificity was confirmed using the retention factor value and visible spectra correlation of marker compounds. A validated HPTLC method was newly developed for simultaneous quantification of seven polyphenols in an Ayurvedic preparation of O. sanctum. The proposed method is simple, precise, specific, accurate, cost-effective, less time consuming and has the ability to separate the polyphenols from other constituents. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Rapid Development and Validation of Improved Reversed-Phase High-performance Liquid Chromatography Method for the Quantification of Mangiferin, a Polyphenol Xanthone Glycoside in Mangifera indica.

    Science.gov (United States)

    Naveen, P; Lingaraju, H B; Prasad, K Shyam

    2017-01-01

    Mangiferin, a polyphenolic xanthone glycoside from Mangifera indica , is used as traditional medicine for the treatment of numerous diseases. The present study was aimed to develop and validate a reversed-phase high-performance liquid chromatography (RP-HPLC) method for the quantification of mangiferin from the bark extract of M. indica . RP-HPLC analysis was performed by isocratic elution with a low-pressure gradient using 0.1% formic acid: acetonitrile (87:13) as a mobile phase with a flow rate of 1.5 ml/min. The separation was done at 26°C using a Kinetex XB-C18 column as stationary phase and the detection wavelength at 256 nm. The proposed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification, and robustness by the International Conference on Harmonisation guidelines. In linearity, the excellent correlation coefficient more than 0.999 indicated good fitting of the curve and also good linearity. The intra- and inter-day precision showed high reliability and reproducibility of the method. The recovery values at three different levels (50%, 100%, and 150%) of spiked samples were found to be 100.47, 100.89, and 100.99, respectively, and low standard deviation value high accuracy of the method. In robustness, the results remain unaffected by small variation in the analytical parameters, which shows the robustness of the method. Liquid chromatography-mass spectrometry analysis confirmed the presence of mangiferin with M/Z value of 421. The assay developed by HPLC method is a simple, rapid, and reliable for the determination of mangiferin from M. indica . The present study was intended to develop and validate an RP-HPLC method for the quantification of mangiferin from the bark extract of M. indica . The developed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification and robustness by International Conference on Harmonization guidelines. This study proved that the developed

  12. Development and validation of a high-performance liquid chromatographic method for the determination of cyproterone acetate in human skin.

    Science.gov (United States)

    Henry de Hassonville, Sandrine; Chiap, Patrice; Liégeois, Jean-François; Evrard, Brigitte; Delattre, Luc; Crommen, Jacques; Piel, Géraldine; Hubert, Philippe

    2004-09-21

    In the framework of a preliminary study on the transdermal penetration of cyproterone acetate (CPA), a simple and rapid procedure involving an extraction step coupled to a HPLC-UV determination has been developed for the separation and quantification of CPA in the two main skin layers-epidermis and dermis-after local application. The separation of epidermis and dermis layers was carefully carried out by means of a sharp spatula after skin immersion in heated water at 65 degrees C. The two skin layers were then treated separately according to the same process: (1) sample homogenization by vibration after freezing with liquid nitrogen in a Mikro-Dismembrator; (2) CPA extraction with methanol after addition of the internal standard (betamethasone dipropionate); (3) centrifugation; (4) evaporation of a supernatant aliquot; (5) dissolution of the dry residue in methanol and addition of water; (6) centrifugation; (7) injection of a supernatant aliquot into the HPLC system. The separation was achieved on octadecylsilica stationary phase using a mobile phase consisting in a mixture of acetonitrile and water (40:60 (v/v)). The method was then validated using a new approach based on accuracy profiles over a CPA concentration range from 33 to 667 ng/ml for each skin layer. Finally, the method was successfully applied to the determination of CPA to several skin samples after topical application of different gel formulations containing CPA.

  13. Validated high-performance liquid chromatographic method for the standardisation of Ptychopetalum olacoides Benth., Olacaceae, commercial extracts

    Directory of Open Access Journals (Sweden)

    Renata Colombo

    2010-10-01

    Full Text Available Ptychopetalum olacoides Benth., Olacaceae, popularly known as marapuama or muirapuama or miriantã, is a species native to the Amazonian region of Brazil. Extracts of the bark of the plant have been used traditionally for its stimulating and aphrodisiac properties and currently commercialised by the herbal industry as constituents in a wide range of phytomedicines. Fractionation by open column chromatography followed by preparative HPLC-UV/PAD of the stem bark and of three commercial extracts of P. olacoides allowed the isolation of three components that were common to all extracts analysed, and these were identified by NMR to be vanillic acid, protocatechuic acid and theobromine. Vanillic acid, which has been proposed as a phytochemical marker for P. olacoides, was employed as an external standard in the development and validation of a rapid qualitative and quantitative HPLC assay for the analyte. The recoveries values of the developed method were 99.02% and the LOD and LOQ values were 0.033 and 0.11 mg.L-1, respectively. The described method may be applied to the standardisation of herbs, extracts or phytomedicines commercialised as marapuama.

  14. Validating High-Stakes Testing Programs.

    Science.gov (United States)

    Kane, Michael

    2002-01-01

    Makes the point that the interpretations and use of high-stakes test scores rely on policy assumptions about what should be taught and the content standards and performance standards that should be applied. The assumptions built into an assessment need to be subjected to scrutiny and criticism if a strong case is to be made for the validity of the…

  15. A simple, rapid and validated high-performance liquid chromatography method suitable for clinical measurements of human mercaptalbumin and non-mercaptalbumin.

    Science.gov (United States)

    Yasukawa, Keiko; Shimosawa, Tatsuo; Okubo, Shigeo; Yatomi, Yutaka

    2018-01-01

    Background Human mercaptalbumin and human non-mercaptalbumin have been reported as markers for various pathological conditions, such as kidney and liver diseases. These markers play important roles in redox regulations throughout the body. Despite the recognition of these markers in various pathophysiologic conditions, the measurements of human mercaptalbumin and non-mercaptalbumin have not been popular because of the technical complexity and long measurement time of conventional methods. Methods Based on previous reports, we explored the optimal analytical conditions for a high-performance liquid chromatography method using an anion-exchange column packed with a hydrophilic polyvinyl alcohol gel. The method was then validated using performance tests as well as measurements of various patients' serum samples. Results We successfully established a reliable high-performance liquid chromatography method with an analytical time of only 12 min per test. The repeatability (within-day variability) and reproducibility (day-to-day variability) were 0.30% and 0.27% (CV), respectively. A very good correlation was obtained with the results of the conventional method. Conclusions A practical method for the clinical measurement of human mercaptalbumin and non-mercaptalbumin was established. This high-performance liquid chromatography method is expected to be a powerful tool enabling the expansion of clinical usefulness and ensuring the elucidation of the roles of albumin in redox reactions throughout the human body.

  16. Bio-analytical method development and validation of Rasagiline by high performance liquid chromatography tandem mass spectrometry detection and its application to pharmacokinetic study

    Directory of Open Access Journals (Sweden)

    Ravi Kumar Konda

    2012-10-01

    Full Text Available The most suitable bio-analytical method based on liquid–liquid extraction has been developed and validated for quantification of Rasagiline in human plasma. Rasagiline-13C3 mesylate was used as an internal standard for Rasagiline. Zorbax Eclipse Plus C18 (2.1 mm×50 mm, 3.5 μm column provided chromatographic separation of analyte followed by detection with mass spectrometry. The method involved simple isocratic chromatographic condition and mass spectrometric detection in the positive ionization mode using an API-4000 system. The total run time was 3.0 min. The proposed method has been validated with the linear range of 5–12000 pg/mL for Rasagiline. The intra-run and inter-run precision values were within 1.3%–2.9% and 1.6%–2.2% respectively for Rasagiline. The overall recovery for Rasagiline and Rasagiline-13C3 mesylate analog was 96.9% and 96.7% respectively. This validated method was successfully applied to the bioequivalence and pharmacokinetic study of human volunteers under fasting condition. Keywords: High performance liquid chromatography, Mass spectrometry, Rasagiline, Liquid–liquid extraction

  17. Modification and validation of a high-performance liquid chromatography method for quantification of Huperzine A in Huperzia crispata.

    Science.gov (United States)

    Yu, Lijun; Shi, Yunfeng; Huang, Jianan; Gong, Yushun; Liu, Zhonghua; Hu, Weixin

    2010-01-01

    The present study describes a rapid and sensitive HPLC method for the quantification of huperzine A (HupA) in Huperzia crispata (Huperziaceae). The sample extraction and preparation involved a simple, time-saving, single-solvent extraction, with each sample being analyzed within 12 min. The mobile phase was ammonium acetate (0.1 M, pH 6.0)--methanol (64 + 36, v/v) at a flow rate of 1.0 mL/min. Detection was at 308 nm. The calibration curve was linear from 0.049 to 7.84 microg (R2 = 0.9997), with intraday and interday precision RSD of less than 2%. The extraction recovery rate was over 98.49%. Quantification of HupA was performed using this modified method, and the content of HupA was 1.86 times higher in the whole plant of H. crispata (218.17 +/- 1.55 microg/g) than in that of H. serrata (117.03 +/- 2.97 microg/g). In the whole plant of H. crispata, HupA mainly accumulated in the actively growing shoot tips, the apical bud, and the 10 youngest leaves, reaching 455.23 +/- 2.97 microg/g. The content of HupA in the samples from sunshine-sheltered sites was 3.45 times higher than in that from sunshine-abundant sites. The satisfactory results indicate that this modified method can be applied in the quality control of large-scale Huperziaceae plant extracts and that changes should be made in the cultivation of H. crispata so as to maximize the production of HupA.

  18. Development and validation of stability indicating method for the quantitative determination of venlafaxine hydrochloride in extended release formulation using high performance liquid chromatography

    Directory of Open Access Journals (Sweden)

    Jaspreet Kaur

    2010-01-01

    Full Text Available Objective : Venlafaxine,hydrochloride is a structurally novel phenethyl bicyclic antidepressant, and is usually categorized as a serotonin-norepinephrine reuptake inhibitor (SNRI but it has been referred to as a serotonin-norepinephrine-dopamine reuptake inhibitor. It inhibits the reuptake of dopamine. Venlafaxine HCL is widely prescribed in the form of sustained release formulations. In the current article we are reporting the development and validation of a fast and simple stability indicating, isocratic high performance liquid chromatographic (HPLC method for the determination of venlafaxine hydrochloride in sustained release formulations. Materials and Methods : The quantitative determination of venlafaxine hydrochloride was performed on a Kromasil C18 analytical column (250 x 4.6 mm i.d., 5 μm particle size with 0.01 M phosphate buffer (pH 4.5: methanol (40: 60 as a mobile phase, at a flow rate of 1.0 ml/min. For HPLC methods, UV detection was made at 225 nm. Results : During method validation, parameters such as precision, linearity, accuracy, stability, limit of quantification and detection and specificity were evaluated, which remained within acceptable limits. Conclusions : The method has been successfully applied for the quantification and dissolution profiling of Venlafaxine HCL in sustained release formulation. The method presents a simple and reliable solution for the routine quantitative analysis of Venlafaxine HCL.

  19. Analytical Method Development and Validation for the Simultaneous Estimation of Abacavir and Lamivudine by Reversed-phase High-performance Liquid Chromatography in Bulk and Tablet Dosage Forms.

    Science.gov (United States)

    Raees Ahmad, Sufiyan Ahmad; Patil, Lalit; Mohammed Usman, Mohammed Rageeb; Imran, Mohammad; Akhtar, Rashid

    2018-01-01

    A simple rapid, accurate, precise, and reproducible validated reverse phase high performance liquid chromatography (HPLC) method was developed for the determination of Abacavir (ABAC) and Lamivudine (LAMI) in bulk and tablet dosage forms. The quantification was carried out using Symmetry Premsil C18 (250 mm × 4.6 mm, 5 μm) column run in isocratic way using mobile phase comprising methanol: water (0.05% orthophosphoric acid with pH 3) 83:17 v/v and a detection wavelength of 245 nm and injection volume of 20 μl, with a flow rate of 1 ml/min. In the developed method, the retention times of ABAC and LAMI were found to be 3.5 min and 7.4 min, respectively. The method was validated in terms of linearity, precision, accuracy, limits of detection, limits of quantitation, and robustness in accordance with the International Conference on Harmonization guidelines. The assay of the proposed method was found to be 99% - 101%. The recovery studies were also carried out and mean % recovery was found to be 99% - 101%. The % relative standard deviation from reproducibility was found to be performance liquid chromatography, UV: Ultraviolet, ICH: International Conference on Harmonization, ABAC: Abacavir, LAMI: Lamivudine, HIV: Human immunodeficiency virus, AIDS: Acquired immunodeficiency syndrome, NRTI: Nucleoside reverse transcriptase inhibitors, ARV: Antiretroviral, RSD: Relative standard deviation, RT: Retention time, SD: Standard deviation.

  20. Experimental validation of prototype high voltage bushing

    Science.gov (United States)

    Shah, Sejal; Tyagi, H.; Sharma, D.; Parmar, D.; M. N., Vishnudev; Joshi, K.; Patel, K.; Yadav, A.; Patel, R.; Bandyopadhyay, M.; Rotti, C.; Chakraborty, A.

    2017-08-01

    Prototype High voltage bushing (PHVB) is a scaled down configuration of DNB High Voltage Bushing (HVB) of ITER. It is designed for operation at 50 kV DC to ensure operational performance and thereby confirming the design configuration of DNB HVB. Two concentric insulators viz. Ceramic and Fiber reinforced polymer (FRP) rings are used as double layered vacuum boundary for 50 kV isolation between grounded and high voltage flanges. Stress shields are designed for smooth electric field distribution. During ceramic to Kovar brazing, spilling cannot be controlled which may lead to high localized electrostatic stress. To understand spilling phenomenon and precise stress calculation, quantitative analysis was performed using Scanning Electron Microscopy (SEM) of brazed sample and similar configuration modeled while performing the Finite Element (FE) analysis. FE analysis of PHVB is performed to find out electrical stresses on different areas of PHVB and are maintained similar to DNB HV Bushing. With this configuration, the experiment is performed considering ITER like vacuum and electrical parameters. Initial HV test is performed by temporary vacuum sealing arrangements using gaskets/O-rings at both ends in order to achieve desired vacuum and keep the system maintainable. During validation test, 50 kV voltage withstand is performed for one hour. Voltage withstand test for 60 kV DC (20% higher rated voltage) have also been performed without any breakdown. Successful operation of PHVB confirms the design of DNB HV Bushing. In this paper, configuration of PHVB with experimental validation data is presented.

  1. Development and validation of a high-performance liquid chromatography method for the quantification of talazoparib in rat plasma: Application to plasma protein binding studies.

    Science.gov (United States)

    Hidau, Mahendra Kumar; Kolluru, Srikanth; Palakurthi, Srinath

    2018-02-01

    A sensitive and selective RP-HPLC method has been developed and validated for the quantification of a highly potent poly ADP ribose polymerase inhibitor talazoparib (TZP) in rat plasma. Chromatographic separation was performed with isocratic elution method. Absorbance for TZP was measured with a UV detector (SPD-20A UV-vis) at a λ max of 227 nm. Protein precipitation was used to extract the drug from plasma samples using methanol-acetonitrile (65:35) as the precipitating solvent. The method proved to be sensitive and reproducible over a 100-2000 ng/mL linearity range with a lower limit of quantification (LLQC) of 100 ng/mL. TZP recovery was found to be >85%. Following analytical method development and validation, it was successfully employed to determine the plasma protein binding of TZP. TZP has a high level of protein binding in rat plasma (95.76 ± 0.38%) as determined by dialysis method. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Development and validation of a dissolution test with reversed-phase high performance liquid chromatographic analysis for Candesartan cilexetil in tablet dosage forms

    Directory of Open Access Journals (Sweden)

    Vairappan Kamalakkannan

    2016-09-01

    Full Text Available A simple, rapid, selective and reproducible reversed-phase high performance liquid chromatographic (RP-HPLC method has been developed and validated for the estimation of release of Candesartan cilexetil (CC in tablets. Analysis was performed on an Agilent, Zorbax C8 column (150mm × 4.6mm, 5μm with the mobile phase consisting of phosphate buffer (pH2.5–acetonitrile (15:85, v/v at a flow rate of 1.0mL/min. UV detection was performed at 215nm and the retention time for CC was 2.2. The calibration curve was linear (correlation coefficient = 1.000 in the selected range of analyte. The optimized dissolution conditions include the USP apparatus 2 at a paddle rotation rate of 50rpm and 900mL of phosphate buffer (pH7.2 with 0.03% of polysorbate 80 as dissolution medium, at 37.0 ± 0.5°C. The method was validated for precision, linearity, specificity, accuracy, limit of quantitation and ruggedness. The system suitability parameters, such as theoretical plate, tailing factor and relative standard deviation (RSD between six standard replicates were well within the limits. The stability result shows that the drug is stable in the prescribed dissolution medium. Three different batches (A, B and C of the formulation containing 8mg of Candesartan cilexetil was performed with the developed method and the results showed no significant differences among the batches.

  3. Analysis of new psychoactive substances in human urine by ultra-high performance supercritical fluid and liquid chromatography: Validation and comparison.

    Science.gov (United States)

    Borovcová, Lucie; Pauk, Volodymyr; Lemr, Karel

    2018-05-01

    New psychoactive substances represent serious social and health problem as tens of new compounds are detected in Europe annually. They often show structural proximity or even isomerism, which complicates their analysis. Two methods based on ultra high performance supercritical fluid chromatography and ultra high performance liquid chromatography with mass spectrometric detection were validated and compared. A simple dilute-filter-and-shoot protocol utilizing propan-2-ol or methanol for supercritical fluid or liquid chromatography, respectively, was proposed to detect and quantify 15 cathinones and phenethylamines in human urine. Both methods offered fast separation (chromatography. Limits of detection in urine ranged from 0.01 to 2.3 ng/mL, except for cathinone (5 ng/mL) in supercritical fluid chromatography. Nevertheless, this technique distinguished all analytes including four pairs of isomers, while liquid chromatography was unable to resolve fluoromethcathinone regioisomers. Concerning matrix effects and recoveries, supercritical fluid chromatography produced more uniform results for different compounds and at different concentration levels. This work demonstrates the performance and reliability of supercritical fluid chromatography and corroborates its applicability as an alternative tool for analysis of new psychoactive substances in biological matrixes. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Validated high-performance anion-exchange chromatography with pulsed amperometric detection method for the determination of residual keratan sulfate and other glucosamine impurities in sodium chondroitin sulfate.

    Science.gov (United States)

    Bottelli, Susanna; Grillo, Gianluca; Barindelli, Edoardo; Nencioni, Alessandro; Di Maria, Alessandro; Fossati, Tiziano

    2017-07-07

    An efficient and sensitive analytical method based on high-performance anion exchange chromatography with pulsed amperometric detection (HPAEC-PAD) was devised for the determination of glucosamine (GlcN) in sodium chondroitin sulfate (CS). Glucosamine (GlcN) is intended as marker of residual keratan sulfate (KS) and other impurities generating glucosamine by acidic hydrolyzation. The latter brings CS and KS to their respective monomers. Since GlcN is present only in KS we developed a method that separates GlcN from GalN, the principal hydrolytic product of CS, and then we validated it in order to quantify GlcN. Method validation was performed by spiking CS raw material with known amounts of KS. Detection limit was 0.5% of KS in CS (corresponding to 0.1μg/ml), and the linear range was 0.5-5% of KS in CS (corresponding to 0.1-1μg/ml). The optimized analysis was carried out on an ICS-5000 system (Dionex, Sunnyvale, CA, USA) equipped with a Dionex Amino Trap guard column (3mm×30mm), Dionex CarboPac-PA20 (3mm×30mm) and a Dionex CarboPac-PA20 analytical column (3mm×150mm) using gradient elution at a 0.5ml/min flow rate. Regression equations revealed good linear relationship (R 2 =0.99, n=5) within the test ranges. Quality parameters, including precision and accuracy, were fully validated and found to be satisfactory. The fully validated HPAEC-PAD method was readily applied for the quantification of residual KS in CS in several raw materials and USP/EP reference substance. Results confirmed that the HPAEC-PAD method is more specific than the electrophoretic method for related substance reported in EP and provides sensitive determination of KS in acid-hydrolyzed CS samples, enabling the quantitation of KS and other impurities (generating glucosamine) in CS. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Validated high-performance thin-layer chromatographic (HPTLC method for simultaneous determination of nadifloxacin, mometasone furoate, and miconazole nitrate cream using fractional factorial design

    Directory of Open Access Journals (Sweden)

    Kalpana G. Patel

    2016-07-01

    Full Text Available A high-performance thin-layer chromatographic method for simultaneous determination of nadifloxacin, mometasone furoate, and miconazole nitrate was developed and validated as per International Conference on Harmonization guidelines. High-performance thin-layer chromatographic separation was performed on aluminum plates precoated with silica gel 60F254 and methanol:ethyl acetate:toluene: acetonitrile:3M ammonium formate in water (1:2.5:6.0:0.3:0.2, % v/v as optimized mobile phase at detection wavelength of 224 nm. The retardation factor (Rf values for nadifloxacin, mometasone furoate, and miconazole nitrate were 0.23, 0.70, and 0.59, respectively. Percent recoveries in terms of accuracy for the marketed formulation were found to be 98.35–99.76%, 99.36–99.65%, and 99.16–100.25% for nadifloxacin, mometasone furoate, and miconazole nitrate, respectively. The pooled percent relative standard deviation for repeatability and intermediate precision studies was found to be < 2% for three target analytes. The effect of four independent variables, methanol content in total mobile phase, wavelength, chamber saturation time, and solvent front, was evaluated by fractional factorial design for robustness testing. Amongst all four factors, volume of methanol in mobile phase appeared to have a possibly significant effect on retention factor of miconazole nitrate compared with the other two drugs nadifloxacin and mometasone furoate, and therefore it was important to be carefully controlled. In summary, a novel, simple, accurate, reproducible, and robust high-performance thin-layer chromatographic method was developed, which would be of use in quality control of these cream formulations.

  6. Development and validation of a high-performance liquid chromatography method with post-column derivatization for the detection of aflatoxins in cereals and grains.

    Science.gov (United States)

    Asghar, Muhammad Asif; Iqbal, Javed; Ahmed, Aftab; Khan, Mobeen Ahmed; Shamsuddin, Zuzzer Ali; Jamil, Khalid

    2016-06-01

    A novel, reliable and rapid high-performance liquid chromatography (HPLC) method with post-column derivatization was developed and validated. The HPLC method was used for the simultaneous determination of aflatoxin B1 (AFB1), B2 (AFB2), G1 (AFG1) and G2 (AFG2) in various cereals and grains. Samples were extracted with 80:20 (v/v) methanol:water and purified using C18 (40-63 μm) solid-phase extraction cartridges. AFs were separated using a LiChroCART-RP-18 (5 μm, 250 × 4.0 mm(2)) column. The mobile phase consisted of methanol:acetonitrile:buffer (17.5:17.5:65 v/v) (pH 7.4) delivered at the flow rate of 1.0 mL min(-1) The fluorescence of each AF was detected at λex = 365 nm and λem = 435 nm. All four AFs were properly resolved within the total run time of 20 min. The established method was extensively validated as a final verification of the method development by the evaluation of selectivity (AFB1, AFB2, AFG1 and AFG2), linearity (R(2) ≥ 0.9994), precision (average SD ≤ 2.79), accuracy (relative mean error ≤ -5.51), robustness (p HPLC method could be effectively applied for the routine analysis of the AFs in different cereals and grains. © The Author(s) 2014.

  7. Development, validation, and application of a method for selected avermectin determination in rural waters using high performance liquid chromatography and fluorescence detection.

    Science.gov (United States)

    Lemos, Maria Augusta Travassos; Matos, Camila Alves; de Resende, Michele Fabri; Prado, Rachel Bardy; Donagemma, Raquel Andrade; Netto, Annibal Duarte Pereira

    2016-11-01

    Avermectins (AVM) are macrocyclic lactones used in livestock and agriculture. A quantitative method of high performance liquid chromatography with fluorescence detection for the determination of eprinomectin, abamectin, doramectin and ivermectin in rural water samples was developed and validated. The method was employed to study samples collected in the Pito Aceso River microbasin, located in the Bom Jardim municipality, Rio de Janeiro State, Brazil. Samples were extracted by solid phase extraction using a polymeric stationary phase, the eluted fraction was re-concentrated under a gentle N2 flow and derivatized to allow AVM determination using liquid chromatography with fluorescence detection. The excitation and emission wavelengths of the derivatives were 365 and 470nm, respectively, and a total chromatographic run of 12min was achieved. Very low limits of quantification (22-58ngL(-1)) were found after re-concentration using N2. Recovery values varied from 85.7% to 119.2% with standard deviations between 1.2% and 10.2%. The validated method was applied in the determination of AVM in 15 water samples collected in the Pito Aceso River microbasin, but most of them were free of AVM or showed only trace levels of these compounds, except for a sample that contained doramectin (9.11µgL(-1)). The method is suitable for routine analysis with satisfactory recovery, sensitivity, and selectivity. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. A newly validated high-performance liquid chromatography method with diode array ultraviolet detection for analysis of the antimalarial drug primaquine in the blood plasma.

    Science.gov (United States)

    Carmo, Ana Paula Barbosa do; Borborema, Manoella; Ribeiro, Stephan; De-Oliveira, Ana Cecilia Xavier; Paumgartten, Francisco Jose Roma; Moreira, Davyson de Lima

    2017-01-01

    Primaquine (PQ) diphosphate is an 8-aminoquinoline antimalarial drug with unique therapeutic properties. It is the only drug that prevents relapses of Plasmodium vivax or Plasmodium ovale infections. In this study, a fast, sensitive, cost-effective, and robust method for the extraction and high-performance liquid chromatography with diode array ultraviolet detection (HPLC-DAD-UV ) analysis of PQ in the blood plasma was developed and validated. After plasma protein precipitation, PQ was obtained by liquid-liquid extraction and analyzed by HPLC-DAD-UV with a modified-silica cyanopropyl column (250mm × 4.6mm i.d. × 5μm) as the stationary phase and a mixture of acetonitrile and 10mM ammonium acetate buffer (pH = 3.80) (45:55) as the mobile phase. The flow rate was 1.0mL·min-1, the oven temperature was 50OC, and absorbance was measured at 264nm. The method was validated for linearity, intra-day and inter-day precision, accuracy, recovery, and robustness. The detection (LOD) and quantification (LOQ) limits were 1.0 and 3.5ng·mL-1, respectively. The method was used to analyze the plasma of female DBA-2 mice treated with 20mg.kg-1 (oral) PQ diphosphate. By combining a simple, low-cost extraction procedure with a sensitive, precise, accurate, and robust method, it was possible to analyze PQ in small volumes of plasma. The new method presents lower LOD and LOQ limits and requires a shorter analysis time and smaller plasma volumes than those of previously reported HPLC methods with DAD-UV detection. The new validated method is suitable for kinetic studies of PQ in small rodents, including mouse models for the study of malaria.

  9. A newly validated high-performance liquid chromatography method with diode array ultraviolet detection for analysis of the antimalarial drug primaquine in the blood plasma

    Directory of Open Access Journals (Sweden)

    Ana Paula Barbosa do Carmo

    Full Text Available Abstract INTRODUCTION: Primaquine (PQ diphosphate is an 8-aminoquinoline antimalarial drug with unique therapeutic properties. It is the only drug that prevents relapses of Plasmodium vivax or Plasmodium ovale infections. In this study, a fast, sensitive, cost-effective, and robust method for the extraction and high-performance liquid chromatography with diode array ultraviolet detection (HPLC-DAD-UV analysis of PQ in the blood plasma was developed and validated. METHODS: After plasma protein precipitation, PQ was obtained by liquid-liquid extraction and analyzed by HPLC-DAD-UV with a modified-silica cyanopropyl column (250mm × 4.6mm i.d. × 5μm as the stationary phase and a mixture of acetonitrile and 10mM ammonium acetate buffer (pH = 3.80 (45:55 as the mobile phase. The flow rate was 1.0mL·min-1, the oven temperature was 50OC, and absorbance was measured at 264nm. The method was validated for linearity, intra-day and inter-day precision, accuracy, recovery, and robustness. The detection (LOD and quantification (LOQ limits were 1.0 and 3.5ng·mL-1, respectively. The method was used to analyze the plasma of female DBA-2 mice treated with 20mg.kg-1 (oral PQ diphosphate. RESULTS: By combining a simple, low-cost extraction procedure with a sensitive, precise, accurate, and robust method, it was possible to analyze PQ in small volumes of plasma. The new method presents lower LOD and LOQ limits and requires a shorter analysis time and smaller plasma volumes than those of previously reported HPLC methods with DAD-UV detection. CONCLUSIONS: The new validated method is suitable for kinetic studies of PQ in small rodents, including mouse models for the study of malaria.

  10. A validated stability indicating high-performance liquid chromatographic method for simultaneous estimation of cefuroxime sodium and sulbactam sodium in injection dosage form

    Directory of Open Access Journals (Sweden)

    Falguni M Patel

    2012-01-01

    Full Text Available Background: A fixed dose combination of cefuroxime sodium (β lactam antibiotic and sulbactam sodium (β Lactamase inhibitor is used in ratio of 2:1 as powder for injection for the treatment of resistant lower respiratory tract and other infections. Aims: A simple, precise, and accurate ion-pair reverse-phase high-performance liquid chromatography (RP-HPLC method was developed and validated for determination of cefuroxime Na(CEF and sulbactam Na(SUL in injection. Materials and Methods: Isocratic RP-HPLC separation was achieved on an ACE C 18 column (150×4.6 mm id, 5 μm particle size using the mobile phase 0.002 M tetrabutylammonium hydroxide sulfate (TBAH in 10 mm potassium di-hydrogen phosphate buffer-acetonitrile (86:14 v/v, pH 3.7 at a flow rate of 1.0 ml/min. Results and Conclusion: The retention time of sulbactam Na and cefuroxime Na were 3.2 min and 10.2 min, respectively. The ion-pairing reagent improved the retention of highly polar sulbactam Na on reverse-phase column. The detection was performed at 210 nm. The method was validated for linearity, precision, accuracy, robustness, solution stability, and specificity. The method was validated for linearity, precision, accuracy, robustness, solution stability, and specificity. The method was linear in the concentration range of 10-100 μg/ml for cefuroxime Na and 5-50 μg/ml for sulbactam Na, with a correlation coefficient of 0.9999 and 0.9998 for the respective drugs. The intraday precision was 0.13-0.21% and 0.48-0.65%, and the interday precision was 0.32-0.81% and 0.60-0.83% for cefuroxime Na and sulbactam Na, respectively. The accuracy (recovery was found to be in the range of 98.76-100.61% and 98.99-100.30% for cefuroxime Na and sulbactam Na, respectively. The drugs were found to degrade under hydrolytic and oxidative conditions. The drugs could be effectively separated from different degradation products, and hence the method can be used for stability analysis.

  11. Development and validation of an ultra-high performance liquid chromatography-tandem mass spectrometry method to measure creatinine in human urine.

    Science.gov (United States)

    Fraselle, S; De Cremer, K; Coucke, W; Glorieux, G; Vanmassenhove, J; Schepers, E; Neirynck, N; Van Overmeire, I; Van Loco, J; Van Biesen, W; Vanholder, R

    2015-04-15

    Despite decades of creatinine measurement in biological fluids using a large variety of analytical methods, an accurate determination of this compound remains challenging. Especially with the novel trend to assess biomarkers on large sample sets preserved in biobanks, a simple and fast method that could cope with both a high sample throughput and a low volume of sample is still of interest. In answer to these challenges, a fast and accurate ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method was developed to measure creatinine in small volumes of human urine. In this method, urine samples are simply diluted with a basic mobile phase and injected directly under positive electrospray ionization (ESI) conditions, without further purification steps. The combination of an important diluting factor (10(4) times) due to the use of a very sensitive triple quadrupole mass spectrometer (XEVO TQ) and the addition of creatinine-d3 as internal standard completely eliminates matrix effects coming from the urine. The method was validated in-house in 2012 according to the EMA guideline on bioanalytical method validation using Certified Reference samples from the German External Quality Assessment Scheme (G-Equas) proficiency test. All obtained results for accuracy and recovery are within the authorized tolerance ranges defined by G-Equas. The method is linear between 0 and 5 g/L, with LOD and LOQ of 5 × 10(-3) g/L and 10(-2) g/L, respectively. The repeatability (CV(r) = 1.03-2.07%) and intra-laboratory reproducibility (CV(RW) = 1.97-2.40%) satisfy the EMA 2012 guideline. The validated method was firstly applied to perform the German G-Equas proficiency test rounds 51 and 53, in 2013 and 2014, respectively. The obtained results were again all within the accepted tolerance ranges and very close to the reference values defined by the organizers of the proficiency test scheme, demonstrating an excellent accuracy of the developed method. The

  12. A validated high-performance liquid chromatography method with diode array detection for simultaneous determination of nine flavonoids in Senecio cannabifolius Less.

    Science.gov (United States)

    Niu, Tian-Zeng; Zhang, Yu-Wei; Bao, Yong-Li; Wu, Yin; Yu, Chun-Lei; Sun, Lu-Guo; Yi, Jing-Wen; Huang, Yan-Xin; Li, Yu-Xin

    2013-03-25

    A reversed phase high performance liquid chromatography method coupled with a diode array detector (HPLC-DAD) was developed for the first time for the simultaneous determination of 9 flavonoids in Senecio cannabifolius, a traditional Chinese medicinal herb. Agilent Zorbax SB-C18 column was used at room temperature and the mobile phase was a mixture of acetonitrile and 0.5% formic acid (v/v) in water in the gradient elution mode at a flow-rate of 1.0mlmin(-1), detected at 360nm. Validation of this method was performed to verify the linearity, precision, limits of detection and quantification, intra- and inter-day variabilities, reproducibility and recovery. The calibration curves showed good linearities (R(2)>0.9995) within the test ranges. The relative standard deviation (RSD) of the method was less than 3.0% for intra- and inter-day assays. The samples were stable for at least 96h, and the average recoveries were between 90.6% and 102.5%. High sensitivity was demonstrated with detection limits of 0.028-0.085μg/ml for flavonoids. The newly established HPLC method represents a powerful technique for the quality assurance of S. cannabifolius. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. A validated stability-indicating high performance liquid chromatographic method for moxifloxacin hydrochloride and ketorolac tromethamine eye drops and its application in pH dependent degradation kinetics

    Directory of Open Access Journals (Sweden)

    Jayant B Dave

    2013-01-01

    Full Text Available Background and Aim: A fixed dose combination of moxifloxacin hydrochloride and ketorolac tromethamine is used in ratio of 1:1 as eye drops for the treatment of the reduction of post operative inflammatory conditions of the eye. A simple, precise, and accurate High Performance Liquid Chromatographic (HPLC method was developed and validated for determination of moxifloxacin hydrochloride and ketorolac tromethamine in eye drops. Materials and Methods: Isocratic HPLC separation was achieved on a ACE C 18 column (C 18 (5 μm, 150 mm×4.6 mm, i.d. using the mobile phase 10 mM potassium di-hydrogen phosphate buffer pH 4.6-Acetonitrile (75:25 v/v at a flow rate of 1.0 mL/min. The detection was performed at 307 nm. Drugs were subjected to acid, alkali and neutral hydrolysis, oxidation and photo degradation. Moreover, the proposed HPLC method was utilized to investigate the pH dependent degradation kinetics of moxifloxacin hydrochloride and ketorolac tromethamine in buffer solutions at different pH values like 2.0, 6.8 and 9.0. Results and Conclusion: The retention time (t R of moxifloxacin hydrochloride and ketorolac tromethamine were 3.81±0.01 and 8.82±0.02 min, respectively. The method was linear in the concentration range of 2-20 μ/mL each for moxifloxacin hydrochloride and ketorolac tromethamine with a correlation coefficient of 0.9996 and 0.9999, respectively. The method was validated for linearity, precision, accuracy, robustness, specificity, limit of detection and limit of quantitation. The drugs could be effectively separated from different degradation products and hence the method can be used for stability analysis. Different kinetics parameters like apparent first-order rate constant, half-life and t 90 (time for 90% potency left were calculated.

  14. Partially hydrolyzed guar gum characterization and sensitive quantification in food matrices by high performance anion exchange chromatography with pulsed amperometric detection--validation using accuracy profile.

    Science.gov (United States)

    Mercier, G; Campargue, C

    2012-11-02

    Interest concerning functional ingredients and especially dietary fibres has been growing in recent years. At the same time, the variety of ingredient accepted as dietary fibres and their mixing at low level in complex matrices have considerably complicated their quantitative analysis by approved AOAC methods. These reasons have led to the specific development of an innovative analytical method performed by high-performance anion-exchange chromatography (HPAEC) with pulsed amperometric detection (PAD) to detect and quantify partially hydrolyzed guar gum (PHGG) in fruit preparation and dairy matrices. The analytical methodology was divided in two steps which could be deployed separately or in conjunction. The first, consists in a complete characterization of PHGG by size exclusion chromatography (SEC) with multi-angle light scattering and refractive index detection and HPAEC-PAD to determine its physico-chemical properties and galactomannans content, and the second step is the development of a new HPAEC-PAD method for PHGG direct quantification in complex matrices (dairy product). Validation in terms of detection and quantification limits, linearity of the analytical range, average accuracy (recovery, trueness) and average uncertainty were statistically carried out with accuracy profile. Overall, this new chromatographic method has considerably improved the possibility to quantify without fractionation treatment, low level of dietary fibres emerging from specific galactomannans, in complex matrices and many foodstuffs. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Validation of a high-performance size-exclusion chromatography method to determine and characterize β-glucans in beer wort using a triple-detector array.

    Science.gov (United States)

    Tomasi, Ivan; Marconi, Ombretta; Sileoni, Valeria; Perretti, Giuseppe

    2017-01-01

    Beer wort β-glucans are high-molecular-weight non-starch polysaccharides of that are great interest to the brewing industries. Because glucans can increase the viscosity of the solutions and form gels, hazes, and precipitates, they are often related to poor lautering performance and beer filtration problems. In this work, a simple and suitable method was developed to determine and characterize β-glucans in beer wort using size exclusion chromatography coupled with a triple-detector array, which is composed of a light scatterer, a viscometer, and a refractive-index detector. The method performances are comparable to the commercial reference method as result from the statistical validation and enable one to obtain interesting parameters of β-glucan in beer wort, such as the molecular weight averages, fraction description, hydrodynamic radius, intrinsic viscosity, polydispersity and Mark-Houwink parameters. This characterization can be useful in brewing science to understand filtration problems, which are not always explained through conventional analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Validation of a high-performance liquid chromatographic method with UV detection for the determination of ethopabate residues in poultry liver.

    Science.gov (United States)

    Granja, Rodrigo H M M; Niño, Alfredo M Montes; Zucchetti, Roberto A M; Niño, Rosario E Montes; Salerno, Alessandro G

    2008-01-01

    Ethopabate is frequently used in the prophylaxis and treatment of coccidiosis in poultry. Residues of this drug in food present a potential risk to consumers. A simple, rapid, and sensitive column high-performance liquid chromatographic (HPLC) method with UV detection for determination of ethopabate in poultry liver is presented. The drug is extracted with acetonitrile. After evaporation, the residue is dissolved with an acetone-hexane mixture and cleaned up by solid-phase extraction using Florisil columns. The analyte is then eluted with methanol. LC analysis is carried out on a C18 5 microm Gemini column, 15 cm x 4.6 mm. Ethopabate is quantified by means of UV detection at 270 nm. Parameters such as decision limit, detection capability, precision, recovery, ruggedness, and measurement uncertainty were calculated according to method validation guidelines provided in 2002/657/EC and ISO/IEC 17025:2005. Decision limit and detection capability were determined to be 2 and 3 microg/kg, respectively. Average recoveries from poultry samples fortified with 10, 15, and 20 microg/kg levels of ethopabate were 100-105%. A complete statistical analysis was performed on the results obtained, including an estimation of the method uncertainty. The method is to be implemented into Brazil's residue monitoring and control program for ethopabate.

  17. Development of Validated High-performance Thin-layer Chromatography Method for Simultaneous Determination of Quercetin and Kaempferol in Thespesia populnea.

    Science.gov (United States)

    Panchal, Hiteksha; Amin, Aeshna; Shah, Mamta

    2017-01-01

    Thespesia populnea L. (Family: Malvaceae) is a well-known medicinal plant distributed in tropical regions of the world and cultivated in South Gujarat and indicated to be useful in cutaneous affections, psoriasis, ringworm, and eczema. Bark and fruits are indicated in the diseases of skin, urethritis, and gonorrhea. The juice of fruits is employed in treating certain hepatic diseases. The plant is reported to contain flavonoids, quercetin, kaempferol, gossypetin, Kaempferol-3-monoglucoside, β-sitosterol, kaempferol-7-glucoside, and gossypol. T. populnea is a common component of many herbal and Ayurvedic formulation such as Kamilari and Liv-52. The present study aimed at developing validated and reliable high-performance thin layer chromatography (HPTLC) method for the analysis of quercetin and kaempferol simultaneously in T. populnea . The method employed thin-layer chromatography aluminum sheets precoated with silica gel as the stationary phase and toluene: ethyl acetate: formic acid (6:4:0.3 v/v/v) as the mobile phase, which gave compact bands of quercetin and kaempferol. Linear regression data for the calibration curves of standard quercetin and kaempferol showed a good linear relationship over a concentration range of 100-600 ng/spot and 500-3000 ng/spot with respect to the area and correlation coefficient (R2) was 0.9955 and 0.9967. The method was evaluated regarding accuracy, precision, selectivity, and robustness. Limits of detection and quantitation were recorded as 32.06 and 85.33 ng/spot and 74.055 and 243.72 ng/spot for quercetin and kaempferol, respectively. We concluded that this method employing HPTLC in the quantitative determination of quercetin and kaempferol is efficient, simple, accurate, and validated.

  18. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Plutonium Metals, Oxides, and Solutions on the High Performance Computing Platform Moonlight

    Energy Technology Data Exchange (ETDEWEB)

    Chapman, Bryan Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gough, Sean T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-12-05

    This report documents a validation of the MCNP6 Version 1.0 computer code on the high performance computing platform Moonlight, for operations at Los Alamos National Laboratory (LANL) that involve plutonium metals, oxides, and solutions. The validation is conducted using the ENDF/B-VII.1 continuous energy group cross section library at room temperature. The results are for use by nuclear criticality safety personnel in performing analysis and evaluation of various facility activities involving plutonium materials.

  19. Development and validation of a rapid ultra-high performance liquid chromatography diode array detector method for Vitex agnus-castus.

    Science.gov (United States)

    Högner, C; Sturm, S; Seger, C; Stuppner, H

    2013-05-15

    A rapid ultra-high performance liquid chromatography diode array detector (UHPLC-DAD) method was developed and validated for the simultaneous determination of all classes of non-volatile phytochemicals (iridoids, flavonoids and diterpenes) in Vitex agnus-castus (Lamiaceae) fruits, a traditional medicinal plant used against premenstrual symptoms (PMS) and other disorders. Seven marker compounds, 3,4-dihydroxybenzoic acid, p-hydroxybenzoic acid, agnuside, 5-hydroxykaempferol-3,6,7,4'-tetramethylether, 1,2-dibenzoic acid glucose, methoxy-vitexilactone, and vitetrifolin D were isolated from the methanol extract of V. agnus-castus to be used as reference substances. Chromatographic separation was performed on a Zorbax Eclipse XDB-C18 (50mm×2.1mm) UHPLC column with 1.8μm particle size, within 20min. A solvent gradient from 0.5% acetic acid to acetonitrile at a flow rate of 0.6mL/min was used as mobile phase. Analyte detection and quantification was realized at 210nm and 260nm. The UHPLC-DAD assay was validated for the quantitative analysis of agnuside, isovitexin, casticin, 5-hydroxykaempferol-3,6,7,4'-tetramethylether and vitetrifolin D. It was found to be specific, accurate, precise, and reproducible for the quantification of these compound within a concentration range of 0.7-500.0μg/mL for casticin and 5-hydroxykaempferol-3,6,7,4'-tetramethylether, 1.4-1000.0μg/mL for isovitexin and agnuside, and 12.4-1000.0μg/mL for vitetrifolin D. Intra- and inter-day variations showed relative standard deviations (RSD) of less than 3.9% and 6.4%, respectively. Tentatively assignment of 62 chromatographic features found in the UHPLC-DAD assay was carried out by coupling the UHPLC instrument to a quadrupole time-of-flight mass spectrometer via an electrospray ionization interface (ESI-QTOF-MS) operated in positive and negative ion mode. By using the established quantitative UHPLC-DAD assay to asses agnuside, isovitexin, casticin, 5-hydroxykaempferol-3,6,7,4'-tetramethylether and

  20. Validation of high performance liquid chromatographic and spectrophotometric methods for the determination of the antiparkinson agent pramipexole dihydrochloride monohydrate in pharmaceutical products

    Directory of Open Access Journals (Sweden)

    Serpil Sevim

    2015-12-01

    Full Text Available abstract The antiparkinson agent pramipexole dihydrochloride monohydrate was quantified in pharmaceutical products by high performance liquid chromatography (HPLC and derivative spectrophotometry. The first method was based on HPLC using tamsulosin HCl as an internal standard. In this method, chromatographic separation was achieved using a LiChrospher 60 RP column at 25°C, with a flow rate of 1.0 mL/min at 263 nm. The eluent comprised 0.01 mol/L ammonium acetate (pH 4.4 and acetonitrile (35:65 by volume. The linearity range was found to be 10.0-30.0 µg/mL with a mean recovery of 100.5 ± 1.10. The limit of detection (8 ng/mL and limit of quantification (50 ng/mL were calculated. In the second method, the first derivative spectrophotometric technique for the determination of pramipexole dihydrochloride monohydrate was performed by measuring the amplitude at 249 and 280 nm. In the first derivative technique, the absorbance and concentration plot was rectilinear over the 5.0-35.0 µg/mL range with a lower detection limit of 1.5 ng/mL and quantification limit of 4.5 ng/mL. The typical excipients included in the pharmaceutical product do not interfere with the selectivity of either method. The developed methods were validated for robustness, selectivity, specificity, linearity, precision, and accuracy as per the ICH and FDA guidelines (ICH Q2B, 1996; FDA,2000. In conclusion, the developed methods were successful in determining the quantity of the antiparkinson agent pramipexole dihydrochloride monohydrate in pharmaceutical products. The RSD values for the pharmaceutical product used in this study were found to be 0.97% for the HPLC method and 0.00% for the first derivative spectrophotometric method.

  1. Development and Validation of a Simple High Performance Liquid Chromatography/UV Method for Simultaneous Determination of Urinary Uric Acid, Hypoxanthine, and Creatinine in Human Urine

    Directory of Open Access Journals (Sweden)

    Nimanthi Wijemanne

    2018-01-01

    Full Text Available Uric acid and hypoxanthine are produced in the catabolism of purine. Abnormal urinary levels of these products are associated with many diseases and therefore it is necessary to have a simple and rapid method to detect them. Hence, we report a simple reverse phase high performance liquid chromatography (HPLC/UV technique, developed and validated for simultaneous analysis of uric acid, hypoxanthine, and creatinine in human urine. Urine was diluted appropriately and eluted with C-18 column 100 mm × 4.6 mm with a C-18 precolumn 25 mm × 4.6 mm in series. Potassium phosphate buffer (20 mM, pH 7.25 at a flow rate of 0.40 mL/min was employed as the solvent and peaks were detected at 235 nm. Tyrosine was used as the internal standard. The experimental conditions offered a good separation of analytes without interference of endogenous substances. The calibration curves were linear for all test compounds with a regression coefficient, r2>0.99. Uric acid, creatinine, tyrosine, and hypoxanthine were eluted at 5.2, 6.1, 7.2, and 8.3 min, respectively. Intraday and interday variability were less than 4.6% for all the analytes investigated and the recovery ranged from 98 to 102%. The proposed HPLC procedure is a simple, rapid, and low cost method with high accuracy with minimum use of organic solvents. This method was successfully applied for the determination of creatinine, hypoxanthine, and uric acid in human urine.

  2. Development and validation of High-performance Thin-layer Chromatography Method for Simultaneous Determination of Polyphenolic Compounds in Medicinal Plants.

    Science.gov (United States)

    Jayachandran Nair, C V; Ahamad, Sayeed; Khan, Washim; Anjum, Varisha; Mathur, Rajani

    2017-12-01

    Quantitative standardization of plant-based products is challenging albeit essential to maintain their quality. This study aims to develop and validate high-performance thin-layer chromatography (HPTLC) method for the simultaneous determination of rutin (Ru), quercetin (Qu), and gallic acid (Ga) from Psidium guajava Linn. (PG) and Aegle marmelos (L.) Correa. (AM) and correlate with antioxidant activity. The stock solution (1 mg/mL) of standard Ru, Qu, and Ga in methanol: Water (1:1) was serially diluted and spotted (5 μL) on slica gel 60 F 254 thin-layer chromatography plates. Toluene: Ethyl acetate: Formic acid: Methanol (3:4:0.8:0.7, v/v/v) was selected as mobile phase for analysis at 254 nm. Hydroalcoholic (1:1) extracts of leaves of PG and AM were fractionated and similarly analyzed. Antioxidant activity was also determined using 2, 2-diphenyl-1-picrylhydrazyl assay. The developed method was robust and resolved Ru, Qu, and Ga at R f 0.08 ± 0.02, 0.76 ± 0.01, and 0.63 ± 0.02, respectively. The intra-day, interday precision, and interanalyst were limit of detection and limit of quantification for Ru, Qu, and Ga were 4.51, 4.2, 5.27, and 13.67, 12.73, 15.98 ng/spot, respectively. Antioxidant activity (Log 50% inhibition) of PG and AM was 4.947 ± 0.322 and 6.498 ± 0.295, respectively. The developed HPTLC method was rapid, accurate, precise, reproducible, and specific for the simultaneous estimation of Ru, Qu, and Ga. HPTLC method for simultaneous determination and quantification of Rutin, Quercetin and Gallic acid, is reported for quality control of herbal drugs. Abbreviations Used: A: Aqueous fraction; AM: Aegle marmelos L. Correa; B: Butanol fraction; C: Chloroform fraction; EA: Ethyl acetate fraction; Ga: Gallic acid; H: Hexane fraction; HA: Hydroalcoholic extract; HPTLC: High-performance thin-layer chromatography; PG: Psidium guajava ; Qu: Quercetin; Ru: Rutin.

  3. Validation of a High-Performance Liquid Chromatography method for the determination of vitamin A, vitamin D3, vitamin E and benzyl alcohol in a veterinary oily injectable solution

    OpenAIRE

    Maria Neagu; Georgiana Soceanu; Ana Caterina Bucur

    2015-01-01

    A new simple, rapid, accurate and precise highperformance liquid chromatography (HPLC) method for determination of vitamin A, vitamin D3, vitamin E and benzyl alcohol in oily injectable solution was developed and validated. The method can be used for the detection and quantification of known and unknown impurities and degradants in the drug substance during routine analysis and also for stability studies in view of its capability to separate degradation products. The method was validate...

  4. Analytical method development and validation of simultaneous estimation of rabeprazole, pantoprazole, and itopride by reverse-phase high-performance liquid chromatography

    Directory of Open Access Journals (Sweden)

    Senthamil Selvan Perumal

    2014-12-01

    Full Text Available A simple, selective, rapid, and precise reverse-phase high-performance liquid chromatography (RP-HPLC method for the simultaneous estimation of rabeprazole (RP, pantoprazole (PP, and itopride (IP has been developed. The compounds were well separated on a Phenomenex C18 (Luna column (250 mm × 4.6 mm, dp = 5 μm with C18 guard column (4 mm × 3 mm × 5 μm with a mobile phase consisting of buffer containing 10 mM potassium dihydrogen orthophosphate (adjusted to pH 6.8: acetonitrile (70:30 v/v at a flow rate of 1.0 mL/min and ultraviolet detection at 288 nm. The retention time of RP, PP, and IP were 5.35, 7.92, and 11.16 minutes, respectively. Validation of the proposed method was carried out according to International Conference on Harmonisation (ICH guidelines. Linearity range was obtained for RP, PP, and IP over the concentration range of 2.5–25, 1–30, and 3–35 μg/mL and the r2 values were 0.994, 0.978, and 0.991, respectively. The calculated limit of detection (LOD values were 1, 0.3, and 1 μg/mL and limit of quantitation (LOQ values were 2.5, 1, and 3 μg/mL for RP, PP, and IP correspondingly. Thus, the current study showed that the developed reverse-phase liquid chromatography method is sensitive and selective for the estimation of RP, PP, and IP in combined dosage form.

  5. Analytical method development and validation of simultaneous estimation of rabeprazole, pantoprazole, and itopride by reverse-phase high-performance liquid chromatography.

    Science.gov (United States)

    Perumal, Senthamil Selvan; Ekambaram, Sanmuga Priya; Raja, Samundeswari

    2014-12-01

    A simple, selective, rapid, and precise reverse-phase high-performance liquid chromatography (RP-HPLC) method for the simultaneous estimation of rabeprazole (RP), pantoprazole (PP), and itopride (IP) has been developed. The compounds were well separated on a Phenomenex C 18 (Luna) column (250 mm × 4.6 mm, dp = 5 μm) with C 18 guard column (4 mm × 3 mm × 5 μm) with a mobile phase consisting of buffer containing 10 mM potassium dihydrogen orthophosphate (adjusted to pH 6.8): acetonitrile (70:30 v/v) at a flow rate of 1.0 mL/min and ultraviolet detection at 288 nm. The retention time of RP, PP, and IP were 5.35, 7.92, and 11.16 minutes, respectively. Validation of the proposed method was carried out according to International Conference on Harmonisation (ICH) guidelines. Linearity range was obtained for RP, PP, and IP over the concentration range of 2.5-25, 1-30, and 3-35 μg/mL and the r 2 values were 0.994, 0.978, and 0.991, respectively. The calculated limit of detection (LOD) values were 1, 0.3, and 1 μg/mL and limit of quantitation (LOQ) values were 2.5, 1, and 3 μg/mL for RP, PP, and IP correspondingly. Thus, the current study showed that the developed reverse-phase liquid chromatography method is sensitive and selective for the estimation of RP, PP, and IP in combined dosage form. Copyright © 2014. Published by Elsevier B.V.

  6. Development and validation of an ultra high performance liquid chromatography tandem mass spectrometry method for simultaneous determination of sulfonamides, quinolones and benzimidazoles in bovine milk.

    Science.gov (United States)

    Hou, Xiao-Lin; Chen, Guo; Zhu, Li; Yang, Ting; Zhao, Jian; Wang, Lei; Wu, Yin-Liang

    2014-07-01

    A simple, sensitive and reliable analytical method was developed for the simultaneous determination of 38 veterinary drugs (18 sulfonamides, 11 quinolones and 9 benzimidazoles) and 8 metabolites of benzimidazoles in bovine milk by ultra high performance liquid chromatography-positive electrospray ionization tandem mass spectrometry (UHPLC-ESI-MS/MS). Samples were extracted with acidified acetonitrile, cleaned up with Oasis(®) MCX cartridges, and analyzed by LC-MS/MS on an Acquity UPLC(®) BEH C18 column with gradient elution. The method allows such multi-analyte measurements within a 13min runtime while the specificity is ensured through the MRM acquisition mode. The method was validated according to the European Commission Decision 2002/657/EC determining specificity, decision limit (CCα), detection capability (CCβ), recovery, precision, linearity and stability. For compounds which have MRLs in bovine milk, the CCα values fall into a range from 11 to 115μg/kg, and the CCβ values fall within a range of 12-125μg/kg. For compounds which have not MRLs in bovine milk, the CCα values fall into a range from 0.01 to 0.08μg/kg, and the CCβ values fall within a range of 0.02-0.11μg/kg. The mean recoveries of the 46 analytes were between 87 and 119%. The calculated RSD values of repeatability and within-laboratory reproducibility experiments were below 11% and 15% for the 46 compounds, respectively. The method was demonstrated to be suitable for the simultaneous determination of sulfonamides, quinolones and benzimidazoles in bovine milk. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Validation of a high performance liquid chromatography method for quantitation of L-proline in 20 % tincture from Murraya paniculata L. Jack

    International Nuclear Information System (INIS)

    Varona Torres, Noel; Gutierrez Gaiten, Yamilet Irene; Casado Martin, Celia Magaly

    2014-01-01

    The search for analytical methods that may monitor the quality of drugs is an issue of great interest in the pharmaceutical field, even more if they are directed to studying chemical markers of medicinal plants, their extracts and phytomedicines. To validate a high-resolution liquid chromatography (HPLC) method for the quantitative determination of the L-proline amino acid as a marker substance in Murraya paniculata L. Jack tincture

  8. Development and validation of a method for the determination of regulated fragrance allergens by High-Performance Liquid Chromatography and Parallel Factor Analysis 2.

    Science.gov (United States)

    Pérez-Outeiral, Jessica; Elcoroaristizabal, Saioa; Amigo, Jose Manuel; Vidal, Maider

    2017-12-01

    This work presents the development and validation of a multivariate method for quantitation of 6 potentially allergenic substances (PAS) related to fragrances by ultrasound-assisted emulsification microextraction coupled with HPLC-DAD and PARAFAC2 in the presence of other 18 PAS. The objective is the extension of a previously proposed univariate method to be able to determine the 24 PAS currently considered as allergens. The suitability of the multivariate approach for the qualitative and quantitative analysis of the analytes is discussed through datasets of increasing complexity, comprising the assessment and validation of the method performance. PARAFAC2 showed to adequately model the data facing up different instrumental and chemical issues, such as co-elution profiles, overlapping spectra, unknown interfering compounds, retention time shifts and baseline drifts. Satisfactory quality parameters of the model performance were obtained (R 2 ≥0.94), as well as meaningful chromatographic and spectral profiles (r≥0.97). Moreover, low errors of prediction in external validation standards (below 15% in most cases) as well as acceptable quantification errors in real spiked samples (recoveries from 82 to 119%) confirmed the suitability of PARAFAC2 for resolution and quantification of the PAS. The combination of the previously proposed univariate approach, for the well-resolved peaks, with the developed multivariate method allows the determination of the 24 regulated PAS. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Development and validation of a rapid high performance liquid chromatography - photodiode array detection method for estimation of a bioactive compound wedelolactone in extracts of Eclipta alba

    Directory of Open Access Journals (Sweden)

    Satyanshu Kumar

    2013-03-01

    Full Text Available Following optimization of extraction, separation and analytical conditions, a rapid, sensitive and simple reverse-phase high performance liquid chromatography-photo diode array (HPLC-PDA method has been developed for the identification and quantification of wedelolactone in different extracts of Eclipta alba. The separation of wedelolactone was achieved on a C18 column using the solvent system consisting of a mixture of methanol: water: acetic acid (95: 5: 0.04 as a mobile phase in isocratic elution mode followed by photo diode array detection at 352 nm. The developed method was validated as per the guidelines of the International Conference on Harmonization (ICH. Calibration curve presented good linear regression (r²>0.998 within the test range and the maximum relative standard deviation (RSD, % values for intra-day assay were found to be 0.15, 1.30 and 1.1 for low (5 µg/mL, medium (20 µg/mL and high (80 µg/mL concentrations of wedelolactone. For inter-day assay the maximum RSD (% values were found to be 2.83, 1.51 and 2.06 for low, medium and high concentrations, respectively. Limit of detection (LOD and limit of quantification (LOQ were calculated to be 2 and 5 µg/mL respectively. Analytical recovery of wedelolactone was greater than 95%. Wedelolactone in different extracts of Eclipta alba was identified and quantified using the developed HPLC method. The validated HPLC method allowed precise quantitative analysis of wedelolactone in Eclipta. alba extracts.Desenvolveu-se método rápido, sensível e simples de Cromatografia Líquida de Alta Eficiência em fase reversa, utilizando-se arranjo de fotodiodo (HPLC-PDA, visando à separação, extração e às condições analíticas para a identificação e quantificação de wedelolactona em diferentes extratos de Eclipta alba. A separação de wedelolactona foi efetuada por meio de uma coluna C18, utilizando mistura de metanol:água:ácido acético (95:5:0.04 como fase móvel, em sistema de

  10. High performance homes

    DEFF Research Database (Denmark)

    Beim, Anne; Vibæk, Kasper Sánchez

    2014-01-01

    Can prefabrication contribute to the development of high performance homes? To answer this question, this chapter defines high performance in more broadly inclusive terms, acknowledging the technical, architectural, social and economic conditions under which energy consumption and production occur....... Consideration of all these factors is a precondition for a truly integrated practice and as this chapter demonstrates, innovative project delivery methods founded on the manufacturing of prefabricated buildings contribute to the production of high performance homes that are cost effective to construct, energy...

  11. Optimization and validation of a reversed-phase high performance liquid chromatography method for the measurement of bovine liver methylmalonyl-coenzyme a mutase activity.

    Science.gov (United States)

    Ouattara, Bazoumana; Duplessis, Mélissa; Girard, Christiane L

    2013-10-16

    Methylmalonyl-CoA mutase (MCM) is an adenosylcobalamin-dependent enzyme that catalyses the interconversion of (2R)-methylmalonyl-CoA to succinyl-CoA. In humans, a deficit in activity of MCM, due to an impairment of intracellular formation of adenosylcobalamin and methylcobalamin results in a wide spectrum of clinical manifestations ranging from moderate to fatal. Consequently, MCM is the subject of abundant literature. However, there is a lack of consensus on the reliable method to monitor its activity. This metabolic pathway is highly solicited in ruminants because it is essential for the utilization of propionate formed during ruminal fermentation. In lactating dairy cows, propionate is the major substrate for glucose formation. In present study, a reversed-phase high performance liquid chromatography (RP-HPLC) was optimized and validated to evaluate MCM activity in bovine liver. The major aim of the study was to describe the conditions to optimize reproducibility of the method and to determine stability of the enzyme and its product during storage and processing of samples. Specificity of the method was good, as there was no interfering peak from liver extract at the retention times corresponding to methylmalonyl-CoA or succinyl-CoA. Repeatability of the method was improved as compared to previous RP-HPLC published data. Using 66 μg of protein, intra-assay coefficient of variation (CV) of specific activities, ranged from 0.90 to 8.05% and the CV inter-day was 7.40%. Storage and processing conditions (frozen homogenate of fresh tissue vs. fresh homogenate of tissue snapped in liquid nitrogen) did not alter the enzyme activity. The analyte was also stable in liver crude extract for three frozen/thawed cycles when stored at -20°C and thawed to room temperature. The improved method provides a way for studying the effects of stages of lactation, diet composition, and physiology in cattle on MCM activity over long periods of time, such as a complete lactation period

  12. High Performance Marine Vessels

    CERN Document Server

    Yun, Liang

    2012-01-01

    High Performance Marine Vessels (HPMVs) range from the Fast Ferries to the latest high speed Navy Craft, including competition power boats and hydroplanes, hydrofoils, hovercraft, catamarans and other multi-hull craft. High Performance Marine Vessels covers the main concepts of HPMVs and discusses historical background, design features, services that have been successful and not so successful, and some sample data of the range of HPMVs to date. Included is a comparison of all HPMVs craft and the differences between them and descriptions of performance (hydrodynamics and aerodynamics). Readers will find a comprehensive overview of the design, development and building of HPMVs. In summary, this book: Focuses on technology at the aero-marine interface Covers the full range of high performance marine vessel concepts Explains the historical development of various HPMVs Discusses ferries, racing and pleasure craft, as well as utility and military missions High Performance Marine Vessels is an ideal book for student...

  13. Development and validation of a chiral high-performance liquid chromatography assay for rogletimide and rogletimide-N-oxide isomers in plasma.

    Science.gov (United States)

    Etienne, M C; Oster, W; Milano, G

    1996-01-01

    The purpose of the present study was to develop and validate a stereo-specific high-performance liquid chromatography (HPLC) assay for rogletimide (Rog) and rogletimide-N-oxide (Nox) isomers in plasma. The assay was performed with a chiral cellulose-[4-methylbenzoate]ester column (Chiracel OJ). Optimal separation was achieved isocratically with a mobile phase consisting of n-hexane/anhydrous ethanol (65/35, v/v) at a flow rate of 0.9 ml/min, with the column being thermostated at +35 degrees C (UV detection at 257 nm). Under these conditions, retention times were approximately 17, 28, 31 and 76 min for R-Rog, S-Rog, R-Nox and S-Nox, respectively. S-aminoglutethimide (S-Ag) served as the internal standard (retention time 70 min). An extraction procedure from plasma samples was developed on Bond Elut RP8 500-mg cartridges; conditioning was performed with 5 ml methanol and 5 ml water, after which 1 ml plasma that had previously been spiked with 5 microM S-Ag was applied. Washing was done with 6 ml water and elution, with 4 ml methanol. After evaporation to dryness, residues were dissolved in 400 microliters anhydrous ethanol and 12-48 microliters was injected onto the HPLC system. Blank plasma from healthy donors showed the random presence of a small interference eluting at the retention time of R-Rog, precluding the accurate quantification of R-Rog concentrations below 2.5 microM. Reproducibility assays demonstrated the need to use an internal standard. Taking into account the internal standard, at 2.5 microM the intra- and inter-assay coefficients of variation were 10.5% and 21.0% for R-Rog 5.5% and 8.7% for S-Rog, 7.6% and 20.8% for R-Nox and 11.7% and 6.4% for S-Nox, respectively. The detection limit was 2.5 microM for R-Rog, 0.5 microM for S-Rog, 0.25 microM for R-Nox and 0.5 microM for S-Nox. Linearity was satisfactory at concentrations ranging from 2.5 to 10 microM for R-Rog, from 0.5 to 10 microM for S-Rog, from 0.25 to 2.5 microM for R-Nox and from 0.50 to 2

  14. High performance systems

    Energy Technology Data Exchange (ETDEWEB)

    Vigil, M.B. [comp.

    1995-03-01

    This document provides a written compilation of the presentations and viewgraphs from the 1994 Conference on High Speed Computing given at the High Speed Computing Conference, {open_quotes}High Performance Systems,{close_quotes} held at Gleneden Beach, Oregon, on April 18 through 21, 1994.

  15. Responsive design high performance

    CERN Document Server

    Els, Dewald

    2015-01-01

    This book is ideal for developers who have experience in developing websites or possess minor knowledge of how responsive websites work. No experience of high-level website development or performance tweaking is required.

  16. High Performance Macromolecular Material

    National Research Council Canada - National Science Library

    Forest, M

    2002-01-01

    .... In essence, most commercial high-performance polymers are processed through fiber spinning, following Nature and spider silk, which is still pound-for-pound the toughest liquid crystalline polymer...

  17. Method for the determination of catechin and epicatechin enantiomers in cocoa-based ingredients and products by high-performance liquid chromatography: single-laboratory validation.

    Science.gov (United States)

    Machonis, Philip R; Jones, Matthew A; Schaneberg, Brian T; Kwik-Uribe, Catherine L

    2012-01-01

    A single-laboratory validation study was performed for an HPLC method to identify and quantify the flavanol enantiomers (+)- and (-)-epicatechin and (+)- and (-)-catechin in cocoa-based ingredients and products. These compounds were eluted isocratically with an ammonium acetate-methanol mobile phase applied to a modified beta-cyclodextrin chiral stationary phase and detected using fluorescence. Spike recovery experiments using appropriate matrix blanks, along with cocoa extract, cocoa powder, and dark chocolate, were used to evaluate accuracy, repeatability, specificity, LOD, LOQ, and linearity of the method as performed by a single analyst on multiple days. In all samples analyzed, (-)-epicatechin was the predominant flavanol and represented 68-91% of the total monomeric flavanols detected. For the cocoa-based products, within-day (intraday) precision for (-)-epicatechin was between 1.46-3.22%, for (+)-catechin between 3.66-6.90%, and for (-)-catechin between 1.69-6.89%; (+)-epicatechin was not detected in these samples. Recoveries for the three sample types investigated ranged from 82.2 to 102.1% at the 50% spiking level, 83.7 to 102.0% at the 100% spiking level, and 80.4 to 101.1% at the 200% spiking level. Based on performance results, this method may be suitable for routine laboratory use in analysis of cocoa-based ingredients and products.

  18. Constructing and Validating High-Performance MIEC-SVM Models in Virtual Screening for Kinases: A Better Way for Actives Discovery.

    Science.gov (United States)

    Sun, Huiyong; Pan, Peichen; Tian, Sheng; Xu, Lei; Kong, Xiaotian; Li, Youyong; Dan Li; Hou, Tingjun

    2016-04-22

    The MIEC-SVM approach, which combines molecular interaction energy components (MIEC) derived from free energy decomposition and support vector machine (SVM), has been found effective in capturing the energetic patterns of protein-peptide recognition. However, the performance of this approach in identifying small molecule inhibitors of drug targets has not been well assessed and validated by experiments. Thereafter, by combining different model construction protocols, the issues related to developing best MIEC-SVM models were firstly discussed upon three kinase targets (ABL, ALK, and BRAF). As for the investigated targets, the optimized MIEC-SVM models performed much better than the models based on the default SVM parameters and Autodock for the tested datasets. Then, the proposed strategy was utilized to screen the Specs database for discovering potential inhibitors of the ALK kinase. The experimental results showed that the optimized MIEC-SVM model, which identified 7 actives with IC50 < 10 μM from 50 purchased compounds (namely hit rate of 14%, and 4 in nM level) and performed much better than Autodock (3 actives with IC50 < 10 μM from 50 purchased compounds, namely hit rate of 6%, and 2 in nM level), suggesting that the proposed strategy is a powerful tool in structure-based virtual screening.

  19. Validation of a high-performance liquid chromatography/fluorescence detection method for the simultaneous quantification of fifteen polycyclic aromatic hydrocarbons

    DEFF Research Database (Denmark)

    Hansen, Åse Marie; Olsen, I L; Holst, E

    1991-01-01

    A high-performance liquid chromatography/fluorescence method using multiple wavelength shift for simultaneous quantification of different PAH compounds was developed. The new method was superior to the methods of DONG and GREENBERG [J. Liquid Chromatogr. 11, 1887-1905 (1988)] and WISE et al...... that no systematic errors, and only small unsystematic errors, could be demonstrated. Furthermore, the method had a good reproducibility and a high sensitivity....

  20. Validating a High Performance Liquid Chromatography-Ion Chromatography (HPLC-IC) Method with Conductivity Detection After Chemical Suppression for Water Fluoride Estimation.

    Science.gov (United States)

    Bondu, Joseph Dian; Selvakumar, R; Fleming, Jude Joseph

    2018-01-01

    A variety of methods, including the Ion Selective Electrode (ISE), have been used for estimation of fluoride levels in drinking water. But as these methods suffer many drawbacks, the newer method of IC has replaced many of these methods. The study aimed at (1) validating IC for estimation of fluoride levels in drinking water and (2) to assess drinking water fluoride levels of villages in and around Vellore district using IC. Forty nine paired drinking water samples were measured using ISE and IC method (Metrohm). Water samples from 165 randomly selected villages in and around Vellore district were collected for fluoride estimation over 1 year. Standardization of IC method showed good within run precision, linearity and coefficient of variance with correlation coefficient R 2  = 0.998. The limit of detection was 0.027 ppm and limit of quantification was 0.083 ppm. Among 165 villages, 46.1% of the villages recorded water fluoride levels >1.00 ppm from which 19.4% had levels ranging from 1 to 1.5 ppm, 10.9% had recorded levels 1.5-2 ppm and about 12.7% had levels of 2.0-3.0 ppm. Three percent of villages had more than 3.0 ppm fluoride in the water tested. Most (44.42%) of these villages belonged to Jolarpet taluk with moderate to high (0.86-3.56 ppm) water fluoride levels. Ion Chromatography method has been validated and is therefore a reliable method in assessment of fluoride levels in the drinking water. While the residents of Jolarpet taluk (Vellore distict) are found to be at a high risk of developing dental and skeletal fluorosis.

  1. Simultaneous determination of some water-soluble vitamins and preservatives in multivitamin syrup by validated stability-indicating high-performance liquid chromatography method.

    Science.gov (United States)

    Vidović, Stojanka; Stojanović, Biljana; Veljković, Jelena; Prazić-Arsić, Ljiljana; Roglić, Goran; Manojlović, Dragan

    2008-08-22

    HPLC stability-indicating method has been developed for the simultaneous determination of some water-soluble vitamins (ascorbic acid, thiamine hydrochloride, riboflavin-5'-phosphate sodium, pyridoxine hydrochloride, nicotinamide, D(+)-panthenol) and two preservatives (methylparaben and sodium benzoate) in multivitamin syrup preparation. Water-soluble vitamins, preservatives and their degradants were separated on Zorbax SB-Aq (C(18)) (250 mm x 4.6 mm, 5 microm) column at an ambient temperature. Combined isocratic and gradient elution was performed with a mobile phase consisting of 0.0125 M hexane-1-sulfonic acid sodium salt in 0.1% (m/v) o-phosphoric acid, pH 2.4-2.5 (solvent A) and acetonitrile (solvent B) at the flow-rate 1 ml min(-1). Starting with solvent A an isocratic elution was performed for 15 min, then the composition was changed to 85% of A and 15% of B during the next 20 min and it was constant for 5 min, then the composition was changed to 70% of A and 30% of B during next 15 min and it was constant for 5 min and finally was changed to 100% of A as at the beginning of the elution. Detection was performed with diode array detector at 210, 230 and 254 nm. Multivitamin syrup preparation was subjected to stress testing (forced degradation) in order to demonstrate that degradants from the vitamins, preservatives and/or product excipients do not interfere with the quantification of vitamins and preservatives. Typical validation characteristics: selectivity, accuracy, precision, linearity, range, limit of quantification and limit of detection were evaluated for vitamins and preservatives.

  2. Clojure high performance programming

    CERN Document Server

    Kumar, Shantanu

    2013-01-01

    This is a short, practical guide that will teach you everything you need to know to start writing high performance Clojure code.This book is ideal for intermediate Clojure developers who are looking to get a good grip on how to achieve optimum performance. You should already have some experience with Clojure and it would help if you already know a little bit of Java. Knowledge of performance analysis and engineering is not required. For hands-on practice, you should have access to Clojure REPL with Leiningen.

  3. High Performance Concrete

    Directory of Open Access Journals (Sweden)

    Traian Oneţ

    2009-01-01

    Full Text Available The paper presents the last studies and researches accomplished in Cluj-Napoca related to high performance concrete, high strength concrete and self compacting concrete. The purpose of this paper is to raid upon the advantages and inconveniences when a particular concrete type is used. Two concrete recipes are presented, namely for the concrete used in rigid pavement for roads and another one for self-compacting concrete.

  4. High performance polymeric foams

    International Nuclear Information System (INIS)

    Gargiulo, M.; Sorrentino, L.; Iannace, S.

    2008-01-01

    The aim of this work was to investigate the foamability of high-performance polymers (polyethersulfone, polyphenylsulfone, polyetherimide and polyethylenenaphtalate). Two different methods have been used to prepare the foam samples: high temperature expansion and two-stage batch process. The effects of processing parameters (saturation time and pressure, foaming temperature) on the densities and microcellular structures of these foams were analyzed by using scanning electron microscopy

  5. High performance conductometry

    International Nuclear Information System (INIS)

    Saha, B.

    2000-01-01

    Inexpensive but high performance systems have emerged progressively for basic and applied measurements in physical and analytical chemistry on one hand, and for on-line monitoring and leak detection in plants and facilities on the other. Salient features of the developments will be presented with specific examples

  6. Danish High Performance Concretes

    DEFF Research Database (Denmark)

    Nielsen, M. P.; Christoffersen, J.; Frederiksen, J.

    1994-01-01

    In this paper the main results obtained in the research program High Performance Concretes in the 90's are presented. This program was financed by the Danish government and was carried out in cooperation between The Technical University of Denmark, several private companies, and Aalborg University...... concretes, workability, ductility, and confinement problems....

  7. High performance homes

    DEFF Research Database (Denmark)

    Beim, Anne; Vibæk, Kasper Sánchez

    2014-01-01

    . Consideration of all these factors is a precondition for a truly integrated practice and as this chapter demonstrates, innovative project delivery methods founded on the manufacturing of prefabricated buildings contribute to the production of high performance homes that are cost effective to construct, energy...

  8. Validation of geotechnical software for repository performance assessment

    International Nuclear Information System (INIS)

    LeGore, T.; Hoover, J.D.; Khaleel, R.; Thornton, E.C.; Anantatmula, R.P.; Lanigan, D.C.

    1989-01-01

    An important step in the characterization of a high level nuclear waste repository is to demonstrate that geotechnical software, used in performance assessment, correctly models validation. There is another type of validation, called software validation. It is based on meeting the requirements of specifications documents (e.g. IEEE specifications) and does not directly address the correctness of the specifications. The process of comparing physical experimental results with the predicted results should incorporate an objective measure of the level of confidence regarding correctness. This paper reports on a methodology developed that allows the experimental uncertainties to be explicitly included in the comparison process. The methodology also allows objective confidence levels to be associated with the software. In the event of a poor comparison, the method also lays the foundation for improving the software

  9. Validation of a method for simultaneous determination of nitroimidazoles, benzimidazoles and chloramphenicols in swine tissues by ultra-high performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Xia, Xi; Wang, Yuanyuan; Wang, Xia; Li, Yun; Zhong, Feng; Li, Xiaowei; Huang, Yaoling; Ding, Shuangyang; Shen, Jianzhong

    2013-05-31

    This paper presents a sensitive and confirmatory multi-residue method for the analysis of 23 veterinary drugs and metabolites belonging to three classes (nitroimidazoles, benzimidazoles, and chloramphenicols) in porcine muscle, liver, and kidney. After extracted with ethyl acetate and basic ethyl acetate sequentially, the crude extracts were defatted with hexane and further purified using Oasis MCX solid-phase extraction cartridges. Rapid determination was carried out by ultra-high performance liquid chromatography-electrospray ionization tandem mass spectrometry. Data acquisition was performed under positive and negative mode simultaneously. Recoveries based on matrix-matched calibrations for meat, liver, and kidney ranged from 50.6 to 108.1%. The method quantification limits were in the range of 3-100ng/kg. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. A novel validated stability indicating high performance liquid chromatographic method for estimation of degradation behavior of ciprofloxacin and tinidazole in solid oral dosage

    Directory of Open Access Journals (Sweden)

    Bhupendrasinh K Vaghela

    2013-01-01

    Full Text Available Objective: The objective of current investigation was to study the degradation behavior of Ciprofloxacin and Tinidazole. The study was performed as per International Conference on Harmonization recommended stress condition. A novel stability-indicating reverse phase HPLC method was developed for the determination of Ciprofloxacin and Tinidazole purity in the presence of its impurities and forced degradation products. This method is also capable to separate placebo peaks as well in pharmaceutical dosage forms. The solid oral dosage form was subjected to the stress conditions such as oxidative, acid, base hydrolysis, heat and photolytic degradation. Materials and Methods: The method was developed using Waters symmetry shield, Reverse Phase (RP C18, 250mm x 4.6mm, 5΅ as a stationary phase. The mobile phase containing a gradient mixture of solvent A and B. 10mM phosphate buffer, adjusted pH 3.0 with phosphoric acid was used as a buffer. Buffer pH 3.0 was used as solvent A and buffer pH 3.0: Acetonitrile in the ratio of 20: 80 v/v were used as solvent B. The eluted compounds were monitored 278 nm (Ciprofloxacin, 317 nm (Tinidazole. The run time was 50 minute. Results: In the precision study the % RSD for the result of Ciprofloxacin, Tinidazole and its impurities was below 10%. The method was linear with the correlation coefficient greater than 0.997. The percentage recoveries were calculated and observed from 93.0% to 106.7%.The peak purity of Ciprofloxacin, Tinidazole peak had not shown any flag, thus proved the stability-indicating power of the method. Conclusion: The developed method was validated as per ICH guidelines with respect to specificity, linearity, limit of detection, limit of quantification, accuracy, precision and robustness.

  11. High-Performance Networking

    CERN Multimedia

    CERN. Geneva

    2003-01-01

    The series will start with an historical introduction about what people saw as high performance message communication in their time and how that developed to the now to day known "standard computer network communication". It will be followed by a far more technical part that uses the High Performance Computer Network standards of the 90's, with 1 Gbit/sec systems as introduction for an in depth explanation of the three new 10 Gbit/s network and interconnect technology standards that exist already or emerge. If necessary for a good understanding some sidesteps will be included to explain important protocols as well as some necessary details of concerned Wide Area Network (WAN) standards details including some basics of wavelength multiplexing (DWDM). Some remarks will be made concerning the rapid expanding applications of networked storage.

  12. High performance data transfer

    Science.gov (United States)

    Cottrell, R.; Fang, C.; Hanushevsky, A.; Kreuger, W.; Yang, W.

    2017-10-01

    The exponentially increasing need for high speed data transfer is driven by big data, and cloud computing together with the needs of data intensive science, High Performance Computing (HPC), defense, the oil and gas industry etc. We report on the Zettar ZX software. This has been developed since 2013 to meet these growing needs by providing high performance data transfer and encryption in a scalable, balanced, easy to deploy and use way while minimizing power and space utilization. In collaboration with several commercial vendors, Proofs of Concept (PoC) consisting of clusters have been put together using off-the- shelf components to test the ZX scalability and ability to balance services using multiple cores, and links. The PoCs are based on SSD flash storage that is managed by a parallel file system. Each cluster occupies 4 rack units. Using the PoCs, between clusters we have achieved almost 200Gbps memory to memory over two 100Gbps links, and 70Gbps parallel file to parallel file with encryption over a 5000 mile 100Gbps link.

  13. Validation of interventional fiber optic spectroscopy with MR Spectroscopy, MAS-NMR spectroscopy, high-performance thin-layer chromatography, and histopathology for accurate hepatic fat quantification

    NARCIS (Netherlands)

    Nachabé, R.; Hoorn, J.W.A. van der; Molengraaf, R. van de; Lamerichs, R.; Pikkemaat, J.; Sio, C.F.; Hendriks, B.H.W.; Sterenborg, H.J.C.M.

    2012-01-01

    Objectives: To validate near-infrared (NIR)-based optical spectroscopy measurements of hepatic fat content using a minimally invasive needle-like probe with integrated optical fibers, enabling real-time feedback during percutaneous interventions. The results were compared with magnetic resonance

  14. Development and validation of an high-performance liquid chromatography-diode array detector method for the simultaneous determination of six phenolic compounds in abnormal savda munziq decoction

    Science.gov (United States)

    Tian, Shuge; Liu, Wenxian; Liu, Feng; Zhang, Xuejia; Upur, Halmuart

    2015-01-01

    Aims: Given the high-effectiveness and low-toxicity of abnormal savda munziq (ASMQ), its herbal formulation has long been used in traditional Uyghur medicine to treat complex diseases, such as cancer, diabetes, and cardiovascular diseases. Settings and Design: ASMQ decoction by reversed-phase high-performance liquid chromatography coupled with a diode array detector was successfully developed for the simultaneous quality assessment of gallic acid, protocatechuic acid, caffeic acid, rutin, rosmarinic acid, and luteolin. The six phenolic compounds were separated on an Agilent TC-C18 reversed-phase analytical column (4.6 × 250 mm, 5 μm) by gradient elution using 0.3% aqueous formic acid (v/v) and 0.3% methanol formic acid (v/v) at 1.0 mL/min. Materials and Methods: The plant material was separately ground and mixed at the following ratios (10): Cordia dichotoma (10.6), Anchusa italic (10.6), Euphorbia humifusa (4.9), Adiantum capillus-veneris (4.9), Ziziphus jujube (4.9), Glycyrrhiza uralensis (7.1), Foeniculum vulgare (4.9), Lavandula angustifolia (4.9), Dracocephalum moldavica L. (4.9), and Alhagi pseudoalhagi (42.3). Statistical Analysis Used: The precisions of all six compounds were Highly significant linear correlations were found between component concentrations and specific chromatographic peak areas (R2 > 0.999). Results: The proposed method was successfully applied to determine the levels of six active components in ASMQ. Conclusions: Given the simplicity, precision, specificity, and sensitivity of the method, it can be utilized as a quality control approach to simultaneously determining the six phenolic compounds in AMSQ. PMID:25709227

  15. Validation of the Performance of Engineered Barriers

    International Nuclear Information System (INIS)

    Choi, Jongwon; Cho, Wonjin; Kwon, Sangki

    2012-04-01

    To study the thermal-hydro-mechanical (THM) and thermal-hydro-mechanical-chemical (THMC) behavior of engineered barrier system (EBS), the engineering scale experiments, KENTEX and KENTEX-C were conducted to investigate THM and THMC behavior in the buffer. The computer modelling and simulation programmes were developed to analyze the distribution of temperature, water content, total pressure and the measured data on the migration behavior of anion and cation. In-situ heater test were performed to investigate the effect of the ventilation, thermal characteristics of EDZ, and effect of the anisotropy of rock mass and joint in addition to the investigation of the thermo-mechanical behavior in rock mass. The geophysics exploration and in-situ field tests were carried out to investigate the range of EDZ and its effects on the mechanical properties of rock. Subsequently, crack propagation characteristics and dynamic material properties of jointed rock mass in KURT were measured. Concurrently, the in-situ experiments were performed in the KURT to investigate the change of hydraulic properties in EDZ. The stainless steel molds are manufactured to fabricate the buffer blocks with various shapes. The experiments are carried out to check the mechanical properties, the workability for installation of the fabricated blocks and to investigate the resaturation processes. The state of the technology on application of cementitious materials to the HLW repository was analysed and the optimized low-pH cement recipe was obtained. And the material properties of low-pH and high-pH cement grouts were evaluated based on the grout recipes of ONKALO in Finland. The KURT was operated, and the various technical supports were provided to the in-situ experiments which were carried at KURT

  16. A validated analytical method to study the long-term stability of natural and synthetic glucocorticoids in livestock urine using ultra-high performance liquid chromatography coupled to Orbitrap-high resolution mass spectrometry.

    Science.gov (United States)

    De Clercq, Nathalie; Julie, Vanden Bussche; Croubels, Siska; Delahaut, Philippe; Vanhaecke, Lynn

    2013-08-02

    Due to their growth-promoting effects, the use of synthetic glucocorticoids is strictly regulated in the European Union (Council Directive 2003/74/EC). In the frame of the national control plans, which should ensure the absence of residues in food products of animal origin, in recent years, a higher frequency of prednisolone positive bovine urines has been observed. This has raised questions with respect to the stability of natural corticoids in the respective urine samples and their potential to be transformed into synthetic analogs. In this study, a ultra high performance liquid chromatography-high resolution mass spectrometry (UHPLC-HRMS) methodology was developed to examine the stability of glucocorticoids in bovine urine under various storage conditions (up to 20 weeks) and to define suitable conditions for sample handling and storage, using an Orbitrap Exactive™. To this end, an extraction procedure was optimized using a Plackett-Burman experimental design to determine the key conditions for optimal extraction of glucocorticoids from urine. Next, the analytical method was successfully validated according to the guidelines of CD 2002/657/EC. Decision limits and detection capabilities for prednisolone, prednisone and methylprednisolone ranged, respectively, from 0.1 to 0.5μgL(-1) and from 0.3 to 0.8μgL(-1). For the natural glucocorticoids limits of detection and limits of quantification for dihydrocortisone, cortisol and cortisone ranged, respectively, from 0.1 to 0.2μgL(-1) and from 0.3 to 0.8μgL(-1). The stability study demonstrated that filter-sterilization of urine, storage at -80°C, and acidic conditions (pH 3) were optimal for preservation of glucocorticoids in urine and able to significantly limit degradation up to 20 weeks. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. High performance sapphire windows

    Science.gov (United States)

    Bates, Stephen C.; Liou, Larry

    1993-02-01

    High-quality, wide-aperture optical access is usually required for the advanced laser diagnostics that can now make a wide variety of non-intrusive measurements of combustion processes. Specially processed and mounted sapphire windows are proposed to provide this optical access to extreme environment. Through surface treatments and proper thermal stress design, single crystal sapphire can be a mechanically equivalent replacement for high strength steel. A prototype sapphire window and mounting system have been developed in a successful NASA SBIR Phase 1 project. A large and reliable increase in sapphire design strength (as much as 10x) has been achieved, and the initial specifications necessary for these gains have been defined. Failure testing of small windows has conclusively demonstrated the increased sapphire strength, indicating that a nearly flawless surface polish is the primary cause of strengthening, while an unusual mounting arrangement also significantly contributes to a larger effective strength. Phase 2 work will complete specification and demonstration of these windows, and will fabricate a set for use at NASA. The enhanced capabilities of these high performance sapphire windows will lead to many diagnostic capabilities not previously possible, as well as new applications for sapphire.

  18. Validation of blood vitamin A concentrations in cattle: comparison of a new cow-side test (iCheck™ FLUORO) with high-performance liquid chromatography (HPLC).

    Science.gov (United States)

    Raila, Jens; Kawashima, Chiho; Sauerwein, Helga; Hülsmann, Nadine; Knorr, Christoph; Myamoto, Akio; Schweigert, Florian J

    2017-05-10

    Plasma concentration of retinol is an accepted indicator to assess the vitamin A (retinol) status in cattle. However, the determination of vitamin A requires a time consuming multi-step procedure, which needs specific equipment to perform extraction, centrifugation or saponification prior to high-performance liquid chromatography (HPLC). The concentrations of retinol in whole blood (n = 10), plasma (n = 132) and serum (n = 61) were measured by a new rapid cow-side test (iCheck™ FLUORO) and compared with those by HPLC in two independent laboratories in Germany (DE) and Japan (JP). Retinol concentrations in plasma ranged from 0.033 to 0.532 mg/L, and in serum from 0.043 to 0.360 mg/L (HPLC method). No significant differences in retinol levels were observed between the new rapid cow-side test and HPLC performed in different laboratories (HPLC vs. iCheck™ FLUORO: 0.320 ± 0.047 mg/L vs. 0.333 ± 0.044 mg/L, and 0.240 ± 0.096 mg/L vs. 0.241 ± 0.069 mg/L, lab DE and lab JP, respectively). A similar comparability was observed when whole blood was used (HPLC vs. iCheck™ FLUORO: 0.353 ± 0.084 mg/L vs. 0.341 ± 0.064 mg/L). Results showed a good agreement between both methods based on correlation coefficients of r 2  = 0.87 (P < 0.001) and Bland-Altman blots revealed no significant bias for all comparison. With the new rapid cow-side test (iCheck™ FLUORO) retinol concentrations in cattle can be reliably assessed within a few minutes and directly in the barn using even whole blood without the necessity of prior centrifugation. The ease of the application of the new rapid cow-side test and its portability can improve the diagnostic of vitamin A status and will help to control vitamin A supplementation in specific vitamin A feeding regimes such as used to optimize health status in calves or meat marbling in Japanese Black cattle.

  19. Determination of C-glucosidic ellagitannins in Lythri salicariaeherba by ultra-high performance liquid chromatography coupled with charged aerosol detector: method development and validation.

    Science.gov (United States)

    Granica, Sebastian; Piwowarski, Jakub P; Kiss, Anna K

    2014-01-01

    Lythri salicariaeherba is a pharmacopoeial plant material used by patients in the form of infusions in the treatment of acute diarrhoea. According to its pharmacopoeial monograph it is standardised for total tannin content, which should be not less than 5.0% using pyrogallol as a standard. Previous studies have shown that aqueous extracts from Lythri herba contain mainly ellagitannins among which vescalagin, castalagin and salicarinins A and B are dominating constituents. To develop and validate an efficient UHPLC coupled with a charged aerosol detector (CAD) method for quantification of four major ellagitannins in Lythri salicariaeherba and in one commercial preparation. Extraction conditions of ellagitannins from plant material were optimised. The relative response factors for vescalagin, castalagin and salicarinins A and B using gallic acid as an external standard were determined for the CAD detector. Then, a UHPLC method for quantification of ellagitannins was developed and validated. Four major ellagitannins were quantified in four samples of Lythri herba and in one commercial preparation. The sum of ellagitannins for each sample was determined, which varied from 30.66 to 48.80 mg/g of raw material and 16.57 mg per capsule for the preparation investigated. The first validated UHPLC/CAD UHPLC-CAD method for quantification of four major ellagitannins was developed. The universality of the CAD response was evaluated and it is shown that although all compounds analysed have similar structures their CAD response differs significantly. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Simultaneous determination of plasma creatinine, uric acid, kynurenine and tryptophan by high-performance liquid chromatography: method validation and in application to the assessment of renal function.

    Science.gov (United States)

    Zhao, Jianxing

    2015-03-01

    A high-performance liquid chromatography with ultraviolet detection method has been developed for the simultaneous determination of a set of reliable markers of renal function, including creatinine, uric acid, kynurenine and tryptophan in plasma. Separation was achieved by an Agilent HC-C18 (2) analytical column. Gradient elution and programmed wavelength detection allowed the method to be used to analyze these compounds by just one injection. The total run time was 25 min with all peaks of interest being eluted within 13 min. Good linear responses were found with correlation coefficient >0.999 for all analytes within the concentration range of the relevant levels. The recovery was: creatinine, 101 ± 1%; uric acid, 94.9 ± 3.7%; kynurenine, 100 ± 2%; and tryptophan, 92.6 ± 2.9%. Coefficients of variation within-run and between-run of all analytes were ≤2.4%. The limit of detection of the method was: creatinine, 0.1 µmol/L; uric acid, 0.05 µmol/L; kynurenine, 0.02 µmol/L; and tryptophan, 1 µmol/L. The developed method could be employed as a useful tool for the detection of chronic kidney disease, even at an early stage. Copyright © 2014 John Wiley & Sons, Ltd.

  1. Determination and validation of a simple high-performance liquid chromatographic method for simultaneous assay of iprodione and vinclozolin in human urine.

    Science.gov (United States)

    Carlucci, Giuseppe; Pasquale, Dorina Di; Ruggieri, Fabrizio; Mazzeo, Pietro

    2005-12-15

    A method based on solid-phase extraction (SPE) and high-performance liquid chromatography (HPLC) was developed for the simultaneous determination of 3-(3,5-diclorophenyl)-5-ethenyl-5-methyl-2,4-oxazolidinedione (vinclozolin) and 3-(3,5-diclorophenyl)-N-(1-methylethyl)-2,4-dioxo-1-imidazolidinecarboxamide (iprodione) in human urine. Urine samples containing vinclozolin and iprodione were collected by solid phase extraction using C(18) cartridges. The chromatographic separation was achieved on a Spherisorb ODS2 (250 mm x 4.6 mm, 5 microm) column with an isocratic mobile phase of acetonitrile-water (60:40, v/v). Detection was UV absorbance at 220 nm. The calibration graphs were linear from 30 to 1000 ng/mL for the two fungicides. Intra- and inter-day R.S.D. did not exceed 2.9%. The quantitation limit was 50 ng/mL for vinclozolin and 30 ng/mL for iprodione, respectively.

  2. Stress degradation studies of Telmisartan and Metoprolol extended release tablets by a validated stability indicating reverse phase-high performance liquid chromatography method

    Directory of Open Access Journals (Sweden)

    Kabeer Ahmed Shaikh

    2014-01-01

    Full Text Available Background and Aim: A sensitive reverse phase high-performance liquid chromatographic method has been developed for the simultaneous determination of Telimisartan and Metoprolol in tablet dosage form. Materials and Method: The chromatographic separation was achieved on Inertsil ODS 3V, 150 x 4.6 mm, 5μ analytical column. Mobile phase consisting of mobile phase A- 0.05M sodium dihydrogen phosphate buffer pH 3.0 and mobile phase B-Acetonitrile, with gradient program time in min /Mobile phase B% 0/22, 4/45, 6/45,18/22, 20/22. Detector was set at 222nm. Results and Conclusion: The described method shows excellent linearity over a range of 80-2 μg mL−1 for Telmisartan and 100-4 μg mL−1 for Metoprolol. The correlation coefficient for Telmisartan is 0.9998 and Metoprolol is 0.9999. The proposed method was found to be suitable for determination of Telmisartan and Metoprolol in tablet dosage form. Forced degradation of the drug product was conducted in accordance with the ICH guideline. Acidic, basic, hydrolytic, oxidative, thermal and photolytic degradation was used to assess the stability indicating power of the method. The drug product was found to be stable in acid, oxidation, thermal and photolytic stress condition and found degradation in base hydrolysis stress condition.

  3. Development and validation of a rapid ultra-high performance liquid chromatography method for the assay of benzalkonium chloride using a quality-by-design approach.

    Science.gov (United States)

    Mallik, Rangan; Raman, Srividya; Liang, Xiaoli; Grobin, Adam W; Choudhury, Dilip

    2015-09-25

    A rapid robust reversed-phase UHPLC method has been developed for the analysis of total benzalkonium chloride in preserved drug formulation. A systematic Quality-by-Design (QbD) method development approach using commercial, off the shelf software (Fusion AE(®)) has been used to optimize the column, mobile phases, gradient time, and other HPLC conditions. Total benzalkonium chloride analysis involves simple sample preparation. The method uses gradient elution from an ACE Excel 2 C18-AR column (50mm×2.1mm, 2.0μm particle size), ammonium phosphate buffer (pH 3.3; 10mM) as aqueous mobile phase and methanol/acetonitrile (85/15, v/v) as the organic mobile phase with UV detection at 214nm. Using these conditions, major homologs of the benzalkonium chloride (C12 and C14) have been separated in less than 2.0min. The validation results confirmed that the method is precise, accurate and linear at concentrations ranging from 0.025mg/mL to 0.075mg/mL for total benzalkonium chloride. The recoveries ranged from 99% to 103% at concentrations from 0.025mg/mL to 0.075mg/mL for total benzalkonium chloride. The validation results also confirmed the robustness of the method as predicted by Fusion AE(®). Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Policy and Validity Prospects for Performance-Based Assessment.

    Science.gov (United States)

    Baker, Eva L.; And Others

    1994-01-01

    This article describes performance-based assessment as expounded by its proponents, comments on these conceptions, reviews evidence regarding the technical quality of performance-based assessment, and considers its validity under various policy options. (JDD)

  5. Validating the BISON fuel performance code to integral LWR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, R.L., E-mail: Richard.Williamson@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gamble, K.A., E-mail: Kyle.Gamble@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Perez, D.M., E-mail: Danielle.Perez@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Novascone, S.R., E-mail: Stephen.Novascone@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Pastore, G., E-mail: Giovanni.Pastore@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gardner, R.J., E-mail: Russell.Gardner@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Hales, J.D., E-mail: Jason.Hales@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Liu, W., E-mail: Wenfeng.Liu@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States); Mai, A., E-mail: Anh.Mai@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States)

    2016-05-15

    Highlights: • The BISON multidimensional fuel performance code is being validated to integral LWR experiments. • Code and solution verification are necessary prerequisites to validation. • Fuel centerline temperature comparisons through all phases of fuel life are very reasonable. • Accuracy in predicting fission gas release is consistent with state-of-the-art modeling and the involved uncertainties. • Rod diameter comparisons are not satisfactory and further investigation is underway. - Abstract: BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON's computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to date for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Results demonstrate that (1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, (2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and (3) comparison

  6. A validated high-performance liquid chromatographic method for the determination of glibenclamide in human plasma and its application to pharmacokinetic studies.

    Science.gov (United States)

    Niopas, Ioannis; Daftsios, Athanasios C

    2002-05-15

    Glibenclamide is a potent second generation oral sulfonylurea antidiabetic agent widely used for the treatment of type II diabetes melitus. A rapid, sensitive, precise, accurate and specific HPLC assay for the determination of glibenclamide in human plasma was developed and validated. After addition of flufenamic acid as internal standard, the analytes were isolated from human plasma by liquid-liquid extraction. The method was linear in the 10-400 ng/ml concentration range (r > 0.999). Recovery for glibenclamide was greater than 91.5% and for internal standard was 93.5%. Within-day and between-day precision, expressed as the relative standard deviation (RSD%), ranged from 1.4 to 5.9% and 5.8 to 6.6%, respectively. Assay accuracy was better than 93.4%. The assay was used to estimate the pharmacokinetics of glibenclamide after oral administration of a 5 mg tablet of glibenclamide to 18 healthy volunteers.

  7. Development and validation of a simple high-performance liquid chromatography analytical method for simultaneous determination of phytosterols, cholesterol and squalene in parenteral lipid emulsions.

    Science.gov (United States)

    Novak, Ana; Gutiérrez-Zamora, Mercè; Domenech, Lluís; Suñé-Negre, Josep M; Miñarro, Montserrat; García-Montoya, Encarna; Llop, Josep M; Ticó, Josep R; Pérez-Lozano, Pilar

    2018-02-01

    A simple analytical method for simultaneous determination of phytosterols, cholesterol and squalene in lipid emulsions was developed owing to increased interest in their clinical effects. Method development was based on commonly used stationary (C 18 , C 8 and phenyl) and mobile phases (mixtures of acetonitrile, methanol and water) under isocratic conditions. Differences in stationary phases resulted in peak overlapping or coelution of different peaks. The best separation of all analyzed compounds was achieved on Zorbax Eclipse XDB C 8 (150 × 4.6 mm, 5 μm; Agilent) and ACN-H 2 O-MeOH, 80:19.5:0.5 (v/v/v). In order to achieve a shorter time of analysis, the method was further optimized and gradient separation was established. The optimized analytical method was validated and tested for routine use in lipid emulsion analyses. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Development and validation of a high performance liquid chromatography quantification method of levo-tetrahydropalmatine and its metabolites in plasma and brain tissues: application to a pharmacokinetic study.

    Science.gov (United States)

    Abdallah, Inas A; Huang, Peng; Liu, Jing; Lee, David Y; Liu-Chen, Lee-Yuan; Hassan, Hazem E

    2017-04-01

    Levo-tetrahydropalmatine (l-THP) is an alkaloid isolated from Chinese medicinal herbs of the Corydalis and Stephania genera. It has been used in China for more than 40 years mainly as an analgesic with sedative/hypnotic effects. Despite its extensive use, its metabolism has not been quantitatively studied, nor there a sensitive reliable bioanalytical method for its quantification simultaneously with its metabolites. As such, the objective of this study was to develop and validate a sensitive and selective HPLC method for simultaneous quantification of l-THP and its desmethyl metabolites l-corydalmine (l-CD) and l-corypalmine (l-CP) in rat plasma and brain tissues. Rat plasma and brain samples were processed by liquid-liquid extraction using ethyl acetate. Chromatographic separation was achieved on a reversed-phase Symmetry® C 18 column (4.6 × 150 mm, 5 μm) at 25°C. The mobile phase consisted of acetonitrile-methanol-10 mm ammonium phosphate (pH 3) (10:30:60, v/v) and was used at a flow rate of 0.8 mL/min. The column eluent was monitored at excitation and emission wavelengths of 230 and 315 nm, respectively. The calibration curves were linear over the concentration range of 1-10,000 ng/mL. The intra- and interday reproducibility studies demonstrated accuracy and precision within the acceptance criteria of bioanalytical guidelines. The validated HPLC method was successfully applied to analyze samples from a pharmacokinetic study of l-THP in rats. Taken together, the developed method can be applied for bioanalysis of l-THP and its metabolites in rodents and potentially can be transferred for bioanalysis of human samples. Copyright © 2016 John Wiley & Sons, Ltd.

  9. R high performance programming

    CERN Document Server

    Lim, Aloysius

    2015-01-01

    This book is for programmers and developers who want to improve the performance of their R programs by making them run faster with large data sets or who are trying to solve a pesky performance problem.

  10. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  11. High performance work practices, innovation and performance

    DEFF Research Database (Denmark)

    Jørgensen, Frances; Newton, Cameron; Johnston, Kim

    2013-01-01

    Research spanning nearly 20 years has provided considerable empirical evidence for relationships between High Performance Work Practices (HPWPs) and various measures of performance including increased productivity, improved customer service, and reduced turnover. What stands out from......, and Africa to examine these various questions relating to the HPWP-innovation-performance relationship. Each paper discusses a practice that has been identified in HPWP literature and potential variables that can facilitate or hinder the effects of these practices of innovation- and performance...

  12. Validation of an ultra-high-performance liquid chromatography-tandem mass spectrometry method to quantify illicit drug and pharmaceutical residues in wastewater using accuracy profile approach.

    Science.gov (United States)

    Hubert, Cécile; Roosen, Martin; Levi, Yves; Karolak, Sara

    2017-06-02

    The analysis of biomarkers in wastewater has become a common approach to assess community behavior. This method is an interesting way to estimate illicit drug consumption in a given population: by using a back calculation method, it is therefore possible to quantify the amount of a specific drug used in a community and to assess the consumption variation at different times and locations. Such a method needs reliable analytical data since the determination of a concentration in the ngL -1 range in a complex matrix is difficult and not easily reproducible. The best analytical method is liquid chromatography - mass spectrometry coupling after solid-phase extraction or on-line pre-concentration. Quality criteria are not specially defined for this kind of determination. In this context, it was decided to develop an UHPLC-MS/MS method to analyze 10 illicit drugs and pharmaceuticals in wastewater treatment plant influent or effluent using a pre-concentration on-line system. A validation process was then carried out using the accuracy profile concept as an innovative tool to estimate the probability of getting prospective results within specified acceptance limits. Influent and effluent samples were spiked with known amounts of the 10 compounds and analyzed three times a day for three days in order to estimate intra-day and inter-day variations. The matrix effect was estimated for each compound. The developed method can provide at least 80% of results within ±25% limits except for compounds that are degraded in influent. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Construct validity of the Individual Work Performance Questionnaire.

    OpenAIRE

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; Vet, H.C.W. de; Beek, A.J. van der

    2014-01-01

    Objective: To examine the construct validity of the Individual Work Performance Questionnaire (IWPQ). Methods: A total of 1424 Dutch workers from three occupational sectors (blue, pink, and white collar) participated in the study. First, IWPQ scores were correlated with related constructs (convergent validity). Second, differences between known groups were tested (discriminative validity). Results: First, IWPQ scores correlated weakly to moderately with absolute and relative presenteeism, and...

  14. A Simulation Approach for Performance Validation during Embedded Systems Design

    Science.gov (United States)

    Wang, Zhonglei; Haberl, Wolfgang; Herkersdorf, Andreas; Wechs, Martin

    Due to the time-to-market pressure, it is highly desirable to design hardware and software of embedded systems in parallel. However, hardware and software are developed mostly using very different methods, so that performance evaluation and validation of the whole system is not an easy task. In this paper, we propose a simulation approach to bridge the gap between model-driven software development and simulation based hardware design, by merging hardware and software models into a SystemC based simulation environment. An automated procedure has been established to generate software simulation models from formal models, while the hardware design is originally modeled in SystemC. As the simulation models are annotated with timing information, performance issues are tackled in the same pass as system functionality, rather than in a dedicated approach.

  15. Collaborative trial validation study of two methods, one based on high performance liquid chromatography-tandem mass spectrometry and on gas chromatography-mass spectrometry for the determination of acrylamide in bakery and potato products.

    Science.gov (United States)

    Wenzl, Thomas; Karasek, Lubomir; Rosen, Johan; Hellenaes, Karl-Erik; Crews, Colin; Castle, Laurence; Anklam, Elke

    2006-11-03

    A European inter-laboratory study was conducted to validate two analytical procedures for the determination of acrylamide in bakery ware (crispbreads, biscuits) and potato products (chips), within a concentration range from about 20 microg/kg to about 9000 microgg/kg. The methods are based on gas chromatography-mass spectrometry (GC-MS) of the derivatised analyte and on high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) of native acrylamide. Isotope dilution with isotopically labelled acrylamide was an integral part of both methods. The study was evaluated according to internationally accepted guidelines. The performance of the HPLC-MS/MS method was found to be superior to that of the GC-MS method and to be fit-for-the-purpose.

  16. Development and Validation of High Performance Thin-Layer Chromatographic Method for Determination of α-Mangostin in Fruit Pericarp of Mangosteen Plant (Garcinia mangostana L. using Ultraviolet – Visible Detection

    Directory of Open Access Journals (Sweden)

    Himanshu Misra

    2009-10-01

    Full Text Available A simple, fast and precise quantitative high performance thin-layer chromatographic method has been developed for quantitative estimation of α-mangostin in fruit pericarp of Garcinia mangostana L. (Hypericaceae. Best solvent for extraction of a-mangostin optimized after screening with five solvents under same conditions using hot solid-liquid extraction through soxhlet apparatus. Methanol and chloroform gave highest and second highest recovery of a-mangostin, respectively. Plates were developed in chloroform-methanol in the ratio of 27-3 (v/v. Post-chromatographic derivatization performed using anisaldehyde-sulphuric acid reagent and scanned at 382 nm in ultraviolet-visible mode. The developed method was found to be linear in the range 1.0 to 5.0 mg spot-1, limits of detection and quantitation were 150 and 450 ng spot-1. The developed method was validated in terms of system suitability, specificity and robustness.

  17. Python high performance programming

    CERN Document Server

    Lanaro, Gabriele

    2013-01-01

    An exciting, easy-to-follow guide illustrating the techniques to boost the performance of Python code, and their applications with plenty of hands-on examples.If you are a programmer who likes the power and simplicity of Python and would like to use this language for performance-critical applications, this book is ideal for you. All that is required is a basic knowledge of the Python programming language. The book will cover basic and advanced topics so will be great for you whether you are a new or a seasoned Python developer.

  18. High performance germanium MOSFETs

    Energy Technology Data Exchange (ETDEWEB)

    Saraswat, Krishna [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States)]. E-mail: saraswat@stanford.edu; Chui, Chi On [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Krishnamohan, Tejas [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Kim, Donghyun [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Nayfeh, Ammar [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Pethe, Abhijit [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States)

    2006-12-15

    Ge is a very promising material as future channel materials for nanoscale MOSFETs due to its high mobility and thus a higher source injection velocity, which translates into higher drive current and smaller gate delay. However, for Ge to become main-stream, surface passivation and heterogeneous integration of crystalline Ge layers on Si must be achieved. We have demonstrated growth of fully relaxed smooth single crystal Ge layers on Si using a novel multi-step growth and hydrogen anneal process without any graded buffer SiGe layer. Surface passivation of Ge has been achieved with its native oxynitride (GeO {sub x}N {sub y} ) and high-permittivity (high-k) metal oxides of Al, Zr and Hf. High mobility MOSFETs have been demonstrated in bulk Ge with high-k gate dielectrics and metal gates. However, due to their smaller bandgap and higher dielectric constant, most high mobility materials suffer from large band-to-band tunneling (BTBT) leakage currents and worse short channel effects. We present novel, Si and Ge based heterostructure MOSFETs, which can significantly reduce the BTBT leakage currents while retaining high channel mobility, making them suitable for scaling into the sub-15 nm regime. Through full band Monte-Carlo, Poisson-Schrodinger and detailed BTBT simulations we show a dramatic reduction in BTBT and excellent electrostatic control of the channel, while maintaining very high drive currents in these highly scaled heterostructure DGFETs. Heterostructure MOSFETs with varying strained-Ge or SiGe thickness, Si cap thickness and Ge percentage were fabricated on bulk Si and SOI substrates. The ultra-thin ({approx}2 nm) strained-Ge channel heterostructure MOSFETs exhibited >4x mobility enhancements over bulk Si devices and >10x BTBT reduction over surface channel strained SiGe devices.

  19. High performance germanium MOSFETs

    International Nuclear Information System (INIS)

    Saraswat, Krishna; Chui, Chi On; Krishnamohan, Tejas; Kim, Donghyun; Nayfeh, Ammar; Pethe, Abhijit

    2006-01-01

    Ge is a very promising material as future channel materials for nanoscale MOSFETs due to its high mobility and thus a higher source injection velocity, which translates into higher drive current and smaller gate delay. However, for Ge to become main-stream, surface passivation and heterogeneous integration of crystalline Ge layers on Si must be achieved. We have demonstrated growth of fully relaxed smooth single crystal Ge layers on Si using a novel multi-step growth and hydrogen anneal process without any graded buffer SiGe layer. Surface passivation of Ge has been achieved with its native oxynitride (GeO x N y ) and high-permittivity (high-k) metal oxides of Al, Zr and Hf. High mobility MOSFETs have been demonstrated in bulk Ge with high-k gate dielectrics and metal gates. However, due to their smaller bandgap and higher dielectric constant, most high mobility materials suffer from large band-to-band tunneling (BTBT) leakage currents and worse short channel effects. We present novel, Si and Ge based heterostructure MOSFETs, which can significantly reduce the BTBT leakage currents while retaining high channel mobility, making them suitable for scaling into the sub-15 nm regime. Through full band Monte-Carlo, Poisson-Schrodinger and detailed BTBT simulations we show a dramatic reduction in BTBT and excellent electrostatic control of the channel, while maintaining very high drive currents in these highly scaled heterostructure DGFETs. Heterostructure MOSFETs with varying strained-Ge or SiGe thickness, Si cap thickness and Ge percentage were fabricated on bulk Si and SOI substrates. The ultra-thin (∼2 nm) strained-Ge channel heterostructure MOSFETs exhibited >4x mobility enhancements over bulk Si devices and >10x BTBT reduction over surface channel strained SiGe devices

  20. Performance Validation of the ATLAS Muon Spectrometer

    CERN Document Server

    Mair, Katharina

    ATLAS (A Toroidal LHC ApparatuS) is a general-purpose experiment for the future Large Hadron Collider (LHC) at CERN, which is scheduled to begin operation in the year 2007, providing experiments with proton-proton collisions. The center-of-mass energy of 14TeV and the design luminosity of 1034 cm−2s−1 will allow to explore many new aspects of fundamental physics. The ATLAS Muon Spectrometer aims at a momentum resolution better than 10% for transverse momentum values ranging from pT = 6 GeV to pT = 1TeV. Precision tracking will be performed by Ar-CO2-gas filled Monitored Drift Tube chambers (MDTs), with a single wire resolution of < 100 μm. In total, about 1 200 chambers, arranged in a large structure, will allow muon track measurements over distances up to 15m in a magnetic field of 0.5 T. Given the large size of the spectrometer it is impossible to keep the shape of the muon chambers and their positions stable within the requested tracking accuracy of 50 μm. Therefore the concept of an optical alig...

  1. High Performance Computing Multicast

    Science.gov (United States)

    2012-02-01

    A History of the Virtual Synchrony Replication Model,” in Replication: Theory and Practice, Charron-Bost, B., Pedone, F., and Schiper, A. (Eds...Performance Computing IP / IPv4 Internet Protocol (version 4.0) IPMC Internet Protocol MultiCast LAN Local Area Network MCMD Dr. Multicast MPI

  2. NGINX high performance

    CERN Document Server

    Sharma, Rahul

    2015-01-01

    System administrators, developers, and engineers looking for ways to achieve maximum performance from NGINX will find this book beneficial. If you are looking for solutions such as how to handle more users from the same system or load your website pages faster, then this is the book for you.

  3. Development of a high performance liquid chromatography method ...

    African Journals Online (AJOL)

    Development of a high performance liquid chromatography method for simultaneous ... Purpose: To develop and validate a new low-cost high performance liquid chromatography (HPLC) method for ..... Several papers have reported the use of ...

  4. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  5. High performance proton accelerators

    International Nuclear Information System (INIS)

    Favale, A.J.

    1989-01-01

    In concert with this theme this paper briefly outlines how Grumman, over the past 4 years, has evolved from a company that designed and fabricated a Radio Frequency Quadrupole (RFQ) accelerator from the Los Alamos National Laboratory (LANL) physics and specifications to a company who, as prime contractor, is designing, fabricating, assembling and commissioning the US Army Strategic Defense Commands (USA SDC) Continuous Wave Deuterium Demonstrator (CWDD) accelerator as a turn-key operation. In the case of the RFQ, LANL scientists performed the physics analysis, established the specifications supported Grumman on the mechanical design, conducted the RFQ tuning and tested the RFQ at their laboratory. For the CWDD Program Grumman has the responsibility for the physics and engineering designs, assembly, testing and commissioning albeit with the support of consultants from LANL, Lawrence Berkeley Laboratory (LBL) and Brookhaven National laboratory. In addition, Culham Laboratory and LANL are team members on CWDD. LANL scientists have reviewed the physics design as well as a USA SDC review board. 9 figs

  6. Validation of NCSSHP for highly enriched uranium systems containing beryllium

    International Nuclear Information System (INIS)

    Krass, A.W.; Elliott, E.P.; Tollefson, D.A.

    1994-01-01

    This document describes the validation of KENO V.a using the 27-group ENDF/B-IV cross section library for highly enriched uranium and beryllium neutronic systems, and is in accordance with ANSI/ANS-8.1-1983(R1988) requirements for calculational methods. The validation has been performed on a Hewlett Packard 9000/Series 700 Workstation at the Oak Ridge Y-12 Plant Nuclear Criticality Safety Department using the Oak Ridge Y-12 Plant Nuclear Criticality Safety Software code package. Critical experiments from LA-2203, UCRL-4975, ORNL-2201, and ORNL/ENG-2 have been identified as having the constituents desired for this validation as well as sufficient experimental detail to allow accurate construction of KENO V.a calculational models. The results of these calculations establish the safety criteria to be employed in future calculational studies of these types of systems

  7. Optimization and validation of high performance liquid ...

    African Journals Online (AJOL)

    from jugular vein of the rabbits after drug administration and analysed by HPLC. ... Metoprolol quantification in plasma, urine and ... preparation of biological samples. .... centrifugation and stored at -70 oC in an ultra- .... The main problem.

  8. Validated method for the determination of perfluorinated compounds in placental tissue samples based on a simple extraction procedure followed by ultra-high performance liquid chromatography-tandem mass spectrometry analysis.

    Science.gov (United States)

    Martín, J; Rodríguez-Gómez, R; Zafra-Gómez, A; Alonso, E; Vílchez, J L; Navalón, A

    2016-04-01

    Xenobiotic exposure during pregnancy is inevitable. Determination of perfluorinated compounds (PFCs), chemicals described as environmental contaminants by Public Health Authorities due to their persistence, bioaccumulation and toxicity, is a challenge. In the present work, a method based on a simplified sample treatment involving freeze-drying, solvent extraction and dispersive clean-up of the extracts using C18 sorbents followed by an ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) analysis was developed and validated for the determination of five perfluorinated carboxylic acids (C4-C8) and perfluorooctane sulfonate (PFOS) in placental tissue samples. The most influential parameters affecting the extraction method and clean-up were optimized using Design of Experiments (DOE). The method was validated using matrix-matched calibration. Found limits of detection (LODs) ranged from 0.03 to 2 ng g(-1) and limits of quantification (LOQs) from 0.08 to 6 ng g(-1), while inter- and intra-day variability was under 14% in all cases. Recovery rates for spiked samples ranged from 94% to 113%. The method was satisfactorily applied for the determination of compounds in human placental tissue samples collected at delivery from 25 randomly selected women. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Validation of a reversed-phase high-performance liquid chromatographic method for the determination of free amino acids in rice using l-theanine as the internal standard.

    Science.gov (United States)

    Liyanaarachchi, G V V; Mahanama, K R R; Somasiri, H P P S; Punyasiri, P A N

    2018-02-01

    The study presents the validation results of the method carried out for analysis of free amino acids (FAAs) in rice using l-theanine as the internal standard (IS) with o-phthalaldehyde (OPA) reagent using high-performance liquid chromatography-fluorescence detection. The detection and quantification limits of the method were in the range 2-16μmol/kg and 3-19μmol/kg respectively. The method had a wide working range from 25 to 600μmol/kg for each individual amino acid, and good linearity with regression coefficients greater than 0.999. Precision measured in terms of repeatability and reproducibility, expressed as percentage relative standard deviation (% RSD) was below 9% for all the amino acids analyzed. The recoveries obtained after fortification at three concentration levels were in the range 75-105%. In comparison to l-norvaline, findings revealed that l-theanine is suitable as an IS and the validated method can be used for FAA determination in rice. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Multilaboratory Validation of First Action Method 2016.04 for Determination of Four Arsenic Species in Fruit Juice by High-Performance Liquid Chromatography-Inductively Coupled Plasma-Mass Spectrometry.

    Science.gov (United States)

    Kubachka, Kevin; Heitkemper, Douglas T; Conklin, Sean

    2017-07-01

    Before being designated AOAC First Action Official MethodSM 2016.04, the U.S. Food and Drug Administration's method, EAM 4.10 High Performance Liquid Chromatography-Inductively Coupled Plasma-Mass Spectrometric Determination of Four Arsenic Species in Fruit Juice, underwent both a single-laboratory validation and a multilaboratory validation (MLV) study. Three federal and five state regulatory laboratories participated in the MLV study, which is the primary focus of this manuscript. The method was validated for inorganic arsenic (iAs) measured as the sum of the two iAs species arsenite [As(III)] and arsenate [As(V)], dimethylarsinic acid (DMA), and monomethylarsonic acid (MMA) by analyses of 13 juice samples, including three apple juice, three apple juice concentrate, four grape juice, and three pear juice samples. In addition, two water Standard Reference Materials (SRMs) were analyzed. The method LODs and LOQs obtained among the eight laboratories were approximately 0.3 and 2 ng/g, respectively, for each of the analytes and were adequate for the intended purpose of the method. Each laboratory analyzed method blanks, fortified method blanks, reference materials, triplicate portions of each juice sample, and duplicate fortified juice samples (one for each matrix type) at three fortification levels. In general, repeatability and reproducibility of the method was ≤15% RSD for each species present at a concentration >LOQ. The average recovery of fortified analytes for all laboratories ranged from 98 to 104% iAs, DMA, and MMA for all four juice sample matrixes. The average iAs results for SRMs 1640a and 1643e agreed within the range of 96-98% of certified values for total arsenic.

  11. Development and Validation of the Basketball Offensive Game Performance Instrument

    Science.gov (United States)

    Chen, Weiyun; Hendricks, Kristin; Zhu, Weimo

    2013-01-01

    The purpose of this study was to design and validate the Basketball Offensive Game Performance Instrument (BOGPI) that assesses an individual player's offensive game performance competency in basketball. Twelve physical education teacher education (PETE) students playing two 10-minute, 3 vs. 3 basketball games were videotaped at end of a…

  12. Validation of Fuel Performance Uncertainty for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam-Gyu; Yoo, Jong-Sung; Jung, Yil-Sup [KEPCO Nuclear Fuel Co., Daejeon (Korea, Republic of)

    2016-10-15

    To achieve this the computer code performance has to be validated based on the experimental results. And for the uncertainty quantification, important uncertainty parameters need to be selected, and combined uncertainty has to be evaluated with an acceptable statistical treatment. And important uncertainty parameters to the rod performance such as fuel enthalpy, fission gas release, cladding hoop strain etc. were chosen through the rigorous sensitivity studies. And their validity has been assessed by utilizing the experimental results, which were tested in CABRI and NSRR. Analysis results revealed that several tested rods were not bounded within combined fuel performance uncertainty. Assessment of fuel performance with an extended fuel power uncertainty on tested rods in NSRR and CABRI has been done. Analysis results showed that several tested rods were not bounded within calculated fuel performance uncertainty. This implies that the currently considered uncertainty range of the parameters is not enough to cover the fuel performance sufficiently.

  13. Effort, symptom validity testing, performance validity testing and traumatic brain injury.

    Science.gov (United States)

    Bigler, Erin D

    2014-01-01

    To understand the neurocognitive effects of brain injury, valid neuropsychological test findings are paramount. This review examines the research on what has been referred to a symptom validity testing (SVT). Above a designated cut-score signifies a 'passing' SVT performance which is likely the best indicator of valid neuropsychological test findings. Likewise, substantially below cut-point performance that nears chance or is at chance signifies invalid test performance. Significantly below chance is the sine qua non neuropsychological indicator for malingering. However, the interpretative problems with SVT performance below the cut-point yet far above chance are substantial, as pointed out in this review. This intermediate, border-zone performance on SVT measures is where substantial interpretative challenges exist. Case studies are used to highlight the many areas where additional research is needed. Historical perspectives are reviewed along with the neurobiology of effort. Reasons why performance validity testing (PVT) may be better than the SVT term are reviewed. Advances in neuroimaging techniques may be key in better understanding the meaning of border zone SVT failure. The review demonstrates the problems with rigidity in interpretation with established cut-scores. A better understanding of how certain types of neurological, neuropsychiatric and/or even test conditions may affect SVT performance is needed.

  14. Computer code validation by high temperature chemistry

    International Nuclear Information System (INIS)

    Alexander, C.A.; Ogden, J.S.

    1988-01-01

    At least five of the computer codes utilized in analysis of severe fuel damage-type events are directly dependent upon or can be verified by high temperature chemistry. These codes are ORIGEN, CORSOR, CORCON, VICTORIA, and VANESA. With the exemption of CORCON and VANESA, it is necessary that verification experiments be performed on real irradiated fuel. For ORIGEN, the familiar knudsen effusion cell is the best choice and a small piece of known mass and known burn-up is selected and volatilized completely into the mass spectrometer. The mass spectrometer is used in the integral mode to integrate the entire signal from preselected radionuclides, and from this integrated signal the total mass of the respective nuclides can be determined. For CORSOR and VICTORIA, experiments with flowing high pressure hydrogen/steam must flow over the irradiated fuel and then enter the mass spectrometer. For these experiments, a high pressure-high temperature molecular beam inlet must be employed. Finally, in support of VANESA-CORCON, the very highest temperature and molten fuels must be contained and analyzed. Results from all types of experiments will be discussed and their applicability to present and future code development will also be covered

  15. Use of the color trails test as an embedded measure of performance validity.

    Science.gov (United States)

    Henry, George K; Algina, James

    2013-01-01

    One hundred personal injury litigants and disability claimants referred for a forensic neuropsychological evaluation were administered both portions of the Color Trails Test (CTT) as part of a more comprehensive battery of standardized tests. Subjects who failed two or more free-standing tests of cognitive performance validity formed the Failed Performance Validity (FPV) group, while subjects who passed all free-standing performance validity measures were assigned to the Passed Performance Validity (PPV) group. A cutscore of ≥45 seconds to complete Color Trails 1 (CT1) was associated with a classification accuracy of 78%, good sensitivity (66%) and high specificity (90%), while a cutscore of ≥84 seconds to complete Color Trails 2 (CT2) was associated with a classification accuracy of 82%, good sensitivity (74%) and high specificity (90%). A CT1 cutscore of ≥58 seconds, and a CT2 cutscore ≥100 seconds was associated with 100% positive predictive power at base rates from 20 to 50%.

  16. Single-laboratory validation of a high-performance liquid chromatographic-diode array detector-fluorescence detector/mass spectrometric method for simultaneous determination of water-soluble vitamins in multivitamin dietary tablets.

    Science.gov (United States)

    Chen, Pei; Atkinson, Renata; Wolf, Wayne R

    2009-01-01

    The purpose of this study was to develop a single-laboratory validated (SLV) method using high-performance liquid chromatography with different detectors [diode array detector (DAD); fluorescence detector (FLD); and mass spectrometry (MS)] for determination of 7 B-complex vitamins (B1-thiamin, B2-riboflavin, B3-nicotinamide, B6-pyridoxine, B9-folic acid, pantothenic acid, and biotin) and vitamin C in multivitamin/multimineral dietary supplements. The method involves the use of a reversed-phase octadecylsilyl column (4 microm, 250 x 2.0 mm id) and a gradient mobile phase profile. Gradient elution was performed at a flow rate of 0.25 mL/min. After a 5 min isocratic elution at 100% A (0.1% formic acid in water), a linear gradient to 50% A and 50% B (0.1% formic acid in acetonitrile) at 15 min was employed. Detection was performed with a DAD as well as either an FLD or a triple-quadrupole MS detector in the multiple reaction monitoring mode. SLV was performed using Standard Reference Material (SRM) 3280 Multivitamin/Multimineral Tablets, being developed by the National Institute of Standards and Technology, with support by the Office of Dietary Supplements of the National Institutes of Health. Phosphate buffer (10 mM, pH 2.0) extracts of the NIST SRM 3280 were analyzed by the liquid chromatographic (LC)-DAD-FLDIMS method. Following extraction, the method does not require any sample cleanup/preconcentration steps except centrifugation and filtration.

  17. The Development and Validation of a Rubric to Enhance Performer Feedback for Undergraduate Vocal Solo Performance

    Science.gov (United States)

    Herrell, Katherine A.

    2014-01-01

    This is a study of the development and validation of a rubric to enhance performer feedback for undergraduate vocal solo performance. In the literature, assessment of vocal performance is under-represented, and the value of feedback from the assessment of musical performances, from the point of view of the performer, is nonexistent. The research…

  18. Construct validity of the Individual Work Performance Questionnaire.

    NARCIS (Netherlands)

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; Vet, H.C.W. de; Beek, A.J. van der

    2014-01-01

    Objective: To examine the construct validity of the Individual Work Performance Questionnaire (IWPQ). Methods: A total of 1424 Dutch workers from three occupational sectors (blue, pink, and white collar) participated in the study. First, IWPQ scores were correlated with related constructs

  19. Construct validity of the individual work performance questionnaire

    NARCIS (Netherlands)

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; Vet, H.C.W. de; Beek, A.J. van der

    2014-01-01

    OBJECTIVE:: To examine the construct validity of the Individual Work Performance Questionnaire (IWPQ). METHODS:: A total of 1424 Dutch workers from three occupational sectors (blue, pink, and white collar) participated in the study. First, IWPQ scores were correlated with related constructs

  20. High Performance Networks for High Impact Science

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Mary A.; Bair, Raymond A.

    2003-02-13

    This workshop was the first major activity in developing a strategic plan for high-performance networking in the Office of Science. Held August 13 through 15, 2002, it brought together a selection of end users, especially representing the emerging, high-visibility initiatives, and network visionaries to identify opportunities and begin defining the path forward.

  1. Simultaneous quantification of acetaminophen and five acetaminophen metabolites in human plasma and urine by high-performance liquid chromatography-electrospray ionization-tandem mass spectrometry: Method validation and application to a neonatal pharmacokinetic study.

    Science.gov (United States)

    Cook, Sarah F; King, Amber D; van den Anker, John N; Wilkins, Diana G

    2015-12-15

    Drug metabolism plays a key role in acetaminophen (paracetamol)-induced hepatotoxicity, and quantification of acetaminophen metabolites provides critical information about factors influencing susceptibility to acetaminophen-induced hepatotoxicity in clinical and experimental settings. The aims of this study were to develop, validate, and apply high-performance liquid chromatography-electrospray ionization-tandem mass spectrometry (HPLC-ESI-MS/MS) methods for simultaneous quantification of acetaminophen, acetaminophen-glucuronide, acetaminophen-sulfate, acetaminophen-glutathione, acetaminophen-cysteine, and acetaminophen-N-acetylcysteine in small volumes of human plasma and urine. In the reported procedures, acetaminophen-d4 and acetaminophen-d3-sulfate were utilized as internal standards (IS). Analytes and IS were recovered from human plasma (10μL) by protein precipitation with acetonitrile. Human urine (10μL) was prepared by fortification with IS followed only by sample dilution. Calibration concentration ranges were tailored to literature values for each analyte in each biological matrix. Prepared samples from plasma and urine were analyzed under the same HPLC-ESI-MS/MS conditions, and chromatographic separation was achieved through use of an Agilent Poroshell 120 EC-C18 column with a 20-min run time per injected sample. The analytes could be accurately and precisely quantified over 2.0-3.5 orders of magnitude. Across both matrices, mean intra- and inter-assay accuracies ranged from 85% to 112%, and intra- and inter-assay imprecision did not exceed 15%. Validation experiments included tests for specificity, recovery and ionization efficiency, inter-individual variability in matrix effects, stock solution stability, and sample stability under a variety of storage and handling conditions (room temperature, freezer, freeze-thaw, and post-preparative). The utility and suitability of the reported procedures were illustrated by analysis of pharmacokinetic samples

  2. Validation of a High-Performance Liquid Chromatography method for the determination of vitamin A, vitamin D3, vitamin E and benzyl alcohol in a veterinary oily injectable solution

    Directory of Open Access Journals (Sweden)

    Maria Neagu

    2015-06-01

    Full Text Available A new simple, rapid, accurate and precise highperformance liquid chromatography (HPLC method for determination of vitamin A, vitamin D3, vitamin E and benzyl alcohol in oily injectable solution was developed and validated. The method can be used for the detection and quantification of known and unknown impurities and degradants in the drug substance during routine analysis and also for stability studies in view of its capability to separate degradation products. The method was validated for accuracy, precision, specificity, robustness and quantification limits according to ICH Guidelines. The estimation of vitamin A, vitamin D3, vitamin E and benzyl alcohol was done by Waters HPLC system manager using gradient pump system. The chromatographic conditions comprised a reverse-phased C18 column (5 µm particle size, 250 mm×4.6 mm i.d. with a mobile phase consisting of tetrahydrofurane, acetonitrile and water in gradient elution. The flow rate was 0.8 ml/min and 2.0 ml/min. Standard curves were linear over the concentration range of 16.50 µg/ml to 11.00 mg/ml for vitamin A, 10.05 µg/ml to 6.70 mg/ml for vitamin E, 0.075 µg/ml to 0.050 mg/ml for vitamin D3 and 1.25 mg/ml to 5.00 mg/ml for benzylalcohol. Statistical analyses proved the method was precise, reproducible, selective, specific and accurate for analysis of vitamin A, vitamin D3, vitamin E, benzyl alcohol and impurities.

  3. Validation of Sizewell ''B'' ultrasonic inspections -- Messages for performance demonstration

    International Nuclear Information System (INIS)

    Conroy, P.J.; Leyland, K.S.; Waites, C.

    1994-01-01

    At the time that the decisions leading to the construction of the Sizewell ''B'' plant were being made, public concern over the potential hazards of nuclear power was increasing. This concern was heightened by the accident at USA's Three Mile Island plant. The result of this and public pressure was that an extensive public inquiry was held in addition to the UK's normal licensing process. Part of the evidence to the inquiry supporting the safety case relied upon the ability of ultrasonic inspections to demonstrate that the Reactor Pressure Vessel (RPV) and other key components were free from defects that could threaten structural integrity. Evidence from a variety of trials designed to investigate the performance capability of ultrasonic inspection revealed that although ultrasonic inspection had the potential to satisfy this requirement its performance in practice was heavily dependent upon the details of application. It was therefore generally recognized that some form of inspection validation was required to provide assurance that the equipment, procedures and operators to be employed were adequate for purpose. The concept of inspection validation was therefore included in the safety case for the licensing of Sizewell ''B''. The UK validation trials covering the ultrasonic inspections of the Sizewell ''B'' PWR Reactor Pressure Vessel are now nearing completion. This paper summarizes the results of the RPV validations and considers some of the implications for ASME 11 Appendix 8 the US code covering performance demonstration

  4. ADVANCED HIGH PERFORMANCE SOLID WALL BLANKET CONCEPTS

    International Nuclear Information System (INIS)

    WONG, CPC; MALANG, S; NISHIO, S; RAFFRAY, R; SAGARA, S

    2002-01-01

    OAK A271 ADVANCED HIGH PERFORMANCE SOLID WALL BLANKET CONCEPTS. First wall and blanket (FW/blanket) design is a crucial element in the performance and acceptance of a fusion power plant. High temperature structural and breeding materials are needed for high thermal performance. A suitable combination of structural design with the selected materials is necessary for D-T fuel sufficiency. Whenever possible, low afterheat, low chemical reactivity and low activation materials are desired to achieve passive safety and minimize the amount of high-level waste. Of course the selected fusion FW/blanket design will have to match the operational scenarios of high performance plasma. The key characteristics of eight advanced high performance FW/blanket concepts are presented in this paper. Design configurations, performance characteristics, unique advantages and issues are summarized. All reviewed designs can satisfy most of the necessary design goals. For further development, in concert with the advancement in plasma control and scrape off layer physics, additional emphasis will be needed in the areas of first wall coating material selection, design of plasma stabilization coils, consideration of reactor startup and transient events. To validate the projected performance of the advanced FW/blanket concepts the critical element is the need for 14 MeV neutron irradiation facilities for the generation of necessary engineering design data and the prediction of FW/blanket components lifetime and availability

  5. RavenDB high performance

    CERN Document Server

    Ritchie, Brian

    2013-01-01

    RavenDB High Performance is comprehensive yet concise tutorial that developers can use to.This book is for developers & software architects who are designing systems in order to achieve high performance right from the start. A basic understanding of RavenDB is recommended, but not required. While the book focuses on advanced topics, it does not assume that the reader has a great deal of prior knowledge of working with RavenDB.

  6. Multiresidue screening method for detection of benzimidazoles and their metabolites in liver and muscle by high-performance liquid chromatography: method development and validation according to Commission Decision 2002/657/EC

    Directory of Open Access Journals (Sweden)

    Marilena Gili

    2014-02-01

    Full Text Available The use of veterinary drugs may cause the presence of residues in food of animal origin if appropriate withdrawal periods are not respected. A high-performance liquid chromatography (HPLC method has been developed for the simultaneous detection of 11 benzimidazole residues, including metabolites – albendazole, albendazole sulphoxide, albendazole sulphone, fenbendazole, fenbendazole sulphoxide (oxfendazole, fenbendazole sulphone, flubendazole, mebendazole, oxibendazole, thiabendazole, 5-hydroxythiabendazole – in bovine, ovine, equine, swine, rabbit and poultry liver and in bovine, swine and fish muscle. After extraction with a dicloromethane/acetonitrile solution (35/65 v/v containing 5% ammonium hydroxide, the solvent was evaporated to dryness, the residue was dissolved in HCl 0.1 M, defatted with hexane, purified on a strong cation exchange solid-phase extraction cartridge and analysed in HPLC with diode array and fluorescence detectors. The method was validated as screening qualitative method evaluating, according to Commission Decision 2002/657/EC criteria, specificity, CCb and b error at cut off level of 25 mg/kg and ruggedness.

  7. Validation and application of a high-performance liquid chromatography-tandem mass spectrometric method for simultaneous quantification of lopinavir and ritonavir in human plasma using semi-automated 96-well liquid-liquid extraction.

    Science.gov (United States)

    Wang, Perry G; Wei, Jack S; Kim, Grace; Chang, Min; El-Shourbagy, Tawakol

    2006-10-20

    Kaletra is an important antiretroviral drug, which has been developed by Abbott Laboratories. It is composed of lopinavir (low-pin-a-veer) and ritonavir (ri-toe-na-veer). Both have been proved to be human immunodeficiency virus (HIV) protease inhibitors and have substantially reduced the morbidity and mortality associated with HIV-1 infection. We have developed and validated an assay, using liquid chromatography coupled with atmospheric pressure chemical ionization tandem mass spectrometry (LC/MS/MS), for the routine quantification of lopinavir and ritonavir in human plasma, in which lopinavir and ritonavir can be simultaneously analyzed with high throughput. The sample preparation consisted of liquid-liquid extraction with a mixture of hexane: ethyl acetate (1:1, v/v), using 100 microL of plasma. Chromatographic separation was performed on a Waters Symmetry C(18) column (150 mm x 3.9 mm, particle size 5 microm) with reverse-phase isocratic using mobile phase of 70:30 (v/v) acetonitrile: 2 mM ammonium acetate aqueous solution containing 0.01% formic acid (v/v) at a flow rate of 1.0 mL/min. A Waters symmetry C(18) guard column (20 mm x 3.9 mm, particle size 5 microm) was connected prior to the analytical column, and a guard column back wash was performed to reduce the analytical column contamination using a mixture of tetrahydrofuran (THF), methanol and water (45:45:10, v/v/v). The analytical run was 4 min. The use of a 96-well plate autosampler allowed a batch size up to 73 study samples. A triple-quadrupole mass spectrometer was operated in a positive ion mode and multiple reaction monitoring (MRM) was used for drug quantification. The method was validated over the concentration ranges of 19-5,300 ng/mL for lopinavir and 11-3,100 ng/mL for ritonavir. A-86093 was used as an internal standard (I.S.). The relative standard deviation (RSD) were <6% for both lopinavir and ritonavir. Mean accuracies were between the designed limits (+/-15%). The robust and rapid LC

  8. Validation of an assay for quantification of free normetanephrine, metanephrine and methoxytyramine in plasma by high performance liquid chromatography with coulometric detection: Comparison of peak-area vs. peak-height measurements.

    Science.gov (United States)

    Nieć, Dawid; Kunicki, Paweł K

    2015-10-01

    Measurements of plasma concentrations of free normetanephrine (NMN), metanephrine (MN) and methoxytyramine (MTY) constitute the most diagnostically accurate screening test for pheochromocytomas and paragangliomas. The aim of this article is to present the results from a validation of an analytical method utilizing high performance liquid chromatography with coulometric detection (HPLC-CD) for quantifying plasma free NMN, MN and MTY. Additionally, peak integration by height and area and the use of one calibration curve for all batches or individual calibration curve for each batch of samples was explored as to determine the optimal approach with regard to accuracy and precision. The method was validated using charcoal stripped plasma spiked with solutions of NMN, MN, MTY and internal standard (4-hydroxy-3-methoxybenzylamine) with the exception of selectivity which was evaluated by analysis of real plasma samples. Calibration curve performance, accuracy, precision and recovery were determined following both peak-area and peak-height measurements and the obtained results were compared. The most accurate and precise method of calibration was evaluated by analyzing quality control samples at three concentration levels in 30 analytical runs. The detector response was linear over the entire tested concentration range from 10 to 2000pg/mL with R(2)≥0.9988. The LLOQ was 10pg/mL for each analyte of interest. To improve accuracy for measurements at low concentrations, a weighted (1/amount) linear regression model was employed, which resulted in inaccuracies of -2.48 to 9.78% and 0.22 to 7.81% following peak-area and peak-height integration, respectively. The imprecisions ranged from 1.07 to 15.45% and from 0.70 to 11.65% for peak-area and peak-height measurements, respectively. The optimal approach to calibration was the one utilizing an individual calibration curve for each batch of samples and peak-height measurements. It was characterized by inaccuracies ranging from -3

  9. Verification and validation guidelines for high integrity systems. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  10. Verification and validation guidelines for high integrity systems. Volume 1

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  11. High-Performance Operating Systems

    DEFF Research Database (Denmark)

    Sharp, Robin

    1999-01-01

    Notes prepared for the DTU course 49421 "High Performance Operating Systems". The notes deal with quantitative and qualitative techniques for use in the design and evaluation of operating systems in computer systems for which performance is an important parameter, such as real-time applications......, communication systems and multimedia systems....

  12. Multiphysics modelling and experimental validation of high concentration photovoltaic modules

    International Nuclear Information System (INIS)

    Theristis, Marios; Fernández, Eduardo F.; Sumner, Mike; O'Donovan, Tadhg S.

    2017-01-01

    Highlights: • A multiphysics modelling approach for concentrating photovoltaics was developed. • An experimental campaign was conducted to validate the models. • The experimental results were in good agreement with the models. • The multiphysics modelling allows the concentrator’s optimisation. - Abstract: High concentration photovoltaics, equipped with high efficiency multijunction solar cells, have great potential in achieving cost-effective and clean electricity generation at utility scale. Such systems are more complex compared to conventional photovoltaics because of the multiphysics effect that is present. Modelling the power output of such systems is therefore crucial for their further market penetration. Following this line, a multiphysics modelling procedure for high concentration photovoltaics is presented in this work. It combines an open source spectral model, a single diode electrical model and a three-dimensional finite element thermal model. In order to validate the models and the multiphysics modelling procedure against actual data, an outdoor experimental campaign was conducted in Albuquerque, New Mexico using a high concentration photovoltaic monomodule that is thoroughly described in terms of its geometry and materials. The experimental results were in good agreement (within 2.7%) with the predicted maximum power point. This multiphysics approach is relatively more complex when compared to empirical models, but besides the overall performance prediction it can also provide better understanding of the physics involved in the conversion of solar irradiance into electricity. It can therefore be used for the design and optimisation of high concentration photovoltaic modules.

  13. A fuel performance code TRUST VIc and its validation

    Energy Technology Data Exchange (ETDEWEB)

    Ishida, M; Kogai, T [Nippon Nuclear Fuel Development Co. Ltd., Oarai, Ibaraki (Japan)

    1997-08-01

    This paper describes a fuel performance code TRUST V1c developed to analyze thermal and mechanical behavior of LWR fuel rod. Submodels in the code include FP gas models depicting gaseous swelling, gas release from pellet and axial gas mixing. The code has FEM-based structure to handle interaction between thermal and mechanical submodels brought by the gas models. The code is validated against irradiation data of fuel centerline temperature, FGR, pellet porosity and cladding deformation. (author). 9 refs, 8 figs.

  14. A fuel performance code TRUST VIc and its validation

    International Nuclear Information System (INIS)

    Ishida, M.; Kogai, T.

    1997-01-01

    This paper describes a fuel performance code TRUST V1c developed to analyze thermal and mechanical behavior of LWR fuel rod. Submodels in the code include FP gas models depicting gaseous swelling, gas release from pellet and axial gas mixing. The code has FEM-based structure to handle interaction between thermal and mechanical submodels brought by the gas models. The code is validated against irradiation data of fuel centerline temperature, FGR, pellet porosity and cladding deformation. (author). 9 refs, 8 figs

  15. Advanced high performance solid wall blanket concepts

    International Nuclear Information System (INIS)

    Wong, C.P.C.; Malang, S.; Nishio, S.; Raffray, R.; Sagara, A.

    2002-01-01

    First wall and blanket (FW/blanket) design is a crucial element in the performance and acceptance of a fusion power plant. High temperature structural and breeding materials are needed for high thermal performance. A suitable combination of structural design with the selected materials is necessary for D-T fuel sufficiency. Whenever possible, low afterheat, low chemical reactivity and low activation materials are desired to achieve passive safety and minimize the amount of high-level waste. Of course the selected fusion FW/blanket design will have to match the operational scenarios of high performance plasma. The key characteristics of eight advanced high performance FW/blanket concepts are presented in this paper. Design configurations, performance characteristics, unique advantages and issues are summarized. All reviewed designs can satisfy most of the necessary design goals. For further development, in concert with the advancement in plasma control and scrape off layer physics, additional emphasis will be needed in the areas of first wall coating material selection, design of plasma stabilization coils, consideration of reactor startup and transient events. To validate the projected performance of the advanced FW/blanket concepts the critical element is the need for 14 MeV neutron irradiation facilities for the generation of necessary engineering design data and the prediction of FW/blanket components lifetime and availability

  16. Development, validation and application of an ultra high performance liquid chromatographic-tandem mass spectrometric method for the simultaneous detection and quantification of five different classes of veterinary antibiotics in swine manure.

    Science.gov (United States)

    Van den Meersche, Tina; Van Pamel, Els; Van Poucke, Christof; Herman, Lieve; Heyndrickx, Marc; Rasschaert, Geertrui; Daeseleire, Els

    2016-01-15

    In this study, a fast, simple and selective ultra high performance liquid chromatographic-tandem mass spectrometric (UHPLC-MS/MS) method for the simultaneous detection and quantification of colistin, sulfadiazine, trimethoprim, doxycycline, oxytetracycline and ceftiofur and for the detection of tylosin A in swine manure was developed and validated. First, a simple extraction procedure with acetonitrile and 6% trichloroacetic acid was carried out. Second, the supernatant was evaporated and the pellet was reconstituted in 1 ml of water/acetonitrile (80/20) and 0.1% formic acid. Extracts were filtered and analyzed by UHPLC-MS/MS on a Kinetex C18 column using gradient elution. The method developed was validated according to the criteria of Commission Decision 2002/657/EC. Recovery percentages varied between 94% and 106%, repeatability percentages were within the range of 1.7-9.2% and the intralaboratory reproducibility varied between 2.8% and 9.3% for all compounds, except for tylosin A for which more variation was observed resulting in a higher measurement uncertainty. The limit of detection and limit of quantification varied between 1.1 and 20.2 and between 3.5 and 67.3 μg/kg, respectively. This method was used to determine the presence and concentration of the seven antibiotic residues in swine manure sampled from ten different manure pits on farms where the selected antibiotics were used. A link was found between the antibiotics used and detected, except for ceftiofur which is injected at low doses and degraded readily in swine manure and was therefore not recovered in any of the samples. To the best of our knowledge, this is the first method available for the simultaneous extraction and quantification of colistin with other antibiotic classes. Additionally, colistin was never extracted from swine manure before. Another innovative aspect of this method is the simultaneous detection and quantification of five different classes of antibiotic residues in swine manure

  17. Image quality validation of Sentinel 2 Level-1 products: performance status at the beginning of the constellation routine phase

    Science.gov (United States)

    Francesconi, Benjamin; Neveu-VanMalle, Marion; Espesset, Aude; Alhammoud, Bahjat; Bouzinac, Catherine; Clerc, Sébastien; Gascon, Ferran

    2017-09-01

    Sentinel-2 is an Earth Observation mission developed by the European Space Agency (ESA) in the frame of the Copernicus program of the European Commission. The mission is based on a constellation of 2-satellites: Sentinel-2A launched in June 2015 and Sentinel-2B launched in March 2017. It offers an unprecedented combination of systematic global coverage of land and coastal areas, a high revisit of five days at the equator and 2 days at mid-latitudes under the same viewing conditions, high spatial resolution, and a wide field of view for multispectral observations from 13 bands in the visible, near infrared and short wave infrared range of the electromagnetic spectrum. The mission performances are routinely and closely monitored by the S2 Mission Performance Centre (MPC), including a consortium of Expert Support Laboratories (ESL). This publication focuses on the Sentinel-2 Level-1 product quality validation activities performed by the MPC. It presents an up-to-date status of the Level-1 mission performances at the beginning of the constellation routine phase. Level-1 performance validations routinely performed cover Level-1 Radiometric Validation (Equalisation Validation, Absolute Radiometry Vicarious Validation, Absolute Radiometry Cross-Mission Validation, Multi-temporal Relative Radiometry Vicarious Validation and SNR Validation), and Level-1 Geometric Validation (Geolocation Uncertainty Validation, Multi-spectral Registration Uncertainty Validation and Multi-temporal Registration Uncertainty Validation). Overall, the Sentinel-2 mission is proving very successful in terms of product quality thereby fulfilling the promises of the Copernicus program.

  18. Identifying High Performance ERP Projects

    OpenAIRE

    Stensrud, Erik; Myrtveit, Ingunn

    2002-01-01

    Learning from high performance projects is crucial for software process improvement. Therefore, we need to identify outstanding projects that may serve as role models. It is common to measure productivity as an indicator of performance. It is vital that productivity measurements deal correctly with variable returns to scale and multivariate data. Software projects generally exhibit variable returns to scale, and the output from ERP projects is multivariate. We propose to use Data Envelopment ...

  19. INL High Performance Building Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Jennifer D. Morton

    2010-02-01

    High performance buildings, also known as sustainable buildings and green buildings, are resource efficient structures that minimize the impact on the environment by using less energy and water, reduce solid waste and pollutants, and limit the depletion of natural resources while also providing a thermally and visually comfortable working environment that increases productivity for building occupants. As Idaho National Laboratory (INL) becomes the nation’s premier nuclear energy research laboratory, the physical infrastructure will be established to help accomplish this mission. This infrastructure, particularly the buildings, should incorporate high performance sustainable design features in order to be environmentally responsible and reflect an image of progressiveness and innovation to the public and prospective employees. Additionally, INL is a large consumer of energy that contributes to both carbon emissions and resource inefficiency. In the current climate of rising energy prices and political pressure for carbon reduction, this guide will help new construction project teams to design facilities that are sustainable and reduce energy costs, thereby reducing carbon emissions. With these concerns in mind, the recommendations described in the INL High Performance Building Strategy (previously called the INL Green Building Strategy) are intended to form the INL foundation for high performance building standards. This revised strategy incorporates the latest federal and DOE orders (Executive Order [EO] 13514, “Federal Leadership in Environmental, Energy, and Economic Performance” [2009], EO 13423, “Strengthening Federal Environmental, Energy, and Transportation Management” [2007], and DOE Order 430.2B, “Departmental Energy, Renewable Energy, and Transportation Management” [2008]), the latest guidelines, trends, and observations in high performance building construction, and the latest changes to the Leadership in Energy and Environmental Design

  20. Treatment and Combination of Data Quality Monitoring Histograms to Perform Data vs. Monte Carlo Validation

    CERN Document Server

    Colin, Nolan

    2013-01-01

    In CMS's automated data quality validation infrastructure, it is not currently possible to assess how well Monte Carlo simulations describe data from collisions, if at all. In order to guarantee high quality data, a novel work flow was devised to perform `data vs. Monte Carlo' validation. Support for this comparison was added by allowing distributions from several Monte Carlo samples to be combined, matched to the data and then displayed in a histogram stack, overlaid with the experimental data.

  1. Solar power plant performance evaluation: simulation and experimental validation

    International Nuclear Information System (INIS)

    Natsheh, E M; Albarbar, A

    2012-01-01

    In this work the performance of solar power plant is evaluated based on a developed model comprise photovoltaic array, battery storage, controller and converters. The model is implemented using MATLAB/SIMULINK software package. Perturb and observe (P and O) algorithm is used for maximizing the generated power based on maximum power point tracker (MPPT) implementation. The outcome of the developed model are validated and supported by a case study carried out using operational 28.8kW grid-connected solar power plant located in central Manchester. Measurements were taken over 21 month's period; using hourly average irradiance and cell temperature. It was found that system degradation could be clearly monitored by determining the residual (the difference) between the output power predicted by the model and the actual measured power parameters. It was found that the residual exceeded the healthy threshold, 1.7kW, due to heavy snow in Manchester last winter. More important, the developed performance evaluation technique could be adopted to detect any other reasons that may degrade the performance of the P V panels such as shading and dirt. Repeatability and reliability of the developed system performance were validated during this period. Good agreement was achieved between the theoretical simulation and the real time measurement taken the online grid connected solar power plant.

  2. Solar power plant performance evaluation: simulation and experimental validation

    Science.gov (United States)

    Natsheh, E. M.; Albarbar, A.

    2012-05-01

    In this work the performance of solar power plant is evaluated based on a developed model comprise photovoltaic array, battery storage, controller and converters. The model is implemented using MATLAB/SIMULINK software package. Perturb and observe (P&O) algorithm is used for maximizing the generated power based on maximum power point tracker (MPPT) implementation. The outcome of the developed model are validated and supported by a case study carried out using operational 28.8kW grid-connected solar power plant located in central Manchester. Measurements were taken over 21 month's period; using hourly average irradiance and cell temperature. It was found that system degradation could be clearly monitored by determining the residual (the difference) between the output power predicted by the model and the actual measured power parameters. It was found that the residual exceeded the healthy threshold, 1.7kW, due to heavy snow in Manchester last winter. More important, the developed performance evaluation technique could be adopted to detect any other reasons that may degrade the performance of the P V panels such as shading and dirt. Repeatability and reliability of the developed system performance were validated during this period. Good agreement was achieved between the theoretical simulation and the real time measurement taken the online grid connected solar power plant.

  3. High performance fuel technology development

    Energy Technology Data Exchange (ETDEWEB)

    Koon, Yang Hyun; Kim, Keon Sik; Park, Jeong Yong; Yang, Yong Sik; In, Wang Kee; Kim, Hyung Kyu [KAERI, Daejeon (Korea, Republic of)

    2012-01-15

    {omicron} Development of High Plasticity and Annular Pellet - Development of strong candidates of ultra high burn-up fuel pellets for a PCI remedy - Development of fabrication technology of annular fuel pellet {omicron} Development of High Performance Cladding Materials - Irradiation test of HANA claddings in Halden research reactor and the evaluation of the in-pile performance - Development of the final candidates for the next generation cladding materials. - Development of the manufacturing technology for the dual-cooled fuel cladding tubes. {omicron} Irradiated Fuel Performance Evaluation Technology Development - Development of performance analysis code system for the dual-cooled fuel - Development of fuel performance-proving technology {omicron} Feasibility Studies on Dual-Cooled Annular Fuel Core - Analysis on the property of a reactor core with dual-cooled fuel - Feasibility evaluation on the dual-cooled fuel core {omicron} Development of Design Technology for Dual-Cooled Fuel Structure - Definition of technical issues and invention of concept for dual-cooled fuel structure - Basic design and development of main structure components for dual- cooled fuel - Basic design of a dual-cooled fuel rod.

  4. High Performance Bulk Thermoelectric Materials

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Zhifeng [Boston College, Chestnut Hill, MA (United States)

    2013-03-31

    Over 13 plus years, we have carried out research on electron pairing symmetry of superconductors, growth and their field emission property studies on carbon nanotubes and semiconducting nanowires, high performance thermoelectric materials and other interesting materials. As a result of the research, we have published 104 papers, have educated six undergraduate students, twenty graduate students, nine postdocs, nine visitors, and one technician.

  5. High performance in software development

    CERN Multimedia

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  6. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Whitmore, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kaffine, Leah [National Renewable Energy Lab. (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron P. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  7. Assessing performance and validating finite element simulations using probabilistic knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, Ronald M.; Rodriguez, E. A. (Edward A.)

    2002-01-01

    Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrence results are used to validate finite element predictions.

  8. Neo4j high performance

    CERN Document Server

    Raj, Sonal

    2015-01-01

    If you are a professional or enthusiast who has a basic understanding of graphs or has basic knowledge of Neo4j operations, this is the book for you. Although it is targeted at an advanced user base, this book can be used by beginners as it touches upon the basics. So, if you are passionate about taming complex data with the help of graphs and building high performance applications, you will be able to get valuable insights from this book.

  9. Integrated plasma control for high performance tokamaks

    International Nuclear Information System (INIS)

    Humphreys, D.A.; Deranian, R.D.; Ferron, J.R.; Johnson, R.D.; LaHaye, R.J.; Leuer, J.A.; Penaflor, B.G.; Walker, M.L.; Welander, A.S.; Jayakumar, R.J.; Makowski, M.A.; Khayrutdinov, R.R.

    2005-01-01

    Sustaining high performance in a tokamak requires controlling many equilibrium shape and profile characteristics simultaneously with high accuracy and reliability, while suppressing a variety of MHD instabilities. Integrated plasma control, the process of designing high-performance tokamak controllers based on validated system response models and confirming their performance in detailed simulations, provides a systematic method for achieving and ensuring good control performance. For present-day devices, this approach can greatly reduce the need for machine time traditionally dedicated to control optimization, and can allow determination of high-reliability controllers prior to ever producing the target equilibrium experimentally. A full set of tools needed for this approach has recently been completed and applied to present-day devices including DIII-D, NSTX and MAST. This approach has proven essential in the design of several next-generation devices including KSTAR, EAST, JT-60SC, and ITER. We describe the method, results of design and simulation tool development, and recent research producing novel approaches to equilibrium and MHD control in DIII-D. (author)

  10. Five-Kilometers Time Trial: Preliminary Validation of a Short Test for Cycling Performance Evaluation.

    Science.gov (United States)

    Dantas, Jose Luiz; Pereira, Gleber; Nakamura, Fabio Yuzo

    2015-09-01

    The five-kilometer time trial (TT5km) has been used to assess aerobic endurance performance without further investigation of its validity. This study aimed to perform a preliminary validation of the TT5km to rank well-trained cyclists based on aerobic endurance fitness and assess changes of the aerobic endurance performance. After the incremental test, 20 cyclists (age = 31.3 ± 7.9 years; body mass index = 22.7 ± 1.5 kg/m(2); maximal aerobic power = 360.5 ± 49.5 W) performed the TT5km twice, collecting performance (time to complete, absolute and relative power output, average speed) and physiological responses (heart rate and electromyography activity). The validation criteria were pacing strategy, absolute and relative reliability, validity, and sensitivity. Sensitivity index was obtained from the ratio between the smallest worthwhile change and typical error. The TT5km showed high absolute (coefficient of variation 0.95) reliability of performance variables, whereas it presented low reliability of physiological responses. The TT5km performance variables were highly correlated with the aerobic endurance indices obtained from incremental test (r > 0.70). These variables showed adequate sensitivity index (> 1). TT5km is a valid test to rank the aerobic endurance fitness of well-trained cyclists and to differentiate changes on aerobic endurance performance. Coaches can detect performance changes through either absolute (± 17.7 W) or relative power output (± 0.3 W.kg(-1)), the time to complete the test (± 13.4 s) and the average speed (± 1.0 km.h(-1)). Furthermore, TT5km performance can also be used to rank the athletes according to their aerobic endurance fitness.

  11. Validating YouTube Factors Affecting Learning Performance

    Science.gov (United States)

    Pratama, Yoga; Hartanto, Rudy; Suning Kusumawardani, Sri

    2018-03-01

    YouTube is often used as a companion medium or a learning supplement. One of the educational places that often uses is Jogja Audio School (JAS) which focuses on music production education. Music production is a difficult material to learn, especially at the audio mastering. With tutorial contents from YouTube, students find it easier to learn and understand audio mastering and improved their learning performance. This study aims to validate the role of YouTube as a medium of learning in improving student’s learning performance by looking at the factors that affect student learning performance. The sample involves 100 respondents from JAS at audio mastering level. The results showed that student learning performance increases seen from factors that have a significant influence of motivation, instructional content, and YouTube usefulness. Overall findings suggest that YouTube has a important role to student learning performance in music production education and as an innovative and efficient learning medium.

  12. High performance MEAs. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-15

    The aim of the present project is through modeling, material and process development to obtain significantly better MEA performance and to attain the technology necessary to fabricate stable catalyst materials thereby providing a viable alternative to current industry standard. This project primarily focused on the development and characterization of novel catalyst materials for the use in high temperature (HT) and low temperature (LT) proton-exchange membrane fuel cells (PEMFC). New catalysts are needed in order to improve fuel cell performance and reduce the cost of fuel cell systems. Additional tasks were the development of new, durable sealing materials to be used in PEMFC as well as the computational modeling of heat and mass transfer processes, predominantly in LT PEMFC, in order to improve fundamental understanding of the multi-phase flow issues and liquid water management in fuel cells. An improved fundamental understanding of these processes will lead to improved fuel cell performance and hence will also result in a reduced catalyst loading to achieve the same performance. The consortium have obtained significant research results and progress for new catalyst materials and substrates with promising enhanced performance and fabrication of the materials using novel methods. However, the new materials and synthesis methods explored are still in the early research and development phase. The project has contributed to improved MEA performance using less precious metal and has been demonstrated for both LT-PEM, DMFC and HT-PEM applications. New novel approach and progress of the modelling activities has been extremely satisfactory with numerous conference and journal publications along with two potential inventions concerning the catalyst layer. (LN)

  13. High Performance Proactive Digital Forensics

    International Nuclear Information System (INIS)

    Alharbi, Soltan; Traore, Issa; Moa, Belaid; Weber-Jahnke, Jens

    2012-01-01

    With the increase in the number of digital crimes and in their sophistication, High Performance Computing (HPC) is becoming a must in Digital Forensics (DF). According to the FBI annual report, the size of data processed during the 2010 fiscal year reached 3,086 TB (compared to 2,334 TB in 2009) and the number of agencies that requested Regional Computer Forensics Laboratory assistance increasing from 689 in 2009 to 722 in 2010. Since most investigation tools are both I/O and CPU bound, the next-generation DF tools are required to be distributed and offer HPC capabilities. The need for HPC is even more evident in investigating crimes on clouds or when proactive DF analysis and on-site investigation, requiring semi-real time processing, are performed. Although overcoming the performance challenge is a major goal in DF, as far as we know, there is almost no research on HPC-DF except for few papers. As such, in this work, we extend our work on the need of a proactive system and present a high performance automated proactive digital forensic system. The most expensive phase of the system, namely proactive analysis and detection, uses a parallel extension of the iterative z algorithm. It also implements new parallel information-based outlier detection algorithms to proactively and forensically handle suspicious activities. To analyse a large number of targets and events and continuously do so (to capture the dynamics of the system), we rely on a multi-resolution approach to explore the digital forensic space. Data set from the Honeynet Forensic Challenge in 2001 is used to evaluate the system from DF and HPC perspectives.

  14. The Multifactor Measure of Performance: Its Development, Norming, and Validation.

    Science.gov (United States)

    Bar-On, Reuven

    2018-01-01

    This article describes the development as well as the initial norming and validation of the Multifactor Measure of Performance™ (MMP™), which is a psychometric instrument that is designed to study, assess and enhance key predictors of human performance to help individuals perform at a higher level. It was created by the author, for the purpose of going beyond existing conceptual and psychometric models that often focus on relatively few factors that are purported to assess performance at school, in the workplace and elsewhere. The relative sparsity of multifactorial pre-employment assessment instruments exemplifies, for the author, one of the important reasons for developing the MMP™, which attempts to comprehensively evaluate a wider array of factors that are thought to contribute to performance. In that this situation creates a need in the area of test-construction that should be addressed, the author sought to develop a multifactorial assessment and development instrument that could concomitantly evaluate a combination of physical, cognitive, intra-personal, inter-personal, and motivational factors that significantly contribute to performance. The specific aim of this article is to show why, how and if this could be done as well as to present and discuss the potential importance of the results obtained to date. The findings presented here will hopefully add to what is known about human performance and thus contribute to the professional literature, in addition to contribute to the continued development of the MMP™. The impetus for developing the MMP™ is first explained below, followed by a detailed description of the process involved and the findings obtained; and their potential application is then discussed as well as the possible limitations of the present research and the need for future studies to address them.

  15. Application of the Performance Validation Tool for the Evaluation of NSSS Control System Performance

    International Nuclear Information System (INIS)

    Sohn, Suk-whun

    2011-01-01

    When a control system is supplied to nuclear power plant (NPP) under construction, static tests and dynamic tests are typically performed for evaluating its performance. The dynamic test is not realistic for validating the performance of the replaced hardware in operating NPPs because of potential risks and economic burden. We have, therefore, developed a performance validation tool which can evaluate the dynamic performances of the control system without undertaking real plant tests. The window-based nuclear plant performance analyzer (Win-NPA) is used as a virtual NPP in the developed tool and provides appropriate control loads to the target control system via hardwired cables in a manner that the interfaces are identical to the field wiring. The outputs from the control system are used as the simulation inputs of the plant model within the Win-NPA. With this closed-loop configuration, major transient events were simulated to check the performance of the implemented control system by comparing it with that of the control system model of the Win-NPA and that of the old hardware. The developed tool and the methodology were successfully applied to the hardware replacement project for Yonggwang (YGN) 3 and 4 feedwater control system (FWCS) in 2008. Several errors in the implemented control system were fixed through the performance validation tests and the operability tests. As a result, the control system of the YGN 3 and 4 has demonstrated an excellent control performance since then. On the basis of YGN 3 and 4 project experiences, we are performing a similar project in Ulchin (UCN) 3 and 4. This methodology can also be applied to other NPPs under construction as a tool for pre-operational dynamic tests. These performance tests before performing power ascension tests (PATs) are conducive to preventing unnecessary plant transients or unwanted reactor trips caused by hidden errors of control systems during PATs. (author)

  16. High performance anode for advanced Li batteries

    Energy Technology Data Exchange (ETDEWEB)

    Lake, Carla [Applied Sciences, Inc., Cedarville, OH (United States)

    2015-11-02

    The overall objective of this Phase I SBIR effort was to advance the manufacturing technology for ASI’s Si-CNF high-performance anode by creating a framework for large volume production and utilization of low-cost Si-coated carbon nanofibers (Si-CNF) for the battery industry. This project explores the use of nano-structured silicon which is deposited on a nano-scale carbon filament to achieve the benefits of high cycle life and high charge capacity without the consequent fading of, or failure in the capacity resulting from stress-induced fracturing of the Si particles and de-coupling from the electrode. ASI’s patented coating process distinguishes itself from others, in that it is highly reproducible, readily scalable and results in a Si-CNF composite structure containing 25-30% silicon, with a compositionally graded interface at the Si-CNF interface that significantly improve cycling stability and enhances adhesion of silicon to the carbon fiber support. In Phase I, the team demonstrated the production of the Si-CNF anode material can successfully be transitioned from a static bench-scale reactor into a fluidized bed reactor. In addition, ASI made significant progress in the development of low cost, quick testing methods which can be performed on silicon coated CNFs as a means of quality control. To date, weight change, density, and cycling performance were the key metrics used to validate the high performance anode material. Under this effort, ASI made strides to establish a quality control protocol for the large volume production of Si-CNFs and has identified several key technical thrusts for future work. Using the results of this Phase I effort as a foundation, ASI has defined a path forward to commercialize and deliver high volume and low-cost production of SI-CNF material for anodes in Li-ion batteries.

  17. High performance light water reactor

    International Nuclear Information System (INIS)

    Squarer, D.; Schulenberg, T.; Struwe, D.; Oka, Y.; Bittermann, D.; Aksan, N.; Maraczy, C.; Kyrki-Rajamaeki, R.; Souyri, A.; Dumaz, P.

    2003-01-01

    The objective of the high performance light water reactor (HPLWR) project is to assess the merit and economic feasibility of a high efficiency LWR operating at thermodynamically supercritical regime. An efficiency of approximately 44% is expected. To accomplish this objective, a highly qualified team of European research institutes and industrial partners together with the University of Tokyo is assessing the major issues pertaining to a new reactor concept, under the co-sponsorship of the European Commission. The assessment has emphasized the recent advancement achieved in this area by Japan. Additionally, it accounts for advanced European reactor design requirements, recent improvements, practical design aspects, availability of plant components and the availability of high temperature materials. The final objective of this project is to reach a conclusion on the potential of the HPLWR to help sustain the nuclear option, by supplying competitively priced electricity, as well as to continue the nuclear competence in LWR technology. The following is a brief summary of the main project achievements:-A state-of-the-art review of supercritical water-cooled reactors has been performed for the HPLWR project.-Extensive studies have been performed in the last 10 years by the University of Tokyo. Therefore, a 'reference design', developed by the University of Tokyo, was selected in order to assess the available technological tools (i.e. computer codes, analyses, advanced materials, water chemistry, etc.). Design data and results of the analysis were supplied by the University of Tokyo. A benchmark problem, based on the 'reference design' was defined for neutronics calculations and several partners of the HPLWR project carried out independent analyses. The results of these analyses, which in addition help to 'calibrate' the codes, have guided the assessment of the core and the design of an improved HPLWR fuel assembly. Preliminary selection was made for the HPLWR scale

  18. Validation of High-resolution Climate Simulations over Northern Europe.

    Science.gov (United States)

    Muna, R. A.

    2005-12-01

    Two AMIP2-type (Gates 1992) experiments have been performed with climate versions of ARPEGE/IFS model examine for North Atlantic North Europe, and Norwegian region and analyzed the effect of increasing resolution on the simulated biases. The ECMWF reanalysis or ERA-15 has been used to validate the simulations. Each of the simulations is an integration of the period 1979 to 1996. The global simulations used observed monthly mean sea surface temperatures (SST) as lower boundary condition. All aspects but the horizontal resolutions are similar in the two simulations. The first simulation has a uniform horizontal resolution of T63L. The second one has a variable resolution (T106Lc3) with the highest resolution in the Norwegian Sea. Both simulations have 31 vertical layers in the same locations. For each simulation the results were divided into two seasons: winter (DJF) and summer (JJA). The parameters investigated were mean sea level pressure, geopotential and temperature at 850 hPa and 500 hPa. To find out the causes of temperature bias during summer, latent and sensible heat flux, total cloud cover and total precipitation were analyzed. The high-resolution simulation exhibits more or less realistic climate over Nordic, Artic and European region. The overall performance of the simulations shows improvements of generally all fields investigated with increasing resolution over the target area both in winter (DJF) and summer (JJA).

  19. Reliable and Valid Assessment of Clinical Bronchoscopy Performance

    DEFF Research Database (Denmark)

    Konge, Lars; Larsen, Klaus Richter; Clementsen, Paul

    2012-01-01

    : The interrater reliability was high, with Cronbach's a = 0.86. Assessment of 3 bronchoscopies by a single rater had a generalizability coefficient of 0.84. The correlation between experience and performance was good (Pearson correlation = 0.76). There were significant differences between the groups for all...

  20. Improved Hypersonic Inlet Performance Using Validated Strut Compression Designs

    Science.gov (United States)

    Bulman, M. J.; Stout, P. W.; Fernandez, R.

    1997-01-01

    Aerojet is currently executing two Strutjet propulsion contracts: one a Rocket Based Combined Cycle (RBCC) engine for a NASA-Marshall Space Flight Center (MSFC) Advanced Reusable Transportation Technology (ARTT) program, the second a Dual Mode Ram/Scramjet engine for a USAF Wright Laboratories Storable Fuel Scramjet Flow Path Concepts program. The engines employed in both programs operate at supersonic and low hypersonic speeds and use inlets employing forebody external and sidewall compression. Aerojet has developed and validated a successful design methodology applicable to these inlet types. Design features include an integrated vehicle forebody, external side compression struts, strut sidewall and throat bleed, a throat shock trap, and variable geometry internal contraction. Computation Fluid Dynamic (CFD) predictions and test data show these inlets allow substantially increased flow turning angles over other designs. These increased flow turning angles allow shorter and lighter engines than current designs, which in turn enables higher performing vehicles with broad operating characteristics. This paper describes the designs of two different inlets evaluated by the NASA-MSFC and USAF programs, discusses the results of wind tunnel tests performed by NASA-Lewis Research Center, and provides correlations of test data with CFD predictions. Parameters of interest include low Mach number starting capability, start sensitivity as a function of back pressure at various contraction ratios, flow turning angles, strut and throat bleed effects, and pressure recovery at various Mach numbers.

  1. English Second Language, General, Special Education, and Speech/Language Personal Teacher Efficacy, English Language Arts Scientifically-Validated Intervention Practice, and Working Memory Development of English Language Learners in High and Low Performing Elementary Schools

    Science.gov (United States)

    Brown, Barbara J.

    2013-01-01

    The researcher investigated teacher factors contributing to English language arts (ELA) achievement of English language learners (ELLs) over 2 consecutive years, in high and low performing elementary schools with a Hispanic/Latino student population greater than or equal to 30 percent. These factors included personal teacher efficacy, teacher…

  2. Validated assay for the simultaneous quantification of total vincristine and actinomycin-D concentrations in human EDTA plasma and of vincristine concentrations in human plasma ultrafiltrate by high-performance liquid chromatography coupled with tandem mass spectrometry

    NARCIS (Netherlands)

    Damen, Carola W. N.; Israëls, Trijn; Caron, Huib N.; Schellens, Jan H. M.; Rosing, Hilde; Beijnen, Jos H.

    2009-01-01

    A sensitive, specific and efficient high-performance liquid chromatography/tandem mass spectrometry (HPLC/MS/MS) assay for the simultaneous determination of total vincristine and actinomycin-D concentrations in human plasma and an assay for the determination of unbound vincristine are presented.

  3. EEG-neurofeedback for optimising performance. II: creativity, the performing arts and ecological validity.

    Science.gov (United States)

    Gruzelier, John H

    2014-07-01

    As a continuation of a review of evidence of the validity of cognitive/affective gains following neurofeedback in healthy participants, including correlations in support of the gains being mediated by feedback learning (Gruzelier, 2014a), the focus here is on the impact on creativity, especially in the performing arts including music, dance and acting. The majority of research involves alpha/theta (A/T), sensory-motor rhythm (SMR) and heart rate variability (HRV) protocols. There is evidence of reliable benefits from A/T training with advanced musicians especially for creative performance, and reliable benefits from both A/T and SMR training for novice music performance in adults and in a school study with children with impact on creativity, communication/presentation and technique. Making the SMR ratio training context ecologically relevant for actors enhanced creativity in stage performance, with added benefits from the more immersive training context. A/T and HRV training have benefitted dancers. The neurofeedback evidence adds to the rapidly accumulating validation of neurofeedback, while performing arts studies offer an opportunity for ecological validity in creativity research for both creative process and product. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Development of high performance cladding

    International Nuclear Information System (INIS)

    Kiuchi, Kiyoshi

    2003-01-01

    The developments of superior next-generation light water reactor are requested on the basis of general view points, such as improvement of safety, economics, reduction of radiation waste and effective utilization of plutonium, until 2030 year in which conventional reactor plants should be renovate. Improvements of stainless steel cladding for conventional high burn-up reactor to more than 100 GWd/t, developments of manufacturing technology for reduced moderation-light water reactor (RMWR) of breeding ratio beyond 1.0 and researches of water-materials interaction on super critical pressure-water cooled reactor are carried out in Japan Atomic Energy Research Institute. Stable austenite stainless steel has been selected for fuel element cladding of advanced boiling water reactor (ABWR). The austenite stain less has the superiority for anti-irradiation properties, corrosion resistance and mechanical strength. A hard spectrum of neutron energy up above 0.1 MeV takes place in core of the reduced moderation-light water reactor, as liquid metal-fast breeding reactor (LMFBR). High performance cladding for the RMWR fuel elements is required to get anti-irradiation properties, corrosion resistance and mechanical strength also. Slow strain rate test (SSRT) of SUS 304 and SUS 316 are carried out for studying stress corrosion cracking (SCC). Irradiation tests in LMFBR are intended to obtain irradiation data for damaged quantity of the cladding materials. (M. Suetake)

  5. A practical approach to perform graded verification and validation

    International Nuclear Information System (INIS)

    Terrado, Carlos; Woolley, J.

    2000-01-01

    Modernization of instrumentation and control (I and C) systems in nuclear power plants often implies to go from analog to digital systems. One condition for the upgrade to be successful is that the new systems achieve at least the same quality level as the analog they replace. The most important part of digital systems quality assurance (QA) is verification and validation (V and V). V and V is concerned with the process as much as the product, it is a systematic program of review and testing activities performed throughout the system development life cycle. Briefly, we can say that verification is to build the product correctly, and validation is to build the correct product. Since V and V is necessary but costly, it is helpful to tailor the effort that should be performed to achieve the quality goal for each particular case. To do this, an accepted practice is to establish different V and V levels, each one with a proper degree of stringency or rigor. This paper shows a practical approach to estimate the appropriate level of V and V, and the resulting V and V techniques recommended for each specific system. The firs step purposed is to determine 'What to do', that is the selection of the V and V class. The main factors considered here are: required integrity, functional complexity, defense in depth and development environment. A guideline to classify the particular system using these factors and show how they lead to the selection of the V and V class is presented. The second step is to determine 'How to do it', that is to choose an appropriate set of V and V methods according to the attributes of the system and the V and V class already selected. A list of possible V and V methods that are recommended for each V and V level during different stages of the development life cycle is included. As a result of the application of this procedure, solutions are found for generalists interested in 'What to do', as well as for specialists, interested in 'How to do'. Finally

  6. High-performance computing in seismology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-09-01

    The scientific, technical, and economic importance of the issues discussed here presents a clear agenda for future research in computational seismology. In this way these problems will drive advances in high-performance computing in the field of seismology. There is a broad community that will benefit from this work, including the petroleum industry, research geophysicists, engineers concerned with seismic hazard mitigation, and governments charged with enforcing a comprehensive test ban treaty. These advances may also lead to new applications for seismological research. The recent application of high-resolution seismic imaging of the shallow subsurface for the environmental remediation industry is an example of this activity. This report makes the following recommendations: (1) focused efforts to develop validated documented software for seismological computations should be supported, with special emphasis on scalable algorithms for parallel processors; (2) the education of seismologists in high-performance computing technologies and methodologies should be improved; (3) collaborations between seismologists and computational scientists and engineers should be increased; (4) the infrastructure for archiving, disseminating, and processing large volumes of seismological data should be improved.

  7. Review and evaluation of performance measures for survival prediction models in external validation settings

    Directory of Open Access Journals (Sweden)

    M. Shafiqur Rahman

    2017-04-01

    Full Text Available Abstract Background When developing a prediction model for survival data it is essential to validate its performance in external validation settings using appropriate performance measures. Although a number of such measures have been proposed, there is only limited guidance regarding their use in the context of model validation. This paper reviewed and evaluated a wide range of performance measures to provide some guidelines for their use in practice. Methods An extensive simulation study based on two clinical datasets was conducted to investigate the performance of the measures in external validation settings. Measures were selected from categories that assess the overall performance, discrimination and calibration of a survival prediction model. Some of these have been modified to allow their use with validation data, and a case study is provided to describe how these measures can be estimated in practice. The measures were evaluated with respect to their robustness to censoring and ease of interpretation. All measures are implemented, or are straightforward to implement, in statistical software. Results Most of the performance measures were reasonably robust to moderate levels of censoring. One exception was Harrell’s concordance measure which tended to increase as censoring increased. Conclusions We recommend that Uno’s concordance measure is used to quantify concordance when there are moderate levels of censoring. Alternatively, Gönen and Heller’s measure could be considered, especially if censoring is very high, but we suggest that the prediction model is re-calibrated first. We also recommend that Royston’s D is routinely reported to assess discrimination since it has an appealing interpretation. The calibration slope is useful for both internal and external validation settings and recommended to report routinely. Our recommendation would be to use any of the predictive accuracy measures and provide the corresponding predictive

  8. Model Validation Using Coordinate Distance with Performance Sensitivity

    Directory of Open Access Journals (Sweden)

    Jiann-Shiun Lew

    2008-01-01

    Full Text Available This paper presents an innovative approach to model validation for a structure with significant parameter variations. Model uncertainty of the structural dynamics is quantified with the use of a singular value decomposition technique to extract the principal components of parameter change, and an interval model is generated to represent the system with parameter uncertainty. The coordinate vector, corresponding to the identified principal directions, of the validation system is computed. The coordinate distance between the validation system and the identified interval model is used as a metric for model validation. A beam structure with an attached subsystem, which has significant parameter uncertainty, is used to demonstrate the proposed approach.

  9. Design validation and performance of closed loop gas recirculation system

    International Nuclear Information System (INIS)

    Kalmani, S.D.; Majumder, G.; Mondal, N.K.; Shinde, R.R.; Joshi, A.V.

    2016-01-01

    A pilot experimental set up of the India Based Neutrino Observatory's ICAL detector has been operational for the last 4 years at TIFR, Mumbai. Twelve glass RPC detectors of size 2 × 2 m 2 , with a gas gap of 2 mm are under test in a closed loop gas recirculation system. These RPCs are continuously purged individually, with a gas mixture of R134a (C 2 H 2 F 4 ), isobutane (iC 4 H 10 ) and sulphur hexafluoride (SF 6 ) at a steady rate of 360 ml/h to maintain about one volume change a day. To economize gas mixture consumption and to reduce the effluents from being released into the atmosphere, a closed loop system has been designed, fabricated and installed at TIFR. The pressure and flow rate in the loop is controlled by mass flow controllers and pressure transmitters. The performance and integrity of RPCs in the pilot experimental set up is being monitored to assess the effect of periodic fluctuation and transients in atmospheric pressure and temperature, room pressure variation, flow pulsations, uniformity of gas distribution and power failures. The capability of closed loop gas recirculation system to respond to these changes is also studied. The conclusions from the above experiment are presented. The validations of the first design considerations and subsequent modifications have provided improved guidelines for the future design of the engineering module gas system.

  10. DEVELOPING OF INDIVIDUAL INSTRUMENT PERFORMANCE ANXIETY SCALE: VALIDITY - RELIABILITY STUDY

    Directory of Open Access Journals (Sweden)

    Esra DALKIRAN

    2014-07-01

    Full Text Available The purpose of this study is to develop a scale unique to our culture, concerning  individual instrument performance anxiety of the students  who are getting instrument training  in the Department of Music Education. In the study, the descriptive research model is used and qualitative research techniques are utilized. The study population consists of the students attending the 23 universities which has Music Education Department. The sample of the study consists of 438 girls and 312 boys, totally 750 students  who are studying in the Department of Music Education of randomly selected 10 universities. As a result of the explanatory and confirmatory factor analyses that were performed, a one-dimensional structure consisting of 14 items was obtained. Also, t-scores and  the coefficient scores of total item correlation concerning the distinguishing power of the items, the difference in the scores of the set of lower and upper 27% was calculated, and it was observed that the items are distinguishing as a result of both analyses. Of the scale, Cronbach's alpha coefficient of internal consistency was calculated as .94, and test-retest reliability coefficient was calculated as .93. As a result, a valid and reliable assessment and evaluation instrument that measures the exam performance anxiety of the students studying in the Department of Music Education, has been developed.Extended AbstractsIntroductionAnxiety is a universal phenomenon which people experience once or a few times during lives. It was accepted as concern for the future or as an unpleasant emotional experience regarding probable hitches of the events (Di Tomasso & Gosch, 2002.In general, the occasions on which negative feelings are experienced cause anxiety to arise (Baltaş and Baltaş, 2000. People also feel anxious in dangerous situations. Anxiety may lead a person to be creative, while it may have hindering characteristics. Anxiety is that an individual considers him

  11. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    Directory of Open Access Journals (Sweden)

    Daan Nieboer

    Full Text Available External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting.We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1 the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2 the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury.The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples and heterogeneous in scenario 2 (in 17%-39% of simulated samples. Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2.The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  12. Reference Proteome Extracts for Mass Spec Instrument Performance Validation and Method Development

    Science.gov (United States)

    Rosenblatt, Mike; Urh, Marjeta; Saveliev, Sergei

    2014-01-01

    Biological samples of high complexity are required to test protein mass spec sample preparation procedures and validate mass spec instrument performance. Total cell protein extracts provide the needed sample complexity. However, to be compatible with mass spec applications, such extracts should meet a number of design requirements: compatibility with LC/MS (free of detergents, etc.)high protein integrity (minimal level of protein degradation and non-biological PTMs)compatibility with common sample preparation methods such as proteolysis, PTM enrichment and mass-tag labelingLot-to-lot reproducibility Here we describe total protein extracts from yeast and human cells that meet the above criteria. Two extract formats have been developed: Intact protein extracts with primary use for sample preparation method development and optimizationPre-digested extracts (peptides) with primary use for instrument validation and performance monitoring

  13. A new high-performance liquid chromatography-tandem mass spectrometry method for the determination of paclitaxel and 6α-hydroxy-paclitaxel in human plasma: Development, validation and application in a clinical pharmacokinetic study.

    Directory of Open Access Journals (Sweden)

    Bianca Posocco

    Full Text Available Paclitaxel belongs to the taxanes family and it is used, alone or in multidrug regimens, for the therapy of several solid tumours, such as breast-, lung-, head and neck-, and ovarian cancer. Standard dosing of chemotherapy does not take into account the many inter-patient differences that make drug exposure highly variable, thus leading to the insurgence of severe toxicity. This is particularly true for paclitaxel considering that a relationship between haematological toxicity and plasma exposure was found. Therefore, in order to treat patients with the correct dose of paclitaxel, improving the overall benefit-risk ratio, Therapeutic Drug Monitoring is necessary. In order to quantify paclitaxel and its main metabolite, 6α-hydroxy-paclitaxel, in patients' plasma, we developed a new, sensitive and specific HPLC-MS/MS method applicable to all paclitaxel dosages used in clinical routine. The developed method used a small volume of plasma sample and is based on quick protein precipitation. The chromatographic separation of the analytes was achieved with a SunFire™ C18 column (3.5 μM, 92 Å, 2,1 x 150 mm; the mobile phases were 0.1% formic acid/bidistilled water and 0.1% formic acid/acetonitrile. The electrospray ionization source worked in positive ion mode and the mass spectrometer operated in selected reaction monitoring mode. Our bioanalytical method was successfully validated according to the FDA-EMA guidelines on bioanalytical method validation. The calibration curves resulted linear (R2 ≥0.9948 over the concentration ranges (1-10000 ng/mL for paclitaxel and 1-1000 ng/mL for 6α-hydroxy-paclitaxel and were characterized by a good accuracy and precision. The intra- and inter-day precision and accuracy were determined on three quality control concentrations for paclitaxel and 6α-hydroxy-paclitaxel and resulted respectively <9.9% and within 91.1-114.8%. In addition, to further verify the assay reproducibility, we tested this method by re

  14. Towards a realistic approach to validation of reactive transport models for performance assessment

    International Nuclear Information System (INIS)

    Siegel, M.D.

    1993-01-01

    Performance assessment calculations are based on geochemical models that assume that interactions among radionuclides, rocks and groundwaters under natural conditions, can be estimated or bound by data obtained from laboratory-scale studies. The data include radionuclide distribution coefficients, measured in saturated batch systems of powdered rocks, and retardation factors measured in short-term column experiments. Traditional approaches to model validation cannot be applied in a straightforward manner to the simple reactive transport models that use these data. An approach to model validation in support of performance assessment is described in this paper. It is based on a recognition of different levels of model validity and is compatible with the requirements of current regulations for high-level waste disposal. Activities that are being carried out in support of this approach include (1) laboratory and numerical experiments to test the validity of important assumptions inherent in current performance assessment methodologies,(2) integrated transport experiments, and (3) development of a robust coupled reaction/transport code for sensitivity analyses using massively parallel computers

  15. Development and validation of a high-performance liquid chromatography method for determination of lisinopril in human plasma by magnetic solid-phase extraction and pre-column derivatization.

    Science.gov (United States)

    Rastkari, Noushin; Ahmadkhaniha, Reza

    2018-03-01

    A sensitive, reliable and simple HPLC method was developed for the determination of lisinopril in human plasma. The method consists of extraction and clean-up steps based on magnetic solid-phase extraction and pre-column derivatization with a fluorescent reagent. The mobile phase consisted of a mixture of methanol-sodium dihydrogen phosphate (pH 3.0; 0.005 m; 75:25, v/v). The flow rate was set at 0.7 mL/min. Fluorescence detection was performed at 470nm excitation and 530nm emission wavelengths. Total chromatography run time was 5 min. The average extraction recovery of lisinopril and fluvoxamine (internal standard) was ≥82.8%. The limits of detection and quantification were determined as 1 and 3 ng/mL respectively. The method exhibited a linear calibration line over the concentration range of 3-1000 ng/mL with coefficient of determination (r 2 ) of ≥0.98. The within-run and between-run precisions were satisfactory with values of CV of 1.8-12.8% (accuracy from 99.2 to 94.7%) and 2.4-13.7% (accuracy from 99.5 to 92.2%), respectively. These developments led to considerable improvement in method sensitivity and reliability. The method was validated according to the US Food and Drug Administration guidelines. Therefore, it can be considered as a suitable method for determination of lisinopril in plasma samples. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Identifying and Validating a Model of Interpersonal Performance Dimensions

    National Research Council Canada - National Science Library

    Carpenter, Tara

    2004-01-01

    .... Two studies were then completed to validate the proposed taxonomy. In the first study empirical evidence for the taxonomy was gathered using a content analysis of critical incidents taken from a job analysis...

  17. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Uranium Metal, Oxide, and Solution Systems on the High Performance Computing Platform Moonlight

    Energy Technology Data Exchange (ETDEWEB)

    Chapman, Bryan Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); MacQuigg, Michael Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wysong, Andrew Russell [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-04-21

    In this document, the code MCNP is validated with ENDF/B-VII.1 cross section data under the purview of ANSI/ANS-8.24-2007, for use with uranium systems. MCNP is a computer code based on Monte Carlo transport methods. While MCNP has wide reading capability in nuclear transport simulation, this validation is limited to the functionality related to neutron transport and calculation of criticality parameters such as keff.

  18. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Uranium Metal, Oxide, and Solution Systems on the High Performance Computing Platform Moonlight

    International Nuclear Information System (INIS)

    Chapman, Bryan Scott; MacQuigg, Michael Robert; Wysong, Andrew Russell

    2016-01-01

    In this document, the code MCNP is validated with ENDF/B-VII.1 cross section data under the purview of ANSI/ANS-8.24-2007, for use with uranium systems. MCNP is a computer code based on Monte Carlo transport methods. While MCNP has wide reading capability in nuclear transport simulation, this validation is limited to the functionality related to neutron transport and calculation of criticality parameters such as k eff .

  19. Learning Apache Solr high performance

    CERN Document Server

    Mohan, Surendra

    2014-01-01

    This book is an easy-to-follow guide, full of hands-on, real-world examples. Each topic is explained and demonstrated in a specific and user-friendly flow, from search optimization using Solr to Deployment of Zookeeper applications. This book is ideal for Apache Solr developers and want to learn different techniques to optimize Solr performance with utmost efficiency, along with effectively troubleshooting the problems that usually occur while trying to boost performance. Familiarity with search servers and database querying is expected.

  20. High-performance composite chocolate

    Science.gov (United States)

    Dean, Julian; Thomson, Katrin; Hollands, Lisa; Bates, Joanna; Carter, Melvyn; Freeman, Colin; Kapranos, Plato; Goodall, Russell

    2013-07-01

    The performance of any engineering component depends on and is limited by the properties of the material from which it is fabricated. It is crucial for engineering students to understand these material properties, interpret them and select the right material for the right application. In this paper we present a new method to engage students with the material selection process. In a competition-based practical, first-year undergraduate students design, cost and cast composite chocolate samples to maximize a particular performance criterion. The same activity could be adapted for any level of education to introduce the subject of materials properties and their effects on the material chosen for specific applications.

  1. High-Performance Composite Chocolate

    Science.gov (United States)

    Dean, Julian; Thomson, Katrin; Hollands, Lisa; Bates, Joanna; Carter, Melvyn; Freeman, Colin; Kapranos, Plato; Goodall, Russell

    2013-01-01

    The performance of any engineering component depends on and is limited by the properties of the material from which it is fabricated. It is crucial for engineering students to understand these material properties, interpret them and select the right material for the right application. In this paper we present a new method to engage students with…

  2. Toward High-Performance Organizations.

    Science.gov (United States)

    Lawler, Edward E., III

    2002-01-01

    Reviews management changes that companies have made over time in adopting or adapting four approaches to organizational performance: employee involvement, total quality management, re-engineering, and knowledge management. Considers future possibilities and defines a new view of what constitutes effective organizational design in management.…

  3. Validating the Assessment for Measuring Indonesian Secondary School Students Performance in Ecology

    Science.gov (United States)

    Rachmatullah, A.; Roshayanti, F.; Ha, M.

    2017-09-01

    The aims of this current study are validating the American Association for the Advancement of Science (AAAS) Ecology assessment and examining the performance of Indonesian secondary school students on the assessment. A total of 611 Indonesian secondary school students (218 middle school students and 393 high school students) participated in the study. Forty-five items of AAAS assessment in the topic of Interdependence in Ecosystems were divided into two versions which every version has 21 similar items. Linking item method was used as the method to combine those two versions of assessment and further Rasch analyses were utilized to validate the instrument. Independent sample t-test was also run to compare the performance of Indonesian students and American students based on the mean of item difficulty. We found that from the total of 45 items, three items were identified as misfitting items. Later on, we also found that both Indonesian middle and high school students were significantly lower performance with very large and medium effect size compared to American students. We will discuss our findings in the regard of validation issue and the connection to Indonesian student’s science literacy.

  4. Validity of the German Version of the Continuous-Scale Physical Functional Performance 10 Test

    Directory of Open Access Journals (Sweden)

    Irene Härdi

    2017-01-01

    Full Text Available Background. The Continuous-Scale Physical Functional Performance 10 Test (CS-PFP 10 quantitatively assesses physical functional performance in older adults who have a broad range of physical functional ability. This study assessed the validity and reliability of the CS-PFP 10 German version. Methods. Forward-translations and backtranslations as well as cultural adaptions of the test were conducted. Participants were German-speaking Swiss community-dwelling adults aged 64 and older. Concurrent validity was assessed using Pearson correlation coefficients between CS-PFP 10 and gait velocity, Timed Up and Go Test, hand grip strength, SF-36 physical function domain, and Freiburger Physical Activity Questionnaire. Internal consistency was calculated by Cronbach’s alpha. Results. Backtranslation and cultural adaptions were accepted by the CS-PFP 10 developer. CS-PFP 10 total score and subscores (upper body strength, upper body flexibility, lower body strength, balance and coordination, and endurance correlated significantly with all measures of physical function tested. Internal consistency was high (Cronbach’s alpha 0.95–0.98. Conclusion. The CS-PFP 10 German version is valid and reliable for measuring physical functional performance in German-speaking Swiss community-dwelling older adults. Quantifying physical function is essential for clinical practice and research and provides meaningful insight into physical functional performance of older adults. This trial is registered with ClinicalTrials.gov NCT01539200.

  5. Functional High Performance Financial IT

    DEFF Research Database (Denmark)

    Berthold, Jost; Filinski, Andrzej; Henglein, Fritz

    2011-01-01

    at the University of Copenhagen that attacks this triple challenge of increased performance, transparency and productivity in the financial sector by a novel integration of financial mathematics, domain-specific language technology, parallel functional programming, and emerging massively parallel hardware. HIPERFIT......The world of finance faces the computational performance challenge of massively expanding data volumes, extreme response time requirements, and compute-intensive complex (risk) analyses. Simultaneously, new international regulatory rules require considerably more transparency and external...... auditability of financial institutions, including their software systems. To top it off, increased product variety and customisation necessitates shorter software development cycles and higher development productivity. In this paper, we report about HIPERFIT, a recently etablished strategic research center...

  6. High performance Mo adsorbent PZC

    Energy Technology Data Exchange (ETDEWEB)

    Anon,

    1998-10-01

    We have developed Mo adsorbents for natural Mo(n, {gamma}){sup 99}Mo-{sup 99m}Tc generator. Among them, we called the highest performance adsorbent PZC that could adsorb about 250 mg-Mo/g. In this report, we will show the structure, adsorption mechanism of Mo, and the other useful properties of PZC when you carry out the examination of Mo adsorption and elution of {sup 99m}Tc. (author)

  7. Reliable and valid assessment of performance in thoracoscopy

    DEFF Research Database (Denmark)

    Konge, Lars; Lehnert, Per; Hansen, Henrik Jessen

    2012-01-01

    BACKGROUND: As we move toward competency-based education in medicine, we have lagged in developing competency-based evaluation methods. In the era of minimally invasive surgery, there is a need for a reliable and valid tool dedicated to measure competence in video-assisted thoracoscopic surgery....... The purpose of this study is to create such an assessment tool, and to explore its reliability and validity. METHODS: An expert group of physicians created an assessment tool consisting of 10 items rated on a five-point rating scale. The following factors were included: economy and confidence of movement...

  8. Validation and computing and performance studies for the ATLAS simulation

    CERN Document Server

    Marshall, Z; The ATLAS collaboration

    2009-01-01

    We present the validation of the ATLAS simulation software pro ject. Software development is controlled by nightly builds and several levels of automatic tests to ensure stability. Computing validation, including CPU time, memory, and disk space required per event, is benchmarked for all software releases. Several different physics processes and event types are checked to thoroughly test all aspects of the detector simulation. The robustness of the simulation software is demonstrated by the production of 500 million events on the World-wide LHC Computing Grid in the last year.

  9. Indoor Air Quality in High Performance Schools

    Science.gov (United States)

    High performance schools are facilities that improve the learning environment while saving energy, resources, and money. The key is understanding the lifetime value of high performance schools and effectively managing priorities, time, and budget.

  10. High performance inertial fusion targets

    International Nuclear Information System (INIS)

    Nuckolls, J.H.; Bangerter, R.O.; Lindl, J.D.; Mead, W.C.; Pan, Y.L.

    1977-01-01

    Inertial confinement fusion (ICF) designs are considered which may have very high gains (approximately 1000) and low power requirements (<100 TW) for input energies of approximately one megajoule. These include targets having very low density shells, ultra thin shells, central ignitors, magnetic insulation, and non-ablative acceleration

  11. High performance inertial fusion targets

    International Nuclear Information System (INIS)

    Nuckolls, J.H.; Bangerter, R.O.; Lindl, J.D.; Mead, W.C.; Pan, Y.L.

    1978-01-01

    Inertial confinement fusion (ICF) target designs are considered which may have very high gains (approximately 1000) and low power requirements (< 100 TW) for input energies of approximately one megajoule. These include targets having very low density shells, ultra thin shells, central ignitors, magnetic insulation, and non-ablative acceleration

  12. High performance nuclear fuel element

    International Nuclear Information System (INIS)

    Mordarski, W.J.; Zegler, S.T.

    1980-01-01

    A fuel-pellet composition is disclosed for use in fast breeder reactors. Uranium carbide particles are mixed with a powder of uraniumplutonium carbides having a stable microstructure. The resulting mixture is formed into fuel pellets. The pellets thus produced exhibit a relatively low propensity to swell while maintaining a high density

  13. Buffer-Free High Performance Liquid Chromatography Method for ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a simple, economical and reproducible high performance liquid chromatographic (HPLC) method for the determination of theophylline in pharmaceutical dosage forms. Method: Caffeine was used as the internal standard and reversed phase C-18 column was used to elute the drug and ...

  14. High Performance JavaScript

    CERN Document Server

    Zakas, Nicholas

    2010-01-01

    If you're like most developers, you rely heavily on JavaScript to build interactive and quick-responding web applications. The problem is that all of those lines of JavaScript code can slow down your apps. This book reveals techniques and strategies to help you eliminate performance bottlenecks during development. You'll learn how to improve execution time, downloading, interaction with the DOM, page life cycle, and more. Yahoo! frontend engineer Nicholas C. Zakas and five other JavaScript experts -- Ross Harmes, Julien Lecomte, Steven Levithan, Stoyan Stefanov, and Matt Sweeney -- demonstra

  15. Validation of OCM-2 sensor performance in retrieving chlorophyll ...

    Indian Academy of Sciences (India)

    Ocean colour; chlorophyll a; total suspended matter; validation; Bay of Bengal; OCM-2. J. Earth Syst. Sci. 122 ... two basins, the Arabian Sea and Bay of Bengal. (BoB). Arabian ... The capability of visible bands of multi-spectral satellite data has ...

  16. Comparison of the performances and validation of three methods for ...

    African Journals Online (AJOL)

    SARAH

    2014-02-28

    Feb 28, 2014 ... bacteria in Norwegian slaughter pigs. Int J. Food Microbiol 1, 301–309. [NCFA] Nordic Committee of Food Analysis (1996). Yersinia enterocolitica Detection in foods 117,. 3rd,edn,1-12. Nowak, B., Mueffling, T.V., Caspari, K. and Hartung, J. 2006 Validation of a method for the detection of virulent Yersinia ...

  17. Carpet Aids Learning in High Performance Schools

    Science.gov (United States)

    Hurd, Frank

    2009-01-01

    The Healthy and High Performance Schools Act of 2002 has set specific federal guidelines for school design, and developed a federal/state partnership program to assist local districts in their school planning. According to the Collaborative for High Performance Schools (CHPS), high-performance schools are, among other things, healthy, comfortable,…

  18. A Valid and Reliable Tool to Assess Nursing Students` Clinical Performance

    OpenAIRE

    Mehrnoosh Pazargadi; Tahereh Ashktorab; Sharareh Khosravi; Hamid Alavi majd

    2013-01-01

    Background: The necessity of a valid and reliable assessment tool is one of the most repeated issues in nursing students` clinical evaluation. But it is believed that present tools are not mostly valid and can not assess students` performance properly.Objectives: This study was conducted to design a valid and reliable assessment tool for evaluating nursing students` performance in clinical education.Methods: In this methodological study considering nursing students` performance definition; th...

  19. Validating Future Force Performance Measures (Army Class): End of Training Longitudinal Validation

    Science.gov (United States)

    2009-09-01

    Caramagno, John Fisher, Patricia Keenan, Julisara Mathew, Alicia Sawyer, Jim Takitch, Shonna Waters, and Elise Weaver Drasgow Consulting Group...promise for enhancing the classification of entry-level Soldiers (Ingerick, Diaz , & Putka, 2009). In Year 2 (2007), the emphasis of the Army...Social Sciences. Ingerick, M., Diaz , T., & Putka, D. (2009). Investigations into Army enlisted classification systems: Concurrent validation report

  20. High performance electromagnetic simulation tools

    Science.gov (United States)

    Gedney, Stephen D.; Whites, Keith W.

    1994-10-01

    Army Research Office Grant #DAAH04-93-G-0453 has supported the purchase of 24 additional compute nodes that were installed in the Intel iPsC/860 hypercube at the Univesity Of Kentucky (UK), rendering a 32-node multiprocessor. This facility has allowed the investigators to explore and extend the boundaries of electromagnetic simulation for important areas of defense concerns including microwave monolithic integrated circuit (MMIC) design/analysis and electromagnetic materials research and development. The iPSC/860 has also provided an ideal platform for MMIC circuit simulations. A number of parallel methods based on direct time-domain solutions of Maxwell's equations have been developed on the iPSC/860, including a parallel finite-difference time-domain (FDTD) algorithm, and a parallel planar generalized Yee-algorithm (PGY). The iPSC/860 has also provided an ideal platform on which to develop a 'virtual laboratory' to numerically analyze, scientifically study and develop new types of materials with beneficial electromagnetic properties. These materials simulations are capable of assembling hundreds of microscopic inclusions from which an electromagnetic full-wave solution will be obtained in toto. This powerful simulation tool has enabled research of the full-wave analysis of complex multicomponent MMIC devices and the electromagnetic properties of many types of materials to be performed numerically rather than strictly in the laboratory.

  1. High-Performance Data Converters

    DEFF Research Database (Denmark)

    Steensgaard-Madsen, Jesper

    -resolution internal D/A converters are required. Unit-element mismatch-shaping D/A converters are analyzed, and the concept of mismatch-shaping is generalized to include scaled-element D/A converters. Several types of scaled-element mismatch-shaping D/A converters are proposed. Simulations show that, when implemented...... in a standard CMOS technology, they can be designed to yield 100 dB performance at 10 times oversampling. The proposed scaled-element mismatch-shaping D/A converters are well suited for use as the feedback stage in oversampled delta-sigma quantizers. It is, however, not easy to make full use of their potential......-order difference of the output signal from the loop filter's first integrator stage. This technique avoids the need for accurate matching of analog and digital filters that characterizes the MASH topology, and it preserves the signal-band suppression of quantization errors. Simulations show that quantizers...

  2. High performance soft magnetic materials

    CERN Document Server

    2017-01-01

    This book provides comprehensive coverage of the current state-of-the-art in soft magnetic materials and related applications, with particular focus on amorphous and nanocrystalline magnetic wires and ribbons and sensor applications. Expert chapters cover preparation, processing, tuning of magnetic properties, modeling, and applications. Cost-effective soft magnetic materials are required in a range of industrial sectors, such as magnetic sensors and actuators, microelectronics, cell phones, security, automobiles, medicine, health monitoring, aerospace, informatics, and electrical engineering. This book presents both fundamentals and applications to enable academic and industry researchers to pursue further developments of these key materials. This highly interdisciplinary volume represents essential reading for researchers in materials science, magnetism, electrodynamics, and modeling who are interested in working with soft magnets. Covers magnetic microwires, sensor applications, amorphous and nanocrystalli...

  3. High performance polyethylene nanocomposite fibers

    Directory of Open Access Journals (Sweden)

    A. Dorigato

    2012-12-01

    Full Text Available A high density polyethylene (HDPE matrix was melt compounded with 2 vol% of dimethyldichlorosilane treated fumed silica nanoparticles. Nanocomposite fibers were prepared by melt spinning through a co-rotating twin screw extruder and drawing at 125°C in air. Thermo-mechanical and morphological properties of the resulting fibers were then investigated. The introduction of nanosilica improved the drawability of the fibers, allowing the achievement of higher draw ratios with respect to the neat matrix. The elastic modulus and creep stability of the fibers were remarkably improved upon nanofiller addition, with a retention of the pristine tensile properties at break. Transmission electronic microscope (TEM images evidenced that the original morphology of the silica aggregates was disrupted by the applied drawing.

  4. Development and validation status of the IFMIF High Flux Test Module

    International Nuclear Information System (INIS)

    Arbeiter, Frederik; Abou-Sena, Ali; Chen Yuming; Dolensky, Bernhard; Heupel, Tobias; Klein, Christine; Scheel, Nicola; Schlindwein, Georg

    2011-01-01

    The development of the IFMIF (International Fusion Material Irradiation Facility) High Flux Test Module in the EVEDA (Engineering Validation and Engineering Design Activities) phase up to 2013 includes conceptual design, engineering analyses, as well as design and engineering validation by building of prototypes and their testing. The High Flux Test Module is the device to facilitate the irradiation of SSTT samples of RAFM steels at temperatures 250-550 deg. C and up to an accumulated irradiation damage of 150 dpa. The requirements, the current design and the performance of the module are discussed, and the development process is outlined.

  5. Development and validation status of the IFMIF High Flux Test Module

    Energy Technology Data Exchange (ETDEWEB)

    Arbeiter, Frederik, E-mail: frederik.arbeiter@kit.edu [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology (KIT-INR), Karlsruhe (Germany); Abou-Sena, Ali; Chen Yuming; Dolensky, Bernhard; Heupel, Tobias; Klein, Christine; Scheel, Nicola; Schlindwein, Georg [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology (KIT-INR), Karlsruhe (Germany)

    2011-10-15

    The development of the IFMIF (International Fusion Material Irradiation Facility) High Flux Test Module in the EVEDA (Engineering Validation and Engineering Design Activities) phase up to 2013 includes conceptual design, engineering analyses, as well as design and engineering validation by building of prototypes and their testing. The High Flux Test Module is the device to facilitate the irradiation of SSTT samples of RAFM steels at temperatures 250-550 deg. C and up to an accumulated irradiation damage of 150 dpa. The requirements, the current design and the performance of the module are discussed, and the development process is outlined.

  6. Development and application of a validated stability-indicating high-performance liquid chromatographic method using photodiode array detection for simultaneous determination of granisetron, methylparaben, propylparaben, sodium benzoate, and their main degradation products in oral pharmaceutical preparations.

    Science.gov (United States)

    Hewala, Ismail; El-Fatatry, Hamed; Emam, Ehab; Mabrouk, Mokhtar

    2011-01-01

    A simple, rapid, and sensitive RP-HPLC method using photodiode array detection was developed and validated for the simultaneous determination of granisetron hydrochloride, 1-methyl-1H-indazole-3-carboxylic acid (the main degradation product of granisetron), sodium benzoate, methylparaben, propylparaben, and 4-hydroxybenzoic acid (the main degradation product of parabens) in granisetron oral drops and solutions. The separation of the compounds was achieved within 8 min on a SymmetryShield RP18 column (100 x 4.6 mm id, 3.5 microm particle size) using the mobile phase acetonitrile--0.05 M KH2PO4 buffered to pH 3 using H3PO4 (3+7, v/v). The photodiode array detector was used to test the purity of the peaks, and the chromatograms were extracted at 240 nm. The method was validated, and validation acceptance criteria were met in all cases. The robust method was successfully applied to the determination of granisetron and preservatives, as well as their degradation products in different batches of granisetron oral drops and solutions. The method proved to be sensitive for determination down to 0.04% (w/w) of granisetron degradation product relative to granisetron and 0.03% (w/w) 4-hydroxybenzoic acid relative to total parabens.

  7. Development and validation of a high performance liquid chromatographic method for the determination of oxcarbazepine and its main metabolites in human plasma and cerebrospinal fluid and its application to pharmacokinetic study.

    Science.gov (United States)

    Kimiskidis, Vasilios; Spanakis, Marios; Niopas, Ioannis; Kazis, Dimitrios; Gabrieli, Chrysi; Kanaze, Feras Imad; Divanoglou, Daniil

    2007-01-17

    An isocratic reversed-phase HPLC-UV procedure for the determination of oxcarbazepine and its main metabolites 10-hydroxy-10,11-dihydrocarbamazepine and 10,11-dihydroxy-trans-10,11-dihydrocarbamazepine in human plasma and cerebrospinal fluid has been developed and validated. After addition of bromazepam as internal standard, the analytes were isolated from plasma and cerebrospinal fluid by liquid-liquid extraction. Separation was achieved on a X-TERRA C18 column using a mobile phase composed of 20 mM KH(2)PO(4), acetonitrile, and n-octylamine (76:24:0.05, v/v/v) at 40 degrees C and detected at 237 nm. The described assay was validated in terms of linearity, accuracy, precision, recovery and lower limit of quantification according to the FDA validation guidelines. Calibration curves were linear with a coefficient of variation (r) greater than 0.998. Accuracy ranged from 92.3% to 106.0% and precision was between 2.3% and 8.2%. The method has been applied to plasma and cerebrospinal fluid samples obtained from patients treated with oxcarbazepine, both in monotherapy and adjunctive therapy.

  8. HIGH-PERFORMANCE COATING MATERIALS

    Energy Technology Data Exchange (ETDEWEB)

    SUGAMA,T.

    2007-01-01

    Corrosion, erosion, oxidation, and fouling by scale deposits impose critical issues in selecting the metal components used at geothermal power plants operating at brine temperatures up to 300 C. Replacing these components is very costly and time consuming. Currently, components made of titanium alloy and stainless steel commonly are employed for dealing with these problems. However, another major consideration in using these metals is not only that they are considerably more expensive than carbon steel, but also the susceptibility of corrosion-preventing passive oxide layers that develop on their outermost surface sites to reactions with brine-induced scales, such as silicate, silica, and calcite. Such reactions lead to the formation of strong interfacial bonds between the scales and oxide layers, causing the accumulation of multiple layers of scales, and the impairment of the plant component's function and efficacy; furthermore, a substantial amount of time is entailed in removing them. This cleaning operation essential for reusing the components is one of the factors causing the increase in the plant's maintenance costs. If inexpensive carbon steel components could be coated and lined with cost-effective high-hydrothermal temperature stable, anti-corrosion, -oxidation, and -fouling materials, this would improve the power plant's economic factors by engendering a considerable reduction in capital investment, and a decrease in the costs of operations and maintenance through optimized maintenance schedules.

  9. Development and validation of a simple high performance thin layer chromatography method combined with direct 1,1-diphenyl-2-picrylhydrazyl assay to quantify free radical scavenging activity in wine.

    Science.gov (United States)

    Agatonovic-Kustrin, Snezana; Morton, David W; Yusof, Ahmad P

    2016-04-15

    The aim of this study was to: (a) develop a simple, high performance thin layer chromatographic (HPTLC) method combined with direct 1,1-diphenyl-2-picrylhydrazyl (DPPH) assay to rapidly assess and compare free radical scavenging activity or anti-oxidant activity for major classes of polyphenolics present in wines; and (b) to investigate relationship between free radical scavenging activity to the total polyphenolic content (TPC) and total antioxidant capacity (TAC) in the wine samples. The most potent free radical scavengers that we tested for in the wine samples were found to be resveratrol (polyphenolic non-flavonoid) and rutin (flavonoid), while polyphenolic acids (caffeic acid and gallic acid) although present in all wine samples were found to be less potent free radical scavengers. Therefore, the total antioxidant capacity was mostly affected by the presence of resveratrol and rutin, while total polyphenolic content was mostly influenced by the presence of the less potent free radical scavengers gallic and caffeic acids. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Precise Quantitative Assessment of the Clinical Performances of Two High-Flux Polysulfone Hemodialyzers in Hemodialysis: Validation of a Blood-Based Simple Kinetic Model Versus Direct Dialysis Quantification.

    Science.gov (United States)

    Lim, Paik-Seong; Lin, Yuyu; Chen, Minfeng; Xu, Xiaoqi; Shi, Yun; Bowry, Sudhir; Canaud, Bernard

    2018-05-01

    Highly permeable dialysis membranes with better design filters have contributed to improved solute removal and dialysis efficacy. However, solute membrane permeability needs to be well controlled to avoid increased loss of albumin that is considered to be detrimental for dialysis patients. A novel high-flux dialyzer type (FX CorDiax; Fresenius Medical Care) incorporating an advanced polysulfone membrane modified with nano-controlled spinning technology to enhance the elimination of a broader spectrum of uremic toxins has been released. The aim of this study was to compare in the clinical setting two dialyzer types having the same surface area, the current (FX dialyzer) and the new dialyzer generation (FX CorDiax), with respect to solute removal capacity over a broad spectrum of markers, including assessment of albumin loss based on a direct dialysis quantification method. We performed a crossover study following an A1-B-A2 design involving 10 patients. Phase A1 was 1 week of thrice-weekly bicarbonate hemodialysis with the FX dialyzer, 4 h per treatment; phase B was performed with a similar treatment regimen but with a new FX CorDiax dialyzer and finally the phase A2 was repeated with FX dialyzer as the former phase. Solute removal markers of interest were assessed from blood samples taken before and after treatment and from total spent dialysate collection (direct dialysis quantification) permitting a mass transfer calculation (mg/session into total spent dialysate/ultrafiltrate). On the blood side, there were no significant differences in the solute percent reduction between FX CorDiax 80 and FX 80. On the dialysate side, no difference was observed regarding eliminated mass of different solutes including β 2 -microglobulin (143.1 ± 33.6 vs. 138.3 ± 41.9 mg, P = 0.8), while the solute mass removal of total protein (1.65 ± 0.51 vs. 2.14 ± 0.75 g, P = 0.04), and albumin (0.41 ± 0.21 vs. 1.22 ± 0.51 g, P < 0.001) were

  11. Combined Heat Transfer in High-Porosity High-Temperature Fibrous Insulations: Theory and Experimental Validation

    Science.gov (United States)

    Daryabeigi, Kamran; Cunnington, George R.; Miller, Steve D.; Knutson, Jeffry R.

    2010-01-01

    Combined radiation and conduction heat transfer through various high-temperature, high-porosity, unbonded (loose) fibrous insulations was modeled based on first principles. The diffusion approximation was used for modeling the radiation component of heat transfer in the optically thick insulations. The relevant parameters needed for the heat transfer model were derived from experimental data. Semi-empirical formulations were used to model the solid conduction contribution of heat transfer in fibrous insulations with the relevant parameters inferred from thermal conductivity measurements at cryogenic temperatures in a vacuum. The specific extinction coefficient for radiation heat transfer was obtained from high-temperature steady-state thermal measurements with large temperature gradients maintained across the sample thickness in a vacuum. Standard gas conduction modeling was used in the heat transfer formulation. This heat transfer modeling methodology was applied to silica, two types of alumina, and a zirconia-based fibrous insulation, and to a variation of opacified fibrous insulation (OFI). OFI is a class of insulations manufactured by embedding efficient ceramic opacifiers in various unbonded fibrous insulations to significantly attenuate the radiation component of heat transfer. The heat transfer modeling methodology was validated by comparison with more rigorous analytical solutions and with standard thermal conductivity measurements. The validated heat transfer model is applicable to various densities of these high-porosity insulations as long as the fiber properties are the same (index of refraction, size distribution, orientation, and length). Furthermore, the heat transfer data for these insulations can be obtained at any static pressure in any working gas environment without the need to perform tests in various gases at various pressures.

  12. Delivering high performance BWR fuel reliably

    International Nuclear Information System (INIS)

    Schardt, J.F.

    1998-01-01

    Utilities are under intense pressure to reduce their production costs in order to compete in the increasingly deregulated marketplace. They need fuel, which can deliver high performance to meet demanding operating strategies. GE's latest BWR fuel design, GE14, provides that high performance capability. GE's product introduction process assures that this performance will be delivered reliably, with little risk to the utility. (author)

  13. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  14. Thermomechanical simulations and experimental validation for high speed incremental forming

    Science.gov (United States)

    Ambrogio, Giuseppina; Gagliardi, Francesco; Filice, Luigino; Romero, Natalia

    2016-10-01

    Incremental sheet forming (ISF) consists in deforming only a small region of the workspace through a punch driven by a NC machine. The drawback of this process is its slowness. In this study, a high speed variant has been investigated from both numerical and experimental points of view. The aim has been the design of a FEM model able to perform the material behavior during the high speed process by defining a thermomechanical model. An experimental campaign has been performed by a CNC lathe with high speed to test process feasibility. The first results have shown how the material presents the same performance than in conventional speed ISF and, in some cases, better material behavior due to the temperature increment. An accurate numerical simulation has been performed to investigate the material behavior during the high speed process confirming substantially experimental evidence.

  15. Comparison of the Effects of Cross-validation Methods on Determining Performances of Classifiers Used in Diagnosing Congestive Heart Failure

    Directory of Open Access Journals (Sweden)

    Isler Yalcin

    2015-08-01

    Full Text Available Congestive heart failure (CHF occurs when the heart is unable to provide sufficient pump action to maintain blood flow to meet the needs of the body. Early diagnosis is important since the mortality rate of the patients with CHF is very high. There are different validation methods to measure performances of classifier algorithms designed for this purpose. In this study, k-fold and leave-one-out cross-validation methods were tested for performance measures of five distinct classifiers in the diagnosis of the patients with CHF. Each algorithm was run 100 times and the average and the standard deviation of classifier performances were recorded. As a result, it was observed that average performance was enhanced and the variability of performances was decreased when the number of data sections used in the cross-validation method was increased.

  16. High Power Flex-Propellant Arcjet Performance

    Science.gov (United States)

    Litchford, Ron J.

    2011-01-01

    A MW-class electrothermal arcjet based on a water-cooled, wall-stabilized, constricted arc discharge configuration was subjected to extensive performance testing using hydrogen and simulated ammonia propellants with the deliberate aim of advancing technology readiness level for potential space propulsion applications. The breadboard design incorporates alternating conductor/insulator wafers to form a discharge barrel enclosure with a 2.5-cm internal bore diameter and an overall length of approximately 1 meter. Swirling propellant flow is introduced into the barrel, and a DC arc discharge mode is established between a backplate tungsten cathode button and a downstream ringanode/ spin-coil assembly. The arc-heated propellant then enters a short mixing plenum and is accelerated through a converging-diverging graphite nozzle. This innovative design configuration differs substantially from conventional arcjet thrusters, in which the throat functions as constrictor and the expansion nozzle serves as the anode, and permits the attainment of an equilibrium sonic throat (EST) condition. During the test program, applied electrical input power was varied between 0.5-1 MW with hydrogen and simulated ammonia flow rates in the range of 4-12 g/s and 15-35 g/s, respectively. The ranges of investigated specific input energy therefore fell between 50-250 MJ/kg for hydrogen and 10-60 MJ/kg for ammonia. In both cases, observed arc efficiencies were between 40-60 percent as determined via a simple heat balance method based on electrical input power and coolant water calorimeter measurements. These experimental results were found to be in excellent agreement with theoretical chemical equilibrium predictions, thereby validating the EST assumption and enabling the utilization of standard TDK nozzle expansion analyses to reliably infer baseline thruster performance characteristics. Inferred specific impulse performance accounting for recombination kinetics during the expansion process

  17. Performance support system in higher engineering education - introduction and empirical validation

    NARCIS (Netherlands)

    Stoyanov, S.; Stoyanov, Slavi; Kommers, Petrus A.M.; Bastiaens, T.J.; Martinez Mediano, Catalina

    2008-01-01

    The paper defines and empirically validates the concept of performance support system in higher engineering education. The validation of the concept is based upon two studies: a pilot and an experiment, on the effect of performance support system on achievements and attitudes of students. The

  18. A Cross-Validation Study of Police Recruit Performance as Predicted by the IPI and MMPI.

    Science.gov (United States)

    Shusman, Elizabeth J.; And Others

    Validation and cross-validation studies were conducted using the Minnesota Multiphasic Personality Inventory (MMPI) and Inwald Personality Inventory (IPI) to predict job performance for 698 urban male police officers who completed a six-month training academy. Job performance criteria evaluated included absence, lateness, derelictions, negative…

  19. Standards Performance Continuum: Development and Validation of a Measure of Effective Pedagogy.

    Science.gov (United States)

    Doherty, R. William; Hilberg, R. Soleste; Epaloose, Georgia; Tharp, Roland G.

    2002-01-01

    Describes the development and validation of the Standards Performance Continuum (SPC) for assessing teacher performance of the Standards for Effective Pedagogy. Three studies involving Florida, California, and New Mexico public school teachers provided evidence of inter-rater reliability, concurrent validity, and criterion-related validity…

  20. Validity of the Symbol Digit Modalities Test as a cognition performance outcome measure for multiple sclerosis.

    Science.gov (United States)

    Benedict, Ralph Hb; DeLuca, John; Phillips, Glenn; LaRocca, Nicholas; Hudson, Lynn D; Rudick, Richard

    2017-04-01

    Cognitive and motor performance measures are commonly employed in multiple sclerosis (MS) research, particularly when the purpose is to determine the efficacy of treatment. The increasing focus of new therapies on slowing progression or reversing neurological disability makes the utilization of sensitive, reproducible, and valid measures essential. Processing speed is a basic elemental cognitive function that likely influences downstream processes such as memory. The Multiple Sclerosis Outcome Assessments Consortium (MSOAC) includes representatives from advocacy organizations, Food and Drug Administration (FDA), European Medicines Agency (EMA), National Institute of Neurological Disorders and Stroke (NINDS), academic institutions, and industry partners along with persons living with MS. Among the MSOAC goals is acceptance and qualification by regulators of performance outcomes that are highly reliable and valid, practical, cost-effective, and meaningful to persons with MS. A critical step for these neuroperformance metrics is elucidation of clinically relevant benchmarks, well-defined degrees of disability, and gradients of change that are deemed clinically meaningful. This topical review provides an overview of research on one particular cognitive measure, the Symbol Digit Modalities Test (SDMT), recognized as being particularly sensitive to slowed processing of information that is commonly seen in MS. The research in MS clearly supports the reliability and validity of this test and recently has supported a responder definition of SDMT change approximating 4 points or 10% in magnitude.

  1. The predictive validity of the BioMedical Admissions Test for pre-clinical examination performance.

    Science.gov (United States)

    Emery, Joanne L; Bell, John F

    2009-06-01

    Some medical courses in the UK have many more applicants than places and almost all applicants have the highest possible previous and predicted examination grades. The BioMedical Admissions Test (BMAT) was designed to assist in the student selection process specifically for a number of 'traditional' medical courses with clear pre-clinical and clinical phases and a strong focus on science teaching in the early years. It is intended to supplement the information provided by examination results, interviews and personal statements. This paper reports on the predictive validity of the BMAT and its predecessor, the Medical and Veterinary Admissions Test. Results from the earliest 4 years of the test (2000-2003) were matched to the pre-clinical examination results of those accepted onto the medical course at the University of Cambridge. Correlation and logistic regression analyses were performed for each cohort. Section 2 of the test ('Scientific Knowledge') correlated more strongly with examination marks than did Section 1 ('Aptitude and Skills'). It also had a stronger relationship with the probability of achieving the highest examination class. The BMAT and its predecessor demonstrate predictive validity for the pre-clinical years of the medical course at the University of Cambridge. The test identifies important differences in skills and knowledge between candidates, not shown by their previous attainment, which predict their examination performance. It is thus a valid source of additional admissions information for medical courses with a strong scientific emphasis when previous attainment is very high.

  2. Development and Validation of a Clarinet Performance Adjudication Scale

    Science.gov (United States)

    Abeles, Harold F.

    1973-01-01

    A basic assumption of this study is that there are generally agreed upon performance standards as evidenced by the use of adjudicators for evaluations at contests and festivals. An evaluation instrument was developed to enable raters to measure effectively those aspects of performance that have common standards of proficiency. (Author/RK)

  3. Evaluation of physicians' professional performance: An iterative development and validation study of multisource feedback instruments

    Directory of Open Access Journals (Sweden)

    Overeem Karlijn

    2012-03-01

    Full Text Available Abstract Background There is a global need to assess physicians' professional performance in actual clinical practice. Valid and reliable instruments are necessary to support these efforts. This study focuses on the reliability and validity, the influences of some sociodemographic biasing factors, associations between self and other evaluations, and the number of evaluations needed for reliable assessment of a physician based on the three instruments used for the multisource assessment of physicians' professional performance in the Netherlands. Methods This observational validation study of three instruments underlying multisource feedback (MSF was set in 26 non-academic hospitals in the Netherlands. In total, 146 hospital-based physicians took part in the study. Each physician's professional performance was assessed by peers (physician colleagues, co-workers (including nurses, secretary assistants and other healthcare professionals and patients. Physicians also completed a self-evaluation. Ratings of 864 peers, 894 co-workers and 1960 patients on MSF were available. We used principal components analysis and methods of classical test theory to evaluate the factor structure, reliability and validity of instruments. We used Pearson's correlation coefficient and linear mixed models to address other objectives. Results The peer, co-worker and patient instruments respectively had six factors, three factors and one factor with high internal consistencies (Cronbach's alpha 0.95 - 0.96. It appeared that only 2 percent of variance in the mean ratings could be attributed to biasing factors. Self-ratings were not correlated with peer, co-worker or patient ratings. However, ratings of peers, co-workers and patients were correlated. Five peer evaluations, five co-worker evaluations and 11 patient evaluations are required to achieve reliable results (reliability coefficient ≥ 0.70. Conclusions The study demonstrated that the three MSF instruments produced

  4. Determination of methylmercury and estimation of total mercury in seafood using high performance liquid chromatography (HPLC) and inductively coupled plasma-mass spectrometry (ICP-MS): Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Hight, Susan C. [Elemental Research Branch, Center for Food Safety and Applied Nutrition, US Food and Drug Administration, 5100 Paint Branch Parkway, College Park, MD 20740-3835 (United States)]. E-mail: susan.hight@fda.gov; Cheng, John [Elemental Research Branch, Center for Food Safety and Applied Nutrition, US Food and Drug Administration, 5100 Paint Branch Parkway, College Park, MD 20740-3835 (United States)]. E-mail: john.cheng@fda.gov

    2006-05-17

    A method was developed for determination of methylmercury and estimation of total mercury in seafood. Mercury (Hg) compounds were extracted from 0.5 g edible seafood or 0.2 g lyophilized reference material by adding 50 ml aqueous 1% w/v L-cysteine.HCl.H{sub 2}O and heating 120 min at 60 deg. C in glass vials. Hg compounds in 50 {mu}l of filtered extract were separated by reversed-phase high performance liquid chromatography using a C-18 column and aqueous 0.1% w/v L-cysteine.HCl.H{sub 2}O + 0.1% w/v L-cysteine mobile phase at room temperature and were detected by inductively coupled plasma-mass spectrometry at mass-to-charge ratio 202. Total Hg was calculated as the mathematical sum of methyl and inorganic Hg determined in extracts. For seafoods containing 0.055-2.78 mg kg{sup -1} methylmercury and 0.014-0.137 mg kg{sup -1} inorganic Hg, precision of analyses was {<=}5% relative standard deviation (R.S.D.) for methylmercury and {<=}9% R.S.D. for inorganic Hg. Recovery of added analyte was 94% for methylmercury and 98% for inorganic Hg. Methyl and total Hg results for reference materials agreed with certified values. Limits of quantitation were 0.007 mg kg{sup -1} methylmercury and 0.005 mg kg{sup -1} inorganic Hg in edible seafood and 0.017 mg kg{sup -1} methylmercury and 0.012 mg kg{sup -1} inorganic Hg in lyophilized reference materials. Evaluation of analyte stability demonstrated that L-cysteine both stabilized and de-alkylated methylmercury, depending on holding time and cysteine concentration. Polypropylene adversely affected methylmercury stability. Total Hg results determined by this method were equivalent to results determined independently by cold vapour-atomic absorption spectrometry. Methylmercury was the predominant form of Hg in finfish. Ratios of methylmercury/total Hg determined by this method were 93-98% for finfish and 38-48% for mollusks.

  5. Predictive validity of pre-admission assessments on medical student performance.

    Science.gov (United States)

    Dabaliz, Al-Awwab; Kaadan, Samy; Dabbagh, M Marwan; Barakat, Abdulaziz; Shareef, Mohammad Abrar; Al-Tannir, Mohamad; Obeidat, Akef; Mohamed, Ayman

    2017-11-24

    To examine the predictive validity of pre-admission variables on students' performance in a medical school in Saudi Arabia. In this retrospective study, we collected admission and college performance data for 737 students in preclinical and clinical years. Data included high school scores and other standardized test scores, such as those of the National Achievement Test and the General Aptitude Test. Additionally, we included the scores of the Test of English as a Foreign Language (TOEFL) and the International English Language Testing System (IELTS) exams. Those datasets were then compared with college performance indicators, namely the cumulative Grade Point Average (cGPA) and progress test, using multivariate linear regression analysis. In preclinical years, both the National Achievement Test (p=0.04, B=0.08) and TOEFL (p=0.017, B=0.01) scores were positive predictors of cGPA, whereas the General Aptitude Test (p=0.048, B=-0.05) negatively predicted cGPA. Moreover, none of the pre-admission variables were predictive of progress test performance in the same group. On the other hand, none of the pre-admission variables were predictive of cGPA in clinical years. Overall, cGPA strongly predict-ed students' progress test performance (p<0.001 and B=19.02). Only the National Achievement Test and TOEFL significantly predicted performance in preclinical years. However, these variables do not predict progress test performance, meaning that they do not predict the functional knowledge reflected in the progress test. We report various strengths and deficiencies in the current medical college admission criteria, and call for employing more sensitive and valid ones that predict student performance and functional knowledge, especially in the clinical years.

  6. High performance carbon nanocomposites for ultracapacitors

    Science.gov (United States)

    Lu, Wen

    2012-10-02

    The present invention relates to composite electrodes for electrochemical devices, particularly to carbon nanotube composite electrodes for high performance electrochemical devices, such as ultracapacitors.

  7. High Turbidity Solis Clear Sky Model: Development and Validation

    Directory of Open Access Journals (Sweden)

    Pierre Ineichen

    2018-03-01

    Full Text Available The Solis clear sky model is a spectral scheme based on radiative transfer calculations and the Lambert–Beer relation. Its broadband version is a simplified fast analytical version; it is limited to broadband aerosol optical depths lower than 0.45, which is a weakness when applied in countries with very high turbidity such as China or India. In order to extend the use of the original simplified version of the model for high turbidity values, we developed a new version of the broadband Solis model based on radiative transfer calculations, valid for turbidity values up to 7, for the three components, global, beam, and diffuse, and for the four aerosol types defined by Shettle and Fenn. A validation of low turbidity data acquired in Geneva shows slightly better results than the previous version. On data acquired at sites presenting higher turbidity data, the bias stays within ±4% for the beam and the global irradiances, and the standard deviation around 5% for clean and stable condition data and around 12% for questionable data and variable sky conditions.

  8. Strategies and Experiences Using High Performance Fortran

    National Research Council Canada - National Science Library

    Shires, Dale

    2001-01-01

    .... High performance Fortran (HPF) is a relative new addition to the Fortran dialect It is an attempt to provide an efficient high-level Fortran parallel programming language for the latest generation of been debatable...

  9. A procedure validation for high conversion reactors fuel elements calculation

    International Nuclear Information System (INIS)

    Ishida, V.N.; Patino, N.E.; Abbate, M.J.; Sbaffoni, M.M.

    1990-01-01

    The present work includes procedure validation of cross sections generation starting from nuclear data and the calculation system actually used at the Bariloche Atomic Center Reactor and Neutrons Division for its application to fuel elements calculation of a high conversion reactor (HCR). To this purpose, the fuel element calculation belonging to a High Conversion Boiling water Reactor (HCBWR) was chosen as reference problem, employing the Monte Carlo method. Various cases were considered: with and without control bars, cold of hot, at different vacuum fractions. Multiplication factors, reaction rates, power maps and peak factors were compared. A sensitivity analysis of typical cells used, the approximations employed to solve the transport equation (Sn or Diffusion), the 1-D or 2-D representation and densification of the spatial network used, with the aim of evaluating their influence on the parameters studied and to come to an optimum combination to be used in future design calculations. (Author) [es

  10. High Performance Grinding and Advanced Cutting Tools

    CERN Document Server

    Jackson, Mark J

    2013-01-01

    High Performance Grinding and Advanced Cutting Tools discusses the fundamentals and advances in high performance grinding processes, and provides a complete overview of newly-developing areas in the field. Topics covered are grinding tool formulation and structure, grinding wheel design and conditioning and applications using high performance grinding wheels. Also included are heat treatment strategies for grinding tools, using grinding tools for high speed applications, laser-based and diamond dressing techniques, high-efficiency deep grinding, VIPER grinding, and new grinding wheels.

  11. Strategy Guideline: High Performance Residential Lighting

    Energy Technology Data Exchange (ETDEWEB)

    Holton, J.

    2012-02-01

    The Strategy Guideline: High Performance Residential Lighting has been developed to provide a tool for the understanding and application of high performance lighting in the home. The high performance lighting strategies featured in this guide are drawn from recent advances in commercial lighting for application to typical spaces found in residential buildings. This guide offers strategies to greatly reduce lighting energy use through the application of high quality fluorescent and light emitting diode (LED) technologies. It is important to note that these strategies not only save energy in the home but also serve to satisfy the homeowner's expectations for high quality lighting.

  12. Carbon nanomaterials for high-performance supercapacitors

    OpenAIRE

    Tao Chen; Liming Dai

    2013-01-01

    Owing to their high energy density and power density, supercapacitors exhibit great potential as high-performance energy sources for advanced technologies. Recently, carbon nanomaterials (especially, carbon nanotubes and graphene) have been widely investigated as effective electrodes in supercapacitors due to their high specific surface area, excellent electrical and mechanical properties. This article summarizes the recent progresses on the development of high-performance supercapacitors bas...

  13. Evidence for validity within workplace assessment: the Longitudinal Evaluation of Performance (LEP).

    Science.gov (United States)

    Prescott-Clements, Linda; van der Vleuten, Cees P M; Schuwirth, Lambert W T; Hurst, Yvonne; Rennie, James S

    2008-05-01

    The drive towards valid and reliable assessment methods for health professions' training is becoming increasingly focused towards authentic models of workplace performance assessment. This study investigates the validity of such a method, longitudinal evaluation of performance (LEP), which has been implemented in the assessment of postgraduate dental trainees in Scotland. Although it is similar in format to the mini-CEX (mini clinical evaluation exercise) and other tools that use global ratings for assessing performance in the workplace, a number of differences exist in the way in which the LEP has been implemented. These include the use of a reference point for evaluators' judgement that represents the standard expected upon completion of the training, flexibility, a greater range of cases assessed and the use of frequency scores within feedback to identify trainees' progress over time. A range of qualitative and quantitative data were collected and analysed from 2 consecutive cohorts of trainees in Scotland (2002-03 and 2003-04). There is rich evidence supporting the validity, educational impact and feasibility of the LEP. In particular, a great deal of support was given by trainers for the use of a fixed reference point for judgements, despite initial concerns that this might be demotivating to trainees. Trainers were highly positive about this approach and considered it useful in identifying trainees' progress and helping to drive learning. The LEP has been successful in combining a strong formative approach to continuous assessment with the collection of evidence on performance within the workplace that (alongside other tools within an assessment system) can contribute towards a summative decision regarding competence.

  14. Latent class analysis of reading, decoding, and writing performance using the Academic Performance Test: concurrent and discriminating validity

    Directory of Open Access Journals (Sweden)

    Cogo-Moreira H

    2013-08-01

    Full Text Available Hugo Cogo-Moreira,1 Carolina Alves Ferreira Carvalho,2 Adriana de Souza Batista Kida,2 Clara Regina Brandão de Avila,2 Giovanni Abrahão Salum,3,5 Tais Silveira Moriyama,1,4 Ary Gadelha,1,5 Luis Augusto Rohde,3,5 Luciana Monteiro de Moura,1 Andrea Parolin Jackowski,1 Jair de Jesus Mari11Department of Psychiatry, Federal University of São Paulo, São Paulo, 2Department of Hearing and Speech Pathology, Federal University of São Paulo, São Paulo, 3Department of Psychiatry, Federal University of Rio Grande do Sul, Rio Grande do Sul, 4Department of Psychiatry, University of São Paulo, São Paulo, 5National Institute for Developmental Psychiatry for Children and Adolescent, (National Counsel of Technological and Scientific Development, BrazilAim: To explore and validate the best returned latent class solution for reading and writing subtests from the Academic Performance Test (TDE.Sample: A total of 1,945 children (6–14 years of age, who answered the TDE, the Development and Well-Being Assessment (DAWBA, and had an estimated intelligence quotient (IQ higher than 70, came from public schools in São Paulo (35 schools and Porto Alegre (22 schools that participated in the ‘High Risk Cohort Study for Childhood Psychiatric Disorders’ project. They were on average 9.52 years old (standard deviation = 1.856, from the 1st to 9th grades, and 53.3% male. The mean estimated IQ was 102.70 (standard deviation = 16.44.Methods: Via Item Response Theory (IRT, the highest discriminating items (‘a’>1.7 were selected from the TDE subtests of reading and writing. A latent class analysis was run based on these subtests. The statistically and empirically best latent class solutions were validated through concurrent (IQ and combined attention deficit hyperactivity disorder [ADHD] diagnoses and discriminant (major depression diagnoses measures.Results: A three-class solution was found to be the best model solution, revealing classes of children with good, not

  15. Validating Future Force Performance Measures (Army Class): Concluding Analyses

    Science.gov (United States)

    2016-06-01

    32 Table 3.10. Descriptive Statistics and Intercorrelations for LV Final Predictor Factor Scores...55 Table 4.7. Descriptive Statistics for Analysis Criteria...Soldier attrition and performance: Dependability (Non- Delinquency ), Adjustment, Physical Conditioning, Leadership, Work Orientation, and Agreeableness

  16. Personality and Job Performance: Evidence of Incremental Validity.

    Science.gov (United States)

    Day, David V.; Silverman, Stanley B.

    1989-01-01

    Investigated relationship between personality variables and job performance in 43 accountants. Results indicated that, even with effects of cognitive ability taken into account, 3 personality scales (orientation toward work, degree of ascendancy, and degree and quality of interpersonal orientation) were significantly related to important aspects…

  17. The reliability and validity of a soccer-specific nonmotorised treadmill simulation (intermittent soccer performance test).

    Science.gov (United States)

    Aldous, Jeffrey W F; Akubat, Ibrahim; Chrismas, Bryna C R; Watkins, Samuel L; Mauger, Alexis R; Midgley, Adrian W; Abt, Grant; Taylor, Lee

    2014-07-01

    This study investigated the reliability and validity of a novel nonmotorised treadmill (NMT)-based soccer simulation using a novel activity category called a "variable run" to quantify fatigue during high-speed running. Twelve male University soccer players completed 3 familiarization sessions and 1 peak speed assessment before completing the intermittent soccer performance test (iSPT) twice. The 2 iSPTs were separated by 6-10 days. The total distance, sprint distance, and high-speed running distance (HSD) were 8,968 ± 430 m, 980 ± 75 m and 2,122 ± 140 m, respectively. No significant difference (p > 0.05) was found between repeated trials of the iSPT for all physiological and performance variables. Reliability measures between iSPT1 and iSPT2 showed good agreement (coefficient of variation: 0.80). Furthermore, the variable run phase showed HSD significantly decreased (p ≤ 0.05) in the last 15 minutes (89 ± 6 m) compared with the first 15 minutes (85 ± 7 m), quantifying decrements in high-speed exercise compared with the previous literature. This study validates the iSPT as a NMT-based soccer simulation compared with the previous match-play data and is a reliable tool for assessing and monitoring physiological and performance variables in soccer players. The iSPT could be used in a number of ways including player rehabilitation, understanding the efficacy of nutritional interventions, and also the quantification of environmentally mediated decrements on soccer-specific performance.

  18. Team Development for High Performance Management.

    Science.gov (United States)

    Schermerhorn, John R., Jr.

    1986-01-01

    The author examines a team development approach to management that creates shared commitments to performance improvement by focusing the attention of managers on individual workers and their task accomplishments. It uses the "high-performance equation" to help managers confront shared beliefs and concerns about performance and develop realistic…

  19. Delivering high performance BWR fuel reliably

    Energy Technology Data Exchange (ETDEWEB)

    Schardt, J.F. [GE Nuclear Energy, Wilmington, NC (United States)

    1998-07-01

    Utilities are under intense pressure to reduce their production costs in order to compete in the increasingly deregulated marketplace. They need fuel, which can deliver high performance to meet demanding operating strategies. GE's latest BWR fuel design, GE14, provides that high performance capability. GE's product introduction process assures that this performance will be delivered reliably, with little risk to the utility. (author)

  20. HPTA: High-Performance Text Analytics

    OpenAIRE

    Vandierendonck, Hans; Murphy, Karen; Arif, Mahwish; Nikolopoulos, Dimitrios S.

    2017-01-01

    One of the main targets of data analytics is unstructured data, which primarily involves textual data. High-performance processing of textual data is non-trivial. We present the HPTA library for high-performance text analytics. The library helps programmers to map textual data to a dense numeric representation, which can be handled more efficiently. HPTA encapsulates three performance optimizations: (i) efficient memory management for textual data, (ii) parallel computation on associative dat...

  1. Quantitative Analysis of Aloins and Aloin-Emodin in Aloe Vera Raw Materials and Finished Products Using High-Performance Liquid Chromatography: Single-Laboratory Validation, First Action 2016.09.

    Science.gov (United States)

    Kline, David; Ritruthai, Vicha; Babajanian, Silva; Gao, Quanyin; Ingle, Prashant; Chang, Peter; Swanson, Gary

    2017-05-01

    A single-laboratory validation study is described for a method of quantitative analysis of aloins (aloins A and B) and aloe-emodin in aloe vera raw materials and finished products. This method used HPLC coupled with UV detection at 380 nm for the aloins and 430 nm for aloe-emodin. The advantage of this test method is that the target analytes are concentrated from the sample matrix (either liquid or solid form) using stepwise liquid-liquid extraction (water-ethyl acetate-methanol), followed by solvent evaporation and reconstitution. This sample preparation process is suitable for different forms of products. The concentrating step for aloins and aloe-emodin has enhanced the method quantitation level to 20 parts per billion (ppb). Reversed-phase chromatography using a 250 × 4.6 mm column under gradient elution conditions was used. Mobile phase A is 0.1% acetic acid in water and mobile phase B is 0.1% acetic acid in acetonitrile. The HPLC run starts with a 20% mobile phase B that reaches 35% at 13 min. From 13 to 30 min, mobile phase B is increased from 35 to 100%. From 30 to 40 min, mobile phase B is changed from 100% back to the initial condition of 20% for re-equilibration. The flow rate is 1 mL/min, with a 100 μL injection volume. Baseline separation (Rs > 2.0) for aloins A and B and aloe-emodin was observed under this chromatographic condition. This test method was validated with raw materials of aloe vera 5× (liquid) and aloe vera 200× (powder) and finished products of aloe concentrate (liquid) and aloe (powder). The linearity of the method was studied from 10 to 500 ppb for aloins A and B and aloe-emodin, with correlation coefficients of 0.999964, 0.999957, and 0.999980, respectively. The test method was proven to be specific, precise, accurate, rugged, and suitable for the intended quantitative analysis of aloins and aloe-emodin in raw materials and finished products. The S/N for aloins A and B and aloe-emodin at 10 ppb level were 12, 10, and 8

  2. Performance Validity Testing in Neuropsychology: Scientific Basis and Clinical Application-A Brief Review.

    Science.gov (United States)

    Greher, Michael R; Wodushek, Thomas R

    2017-03-01

    Performance validity testing refers to neuropsychologists' methodology for determining whether neuropsychological test performances completed in the course of an evaluation are valid (ie, the results of true neurocognitive function) or invalid (ie, overly impacted by the patient's effort/engagement in testing). This determination relies upon the use of either standalone tests designed for this sole purpose, or specific scores/indicators embedded within traditional neuropsychological measures that have demonstrated this utility. In response to a greater appreciation for the critical role that performance validity issues play in neuropsychological testing and the need to measure this variable to the best of our ability, the scientific base for performance validity testing has expanded greatly over the last 20 to 30 years. As such, the majority of current day neuropsychologists in the United States use a variety of measures for the purpose of performance validity testing as part of everyday forensic and clinical practice and address this issue directly in their evaluations. The following is the first article of a 2-part series that will address the evolution of performance validity testing in the field of neuropsychology, both in terms of the science as well as the clinical application of this measurement technique. The second article of this series will review performance validity tests in terms of methods for development of these measures, and maximizing of diagnostic accuracy.

  3. Issues in developing valid assessments of speech pathology students' performance in the workplace.

    Science.gov (United States)

    McAllister, Sue; Lincoln, Michelle; Ferguson, Alison; McAllister, Lindy

    2010-01-01

    Workplace-based learning is a critical component of professional preparation in speech pathology. A validated assessment of this learning is seen to be 'the gold standard', but it is difficult to develop because of design and validation issues. These issues include the role and nature of judgement in assessment, challenges in measuring quality, and the relationship between assessment and learning. Valid assessment of workplace-based performance needs to capture the development of competence over time and account for both occupation specific and generic competencies. This paper reviews important conceptual issues in the design of valid and reliable workplace-based assessments of competence including assessment content, process, impact on learning, measurement issues, and validation strategies. It then goes on to share what has been learned about quality assessment and validation of a workplace-based performance assessment using competency-based ratings. The outcomes of a four-year national development and validation of an assessment tool are described. A literature review of issues in conceptualizing, designing, and validating workplace-based assessments was conducted. Key factors to consider in the design of a new tool were identified and built into the cycle of design, trialling, and data analysis in the validation stages of the development process. This paper provides an accessible overview of factors to consider in the design and validation of workplace-based assessment tools. It presents strategies used in the development and national validation of a tool COMPASS, used in an every speech pathology programme in Australia, New Zealand, and Singapore. The paper also describes Rasch analysis, a model-based statistical approach which is useful for establishing validity and reliability of assessment tools. Through careful attention to conceptual and design issues in the development and trialling of workplace-based assessments, it has been possible to develop the

  4. Validation of ascorbic acid tablets of national production by igh-performance liquid chromatography method

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Izquierdo Castro, Idalberto

    2009-01-01

    We validate an analytical method by high-performance liquid chromatography to determine ascorbic acid proportion in vitamin C tablets, which was designed as an alternative method to quality control and to follow-up of active principle chemical stability, since official techniques to quality control of ascorbic acid in tablets are not selective with degradation products. Method was modified according to that reported in USP 28, 2005 for analysis of injectable product. We used a RP-18 column of 250 x 4.6 mm 5 μm with a UV detector to 245 nm. Its validation was necessary for both objectives, considering parameters required for methods of I and II categories. This method was enough linear, exact, and precise in the rank of 100-300 μg/mL. Also, it was selective with remaining components of matrix and with the possible degradation products achieved in stressing conditions. Detection and quantification limits were estimated. When method was validated it was applied to ascorbic acid quantification in two batches of expired tablets and we detected a marked influence of container in active degradation principle after 12 months at room temperature. (Author)

  5. Validation and Evaluation of Army Aviation Collective Performance Measures

    Science.gov (United States)

    2014-01-01

    Michael Stachour, Mr. Kevin Hotel , and many others in the Directorate of Simulation who provided input, feedback, and coordination throughout the...research effort would not have been possible without the high-quality contributions of all members of the technical team: Kerri Chik, Andy Chang...CONTENTS Page INTRODUCTION ...........................................................................................................................1

  6. Development and validation of a fuel performance analysis code

    International Nuclear Information System (INIS)

    Majalee, Aaditya V.; Chaturvedi, S.

    2015-01-01

    CAD has been developing a computer code 'FRAVIZ' for calculation of steady-state thermomechanical behaviour of nuclear reactor fuel rods. It contains four major modules viz., Thermal module, Fission Gas Release module, Material Properties module and Mechanical module. All these four modules are coupled to each other and feedback from each module is fed back to others to get a self-consistent evolution in time. The computer code has been checked against two FUMEX benchmarks. Modelling fuel performance in Advance Heavy Water Reactor would require additional inputs related to the fuel and some modification in the code.(author)

  7. Performance Prediction and Validation: Data, Frameworks, and Considerations

    Energy Technology Data Exchange (ETDEWEB)

    Tinnesand, Heidi

    2017-05-19

    Improving the predictability and reliability of wind power generation and operations will reduce costs and potentially establish a framework to attract new capital into the distributed wind sector, a key cost reduction requirement highlighted in results from the distributed wind future market assessment conducted with dWind. Quantifying and refining the accuracy of project performance estimates will also directly address several of the key challenges identified by industry stakeholders in 2015 as part of the distributed wind resource assessment workshop and be cross-cutting for several other facets of the distributed wind portfolio. This presentation covers the efforts undertaken in 2016 to address these topics.

  8. Development and validation of a high-performance liquid chromatography-tandem mass spectrometry method for the simultaneous determination of irinotecan and its main metabolites in human plasma and its application in a clinical pharmacokinetic study.

    Directory of Open Access Journals (Sweden)

    Elena Marangon

    Full Text Available Irinotecan is currently used in several cancer regimens mainly in colorectal cancer (CRC. This drug has a narrow therapeutic range and treatment can lead to side effects, mainly neutropenia and diarrhea, frequently requiring discontinuing or lowering the drug dose. A wide inter-individual variability in irinotecan pharmacokinetic parameters and pharmacodynamics has been reported and associated to patients' genetic background. In particular, a polymorphism in the UGT1A1 gene (UGT1A1*28 has been linked to an impaired detoxification of SN-38 (irinotecan active metabolite to SN-38 glucuronide (SN-38G leading to increased toxicities. Therefore, therapeutic drug monitoring of irinotecan, SN-38 and SN-38G is recommended to personalize therapy. In order to quantify simultaneously irinotecan and its main metabolites in patients' plasma, we developed and validated a new, sensitive and specific HPLC-MS/MS method applicable to all irinotecan dosages used in clinic. This method required a small plasma volume, addition of camptothecin as internal standard and simple protein precipitation. Chromatographic separation was done on a Gemini C18 column (3 μM, 100 mm x 2.0 mm using 0.1% acetic acid/bidistilled water and 0.1% acetic acid/acetonitrile as mobile phases. The mass spectrometer worked with electrospray ionization in positive ion mode and selected reaction monitoring. The standard curves were linear (R2 ≥0.9962 over the concentration ranges (10-10000 ng/mL for irinotecan, 1-500 ng/mL for SN-38 and SN-38G and 1-5000 ng/mL for APC and had good back-calculated accuracy and precision. The intra- and inter-day precision and accuracy, determined on three quality control levels for all the analytes, were always <12.3% and between 89.4% and 113.0%, respectively. Moreover, we evaluated this bioanalytical method by re-analysis of incurred samples as an additional measure of assay reproducibility. This method was successfully applied to a pharmacokinetic study in

  9. PASTIS: Bayesian extrasolar planet validation - I. General framework, models, and performance

    Science.gov (United States)

    Díaz, R. F.; Almenara, J. M.; Santerne, A.; Moutou, C.; Lethuillier, A.; Deleuil, M.

    2014-06-01

    A large fraction of the smallest transiting planet candidates discovered by the Kepler and CoRoT space missions cannot be confirmed by a dynamical measurement of the mass using currently available observing facilities. To establish their planetary nature, the concept of planet validation has been advanced. This technique compares the probability of the planetary hypothesis against that of all reasonably conceivable alternative false positive (FP) hypotheses. The candidate is considered as validated if the posterior probability of the planetary hypothesis is sufficiently larger than the sum of the probabilities of all FP scenarios. In this paper, we present PASTIS, the Planet Analysis and Small Transit Investigation Software, a tool designed to perform a rigorous model comparison of the hypotheses involved in the problem of planet validation, and to fully exploit the information available in the candidate light curves. PASTIS self-consistently models the transit light curves and follow-up observations. Its object-oriented structure offers a large flexibility for defining the scenarios to be compared. The performance is explored using artificial transit light curves of planets and FPs with a realistic error distribution obtained from a Kepler light curve. We find that data support the correct hypothesis strongly only when the signal is high enough (transit signal-to-noise ratio above 50 for the planet case) and remain inconclusive otherwise. PLAnetary Transits and Oscillations of stars (PLATO) shall provide transits with high enough signal-to-noise ratio, but to establish the true nature of the vast majority of Kepler and CoRoT transit candidates additional data or strong reliance on hypotheses priors is needed.

  10. An integrated approach to validation of safeguards and security program performance

    International Nuclear Information System (INIS)

    Altman, W.D.; Hunt, J.S.; Hockert, J.W.

    1988-01-01

    Department of Energy (DOE) requirements for safeguards and security programs are becoming increasingly performance oriented. Master Safeguards and Security Agreemtns specify performance levels for systems protecting DOE security interests. In order to measure and validate security system performance, Lawrence Livermore National Laboratory (LLNL) has developed cost effective validation tools and a comprehensive validation approach that synthesizes information gained from different activities such as force on force exercises, limited scope performance tests, equipment testing, vulnerability analyses, and computer modeling; into an overall assessment of the performance of the protection system. The analytic approach employs logic diagrams adapted from the fault and event trees used in probabilistic risk assessment. The synthesis of the results from the various validation activities is accomplished using a method developed by LLNL, based upon Bayes' theorem

  11. Strategy Guideline. Partnering for High Performance Homes

    Energy Technology Data Exchange (ETDEWEB)

    Prahl, Duncan [IBACOS, Inc., Pittsburgh, PA (United States)

    2013-01-01

    High performance houses require a high degree of coordination and have significant interdependencies between various systems in order to perform properly, meet customer expectations, and minimize risks for the builder. Responsibility for the key performance attributes is shared across the project team and can be well coordinated through advanced partnering strategies. For high performance homes, traditional partnerships need to be matured to the next level and be expanded to all members of the project team including trades, suppliers, manufacturers, HERS raters, designers, architects, and building officials as appropriate. This guide is intended for use by all parties associated in the design and construction of high performance homes. It serves as a starting point and features initial tools and resources for teams to collaborate to continually improve the energy efficiency and durability of new houses.

  12. Reliability and validity of the test of incremental respiratory endurance measures of inspiratory muscle performance in COPD.

    Science.gov (United States)

    Formiga, Magno F; Roach, Kathryn E; Vital, Isabel; Urdaneta, Gisel; Balestrini, Kira; Calderon-Candelario, Rafael A; Campos, Michael A; Cahalin, Lawrence P

    2018-01-01

    The Test of Incremental Respiratory Endurance (TIRE) provides a comprehensive assessment of inspiratory muscle performance by measuring maximal inspiratory pressure (MIP) over time. The integration of MIP over inspiratory duration (ID) provides the sustained maximal inspiratory pressure (SMIP). Evidence on the reliability and validity of these measurements in COPD is not currently available. Therefore, we assessed the reliability, responsiveness and construct validity of the TIRE measures of inspiratory muscle performance in subjects with COPD. Test-retest reliability, known-groups and convergent validity assessments were implemented simultaneously in 81 male subjects with mild to very severe COPD. TIRE measures were obtained using the portable PrO2 device, following standard guidelines. All TIRE measures were found to be highly reliable, with SMIP demonstrating the strongest test-retest reliability with a nearly perfect intraclass correlation coefficient (ICC) of 0.99, while MIP and ID clustered closely together behind SMIP with ICC values of about 0.97. Our findings also demonstrated known-groups validity of all TIRE measures, with SMIP and ID yielding larger effect sizes when compared to MIP in distinguishing between subjects of different COPD status. Finally, our analyses confirmed convergent validity for both SMIP and ID, but not MIP. The TIRE measures of MIP, SMIP and ID have excellent test-retest reliability and demonstrated known-groups validity in subjects with COPD. SMIP and ID also demonstrated evidence of moderate convergent validity and appear to be more stable measures in this patient population than the traditional MIP.

  13. Effects of Task Performance and Task Complexity on the Validity of Computational Models of Attention

    NARCIS (Netherlands)

    Koning, L. de; Maanen, P.P. van; Dongen, K. van

    2008-01-01

    Computational models of attention can be used as a component of decision support systems. For accurate support, a computational model of attention has to be valid and robust. The effects of task performance and task complexity on the validity of three different computational models of attention were

  14. Validity and reliability of tests determining performance-related components of wheelchair basketball

    NARCIS (Netherlands)

    de Groot, Sonja; Balvers, Inge J.M.; Kouwenhoven, Sanne M.; Janssen, Thomas W.J.

    The purpose of this study was to investigate the reliability and validity of wheelchair basketball field tests. Nineteen wheelchair basketball players performed 10 test items twice to determine the reliability. The validity of the tests was assessed by relating the scores to the players'

  15. Validity and reliability of tests determining performance-related components of wheelchair basketball

    NARCIS (Netherlands)

    De Groot, Sonja; Balvers, Inge J. M.; Kouwenhoven, Sanne M.; Janssen, Thomas W. J.

    2012-01-01

    The purpose of this study was to investigate the reliability and validity of wheelchair basketball field tests. Nineteen wheelchair basketball players performed 10 test items twice to determine the reliability. The validity of the tests was assessed by relating the scores to the players'

  16. Geographic and temporal validity of prediction models: Different approaches were useful to examine model performance

    NARCIS (Netherlands)

    P.C. Austin (Peter); D. van Klaveren (David); Y. Vergouwe (Yvonne); D. Nieboer (Daan); D.S. Lee (Douglas); E.W. Steyerberg (Ewout)

    2016-01-01

    textabstractObjective: Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting: We

  17. The Role of Performance Management in Creating and Maintaining a High-Performance Organization

    Directory of Open Access Journals (Sweden)

    André A. de Waal

    2015-04-01

    Full Text Available There is still a good deal of confusion in the literature about how the use of a performance management system affects overall organizational performance. Some researchers find that performance management enhances both the financial and non-financial results of an organization, while others do not find any positive effects or, at most, ambiguous effects. An important step toward getting more clarity in this relationship is to investigate the role performance management plays in creating and maintaining a high-performance organization (HPO. The purpose of this study is to integrate performance management analysis (PMA and high-performance organization (HPO. A questionnaire combining questions on PMA dimensions and HPO factors was administered to two European-based multinational firms. Based on 468 valid questionnaires, a correlation analysis was performed on the PMA dimensions and the HPO factors in order to test the impact of performance management on the factors of high organizational performance. The results show strong and significant correlations between all the PMA dimensions and all the HPO factors, indicating that a performance management system that fosters performance-driven behavior in the organization is of critical importance to strengthen overall financial and non-financial performance.

  18. High Performance Numerical Computing for High Energy Physics: A New Challenge for Big Data Science

    International Nuclear Information System (INIS)

    Pop, Florin

    2014-01-01

    Modern physics is based on both theoretical analysis and experimental validation. Complex scenarios like subatomic dimensions, high energy, and lower absolute temperature are frontiers for many theoretical models. Simulation with stable numerical methods represents an excellent instrument for high accuracy analysis, experimental validation, and visualization. High performance computing support offers possibility to make simulations at large scale, in parallel, but the volume of data generated by these experiments creates a new challenge for Big Data Science. This paper presents existing computational methods for high energy physics (HEP) analyzed from two perspectives: numerical methods and high performance computing. The computational methods presented are Monte Carlo methods and simulations of HEP processes, Markovian Monte Carlo, unfolding methods in particle physics, kernel estimation in HEP, and Random Matrix Theory used in analysis of particles spectrum. All of these methods produce data-intensive applications, which introduce new challenges and requirements for ICT systems architecture, programming paradigms, and storage capabilities.

  19. Validity and Practitality of Acid-Base Module Based on Guided Discovery Learning for Senior High School

    Science.gov (United States)

    Yerimadesi; Bayharti; Jannah, S. M.; Lufri; Festiyed; Kiram, Y.

    2018-04-01

    This Research and Development(R&D) aims to produce guided discovery learning based module on topic of acid-base and determine its validity and practicality in learning. Module development used Four D (4-D) model (define, design, develop and disseminate).This research was performed until development stage. Research’s instruments were validity and practicality questionnaires. Module was validated by five experts (three chemistry lecturers of Universitas Negeri Padang and two chemistry teachers of SMAN 9 Padang). Practicality test was done by two chemistry teachers and 30 students of SMAN 9 Padang. Kappa Cohen’s was used to analyze validity and practicality. The average moment kappa was 0.86 for validity and those for practicality were 0.85 by teachers and 0.76 by students revealing high category. It can be concluded that validity and practicality was proven for high school chemistry learning.

  20. High-performance ceramics. Fabrication, structure, properties

    International Nuclear Information System (INIS)

    Petzow, G.; Tobolski, J.; Telle, R.

    1996-01-01

    The program ''Ceramic High-performance Materials'' pursued the objective to understand the chaining of cause and effect in the development of high-performance ceramics. This chain of problems begins with the chemical reactions for the production of powders, comprises the characterization, processing, shaping and compacting of powders, structural optimization, heat treatment, production and finishing, and leads to issues of materials testing and of a design appropriate to the material. The program ''Ceramic High-performance Materials'' has resulted in contributions to the understanding of fundamental interrelationships in terms of materials science, which are summarized in the present volume - broken down into eight special aspects. (orig./RHM)

  1. High Burnup Fuel Performance and Safety Research

    Energy Technology Data Exchange (ETDEWEB)

    Bang, Je Keun; Lee, Chan Bok; Kim, Dae Ho (and others)

    2007-03-15

    The worldwide trend of nuclear fuel development is to develop a high burnup and high performance nuclear fuel with high economies and safety. Because the fuel performance evaluation code, INFRA, has a patent, and the superiority for prediction of fuel performance was proven through the IAEA CRP FUMEX-II program, the INFRA code can be utilized with commercial purpose in the industry. The INFRA code was provided and utilized usefully in the universities and relevant institutes domesticallly and it has been used as a reference code in the industry for the development of the intrinsic fuel rod design code.

  2. High performance liquid chromatographic determination of ...

    African Journals Online (AJOL)

    STORAGESEVER

    2010-02-08

    ) high performance liquid chromatography (HPLC) grade .... applications. These are important requirements if the reagent is to be applicable to on-line pre or post column derivatisation in a possible automation of the analytical.

  3. Analog circuit design designing high performance amplifiers

    CERN Document Server

    Feucht, Dennis

    2010-01-01

    The third volume Designing High Performance Amplifiers applies the concepts from the first two volumes. It is an advanced treatment of amplifier design/analysis emphasizing both wideband and precision amplification.

  4. High-performance computing using FPGAs

    CERN Document Server

    Benkrid, Khaled

    2013-01-01

    This book is concerned with the emerging field of High Performance Reconfigurable Computing (HPRC), which aims to harness the high performance and relative low power of reconfigurable hardware–in the form Field Programmable Gate Arrays (FPGAs)–in High Performance Computing (HPC) applications. It presents the latest developments in this field from applications, architecture, and tools and methodologies points of view. We hope that this work will form a reference for existing researchers in the field, and entice new researchers and developers to join the HPRC community.  The book includes:  Thirteen application chapters which present the most important application areas tackled by high performance reconfigurable computers, namely: financial computing, bioinformatics and computational biology, data search and processing, stencil computation e.g. computational fluid dynamics and seismic modeling, cryptanalysis, astronomical N-body simulation, and circuit simulation.     Seven architecture chapters which...

  5. Embedded High Performance Scalable Computing Systems

    National Research Council Canada - National Science Library

    Ngo, David

    2003-01-01

    The Embedded High Performance Scalable Computing Systems (EHPSCS) program is a cooperative agreement between Sanders, A Lockheed Martin Company and DARPA that ran for three years, from Apr 1995 - Apr 1998...

  6. Gradient High Performance Liquid Chromatography Method ...

    African Journals Online (AJOL)

    Purpose: To develop a gradient high performance liquid chromatography (HPLC) method for the simultaneous determination of phenylephrine (PHE) and ibuprofen (IBU) in solid ..... nimesulide, phenylephrine. Hydrochloride, chlorpheniramine maleate and caffeine anhydrous in pharmaceutical dosage form. Acta Pol.

  7. Highlighting High Performance: Whitman Hanson Regional High School; Whitman, Massachusetts

    Energy Technology Data Exchange (ETDEWEB)

    2006-06-01

    This brochure describes the key high-performance building features of the Whitman-Hanson Regional High School. The brochure was paid for by the Massachusetts Technology Collaborative as part of their Green Schools Initiative. High-performance features described are daylighting and energy-efficient lighting, indoor air quality, solar and wind energy, building envelope, heating and cooling systems, water conservation, and acoustics. Energy cost savings are also discussed.

  8. High performance computing in Windows Azure cloud

    OpenAIRE

    Ambruš, Dejan

    2013-01-01

    High performance, security, availability, scalability, flexibility and lower costs of maintenance have essentially contributed to the growing popularity of cloud computing in all spheres of life, especially in business. In fact cloud computing offers even more than this. With usage of virtual computing clusters a runtime environment for high performance computing can be efficiently implemented also in a cloud. There are many advantages but also some disadvantages of cloud computing, some ...

  9. High-performance computing — an overview

    Science.gov (United States)

    Marksteiner, Peter

    1996-08-01

    An overview of high-performance computing (HPC) is given. Different types of computer architectures used in HPC are discussed: vector supercomputers, high-performance RISC processors, various parallel computers like symmetric multiprocessors, workstation clusters, massively parallel processors. Software tools and programming techniques used in HPC are reviewed: vectorizing compilers, optimization and vector tuning, optimization for RISC processors; parallel programming techniques like shared-memory parallelism, message passing and data parallelism; and numerical libraries.

  10. Governance among Malaysian high performing companies

    Directory of Open Access Journals (Sweden)

    Asri Marsidi

    2016-07-01

    Full Text Available Well performed companies have always been linked with effective governance which is generally reflected through effective board of directors. However many issues concerning the attributes for effective board of directors remained unresolved. Nowadays diversity has been perceived as able to influence the corporate performance due to the likelihood of meeting variety of needs and demands from diverse customers and clients. The study therefore aims to provide a fundamental understanding on governance among high performing companies in Malaysia.

  11. High-performance OPCPA laser system

    International Nuclear Information System (INIS)

    Zuegel, J.D.; Bagnoud, V.; Bromage, J.; Begishev, I.A.; Puth, J.

    2006-01-01

    Optical parametric chirped-pulse amplification (OPCPA) is ideally suited for amplifying ultra-fast laser pulses since it provides broadband gain across a wide range of wavelengths without many of the disadvantages of regenerative amplification. A high-performance OPCPA system has been demonstrated as a prototype for the front end of the OMEGA Extended Performance (EP) Laser System. (authors)

  12. High-performance OPCPA laser system

    Energy Technology Data Exchange (ETDEWEB)

    Zuegel, J.D.; Bagnoud, V.; Bromage, J.; Begishev, I.A.; Puth, J. [Rochester Univ., Lab. for Laser Energetics, NY (United States)

    2006-06-15

    Optical parametric chirped-pulse amplification (OPCPA) is ideally suited for amplifying ultra-fast laser pulses since it provides broadband gain across a wide range of wavelengths without many of the disadvantages of regenerative amplification. A high-performance OPCPA system has been demonstrated as a prototype for the front end of the OMEGA Extended Performance (EP) Laser System. (authors)

  13. Comparing Dutch and British high performing managers

    NARCIS (Netherlands)

    Waal, A.A. de; Heijden, B.I.J.M. van der; Selvarajah, C.; Meyer, D.

    2016-01-01

    National cultures have a strong influence on the performance of organizations and should be taken into account when studying the traits of high performing managers. At the same time, many studies that focus upon the attributes of successful managers show that there are attributes that are similar

  14. Development of the fast, simple and fully validated high performance liquid chromatographic method with diode array detector for quantification of testosterone esters in an oil-based injectable dosage form.

    Science.gov (United States)

    Kozlik, Petr; Tircova, Barbora

    2016-11-01

    Counterfeit steroids are available on the black market, ultimately to consumers who believe they are buying a legitimate pharmaceutical item from the labeled company. In many cases, counterfeit steroids can contain lower doses or some products can be overdosed. This can unwittingly expose users to a significant health risks. The mixture of testosterone propionate, phenylpropionate, isocaproate and decanoate in an oil-based injectable dosage form belongs to the one of the most misused illicit drugs by a variety of athletes. This study developed a new, fast, simple and reliable HPLC method combined with a simple sample preparation step to determine testosterone propionate, phenylpropionate, isocaproate and decanoate in an oil-based injectable dosage form without the use of sophisticated and expensive instrumentation. The developed analytical procedure provides high throughput of samples where LC analysis takes only 6min and sample preparation of oil matrix in one step takes approximately 10min with precision ranging from 1.03 to 3.38% (RSD), and accuracy (relative error %) within ±2.01%. This method was found to be precise, linear, accurate, sensitive, selective and robust for routine application in screening of commercial pharmaceutical products based on content of mentioned testosterone esters in their oil-based injectable dosage form for counterfeit drugs. This method was successfully applied to the analysis of nine samples of commercial testosterone mixtures purchased from various sources and will be further used as an effective screening method for determination of previously mentioned testosterone esters in samples confiscated by Institute of Forensic Science (Slovakia) during the illegal trade. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Noncredible cognitive performance at clinical evaluation of adult ADHD: An embedded validity indicator in a visuospatial working memory test.

    Science.gov (United States)

    Fuermaier, Anselm B M; Tucha, Oliver; Koerts, Janneke; Lange, Klaus W; Weisbrod, Matthias; Aschenbrenner, Steffen; Tucha, Lara

    2017-12-01

    The assessment of performance validity is an essential part of the neuropsychological evaluation of adults with attention-deficit/hyperactivity disorder (ADHD). Most available tools, however, are inaccurate regarding the identification of noncredible performance. This study describes the development of a visuospatial working memory test, including a validity indicator for noncredible cognitive performance of adults with ADHD. Visuospatial working memory of adults with ADHD (n = 48) was first compared to the test performance of healthy individuals (n = 48). Furthermore, a simulation design was performed including 252 individuals who were randomly assigned to either a control group (n = 48) or to 1 of 3 simulation groups who were requested to feign ADHD (n = 204). Additional samples of 27 adults with ADHD and 69 instructed simulators were included to cross-validate findings from the first samples. Adults with ADHD showed impaired visuospatial working memory performance of medium size as compared to healthy individuals. Simulation groups committed significantly more errors and had shorter response times as compared to patients with ADHD. Moreover, binary logistic regression analysis was carried out to derive a validity index that optimally differentiates between true and feigned ADHD. ROC analysis demonstrated high classification rates of the validity index, as shown in excellent specificity (95.8%) and adequate sensitivity (60.3%). The visuospatial working memory test as presented in this study therefore appears sensitive in indicating cognitive impairment of adults with ADHD. Furthermore, the embedded validity index revealed promising results concerning the detection of noncredible cognitive performance of adults with ADHD. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. High Performance Work Systems for Online Education

    Science.gov (United States)

    Contacos-Sawyer, Jonna; Revels, Mark; Ciampa, Mark

    2010-01-01

    The purpose of this paper is to identify the key elements of a High Performance Work System (HPWS) and explore the possibility of implementation in an online institution of higher learning. With the projected rapid growth of the demand for online education and its importance in post-secondary education, providing high quality curriculum, excellent…

  17. Teacher Accountability at High Performing Charter Schools

    Science.gov (United States)

    Aguirre, Moises G.

    2016-01-01

    This study will examine the teacher accountability and evaluation policies and practices at three high performing charter schools located in San Diego County, California. Charter schools are exempted from many laws, rules, and regulations that apply to traditional school systems. By examining the teacher accountability systems at high performing…

  18. Validation of a Dry Model for Assessing the Performance of Arthroscopic Hip Labral Repair.

    Science.gov (United States)

    Phillips, Lisa; Cheung, Jeffrey J H; Whelan, Daniel B; Murnaghan, Michael Lucas; Chahal, Jas; Theodoropoulos, John; Ogilvie-Harris, Darrell; Macniven, Ian; Dwyer, Tim

    2017-07-01

    Arthroscopic hip labral repair is a technically challenging and demanding surgical technique with a steep learning curve. Arthroscopic simulation allows trainees to develop these skills in a safe environment. The purpose of this study was to evaluate the use of a combination of assessment ratings for the performance of arthroscopic hip labral repair on a dry model. Cross-sectional study; Level of evidence, 3. A total of 47 participants including orthopaedic surgery residents (n = 37), sports medicine fellows (n = 5), and staff surgeons (n = 5) performed arthroscopic hip labral repair on a dry model. Prior arthroscopic experience was noted. Participants were evaluated by 2 orthopaedic surgeons using a task-specific checklist, the Arthroscopic Surgical Skill Evaluation Tool (ASSET), task completion time, and a final global rating scale. All procedures were video-recorded and scored by an orthopaedic fellow blinded to the level of training of each participant. The internal consistency/reliability (Cronbach alpha) using the total ASSET score for the procedure was high (intraclass correlation coefficient > 0.9). One-way analysis of variance for the total ASSET score demonstrated a difference between participants based on the level of training ( F 3,43 = 27.8, P 0.9). The results of this study demonstrate that the use of dry models to assess the performance of arthroscopic hip labral repair by trainees is both valid and reliable. Further research will be required to demonstrate a correlation with performance on cadaveric specimens or in the operating room.

  19. Design, Monitoring, and Validation of a High Performance Sustainable Building

    Science.gov (United States)

    2013-08-01

    ES-1 1.0 INTRODUCTION ...better quality in the overall building construction (e.g., plumbing the building correctly). 1 1.0 INTRODUCTION The U.S. DUSEPArtment of...al, 2006 U.S. Department of Defense (DoD) Directive 4170.11. 2009. Department of Defense Instruction- Installation Energy Mangement . December 11

  20. Uncovering and Validating Toughening Mechanisms in High Performance Composites

    Science.gov (United States)

    2015-09-17

    tan dY Z X dX φ" #= $ % & ’ (1) The local coordinate systems of continuous twisted crack (x’, y’, z’) are to be defined based on the...Suksangpanya (The entire project) -! Nicolas Guarin (funded from other sources) -! David Restrepo (minimum involvement, funded from other sources

  1. Development and Validation of Reversed Phase High Performance ...

    African Journals Online (AJOL)

    All rights reserved. Available online ... Methods: Standards and samples were prepared by dissolving amlodipine besylate standard or ... Tropical Journal of Pharmaceutical Research is indexed by Science Citation Index (SciSearch), Scopus,.

  2. Validação de metodologia analítica para o doseamento simultâneo de mebendazol e tiabendazol por cromatografia líquida de alta eficiência Validation of analytical methodology for simultaneous evaluation of mebendazole and thiabendazole in tablets by high performance liquid chromatography

    Directory of Open Access Journals (Sweden)

    Núbia K. de Paula

    2007-10-01

    Full Text Available The aim of this work was to develop and validate an analytical methodology for simultaneous determination of mebendazole and thiabendazole, two benzimidazoles used as anthelmintics. The method was based on high performance liquid chromatography, using a C18 column, a mobile phase composed of KH2PO4 0.05 mol L-1 and methanol 40:60 (v/v and UV detection at 312 nm. The results showed that the method presented linearity from 60.0 to 140.0 µg mL-1 for mebendazole and from 99.6 to 232.4 g µL-1 for thiabendazole and it was considered selective, accurate, precise and robust according to the specific resolution from ANVISA, the Brazilian regulatory agency.

  3. Disentangling the Predictive Validity of High School Grades for Academic Success in University

    Science.gov (United States)

    Vulperhorst, Jonne; Lutz, Christel; de Kleijn, Renske; van Tartwijk, Jan

    2018-01-01

    To refine selective admission models, we investigate which measure of prior achievement has the best predictive validity for academic success in university. We compare the predictive validity of three core high school subjects to the predictive validity of high school grade point average (GPA) for academic achievement in a liberal arts university…

  4. High temperature performance of polymer composites

    CERN Document Server

    Keller, Thomas

    2014-01-01

    The authors explain the changes in the thermophysical and thermomechanical properties of polymer composites under elevated temperatures and fire conditions. Using microscale physical and chemical concepts they allow researchers to find reliable solutions to their engineering needs on the macroscale. In a unique combination of experimental results and quantitative models, a framework is developed to realistically predict the behavior of a variety of polymer composite materials over a wide range of thermal and mechanical loads. In addition, the authors treat extreme fire scenarios up to more than 1000°C for two hours, presenting heat-protection methods to improve the fire resistance of composite materials and full-scale structural members, and discuss their performance after fire exposure. Thanks to the microscopic approach, the developed models are valid for a variety of polymer composites and structural members, making this work applicable to a wide audience, including materials scientists, polymer chemist...

  5. Reliability and validity of match performance analysis in soccer : a multidimensional qualitative evaluation of opponent interaction

    OpenAIRE

    Tenga, Albin

    2010-01-01

    Avhandling (doktorgrad) – Norges idrettshøgskole, 2010. Match performance analysis is widely used as a method for studying technical, tactical and physical aspects of player and team performance in a soccer match. Therefore, ensuring the validity and reliability of the collected data is important for match performance analysis to meet its intents and purposes effectively. However, most studies on soccer match performance use unidimensional frequency data based on analyses done ...

  6. High performance bio-integrated devices

    Science.gov (United States)

    Kim, Dae-Hyeong; Lee, Jongha; Park, Minjoon

    2014-06-01

    In recent years, personalized electronics for medical applications, particularly, have attracted much attention with the rise of smartphones because the coupling of such devices and smartphones enables the continuous health-monitoring in patients' daily life. Especially, it is expected that the high performance biomedical electronics integrated with the human body can open new opportunities in the ubiquitous healthcare. However, the mechanical and geometrical constraints inherent in all standard forms of high performance rigid wafer-based electronics raise unique integration challenges with biotic entities. Here, we describe materials and design constructs for high performance skin-mountable bio-integrated electronic devices, which incorporate arrays of single crystalline inorganic nanomembranes. The resulting electronic devices include flexible and stretchable electrophysiology electrodes and sensors coupled with active electronic components. These advances in bio-integrated systems create new directions in the personalized health monitoring and/or human-machine interfaces.

  7. Designing a High Performance Parallel Personal Cluster

    OpenAIRE

    Kapanova, K. G.; Sellier, J. M.

    2016-01-01

    Today, many scientific and engineering areas require high performance computing to perform computationally intensive experiments. For example, many advances in transport phenomena, thermodynamics, material properties, computational chemistry and physics are possible only because of the availability of such large scale computing infrastructures. Yet many challenges are still open. The cost of energy consumption, cooling, competition for resources have been some of the reasons why the scientifi...

  8. vSphere high performance cookbook

    CERN Document Server

    Sarkar, Prasenjit

    2013-01-01

    vSphere High Performance Cookbook is written in a practical, helpful style with numerous recipes focusing on answering and providing solutions to common, and not-so common, performance issues and problems.The book is primarily written for technical professionals with system administration skills and some VMware experience who wish to learn about advanced optimization and the configuration features and functions for vSphere 5.1.

  9. High performance parallel I/O

    CERN Document Server

    Prabhat

    2014-01-01

    Gain Critical Insight into the Parallel I/O EcosystemParallel I/O is an integral component of modern high performance computing (HPC), especially in storing and processing very large datasets to facilitate scientific discovery. Revealing the state of the art in this field, High Performance Parallel I/O draws on insights from leading practitioners, researchers, software architects, developers, and scientists who shed light on the parallel I/O ecosystem.The first part of the book explains how large-scale HPC facilities scope, configure, and operate systems, with an emphasis on choices of I/O har

  10. Performance Validity Testing in Neuropsychology: Methods for Measurement Development and Maximizing Diagnostic Accuracy.

    Science.gov (United States)

    Wodushek, Thomas R; Greher, Michael R

    2017-05-01

    In the first column in this 2-part series, Performance Validity Testing in Neuropsychology: Scientific Basis and Clinical Application-A Brief Review, the authors introduced performance validity tests (PVTs) and their function, provided a justification for why they are necessary, traced their ongoing endorsement by neuropsychological organizations, and described how they are used and interpreted by ever increasing numbers of clinical neuropsychologists. To enhance readers' understanding of these measures, this second column briefly describes common detection strategies used in PVTs as well as the typical methods used to validate new PVTs and determine cut scores for valid/invalid determinations. We provide a discussion of the latest research demonstrating how neuropsychologists can combine multiple PVTs in a single battery to improve sensitivity/specificity to invalid responding. Finally, we discuss future directions for the research and application of PVTs.

  11. Assessment of performance validity in the Stroop Color and Word Test in mild traumatic brain injury patients: a criterion-groups validation design.

    Science.gov (United States)

    Guise, Brian J; Thompson, Matthew D; Greve, Kevin W; Bianchini, Kevin J; West, Laura

    2014-03-01

    The current study assessed performance validity on the Stroop Color and Word Test (Stroop) in mild traumatic brain injury (TBI) using criterion-groups validation. The sample consisted of 77 patients with a reported history of mild TBI. Data from 42 moderate-severe TBI and 75 non-head-injured patients with other clinical diagnoses were also examined. TBI patients were categorized on the basis of Slick, Sherman, and Iverson (1999) criteria for malingered neurocognitive dysfunction (MND). Classification accuracy is reported for three indicators (Word, Color, and Color-Word residual raw scores) from the Stroop across a range of injury severities. With false-positive rates set at approximately 5%, sensitivity was as high as 29%. The clinical implications of these findings are discussed. © 2012 The British Psychological Society.

  12. Predictive validity of a three-dimensional model of performance anxiety in the context of tae-kwon-do.

    Science.gov (United States)

    Cheng, Wen-Nuan Kara; Hardy, Lew; Woodman, Tim

    2011-02-01

    We tested the predictive validity of the recently validated three-dimensional model of performance anxiety (Chang, Hardy, & Markland, 2009) with elite tae-kwon-do competitors (N = 99). This conceptual framework emphasized the adaptive potential of anxiety by including a regulatory dimension (reflected by perceived control) along with the intensity-oriented dimensions of cognitive and physiological anxiety. Anxiety was assessed 30 min before a competitive contest using the Three-Factor Anxiety Inventory. Competitors rated their performance on a tae-kwon-do-specific performance scale within 30 min after completion of their contest. Moderated hierarchical regression analyses revealed initial support for the predictive validity of the three-dimensional performance anxiety model. The regulatory dimension of anxiety (perceived control) revealed significant main and interactive effects on performance. This dimension appeared to be adaptive, as performance was better under high than low perceived control, and best vs. worst performance was associated with highest vs. lowest perceived control, respectively. Results are discussed in terms of the importance of the regulatory dimension of anxiety.

  13. Reliability, validity and description of timed performance of the Jebsen-Taylor Test in patients with muscular dystrophies.

    Science.gov (United States)

    Artilheiro, Mariana Cunha; Fávero, Francis Meire; Caromano, Fátima Aparecida; Oliveira, Acary de Souza Bulle; Carvas, Nelson; Voos, Mariana Callil; Sá, Cristina Dos Santos Cardoso de

    2017-12-08

    The Jebsen-Taylor Test evaluates upper limb function by measuring timed performance on everyday activities. The test is used to assess and monitor the progression of patients with Parkinson disease, cerebral palsy, stroke and brain injury. To analyze the reliability, internal consistency and validity of the Jebsen-Taylor Test in people with Muscular Dystrophy and to describe and classify upper limb timed performance of people with Muscular Dystrophy. Fifty patients with Muscular Dystrophy were assessed. Non-dominant and dominant upper limb performances on the Jebsen-Taylor Test were filmed. Two raters evaluated timed performance for inter-rater reliability analysis. Test-retest reliability was investigated by using intraclass correlation coefficients. Internal consistency was assessed using the Cronbach alpha. Construct validity was conducted by comparing the Jebsen-Taylor Test with the Performance of Upper Limb. The internal consistency of Jebsen-Taylor Test was good (Cronbach's α=0.98). A very high inter-rater reliability (0.903-0.999), except for writing with an Intraclass correlation coefficient of 0.772-1.000. Strong correlations between the Jebsen-Taylor Test and the Performance of Upper Limb Module were found (rho=-0.712). The Jebsen-Taylor Test is a reliable and valid measure of timed performance for people with Muscular Dystrophy. Copyright © 2017 Associação Brasileira de Pesquisa e Pós-Graduação em Fisioterapia. Publicado por Elsevier Editora Ltda. All rights reserved.

  14. Strategy Guideline: Partnering for High Performance Homes

    Energy Technology Data Exchange (ETDEWEB)

    Prahl, D.

    2013-01-01

    High performance houses require a high degree of coordination and have significant interdependencies between various systems in order to perform properly, meet customer expectations, and minimize risks for the builder. Responsibility for the key performance attributes is shared across the project team and can be well coordinated through advanced partnering strategies. For high performance homes, traditional partnerships need to be matured to the next level and be expanded to all members of the project team including trades, suppliers, manufacturers, HERS raters, designers, architects, and building officials as appropriate. In an environment where the builder is the only source of communication between trades and consultants and where relationships are, in general, adversarial as opposed to cooperative, the chances of any one building system to fail are greater. Furthermore, it is much harder for the builder to identify and capitalize on synergistic opportunities. Partnering can help bridge the cross-functional aspects of the systems approach and achieve performance-based criteria. Critical success factors for partnering include support from top management, mutual trust, effective and open communication, effective coordination around common goals, team building, appropriate use of an outside facilitator, a partnering charter progress toward common goals, an effective problem-solving process, long-term commitment, continuous improvement, and a positive experience for all involved.

  15. Long-term bridge performance high priority bridge performance issues.

    Science.gov (United States)

    2014-10-01

    Bridge performance is a multifaceted issue involving performance of materials and protective systems, : performance of individual components of the bridge, and performance of the structural system as a whole. The : Long-Term Bridge Performance (LTBP)...

  16. An Introduction to High Performance Fortran

    Directory of Open Access Journals (Sweden)

    John Merlin

    1995-01-01

    Full Text Available High Performance Fortran (HPF is an informal standard for extensions to Fortran 90 to assist its implementation on parallel architectures, particularly for data-parallel computation. Among other things, it includes directives for specifying data distribution across multiple memories, and concurrent execution features. This article provides a tutorial introduction to the main features of HPF.

  17. High performance computing on vector systems

    CERN Document Server

    Roller, Sabine

    2008-01-01

    Presents the developments in high-performance computing and simulation on modern supercomputer architectures. This book covers trends in hardware and software development in general and specifically the vector-based systems and heterogeneous architectures. It presents innovative fields like coupled multi-physics or multi-scale simulations.

  18. High Performance Electronics on Flexible Silicon

    KAUST Repository

    Sevilla, Galo T.

    2016-09-01

    Over the last few years, flexible electronic systems have gained increased attention from researchers around the world because of their potential to create new applications such as flexible displays, flexible energy harvesters, artificial skin, and health monitoring systems that cannot be integrated with conventional wafer based complementary metal oxide semiconductor processes. Most of the current efforts to create flexible high performance devices are based on the use of organic semiconductors. However, inherent material\\'s limitations make them unsuitable for big data processing and high speed communications. The objective of my doctoral dissertation is to develop integration processes that allow the transformation of rigid high performance electronics into flexible ones while maintaining their performance and cost. In this work, two different techniques to transform inorganic complementary metal-oxide-semiconductor electronics into flexible ones have been developed using industry compatible processes. Furthermore, these techniques were used to realize flexible discrete devices and circuits which include metal-oxide-semiconductor field-effect-transistors, the first demonstration of flexible Fin-field-effect-transistors, and metal-oxide-semiconductors-based circuits. Finally, this thesis presents a new technique to package, integrate, and interconnect flexible high performance electronics using low cost additive manufacturing techniques such as 3D printing and inkjet printing. This thesis contains in depth studies on electrical, mechanical, and thermal properties of the fabricated devices.

  19. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  20. Technology Leadership in Malaysia's High Performance School

    Science.gov (United States)

    Yieng, Wong Ai; Daud, Khadijah Binti

    2017-01-01

    Headmaster as leader of the school also plays a role as a technology leader. This applies to the high performance schools (HPS) headmaster as well. The HPS excel in all aspects of education. In this study, researcher is interested in examining the role of the headmaster as a technology leader through interviews with three headmasters of high…

  1. Toward High Performance in Industrial Refrigeration Systems

    DEFF Research Database (Denmark)

    Thybo, C.; Izadi-Zamanabadi, Roozbeh; Niemann, H.

    2002-01-01

    Achieving high performance in complex industrial systems requires information manipulation at different system levels. The paper shows how different models of same subsystems, but using different quality of information/data, are used for fault diagnosis as well as robust control design...

  2. Towards high performance in industrial refrigeration systems

    DEFF Research Database (Denmark)

    Thybo, C.; Izadi-Zamanabadi, R.; Niemann, Hans Henrik

    2002-01-01

    Achieving high performance in complex industrial systems requires information manipulation at different system levels. The paper shows how different models of same subsystems, but using different quality of information/data, are used for fault diagnosis as well as robust control design...

  3. Project materials [Commercial High Performance Buildings Project

    Energy Technology Data Exchange (ETDEWEB)

    None

    2001-01-01

    The Consortium for High Performance Buildings (ChiPB) is an outgrowth of DOE'S Commercial Whole Buildings Roadmapping initiatives. It is a team-driven public/private partnership that seeks to enable and demonstrate the benefit of buildings that are designed, built and operated to be energy efficient, environmentally sustainable, superior quality, and cost effective.

  4. High performance structural ceramics for nuclear industry

    International Nuclear Information System (INIS)

    Pujari, Vimal K.; Faker, Paul

    2006-01-01

    A family of Saint-Gobain structural ceramic materials and products produced by its High performance Refractory Division is described. Over the last fifty years or so, Saint-Gobain has been a leader in developing non oxide ceramic based novel materials, processes and products for application in Nuclear, Chemical, Automotive, Defense and Mining industries

  5. A new high performance current transducer

    International Nuclear Information System (INIS)

    Tang Lijun; Lu Songlin; Li Deming

    2003-01-01

    A DC-100 kHz current transducer is developed using a new technique on zero-flux detecting principle. It was shown that the new current transducer is of high performance, its magnetic core need not be selected very stringently, and it is easy to manufacture

  6. Reliability and Validity of the Turkish Version of the Job Performance Scale Instrument.

    Science.gov (United States)

    Harmanci Seren, Arzu Kader; Tuna, Rujnan; Eskin Bacaksiz, Feride

    2018-02-01

    Objective measurement of the job performance of nursing staff using valid and reliable instruments is important in the evaluation of healthcare quality. A current, valid, and reliable instrument that specifically measures the performance of nurses is required for this purpose. The aim of this study was to determine the validity and reliability of the Turkish version of the Job Performance Instrument. This study used a methodological design and a sample of 240 nurses working at different units in four hospitals in Istanbul, Turkey. A descriptive data form, the Job Performance Scale, and the Employee Performance Scale were used to collect data. Data were analyzed using IBM SPSS Statistics Version 21.0 and LISREL Version 8.51. On the basis of the data analysis, the instrument was revised. Some items were deleted, and subscales were combined. The Turkish version of the Job Performance Instrument was determined to be valid and reliable to measure the performance of nurses. The instrument is suitable for evaluating current nursing roles.

  7. Investigation of high-alpha lateral-directional control power requirements for high-performance aircraft

    Science.gov (United States)

    Foster, John V.; Ross, Holly M.; Ashley, Patrick A.

    1993-01-01

    Designers of the next-generation fighter and attack airplanes are faced with the requirements of good high angle-of-attack maneuverability as well as efficient high speed cruise capability with low radar cross section (RCS) characteristics. As a result, they are challenged with the task of making critical design trades to achieve the desired levels of maneuverability and performance. This task has highlighted the need for comprehensive, flight-validated lateral-directional control power design guidelines for high angles of attack. A joint NASA/U.S. Navy study has been initiated to address this need and to investigate the complex flight dynamics characteristics and controls requirements for high angle-of-attack lateral-directional maneuvering. A multi-year research program is underway which includes groundbased piloted simulation and flight validation. This paper will give a status update of this program that will include a program overview, description of test methodology and preliminary results.

  8. Strategy Guideline. High Performance Residential Lighting

    Energy Technology Data Exchange (ETDEWEB)

    Holton, J. [IBACOS, Inc., Pittsburgh, PA (United States)

    2012-02-01

    This report has been developed to provide a tool for the understanding and application of high performance lighting in the home. The strategies featured in this guide are drawn from recent advances in commercial lighting for application to typical spaces found in residential buildings. This guide offers strategies to greatly reduce lighting energy use through the application of high quality fluorescent and light emitting diode (LED) technologies. It is important to note that these strategies not only save energy in the home but also serve to satisfy the homeowner’s expectations for high quality lighting.

  9. Architecting Web Sites for High Performance

    Directory of Open Access Journals (Sweden)

    Arun Iyengar

    2002-01-01

    Full Text Available Web site applications are some of the most challenging high-performance applications currently being developed and deployed. The challenges emerge from the specific combination of high variability in workload characteristics and of high performance demands regarding the service level, scalability, availability, and costs. In recent years, a large body of research has addressed the Web site application domain, and a host of innovative software and hardware solutions have been proposed and deployed. This paper is an overview of recent solutions concerning the architectures and the software infrastructures used in building Web site applications. The presentation emphasizes three of the main functions in a complex Web site: the processing of client requests, the control of service levels, and the interaction with remote network caches.

  10. Validation of a checklist to assess ward round performance in internal medicine

    DEFF Research Database (Denmark)

    Nørgaard, Kirsten; Ringsted, Charlotte; Dolmans, Diana

    2004-01-01

    and construct validity of the task-specific checklist. METHODS: To determine content validity, a questionnaire was mailed to 295 internists. They were requested to give their opinion on the relevance of each item included on the checklist and to indicate the comprehensiveness of the checklist. To determine...... construct validity, an observer assessed 4 groups of doctors during performance of a complete ward round (n = 32). The nurse who accompanied the doctor on rounds made a global assessment of the performance. RESULTS: The response rate to the questionnaire was 80.7%. The respondents found that all 10 items......BACKGROUND: Ward rounds are an essential responsibility for doctors in hospital settings. Tools for guiding and assessing trainees' performance of ward rounds are needed. A checklist was developed for that purpose for use with trainees in internal medicine. OBJECTIVE: To assess the content...

  11. Performance Validation and Scaling of a Capillary Membrane Solid-Liquid Separation System

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, S; Cook, J; Juratovac, J; Goodwillie, J; Burke, T

    2011-10-25

    Algaeventure Systems (AVS) has previously demonstrated an innovative technology for dewatering algae slurries that dramatically reduces energy consumption by utilizing surface physics and capillary action. Funded by a $6M ARPA-E award, transforming the original Harvesting, Dewatering and Drying (HDD) prototype machine into a commercially viable technology has required significant attention to material performance, integration of sensors and control systems, and especially addressing scaling issues that would allow processing extreme volumes of algal cultivation media/slurry. Decoupling the harvesting, dewatering and drying processes, and addressing the rate limiting steps for each of the individual steps has allowed for the development individual technologies that may be tailored to the specific needs of various cultivation systems. The primary performance metric used by AVS to assess the economic viability of its Solid-Liquid Separation (SLS) dewatering technology is algae mass production rate as a function of power consumption (cost), cake solids/moisture content, and solids capture efficiency. An associated secondary performance metric is algae mass loading rate which is dependent on hydraulic loading rate, area-specific hydraulic processing capacity (gpm/in2), filter:capillary belt contact area, and influent algae concentration. The system is capable of dewatering 4 g/L (0.4%) algae streams to solids concentrations up to 30% with capture efficiencies of 80+%, however mass production is highly dependent on average cell size (which determines filter mesh size and percent open area). This paper will present data detailing the scaling efforts to date. Characterization and performance data for novel membranes, as well as optimization of off-the-shelf filter materials will be examined. Third party validation from Ohio University on performance and operating cost, as well as design modification suggestions will be discussed. Extrapolation of current productivities

  12. NINJA: Java for High Performance Numerical Computing

    Directory of Open Access Journals (Sweden)

    José E. Moreira

    2002-01-01

    Full Text Available When Java was first introduced, there was a perception that its many benefits came at a significant performance cost. In the particularly performance-sensitive field of numerical computing, initial measurements indicated a hundred-fold performance disadvantage between Java and more established languages such as Fortran and C. Although much progress has been made, and Java now can be competitive with C/C++ in many important situations, significant performance challenges remain. Existing Java virtual machines are not yet capable of performing the advanced loop transformations and automatic parallelization that are now common in state-of-the-art Fortran compilers. Java also has difficulties in implementing complex arithmetic efficiently. These performance deficiencies can be attacked with a combination of class libraries (packages, in Java that implement truly multidimensional arrays and complex numbers, and new compiler techniques that exploit the properties of these class libraries to enable other, more conventional, optimizations. Two compiler techniques, versioning and semantic expansion, can be leveraged to allow fully automatic optimization and parallelization of Java code. Our measurements with the NINJA prototype Java environment show that Java can be competitive in performance with highly optimized and tuned Fortran code.

  13. Validation of risk-based performance indicators: Safety system function trends

    International Nuclear Information System (INIS)

    Boccio, J.L.; Vesely, W.E.; Azarm, M.A.; Carbonaro, J.F.; Usher, J.L.; Oden, N.

    1989-10-01

    This report describes and applies a process for validating a model for a risk-based performance indicator. The purpose of the risk-based indicator evaluated, Safety System Function Trend (SSFT), is to monitor the unavailability of selected safety systems. Interim validation of this indicator is based on three aspects: a theoretical basis, an empirical basis relying on statistical correlations, and case studies employing 25 plant years of historical data collected from five plants for a number of safety systems. Results using the SSFT model are encouraging. Application of the model through case studies dealing with the performance of important safety systems shows that statistically significant trends in, and levels of, system performance can be discerned which thereby can provide leading indications of degrading and/or improving performances. Methods for developing system performance tolerance bounds are discussed and applied to aid in the interpretation of the trends in this risk-based indicator. Some additional characteristics of the SSFT indicator, learned through the data-collection efforts and subsequent data analyses performed, are also discussed. The usefulness and practicality of other data sources for validation purposes are explored. Further validation of this indicator is noted. Also, additional research is underway in developing a more detailed estimator of system unavailability. 9 refs., 18 figs., 5 tabs

  14. Development of high performance cladding materials

    International Nuclear Information System (INIS)

    Park, Jeong Yong; Jeong, Y. H.; Park, S. Y.

    2010-04-01

    The irradiation test for HANA claddings conducted and a series of evaluation for next-HANA claddings as well as their in-pile and out-of pile performances tests were also carried out at Halden research reactor. The 6th irradiation test have been completed successfully in Halden research reactor. As a result, HANA claddings showed high performance, such as corrosion resistance increased by 40% compared to Zircaloy-4. The high performance of HANA claddings in Halden test has enabled lead test rod program as the first step of the commercialization of HANA claddings. DB has been established for thermal and LOCA-related properties. It was confirmed from the thermal shock test that the integrity of HANA claddings was maintained in more expanded region than the criteria regulated by NRC. The manufacturing process of strips was established in order to apply HANA alloys, which were originally developed for the claddings, to the spacer grids. 250 kinds of model alloys for the next-generation claddings were designed and manufactured over 4 times and used to select the preliminary candidate alloys for the next-generation claddings. The selected candidate alloys showed 50% better corrosion resistance and 20% improved high temperature oxidation resistance compared to the foreign advanced claddings. We established the manufacturing condition controlling the performance of the dual-cooled claddings by changing the reduction rate in the cold working steps

  15. A Linux Workstation for High Performance Graphics

    Science.gov (United States)

    Geist, Robert; Westall, James

    2000-01-01

    The primary goal of this effort was to provide a low-cost method of obtaining high-performance 3-D graphics using an industry standard library (OpenGL) on PC class computers. Previously, users interested in doing substantial visualization or graphical manipulation were constrained to using specialized, custom hardware most often found in computers from Silicon Graphics (SGI). We provided an alternative to expensive SGI hardware by taking advantage of third-party, 3-D graphics accelerators that have now become available at very affordable prices. To make use of this hardware our goal was to provide a free, redistributable, and fully-compatible OpenGL work-alike library so that existing bodies of code could simply be recompiled. for PC class machines running a free version of Unix. This should allow substantial cost savings while greatly expanding the population of people with access to a serious graphics development and viewing environment. This should offer a means for NASA to provide a spectrum of graphics performance to its scientists, supplying high-end specialized SGI hardware for high-performance visualization while fulfilling the requirements of medium and lower performance applications with generic, off-the-shelf components and still maintaining compatibility between the two.

  16. The path toward HEP High Performance Computing

    CERN Document Server

    Apostolakis, John; Carminati, Federico; Gheata, Andrei; Wenzel, Sandro

    2014-01-01

    High Energy Physics code has been known for making poor use of high performance computing architectures. Efforts in optimising HEP code on vector and RISC architectures have yield limited results and recent studies have shown that, on modern architectures, it achieves a performance between 10% and 50% of the peak one. Although several successful attempts have been made to port selected codes on GPUs, no major HEP code suite has a 'High Performance' implementation. With LHC undergoing a major upgrade and a number of challenging experiments on the drawing board, HEP cannot any longer neglect the less-than-optimal performance of its code and it has to try making the best usage of the hardware. This activity is one of the foci of the SFT group at CERN, which hosts, among others, the Root and Geant4 project. The activity of the experiments is shared and coordinated via a Concurrency Forum, where the experience in optimising HEP code is presented and discussed. Another activity is the Geant-V project, centred on th...

  17. Validation of the Cognition Test Battery for Spaceflight in a Sample of Highly Educated Adults.

    Science.gov (United States)

    Moore, Tyler M; Basner, Mathias; Nasrini, Jad; Hermosillo, Emanuel; Kabadi, Sushila; Roalf, David R; McGuire, Sarah; Ecker, Adrian J; Ruparel, Kosha; Port, Allison M; Jackson, Chad T; Dinges, David F; Gur, Ruben C

    2017-10-01

    Neuropsychological changes that may occur due to the environmental and psychological stressors of prolonged spaceflight motivated the development of the Cognition Test Battery. The battery was designed to assess multiple domains of neurocognitive functions linked to specific brain systems. Tests included in Cognition have been validated, but not in high-performing samples comparable to astronauts, which is an essential step toward ensuring their usefulness in long-duration space missions. We administered Cognition (on laptop and iPad) and the WinSCAT, counterbalanced for order and version, in a sample of 96 subjects (50% women; ages 25-56 yr) with at least a Master's degree in science, technology, engineering, or mathematics (STEM). We assessed the associations of age, sex, and administration device with neurocognitive performance, and compared the scores on the Cognition battery with those of WinSCAT. Confirmatory factor analysis compared the structure of the iPad and laptop administration methods using Wald tests. Age was associated with longer response times (mean β = 0.12) and less accurate (mean β = -0.12) performance, women had longer response times on psychomotor (β = 0.62), emotion recognition (β = 0.30), and visuo-spatial (β = 0.48) tasks, men outperformed women on matrix reasoning (β = -0.34), and performance on an iPad was generally faster (mean β = -0.55). The WinSCAT appeared heavily loaded with tasks requiring executive control, whereas Cognition assessed a larger variety of neurocognitive domains. Overall results supported the interpretation of Cognition scores as measuring their intended constructs in high performing astronaut analog samples.Moore TM, Basner M, Nasrini J, Hermosillo E, Kabadi S, Roalf DR, McGuire S, Ecker AJ, Ruparel K, Port AM, Jackson CT, Dinges DF, Gur RC. Validation of the Cognition Test Battery for spaceflight in a sample of highly educated adults. Aerosp Med Hum Perform. 2017; 88(10):937-946.

  18. Site characterization and validation - Tracer migration experiment in the validation drift, report 2, part 1: performed experiments, results and evaluation

    International Nuclear Information System (INIS)

    Birgersson, L.; Widen, H.; Aagren, T.; Neretnieks, I.; Moreno, L.

    1992-01-01

    This report is the second of the two reports describing the tracer migration experiment where water and tracer flow has been monitored in a drift at the 385 m level in the Stripa experimental mine. The tracer migration experiment is one of a large number of experiments performed within the Site Characterization and Validation (SCV) project. The upper part of the 50 m long validation drift was covered with approximately 150 plastic sheets, in which the emerging water was collected. The water emerging into the lower part of the drift was collected in short boreholes, sumpholes. Sex different tracer mixtures were injected at distances between 10 and 25 m from the drift. The flowrate and tracer monitoring continued for ten months. Tracer breakthrough curves and flowrate distributions were used to study flow paths, velocities, hydraulic conductivities, dispersivities, interaction with the rock matrix and channelling effects within the rock. The present report describes the structure of the observations, the flowrate measurements and estimated hydraulic conductivities. The main part of this report addresses the interpretation of the tracer movement in fractured rock. The tracer movement as measured by the more than 150 individual tracer curves has been analysed with the traditional advection-dispersion model and a subset of the curves with the advection-dispersion-diffusion model. The tracer experiments have permitted the flow porosity, dispersion and interaction with the rock matrix to be studied. (57 refs.)

  19. High Performance Commercial Fenestration Framing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Mike Manteghi; Sneh Kumar; Joshua Early; Bhaskar Adusumalli

    2010-01-31

    A major objective of the U.S. Department of Energy is to have a zero energy commercial building by the year 2025. Windows have a major influence on the energy performance of the building envelope as they control over 55% of building energy load, and represent one important area where technologies can be developed to save energy. Aluminum framing systems are used in over 80% of commercial fenestration products (i.e. windows, curtain walls, store fronts, etc.). Aluminum framing systems are often required in commercial buildings because of their inherent good structural properties and long service life, which is required from commercial and architectural frames. At the same time, they are lightweight and durable, requiring very little maintenance, and offer design flexibility. An additional benefit of aluminum framing systems is their relatively low cost and easy manufacturability. Aluminum, being an easily recyclable material, also offers sustainable features. However, from energy efficiency point of view, aluminum frames have lower thermal performance due to the very high thermal conductivity of aluminum. Fenestration systems constructed of aluminum alloys therefore have lower performance in terms of being effective barrier to energy transfer (heat loss or gain). Despite the lower energy performance, aluminum is the choice material for commercial framing systems and dominates the commercial/architectural fenestration market because of the reasons mentioned above. In addition, there is no other cost effective and energy efficient replacement material available to take place of aluminum in the commercial/architectural market. Hence it is imperative to improve the performance of aluminum framing system to improve the energy performance of commercial fenestration system and in turn reduce the energy consumption of commercial building and achieve zero energy building by 2025. The objective of this project was to develop high performance, energy efficient commercial

  20. Protective design of critical infrastructure with high performance concretes

    International Nuclear Information System (INIS)

    Riedel, W.; Nöldgen, M.; Stolz, A.; Roller, C.

    2012-01-01

    Conclusions: High performance concrete constructions will allow innovative design solutions for critical infrastructures. Validation of engineering methods can reside on large and model scale experiments conducted on conventional concrete structures. New consistent impact experiments show extreme protection potential for UHPC. Modern FEM with concrete models and explicit rebar can model HPC and UHPC penetration resistance. SDOF and TDOF approaches are valuable design tools on local and global level. Combination of at least 2 out of 3 design methods FEM – XDOF- EXP allow reliable prediction and efficient innovative designs

  1. Fracture toughness of ultra high performance concrete by flexural performance

    Directory of Open Access Journals (Sweden)

    Manolova Emanuela

    2016-01-01

    Full Text Available This paper describes the fracture toughness of the innovative structural material - Ultra High Performance Concrete (UHPC, evaluated by flexural performance. For determination the material behaviour by static loading are used adapted standard test methods for flexural performance of fiber-reinforced concrete (ASTM C 1609 and ASTM C 1018. Fracture toughness is estimated by various deformation parameters derived from the load-deflection curve, obtained by testing simple supported beam under third-point loading, using servo-controlled testing system. This method is used to be estimated the contribution of the embedded fiber-reinforcement into improvement of the fractural behaviour of UHPC by changing the crack-resistant capacity, fracture toughness and energy absorption capacity with various mechanisms. The position of the first crack has been formulated based on P-δ (load- deflection response and P-ε (load - longitudinal deformation in the tensile zone response, which are used for calculation of the two toughness indices I5 and I10. The combination of steel fibres with different dimensions leads to a composite, having at the same time increased crack resistance, first crack formation, ductility and post-peak residual strength.

  2. The Validity of Conscientiousness Is Overestimated in the Prediction of Job Performance.

    Science.gov (United States)

    Kepes, Sven; McDaniel, Michael A

    2015-01-01

    Sensitivity analyses refer to investigations of the degree to which the results of a meta-analysis remain stable when conditions of the data or the analysis change. To the extent that results remain stable, one can refer to them as robust. Sensitivity analyses are rarely conducted in the organizational science literature. Despite conscientiousness being a valued predictor in employment selection, sensitivity analyses have not been conducted with respect to meta-analytic estimates of the correlation (i.e., validity) between conscientiousness and job performance. To address this deficiency, we reanalyzed the largest collection of conscientiousness validity data in the personnel selection literature and conducted a variety of sensitivity analyses. Publication bias analyses demonstrated that the validity of conscientiousness is moderately overestimated (by around 30%; a correlation difference of about .06). The misestimation of the validity appears to be due primarily to suppression of small effects sizes in the journal literature. These inflated validity estimates result in an overestimate of the dollar utility of personnel selection by millions of dollars and should be of considerable concern for organizations. The fields of management and applied psychology seldom conduct sensitivity analyses. Through the use of sensitivity analyses, this paper documents that the existing literature overestimates the validity of conscientiousness in the prediction of job performance. Our data show that effect sizes from journal articles are largely responsible for this overestimation.

  3. CryoSat-2: Post launch performance of SIRAL-2 and its calibration/validation

    Science.gov (United States)

    Cullen, Robert; Francis, Richard; Davidson, Malcolm; Wingham, Duncan

    2010-05-01

    1. INTRODUCTION The main payload of CryoSat-2 [1], SIRAL (Synthetic interferometric radar altimeter), is a Ku band pulse-width limited radar altimeter which transmits pulses at a high pulse repetition frequency thus making received echoes phase coherent and suitable for azimuth processing [2]. The azimuth processing in conjunction with correction for slant range improves along track resolution to about 250 meters which is a significant improvement over traditional pulse-width limited systems such as Envisat RA-2, [3]. CryoSat-2 will be launched on 25th February 2010 and this paper describes the pre and post launch measures of CryoSat/SIRAL performance and the status of mission validation planning. 2. SIRAL PERFORMANCE: INTERNAL AND EXTERNAL CALIBRATION Phase coherent pulse-width limited radar altimeters such as SIRAL-2 pose a new challenge when considering a strategy for calibration. Along with the need to generate the well understood corrections for transfer function amplitude with respect to frequency, gain and instrument path delay there is also a need to provide corrections for transfer function phase with respect to frequency and AGC setting, phase variation across bursts of pulses. Furthermore, since some components of these radars are temperature sensitive one needs to be careful when the deciding how often calibrations are performed whilst not impacting mission performance. Several internal calibration ground processors have been developed to model imperfections within the CryoSat-2 radar altimeter (SIRAL-2) hardware and reduce their effect from the science data stream via the use of calibration correction auxiliary products within the ground segment. We present the methods and results used to model and remove imperfections and describe the baseline for usage of SIRAL-2 calibration modes during the commissioning phase and the operational exploitation phases of the mission. Additionally we present early results derived from external calibration of SIRAL via

  4. Validation of CryoSat-2 Performance over Arctic Sea Ice

    DEFF Research Database (Denmark)

    Di Bella, Alessandro; Skourup, Henriette; Bouffard, J.

    The main objective of this work is to validate CryoSat-2 (CS2) SARIn performance over sea ice by use of airborne laser altimetry data obtained during the CryoVEx 2012 campaign. A study by [1] has shown that the extra information from the CS2 SARIn mode increases the number of valid sea surface...... to validate the sea ice freeboard obtained by processing CS2 SARIn level 1b waveforms. The possible reduction in the random freeboard uncertainty is investigated comparing two scenarios, i.e. a SAR-like and a SARIn acquisition. It is observed that using the extra phase information, CS2 is able to detect leads...... height estimates which are usually discarded in the SAR mode due to snagging of the radar signal. As the number of valid detected leads increases, the uncertainty of the freeboard heights decreases. In this study, the snow freeboard heights estimated using data from the airborne laser scanner are used...

  5. Empirical Performance of Cross-Validation With Oracle Methods in a Genomics Context.

    Science.gov (United States)

    Martinez, Josue G; Carroll, Raymond J; Müller, Samuel; Sampson, Joshua N; Chatterjee, Nilanjan

    2011-11-01

    When employing model selection methods with oracle properties such as the smoothly clipped absolute deviation (SCAD) and the Adaptive Lasso, it is typical to estimate the smoothing parameter by m-fold cross-validation, for example, m = 10. In problems where the true regression function is sparse and the signals large, such cross-validation typically works well. However, in regression modeling of genomic studies involving Single Nucleotide Polymorphisms (SNP), the true regression functions, while thought to be sparse, do not have large signals. We demonstrate empirically that in such problems, the number of selected variables using SCAD and the Adaptive Lasso, with 10-fold cross-validation, is a random variable that has considerable and surprising variation. Similar remarks apply to non-oracle methods such as the Lasso. Our study strongly questions the suitability of performing only a single run of m-fold cross-validation with any oracle method, and not just the SCAD and Adaptive Lasso.

  6. Validity of High School Physic Module With Character Values Using Process Skill Approach In STKIP PGRI West Sumatera

    Science.gov (United States)

    Anaperta, M.; Helendra, H.; Zulva, R.

    2018-04-01

    This study aims to describe the validity of physics module with Character Oriented Values Using Process Approach Skills at Dynamic Electrical Material in high school physics / MA and SMK. The type of research is development research. The module development model uses the development model proposed by Plomp which consists of (1) preliminary research phase, (2) the prototyping phase, and (3) assessment phase. In this research is done is initial investigation phase and designing. Data collecting technique to know validation is observation and questionnaire. In the initial investigative phase, curriculum analysis, student analysis, and concept analysis were conducted. In the design phase and the realization of module design for SMA / MA and SMK subjects in dynamic electrical materials. After that, the formative evaluation which include self evaluation, prototyping (expert reviews, one-to-one, and small group. At this stage validity is performed. This research data is obtained through the module validation sheet, which then generates a valid module.

  7. HIGH PERFORMANCE CERIA BASED OXYGEN MEMBRANE

    DEFF Research Database (Denmark)

    2014-01-01

    The invention describes a new class of highly stable mixed conducting materials based on acceptor doped cerium oxide (CeO2-8 ) in which the limiting electronic conductivity is significantly enhanced by co-doping with a second element or co- dopant, such as Nb, W and Zn, so that cerium and the co......-dopant have an ionic size ratio between 0.5 and 1. These materials can thereby improve the performance and extend the range of operating conditions of oxygen permeation membranes (OPM) for different high temperature membrane reactor applications. The invention also relates to the manufacturing of supported...

  8. Playa: High-Performance Programmable Linear Algebra

    Directory of Open Access Journals (Sweden)

    Victoria E. Howle

    2012-01-01

    Full Text Available This paper introduces Playa, a high-level user interface layer for composing algorithms for complex multiphysics problems out of objects from other Trilinos packages. Among other features, Playa provides very high-performance overloaded operators implemented through an expression template mechanism. In this paper, we give an overview of the central Playa objects from a user's perspective, show application to a sequence of increasingly complex solver algorithms, provide timing results for Playa's overloaded operators and other functions, and briefly survey some of the implementation issues involved.

  9. Optimizing the design of very high power, high performance converters

    International Nuclear Information System (INIS)

    Edwards, R.J.; Tiagha, E.A.; Ganetis, G.; Nawrocky, R.J.

    1980-01-01

    This paper describes how various technologies are used to achieve the desired performance in a high current magnet power converter system. It is hoped that the discussions of the design approaches taken will be applicable to other power supply systems where stringent requirements in stability, accuracy and reliability must be met

  10. Predictive validity of the comprehensive basic science examination mean score for assessment of medical students' performance

    Directory of Open Access Journals (Sweden)

    Firouz Behboudi

    2002-04-01

    Full Text Available Background Medical education curriculum improvements can be achieved bye valuating students performance. Medical students have to pass two undergraduate comprehensive examinations, basic science and preinternship, in Iran. Purpose To measure validity of the students' mean score in comprehensive basic science exam (CBSE for predicting their performance in later curriculum phases. Methods This descriptive cross-sectional study was conducted on 95 (38 women and 55 men Guilan medical university students. Their admission to the university was 81% by regional quota and 12% by shaheed and other organizations' share. They first enrolled in 1994 and were able to pass CBS£ at first try. Data on gender, regional quota, and average grades of CBS£, PC, and CPIE were collected by a questionnaire. The calculations were done by SPSS package. Results The correlation coefficient between CBS£ and CPIE mean scores (0.65 was higher than correlation coefficient between CBS£ and PC mean scores (0.49. The predictive validity of CBS£ average grade was significant for students' performance in CPIE; however, the predictive validity of CBSE mean scores for students I pe1jormance in PC was lower. Conclusion he students' mean score in CBSE can be a good denominator for their further admission. We recommend further research to assess the predictive validity for each one of the basic courses. Keywords predictive validity, comprehensive basic exam

  11. Victoria Symptom Validity Test performance in children and adolescents with neurological disorders.

    Science.gov (United States)

    Brooks, Brian L

    2012-12-01

    It is becoming increasingly more important to study, use, and promote the utility of measures that are designed to detect non-compliance with testing (i.e., poor effort, symptom non-validity, response bias) as part of neuropsychological assessments with children and adolescents. Several measures have evidence for use in pediatrics, but there is a paucity of published support for the Victoria Symptom Validity Test (VSVT) in this population. The purpose of this study was to examine the performance on the VSVT in a sample of pediatric patients with known neurological disorders. The sample consisted of 100 consecutively referred children and adolescents between the ages of 6 and 19 years (mean = 14.0, SD = 3.1) with various neurological diagnoses. On the VSVT total items, 95% of the sample had performance in the "valid" range, with 5% being deemed "questionable" and 0% deemed "invalid". On easy items, 97% were "valid", 2% were "questionable", and 1% was "invalid." For difficult items, 84% were "valid," 16% were "questionable," and 0% was "invalid." For those patients given two effort measures (i.e., VSVT and Test of Memory Malingering; n = 65), none was identified as having poor test-taking compliance on both measures. VSVT scores were significantly correlated with age, intelligence, processing speed, and functional ratings of daily abilities (attention, executive functioning, and adaptive functioning), but not objective performance on the measure of sustained attention, verbal memory, or visual memory. The VSVT has potential to be used in neuropsychological assessments with pediatric patients.

  12. Robust High Performance Aquaporin based Biomimetic Membranes

    DEFF Research Database (Denmark)

    Helix Nielsen, Claus; Zhao, Yichun; Qiu, C.

    2013-01-01

    on top of a support membrane. Control membranes, either without aquaporins or with the inactive AqpZ R189A mutant aquaporin served as controls. The separation performance of the membranes was evaluated by cross-flow forward osmosis (FO) and reverse osmosis (RO) tests. In RO the ABM achieved a water......Aquaporins are water channel proteins with high water permeability and solute rejection, which makes them promising for preparing high-performance biomimetic membranes. Despite the growing interest in aquaporin-based biomimetic membranes (ABMs), it is challenging to produce robust and defect...... permeability of ~ 4 L/(m2 h bar) with a NaCl rejection > 97% at an applied hydraulic pressure of 5 bar. The water permeability was ~40% higher compared to a commercial brackish water RO membrane (BW30) and an order of magnitude higher compared to a seawater RO membrane (SW30HR). In FO, the ABMs had > 90...

  13. Evaluation of high-performance computing software

    Energy Technology Data Exchange (ETDEWEB)

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  14. High performance cloud auditing and applications

    CERN Document Server

    Choi, Baek-Young; Song, Sejun

    2014-01-01

    This book mainly focuses on cloud security and high performance computing for cloud auditing. The book discusses emerging challenges and techniques developed for high performance semantic cloud auditing, and presents the state of the art in cloud auditing, computing and security techniques with focus on technical aspects and feasibility of auditing issues in federated cloud computing environments.   In summer 2011, the United States Air Force Research Laboratory (AFRL) CyberBAT Cloud Security and Auditing Team initiated the exploration of the cloud security challenges and future cloud auditing research directions that are covered in this book. This work was supported by the United States government funds from the Air Force Office of Scientific Research (AFOSR), the AFOSR Summer Faculty Fellowship Program (SFFP), the Air Force Research Laboratory (AFRL) Visiting Faculty Research Program (VFRP), the National Science Foundation (NSF) and the National Institute of Health (NIH). All chapters were partially suppor...

  15. Monitoring SLAC High Performance UNIX Computing Systems

    International Nuclear Information System (INIS)

    Lettsome, Annette K.

    2005-01-01

    Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia. Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface

  16. High performance parallel computers for science

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1989-01-01

    This paper reports that Fermilab's Advanced Computer Program (ACP) has been developing cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 Mflops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction

  17. Toward a theory of high performance.

    Science.gov (United States)

    Kirby, Julia

    2005-01-01

    What does it mean to be a high-performance company? The process of measuring relative performance across industries and eras, declaring top performers, and finding the common drivers of their success is such a difficult one that it might seem a fool's errand to attempt. In fact, no one did for the first thousand or so years of business history. The question didn't even occur to many scholars until Tom Peters and Bob Waterman released In Search of Excellence in 1982. Twenty-three years later, we've witnessed several more attempts--and, just maybe, we're getting closer to answers. In this reported piece, HBR senior editor Julia Kirby explores why it's so difficult to study high performance and how various research efforts--including those from John Kotter and Jim Heskett; Jim Collins and Jerry Porras; Bill Joyce, Nitin Nohria, and Bruce Roberson; and several others outlined in a summary chart-have attacked the problem. The challenge starts with deciding which companies to study closely. Are the stars the ones with the highest market caps, the ones with the greatest sales growth, or simply the ones that remain standing at the end of the game? (And when's the end of the game?) Each major study differs in how it defines success, which companies it therefore declares to be worthy of emulation, and the patterns of activity and attitude it finds in common among them. Yet, Kirby concludes, as each study's method incrementally solves problems others have faced, we are progressing toward a consensus theory of high performance.

  18. CFD modelling of hydrogen stratification in enclosures: Model validation and application to PAR performance

    Energy Technology Data Exchange (ETDEWEB)

    Hoyes, J.R., E-mail: james.hoyes@hsl.gsi.gov.uk; Ivings, M.J.

    2016-12-15

    Highlights: • The ability of CFD to predict hydrogen stratification phenomena is investigated. • Contrary to expectation, simulations on tetrahedral meshes under-predict mixing. • Simulations on structured meshes give good agreement with experimental data. • CFD model used to investigate the effects of stratification on PAR performance. • Results show stratification can have a significant effect on PAR performance. - Abstract: Computational Fluid Dynamics (CFD) models are maturing into useful tools for supporting safety analyses. This paper investigates the capabilities of CFD models for predicting hydrogen stratification in a containment vessel using data from the NEA/OECD SETH2 MISTRA experiments. Further simulations are then carried out to illustrate the qualitative effects of hydrogen stratification on the performance of Passive Autocatalytic Recombiner (PAR) units. The MISTRA experiments have well-defined initial and boundary conditions which makes them well suited for use in a validation study. Results are presented for the sensitivity to mesh resolution and mesh type. Whilst the predictions are shown to be largely insensitive to the mesh resolution they are surprisingly sensitive to the mesh type. In particular, tetrahedral meshes are found to induce small unphysical convection currents that result in molecular diffusion and turbulent mixing being under-predicted. This behaviour is not unique to the CFD model used here (ANSYS CFX) and furthermore, it may affect simulations run on other non-aligned meshes (meshes that are not aligned perpendicular to gravity), including non-aligned structured meshes. Following existing best practice guidelines can help to identify potential unphysical predictions, but as an additional precaution consideration should be given to using gravity-aligned meshes for modelling stratified flows. CFD simulations of hydrogen recombination in the Becker Technologies THAI facility are presented with high and low PAR positions

  19. High-performance phase-field modeling

    KAUST Repository

    Vignal, Philippe; Sarmiento, Adel; Cortes, Adriano Mauricio; Dalcin, L.; Collier, N.; Calo, Victor M.

    2015-01-01

    and phase-field crystal equation will be presented, which corroborate the theoretical findings, and illustrate the robustness of the method. Results related to more challenging examples, namely the Navier-Stokes Cahn-Hilliard and a diusion-reaction Cahn-Hilliard system, will also be presented. The implementation was done in PetIGA and PetIGA-MF, high-performance Isogeometric Analysis frameworks [1, 3], designed to handle non-linear, time-dependent problems.

  20. AHPCRC - Army High Performance Computing Research Center

    Science.gov (United States)

    2010-01-01

    computing. Of particular interest is the ability of a distrib- uted jamming network (DJN) to jam signals in all or part of a sensor or communications net...and reasoning, assistive technologies. FRIEDRICH (FRITZ) PRINZ Finmeccanica Professor of Engineering, Robert Bosch Chair, Department of Engineering...High Performance Computing Research Center www.ahpcrc.org BARBARA BRYAN AHPCRC Research and Outreach Manager, HPTi (650) 604-3732 bbryan@hpti.com Ms

  1. Performance concerns for high duty fuel cycle

    International Nuclear Information System (INIS)

    Esposito, V.J.; Gutierrez, J.E.

    1999-01-01

    One of the goals of the nuclear industry is to achieve economic performance such that nuclear power plants are competitive in a de-regulated market. The manner in which nuclear fuel is designed and operated lies at the heart of economic viability. In this sense reliability, operating flexibility and low costs are the three major requirements of the NPP today. The translation of these three requirements to the design is part of our work. The challenge today is to produce a fuel design which will operate with long operating cycles, high discharge burnup, power up-rating and while still maintaining all design and safety margins. European Fuel Group (EFG) understands that to achieve the required performance high duty/energy fuel designs are needed. The concerns for high duty design includes, among other items, core design methods, advanced Safety Analysis methodologies, performance models, advanced material and operational strategies. The operational aspects require the trade-off and evaluation of various parameters including coolant chemistry control, material corrosion, boiling duty, boron level impacts, etc. In this environment MAEF is the design that EFG is now offering based on ZIRLO alloy and a robust skeleton. This new design is able to achieve 70 GWd/tU and Lead Test Programs are being executed to demonstrate this capability. A number of performance issues which have been a concern with current designs have been resolved such as cladding corrosion and incomplete RCCA insertion (IRI). As the core duty becomes more aggressive other new issues need to be addressed such as Axial Offset Anomaly. These new issues are being addressed by combination of the new design in concert with advanced methodologies to meet the demanding needs of NPP. The ability and strategy to meet high duty core requirements, flexibility of operation and maintain acceptable balance of all technical issues is the discussion in this paper. (authors)

  2. DURIP: High Performance Computing in Biomathematics Applications

    Science.gov (United States)

    2017-05-10

    Mathematics and Statistics (AMS) at the University of California, Santa Cruz (UCSC) to conduct research and research-related education in areas of...Computing in Biomathematics Applications Report Title The goal of this award was to enhance the capabilities of the Department of Applied Mathematics and...DURIP: High Performance Computing in Biomathematics Applications The goal of this award was to enhance the capabilities of the Department of Applied

  3. High Performance Computing Operations Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Cupps, Kimberly C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-19

    The High Performance Computing Operations Review (HPCOR) meeting—requested by the ASC and ASCR program headquarters at DOE—was held November 5 and 6, 2013, at the Marriott Hotel in San Francisco, CA. The purpose of the review was to discuss the processes and practices for HPC integration and its related software and facilities. Experiences and lessons learned from the most recent systems deployed were covered in order to benefit the deployment of new systems.

  4. Planning for high performance project teams

    International Nuclear Information System (INIS)

    Reed, W.; Keeney, J.; Westney, R.

    1997-01-01

    Both industry-wide research and corporate benchmarking studies confirm the significant savings in cost and time that result from early planning of a project. Amoco's Team Planning Workshop combines long-term strategic project planning and short-term tactical planning with team building to provide the basis for high performing project teams, better project planning, and effective implementation of the Amoco Common Process for managing projects

  5. Video performance for high security applications

    International Nuclear Information System (INIS)

    Connell, Jack C.; Norman, Bradley C.

    2010-01-01

    The complexity of physical protection systems has increased to address modern threats to national security and emerging commercial technologies. A key element of modern physical protection systems is the data presented to the human operator used for rapid determination of the cause of an alarm, whether false (e.g., caused by an animal, debris, etc.) or real (e.g., a human adversary). Alarm assessment, the human validation of a sensor alarm, primarily relies on imaging technologies and video systems. Developing measures of effectiveness (MOE) that drive the design or evaluation of a video system or technology becomes a challenge, given the subjectivity of the application (e.g., alarm assessment). Sandia National Laboratories has conducted empirical analysis using field test data and mathematical models such as binomial distribution and Johnson target transfer functions to develop MOEs for video system technologies. Depending on the technology, the task of the security operator and the distance to the target, the Probability of Assessment (PAs) can be determined as a function of a variety of conditions or assumptions. PAs used as an MOE allows the systems engineer to conduct trade studies, make informed design decisions, or evaluate new higher-risk technologies. This paper outlines general video system design trade-offs, discusses ways video can be used to increase system performance and lists MOEs for video systems used in subjective applications such as alarm assessment.

  6. Development and validation of a music performance anxiety inventory for gifted adolescent musicians.

    Science.gov (United States)

    Osborne, Margaret S; Kenny, Dianna T

    2005-01-01

    Music performance anxiety (MPA) is a distressing experience for musicians of all ages, yet the empirical investigation of MPA in adolescents has received little attention to date. No measures specifically targeting MPA in adolescents have been empirically validated. This article presents findings of an initial study into the psychometric properties and validation of the Music Performance Anxiety Inventory for Adolescents (MPAI-A), a new self-report measure of MPA for this group. Data from 381 elite young musicians aged 12-19 years was used to investigate the factor structure, internal reliability, construct and divergent validity of the MPAI-A. Cronbach's alpha for the full measure was .91. Factor analysis identified three factors, which together accounted for 53% of the variance. Construct validity was demonstrated by significant positive relationships with social phobia (measured using the Social Phobia Anxiety Inventory [Beidel, D. C., Turner, S. M., & Morris, T. L. (1995). A new inventory to assess childhood social anxiety and phobia: The Social Phobia and Anxiety Inventory for Children. Psychological Assessment, 7(1), 73-79; Beidel, D. C., Turner, S. M., & Morris, T. L. (1998). Social Phobia and Anxiety Inventory for Children (SPAI-C). North Tonawanda, NY: Multi-Health Systems Inc.]) and trait anxiety (measured using the State Trait Anxiety Inventory [Spielberger, C. D. (1983). State-Trait Anxiety Inventory STAI (Form Y). Palo Alto, CA: Consulting Psychologists Press, Inc.]). The MPAI-A demonstrated convergent validity by a moderate to strong positive correlation with an adult measure of MPA. Discriminant validity was established by a weaker positive relationship with depression, and no relationship with externalizing behavior problems. It is hoped that the MPAI-A, as the first empirically validated measure of adolescent musicians' performance anxiety, will enhance and promote phenomenological and treatment research in this area.

  7. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  8. High performance APCS conceptual design and evaluation scoping study

    International Nuclear Information System (INIS)

    Soelberg, N.; Liekhus, K.; Chambers, A.; Anderson, G.

    1998-02-01

    This Air Pollution Control System (APCS) Conceptual Design and Evaluation study was conducted to evaluate a high-performance (APC) system for minimizing air emissions from mixed waste thermal treatment systems. Seven variations of high-performance APCS designs were conceptualized using several design objectives. One of the system designs was selected for detailed process simulation using ASPEN PLUS to determine material and energy balances and evaluate performance. Installed system capital costs were also estimated. Sensitivity studies were conducted to evaluate the incremental cost and benefit of added carbon adsorber beds for mercury control, specific catalytic reduction for NO x control, and offgas retention tanks for holding the offgas until sample analysis is conducted to verify that the offgas meets emission limits. Results show that the high-performance dry-wet APCS can easily meet all expected emission limits except for possibly mercury. The capability to achieve high levels of mercury control (potentially necessary for thermally treating some DOE mixed streams) could not be validated using current performance data for mercury control technologies. The engineering approach and ASPEN PLUS modeling tool developed and used in this study identified APC equipment and system performance, size, cost, and other issues that are not yet resolved. These issues need to be addressed in feasibility studies and conceptual designs for new facilities or for determining how to modify existing facilities to meet expected emission limits. The ASPEN PLUS process simulation with current and refined input assumptions and calculations can be used to provide system performance information for decision-making, identifying best options, estimating costs, reducing the potential for emission violations, providing information needed for waste flow analysis, incorporating new APCS technologies in existing designs, or performing facility design and permitting activities

  9. High performance separation of lanthanides and actinides

    International Nuclear Information System (INIS)

    Sivaraman, N.; Vasudeva Rao, P.R.

    2011-01-01

    The major advantage of High Performance Liquid Chromatography (HPLC) is its ability to provide rapid and high performance separations. It is evident from Van Deemter curve for particle size versus resolution that packing materials with particle sizes less than 2 μm provide better resolution for high speed separations and resolving complex mixtures compared to 5 μm based supports. In the recent past, chromatographic support material using monolith has been studied extensively at our laboratory. Monolith column consists of single piece of porous, rigid material containing mesopores and micropores, which provide fast analyte mass transfer. Monolith support provides significantly higher separation efficiency than particle-packed columns. A clear advantage of monolith is that it could be operated at higher flow rates but with lower back pressure. Higher operating flow rate results in higher column permeability, which drastically reduces analysis time and provides high separation efficiency. The above developed fast separation methods were applied to assay the lanthanides and actinides from the dissolver solutions of nuclear reactor fuels

  10. High Performance OLED Panel and Luminaire

    Energy Technology Data Exchange (ETDEWEB)

    Spindler, Jeffrey [OLEDWorks LLC, Rochester, NY (United States)

    2017-02-20

    In this project, OLEDWorks developed and demonstrated the technology required to produce OLED lighting panels with high energy efficiency and excellent light quality. OLED panels developed in this program produce high quality warm white light with CRI greater than 85 and efficacy up to 80 lumens per watt (LPW). An OLED luminaire employing 24 of the high performance panels produces practical levels of illumination for general lighting, with a flux of over 2200 lumens at 60 LPW. This is a significant advance in the state of the art for OLED solid-state lighting (SSL), which is expected to be a complementary light source to the more advanced LED SSL technology that is rapidly replacing all other traditional forms of lighting.

  11. The path toward HEP High Performance Computing

    International Nuclear Information System (INIS)

    Apostolakis, John; Brun, René; Gheata, Andrei; Wenzel, Sandro; Carminati, Federico

    2014-01-01

    High Energy Physics code has been known for making poor use of high performance computing architectures. Efforts in optimising HEP code on vector and RISC architectures have yield limited results and recent studies have shown that, on modern architectures, it achieves a performance between 10% and 50% of the peak one. Although several successful attempts have been made to port selected codes on GPUs, no major HEP code suite has a 'High Performance' implementation. With LHC undergoing a major upgrade and a number of challenging experiments on the drawing board, HEP cannot any longer neglect the less-than-optimal performance of its code and it has to try making the best usage of the hardware. This activity is one of the foci of the SFT group at CERN, which hosts, among others, the Root and Geant4 project. The activity of the experiments is shared and coordinated via a Concurrency Forum, where the experience in optimising HEP code is presented and discussed. Another activity is the Geant-V project, centred on the development of a highperformance prototype for particle transport. Achieving a good concurrency level on the emerging parallel architectures without a complete redesign of the framework can only be done by parallelizing at event level, or with a much larger effort at track level. Apart the shareable data structures, this typically implies a multiplication factor in terms of memory consumption compared to the single threaded version, together with sub-optimal handling of event processing tails. Besides this, the low level instruction pipelining of modern processors cannot be used efficiently to speedup the program. We have implemented a framework that allows scheduling vectors of particles to an arbitrary number of computing resources in a fine grain parallel approach. The talk will review the current optimisation activities within the SFT group with a particular emphasis on the development perspectives towards a simulation framework able to profit

  12. DMFC performance and methanol cross-over: Experimental analysis and model validation

    Energy Technology Data Exchange (ETDEWEB)

    Casalegno, A.; Marchesi, R. [Dipartimento di Energia, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)

    2008-10-15

    A combined experimental and modelling approach is proposed to analyze methanol cross-over and its effect on DMFC performance. The experimental analysis is performed in order to allow an accurate investigation of methanol cross-over influence on DMFC performance, hence measurements were characterized in terms of uncertainty and reproducibility. The findings suggest that methanol cross-over is mainly determined by diffusion transport and affects cell performance partly via methanol electro-oxidation at the cathode. The modelling analysis is carried out to further investigate methanol cross-over phenomenon. A simple model evaluates the effectiveness of two proposed interpretations regarding methanol cross-over and its effects. The model is validated using the experimental data gathered. Both the experimental analysis and the proposed and validated model allow a substantial step forward in the understanding of the main phenomena associated with methanol cross-over. The findings confirm the possibility to reduce methanol cross-over by optimizing anode feeding. (author)

  13. Validity of the Optometry Admission Test in Predicting Performance in Schools and Colleges of Optometry.

    Science.gov (United States)

    Kramer, Gene A.; Johnston, JoElle

    1997-01-01

    A study examined the relationship between Optometry Admission Test scores and pre-optometry or undergraduate grade point average (GPA) with first and second year performance in optometry schools. The test's predictive validity was limited but significant, and comparable to those reported for other admission tests. In addition, the scores…

  14. Ecological Development and Validation of a Music Performance Rating Scale for Five Instrument Families

    Science.gov (United States)

    Wrigley, William J.; Emmerson, Stephen B.

    2013-01-01

    This study investigated ways to improve the quality of music performance evaluation in an effort to address the accountability imperative in tertiary music education. An enhanced scientific methodology was employed incorporating ecological validity and using recognized qualitative methods involving grounded theory and quantitative methods…

  15. Development and content validation of performance assessments for endoscopic third ventriculostomy

    NARCIS (Netherlands)

    Breimer, Gerben E.; Haji, Faizal A.; Hoving, Eelco W; Drake, James M.

    This study aims to develop and establish the content validity of multiple expert rating instruments to assess performance in endoscopic third ventriculostomy (ETV), collectively called the Neuro-Endoscopic Ventriculostomy Assessment Tool (NEVAT). The important aspects of ETV were identified through

  16. Development and Validation of a Rating Scale for Wind Jazz Improvisation Performance

    Science.gov (United States)

    Smith, Derek T.

    2009-01-01

    The purpose of this study was to construct and validate a rating scale for collegiate wind jazz improvisation performance. The 14-item Wind Jazz Improvisation Evaluation Scale (WJIES) was constructed and refined through a facet-rational approach to scale development. Five wind jazz students and one professional jazz educator were asked to record…

  17. Validating Performance Level Descriptors (PLDs) for the AP® Environmental Science Exam

    Science.gov (United States)

    Reshetar, Rosemary; Kaliski, Pamela; Chajewski, Michael; Lionberger, Karen

    2012-01-01

    This presentation summarizes a pilot study conducted after the May 2011 administration of the AP Environmental Science Exam. The study used analytical methods based on scaled anchoring as input to a Performance Level Descriptor validation process that solicited systematic input from subject matter experts.

  18. Assessing Cognitive Performance in Badminton Players : A Reproducibility and Validity Study

    NARCIS (Netherlands)

    van de Water, Tanja; Huijgen, Barbara; Faber, Irene R.; Elferink-Gemser, Marije

    2017-01-01

    Fast reaction and good inhibitory control are associated with elite sports performance. To evaluate the reproducibility and validity of a newly developed Badminton Reaction Inhibition Test (BRIT), fifteen elite (25 +/- 4 years) and nine non-elite (24 +/- 4 years) Dutch male badminton players

  19. VALIDITY OF THE DIMENSIONAL CONFIGURATION OF THE REDUCED POTENTIAL PERFORMANCE MODEL IN SKI JUMPING

    OpenAIRE

    Ulaga, Maja; Čoh, Milan; Jošt, Bojan

    2007-01-01

    The aim of the study was to establish the validity of the dimensional configuration of the reduced po-tential performance model in ski jumping. Two performance models were prepared (models A and B), dif-fering only in terms of their method of determining the weights (dimensional configuration). Model A in-volves the dependent determination of weights while model B includes the independent determination of weights. The sample consisted of 104 Slovenian ski jumpers from the senior-men’s categor...

  20. Assessing Cognitive Performance in Badminton Players: A Reproducibility and Validity Study

    Directory of Open Access Journals (Sweden)

    van de Water Tanja

    2017-01-01

    Full Text Available Fast reaction and good inhibitory control are associated with elite sports performance. To evaluate the reproducibility and validity of a newly developed Badminton Reaction Inhibition Test (BRIT, fifteen elite (25 ± 4 years and nine non-elite (24 ± 4 years Dutch male badminton players participated in the study. The BRIT measured four components: domain-general reaction time, badminton-specific reaction time, domain-general inhibitory control and badminton-specific inhibitory control. Five participants were retested within three weeks on the badminton-specific components. Reproducibility was acceptable for badminton-specific reaction time (ICC = 0.626, CV = 6% and for badminton-specific inhibitory control (ICC = 0.317, CV = 13%. Good construct validity was shown for badminton-specific reaction time discriminating between elite and non-elite players (F = 6.650, p 0.05. Concurrent validity for domain-general reaction time was good, as it was associated with a national ranking for elite (p = 0.70, p 0.05. In conclusion, reproducibility and validity of inhibitory control assessment was not confirmed, however, the BRIT appears a reproducible and valid measure of reaction time in badminton players. Reaction time measured with the BRIT may provide input for training programs aiming to improve badminton players’ performance.

  1. Assessing Cognitive Performance in Badminton Players: A Reproducibility and Validity Study.

    Science.gov (United States)

    van de Water, Tanja; Huijgen, Barbara; Faber, Irene; Elferink-Gemser, Marije

    2017-01-01

    Fast reaction and good inhibitory control are associated with elite sports performance. To evaluate the reproducibility and validity of a newly developed Badminton Reaction Inhibition Test (BRIT), fifteen elite (25 ± 4 years) and nine non-elite (24 ± 4 years) Dutch male badminton players participated in the study. The BRIT measured four components: domain-general reaction time, badminton-specific reaction time, domain-general inhibitory control and badminton-specific inhibitory control. Five participants were retested within three weeks on the badminton-specific components. Reproducibility was acceptable for badminton-specific reaction time (ICC = 0.626, CV = 6%) and for badminton-specific inhibitory control (ICC = 0.317, CV = 13%). Good construct validity was shown for badminton-specific reaction time discriminating between elite and non-elite players (F = 6.650, p 0.05). Concurrent validity for domain-general reaction time was good, as it was associated with a national ranking for elite (p = 0.70, p badminton-specific reaction time, nor both components of inhibitory control (p > 0.05). In conclusion, reproducibility and validity of inhibitory control assessment was not confirmed, however, the BRIT appears a reproducible and valid measure of reaction time in badminton players. Reaction time measured with the BRIT may provide input for training programs aiming to improve badminton players' performance.

  2. A High Performance COTS Based Computer Architecture

    Science.gov (United States)

    Patte, Mathieu; Grimoldi, Raoul; Trautner, Roland

    2014-08-01

    Using Commercial Off The Shelf (COTS) electronic components for space applications is a long standing idea. Indeed the difference in processing performance and energy efficiency between radiation hardened components and COTS components is so important that COTS components are very attractive for use in mass and power constrained systems. However using COTS components in space is not straightforward as one must account with the effects of the space environment on the COTS components behavior. In the frame of the ESA funded activity called High Performance COTS Based Computer, Airbus Defense and Space and its subcontractor OHB CGS have developed and prototyped a versatile COTS based architecture for high performance processing. The rest of the paper is organized as follows: in a first section we will start by recapitulating the interests and constraints of using COTS components for space applications; then we will briefly describe existing fault mitigation architectures and present our solution for fault mitigation based on a component called the SmartIO; in the last part of the paper we will describe the prototyping activities executed during the HiP CBC project.

  3. Management issues for high performance storage systems

    Energy Technology Data Exchange (ETDEWEB)

    Louis, S. [Lawrence Livermore National Lab., CA (United States); Burris, R. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    Managing distributed high-performance storage systems is complex and, although sharing common ground with traditional network and systems management, presents unique storage-related issues. Integration technologies and frameworks exist to help manage distributed network and system environments. Industry-driven consortia provide open forums where vendors and users cooperate to leverage solutions. But these new approaches to open management fall short addressing the needs of scalable, distributed storage. We discuss the motivation and requirements for storage system management (SSM) capabilities and describe how SSM manages distributed servers and storage resource objects in the High Performance Storage System (HPSS), a new storage facility for data-intensive applications and large-scale computing. Modem storage systems, such as HPSS, require many SSM capabilities, including server and resource configuration control, performance monitoring, quality of service, flexible policies, file migration, file repacking, accounting, and quotas. We present results of initial HPSS SSM development including design decisions and implementation trade-offs. We conclude with plans for follow-on work and provide storage-related recommendations for vendors and standards groups seeking enterprise-wide management solutions.

  4. Automatic Energy Schemes for High Performance Applications

    Energy Technology Data Exchange (ETDEWEB)

    Sundriyal, Vaibhav [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Although high-performance computing traditionally focuses on the efficient execution of large-scale applications, both energy and power have become critical concerns when approaching exascale. Drastic increases in the power consumption of supercomputers affect significantly their operating costs and failure rates. In modern microprocessor architectures, equipped with dynamic voltage and frequency scaling (DVFS) and CPU clock modulation (throttling), the power consumption may be controlled in software. Additionally, network interconnect, such as Infiniband, may be exploited to maximize energy savings while the application performance loss and frequency switching overheads must be carefully balanced. This work first studies two important collective communication operations, all-to-all and allgather and proposes energy saving strategies on the per-call basis. Next, it targets point-to-point communications to group them into phases and apply frequency scaling to them to save energy by exploiting the architectural and communication stalls. Finally, it proposes an automatic runtime system which combines both collective and point-to-point communications into phases, and applies throttling to them apart from DVFS to maximize energy savings. The experimental results are presented for NAS parallel benchmark problems as well as for the realistic parallel electronic structure calculations performed by the widely used quantum chemistry package GAMESS. Close to the maximum energy savings were obtained with a substantially low performance loss on the given platform.

  5. A high performance architecture for accelerator controls

    International Nuclear Information System (INIS)

    Allen, M.; Hunt, S.M; Lue, H.; Saltmarsh, C.G.; Parker, C.R.C.B.

    1991-01-01

    The demands placed on the Superconducting Super Collider (SSC) control system due to large distances, high bandwidth and fast response time required for operation will require a fresh approach to the data communications architecture of the accelerator. The prototype design effort aims at providing deterministic communication across the accelerator complex with a response time of < 100 ms and total bandwidth of 2 Gbits/sec. It will offer a consistent interface for a large number of equipment types, from vacuum pumps to beam position monitors, providing appropriate communications performance for each equipment type. It will consist of highly parallel links to all equipment: those with computing resources, non-intelligent direct control interfaces, and data concentrators. This system will give each piece of equipment a dedicated link of fixed bandwidth to the control system. Application programs will have access to all accelerator devices which will be memory mapped into a global virtual addressing scheme. Links to devices in the same geographical area will be multiplexed using commercial Time Division Multiplexing equipment. Low-level access will use reflective memory techniques, eliminating processing overhead and complexity of traditional data communication protocols. The use of commercial standards and equipment will enable a high performance system to be built at low cost

  6. A high performance architecture for accelerator controls

    International Nuclear Information System (INIS)

    Allen, M.; Hunt, S.M.; Lue, H.; Saltmarsh, C.G.; Parker, C.R.C.B.

    1991-03-01

    The demands placed on the Superconducting Super Collider (SSC) control system due to large distances, high bandwidth and fast response time required for operation will require a fresh approach to the data communications architecture of the accelerator. The prototype design effort aims at providing deterministic communication across the accelerator complex with a response time of <100 ms and total bandwidth of 2 Gbits/sec. It will offer a consistent interface for a large number of equipment types, from vacuum pumps to beam position monitors, providing appropriate communications performance for each equipment type. It will consist of highly parallel links to all equipments: those with computing resources, non-intelligent direct control interfaces, and data concentrators. This system will give each piece of equipment a dedicated link of fixed bandwidth to the control system. Application programs will have access to all accelerator devices which will be memory mapped into a global virtual addressing scheme. Links to devices in the same geographical area will be multiplexed using commercial Time Division Multiplexing equipment. Low-level access will use reflective memory techniques, eliminating processing overhead and complexity of traditional data communication protocols. The use of commercial standards and equipment will enable a high performance system to be built at low cost. 1 fig

  7. High performance computing in linear control

    International Nuclear Information System (INIS)

    Datta, B.N.

    1993-01-01

    Remarkable progress has been made in both theory and applications of all important areas of control. The theory is rich and very sophisticated. Some beautiful applications of control theory are presently being made in aerospace, biomedical engineering, industrial engineering, robotics, economics, power systems, etc. Unfortunately, the same assessment of progress does not hold in general for computations in control theory. Control Theory is lagging behind other areas of science and engineering in this respect. Nowadays there is a revolution going on in the world of high performance scientific computing. Many powerful computers with vector and parallel processing have been built and have been available in recent years. These supercomputers offer very high speed in computations. Highly efficient software, based on powerful algorithms, has been developed to use on these advanced computers, and has also contributed to increased performance. While workers in many areas of science and engineering have taken great advantage of these hardware and software developments, control scientists and engineers, unfortunately, have not been able to take much advantage of these developments

  8. Building Trust in High-Performing Teams

    Directory of Open Access Journals (Sweden)

    Aki Soudunsaari

    2012-06-01

    Full Text Available Facilitation of growth is more about good, trustworthy contacts than capital. Trust is a driving force for business creation, and to create a global business you need to build a team that is capable of meeting the challenge. Trust is a key factor in team building and a needed enabler for cooperation. In general, trust building is a slow process, but it can be accelerated with open interaction and good communication skills. The fast-growing and ever-changing nature of global business sets demands for cooperation and team building, especially for startup companies. Trust building needs personal knowledge and regular face-to-face interaction, but it also requires empathy, respect, and genuine listening. Trust increases communication, and rich and open communication is essential for the building of high-performing teams. Other building materials are a shared vision, clear roles and responsibilities, willingness for cooperation, and supporting and encouraging leadership. This study focuses on trust in high-performing teams. It asks whether it is possible to manage trust and which tools and operation models should be used to speed up the building of trust. In this article, preliminary results from the authors’ research are presented to highlight the importance of sharing critical information and having a high level of communication through constant interaction.

  9. The concurrent validity of the technical test battery as an indicator of work performance in a telecommunications company

    Directory of Open Access Journals (Sweden)

    Marelize Barnard

    2005-10-01

    Full Text Available The purpose of this study was to assess the concurrent validity of the Technical Test Battery (TTB in a South African telecommunications institution. The Technical Test Battery (TTB was administered to a sample of 107 technical officers. Their test scores were compared to the scores obtained from a job performance rating scale specifically designed for this position on the basis of a thorough job analysis. The TTB demonstrated high concurrent validity as an indicator of work performance for technical posts in the telecommunications environment. These results suggest that the TTB may have a high predictive validity for performance in technical positions. The findings and implications of the study are discussed. Opsomming Die doel van hierdie studie was om die samevallende geldigheid van die “Technical Test Battery (TTB�? in ’n Suid-Afrikaanse telekommunikasie instansie te bepaal. Die TTB is op ’n steekproef van 107 tegniese personeel toegepas. Die toetstellings is in verband gebring met die tellings van ’n werksprestasiemaatstaf wat spesifiek vir die pos ontwikkel is op grond van ’n deeglike posanalise. Daar is bevind dat die TTB ’n hoë samevallende geldigheid as aanduider van werksprestasie vir tegniese poste in the telekommunikasiebedryf toon. Dié resultate dui op ’n sterk moontlikheid dat die TTB ’n goeie voorspeller van werksprestasie vir tegniese beroepe kan wees. Die bevindinge en implikasies van die studie word bespreek.

  10. Improving UV Resistance of High Performance Fibers

    Science.gov (United States)

    Hassanin, Ahmed

    High performance fibers are characterized by their superior properties compared to the traditional textile fibers. High strength fibers have high modules, high strength to weight ratio, high chemical resistance, and usually high temperature resistance. It is used in application where superior properties are needed such as bulletproof vests, ropes and cables, cut resistant products, load tendons for giant scientific balloons, fishing rods, tennis racket strings, parachute cords, adhesives and sealants, protective apparel and tire cords. Unfortunately, Ultraviolet (UV) radiation causes serious degradation to the most of high performance fibers. UV lights, either natural or artificial, cause organic compounds to decompose and degrade, because the energy of the photons of UV light is high enough to break chemical bonds causing chain scission. This work is aiming at achieving maximum protection of high performance fibers using sheathing approaches. The sheaths proposed are of lightweight to maintain the advantage of the high performance fiber that is the high strength to weight ratio. This study involves developing three different types of sheathing. The product of interest that need be protected from UV is braid from PBO. First approach is extruding a sheath from Low Density Polyethylene (LDPE) loaded with different rutile TiO2 % nanoparticles around the braid from the PBO. The results of this approach showed that LDPE sheath loaded with 10% TiO2 by weight achieved the highest protection compare to 0% and 5% TiO2. The protection here is judged by strength loss of PBO. This trend noticed in different weathering environments, where the sheathed samples were exposed to UV-VIS radiations in different weatheromter equipments as well as exposure to high altitude environment using NASA BRDL balloon. The second approach is focusing in developing a protective porous membrane from polyurethane loaded with rutile TiO2 nanoparticles. Membrane from polyurethane loaded with 4

  11. Intel Xeon Phi coprocessor high performance programming

    CERN Document Server

    Jeffers, James

    2013-01-01

    Authors Jim Jeffers and James Reinders spent two years helping educate customers about the prototype and pre-production hardware before Intel introduced the first Intel Xeon Phi coprocessor. They have distilled their own experiences coupled with insights from many expert customers, Intel Field Engineers, Application Engineers and Technical Consulting Engineers, to create this authoritative first book on the essentials of programming for this new architecture and these new products. This book is useful even before you ever touch a system with an Intel Xeon Phi coprocessor. To ensure that your applications run at maximum efficiency, the authors emphasize key techniques for programming any modern parallel computing system whether based on Intel Xeon processors, Intel Xeon Phi coprocessors, or other high performance microprocessors. Applying these techniques will generally increase your program performance on any system, and better prepare you for Intel Xeon Phi coprocessors and the Intel MIC architecture. It off...

  12. Development of high-performance blended cements

    Science.gov (United States)

    Wu, Zichao

    2000-10-01

    This thesis presents the development of high-performance blended cements from industrial by-products. To overcome the low-early strength of blended cements, several chemicals were studied as the activators for cement hydration. Sodium sulfate was discovered as the best activator. The blending proportions were optimized by Taguchi experimental design. The optimized blended cements containing up to 80% fly ash performed better than Type I cement in strength development and durability. Maintaining a constant cement content, concrete produced from the optimized blended cements had equal or higher strength and higher durability than that produced from Type I cement alone. The key for the activation mechanism was the reaction between added SO4 2- and Ca2+ dissolved from cement hydration products.

  13. Is Learner Self-Assessment Reliable and Valid in a Web-Based Portfolio Environment for High School Students?

    Science.gov (United States)

    Chang, Chi-Cheng; Liang, Chaoyun; Chen, Yi-Hui

    2013-01-01

    This study explored the reliability and validity of Web-based portfolio self-assessment. Participants were 72 senior high school students enrolled in a computer application course. The students created learning portfolios, viewed peers' work, and performed self-assessment on the Web-based portfolio assessment system. The results indicated: 1)…

  14. Utilities for high performance dispersion model PHYSIC

    International Nuclear Information System (INIS)

    Yamazawa, Hiromi

    1992-09-01

    The description and usage of the utilities for the dispersion calculation model PHYSIC were summarized. The model was developed in the study of developing high performance SPEEDI with the purpose of introducing meteorological forecast function into the environmental emergency response system. The procedure of PHYSIC calculation consists of three steps; preparation of relevant files, creation and submission of JCL, and graphic output of results. A user can carry out the above procedure with the help of the Geographical Data Processing Utility, the Model Control Utility, and the Graphic Output Utility. (author)

  15. An integrated high performance fastbus slave interface

    International Nuclear Information System (INIS)

    Christiansen, J.; Ljuslin, C.

    1992-01-01

    A high performance Fastbus slave interface ASIC is presented. The Fastbus slave integrated circuit (FASIC) is a programmable device, enabling its direct use in many different applications. The FASIC acts as an interface between Fastbus and a 'standard' processor/memory bus. It can work stand-alone or together with a microprocessor. A set of address mapping windows can map Fastbus addresses to convenient memory addresses and at the same time act as address decoding logic. Data rates of 100 MBytes/s to Fastbus can be obtained using an internal FIFO buffer in the FASIC. (orig.)

  16. High performance visual display for HENP detectors

    CERN Document Server

    McGuigan, M; Spiletic, J; Fine, V; Nevski, P

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactiv...

  17. High-Performance Vertical Organic Electrochemical Transistors.

    Science.gov (United States)

    Donahue, Mary J; Williamson, Adam; Strakosas, Xenofon; Friedlein, Jacob T; McLeod, Robert R; Gleskova, Helena; Malliaras, George G

    2018-02-01

    Organic electrochemical transistors (OECTs) are promising transducers for biointerfacing due to their high transconductance, biocompatibility, and availability in a variety of form factors. Most OECTs reported to date, however, utilize rather large channels, limiting the transistor performance and resulting in a low transistor density. This is typically a consequence of limitations associated with traditional fabrication methods and with 2D substrates. Here, the fabrication and characterization of OECTs with vertically stacked contacts, which overcome these limitations, is reported. The resulting vertical transistors exhibit a reduced footprint, increased intrinsic transconductance of up to 57 mS, and a geometry-normalized transconductance of 814 S m -1 . The fabrication process is straightforward and compatible with sensitive organic materials, and allows exceptional control over the transistor channel length. This novel 3D fabrication method is particularly suited for applications where high density is needed, such as in implantable devices. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Validity and Reliability of the Academic Resilience Scale in Turkish High School

    Science.gov (United States)

    Kapikiran, Sahin

    2012-01-01

    The present study aims to determine the validity and reliability of the academic resilience scale in Turkish high school. The participances of the study includes 378 high school students in total (192 female and 186 male). A set of analyses were conducted in order to determine the validity and reliability of the study. Firstly, both exploratory…

  19. High Performance Data Distribution for Scientific Community

    Science.gov (United States)

    Tirado, Juan M.; Higuero, Daniel; Carretero, Jesus

    2010-05-01

    Institutions such as NASA, ESA or JAXA find solutions to distribute data from their missions to the scientific community, and their long term archives. This is a complex problem, as it includes a vast amount of data, several geographically distributed archives, heterogeneous architectures with heterogeneous networks, and users spread around the world. We propose a novel architecture (HIDDRA) that solves this problem aiming to reduce user intervention in data acquisition and processing. HIDDRA is a modular system that provides a highly efficient parallel multiprotocol download engine, using a publish/subscribe policy which helps the final user to obtain data of interest transparently. Our system can deal simultaneously with multiple protocols (HTTP,HTTPS, FTP, GridFTP among others) to obtain the maximum bandwidth, reducing the workload in data server and increasing flexibility. It can also provide high reliability and fault tolerance, as several sources of data can be used to perform one file download. HIDDRA architecture can be arranged into a data distribution network deployed on several sites that can cooperate to provide former features. HIDDRA has been addressed by the 2009 e-IRG Report on Data Management as a promising initiative for data interoperability. Our first prototype has been evaluated in collaboration with the ESAC centre in Villafranca del Castillo (Spain) that shows a high scalability and performance, opening a wide spectrum of opportunities. Some preliminary results have been published in the Journal of Astrophysics and Space Science [1]. [1] D. Higuero, J.M. Tirado, J. Carretero, F. Félix, and A. de La Fuente. HIDDRA: a highly independent data distribution and retrieval architecture for space observation missions. Astrophysics and Space Science, 321(3):169-175, 2009

  20. High-performance laboratories and cleanrooms; TOPICAL

    International Nuclear Information System (INIS)

    Tschudi, William; Sartor, Dale; Mills, Evan; Xu, Tengfang

    2002-01-01

    The California Energy Commission sponsored this roadmap to guide energy efficiency research and deployment for high performance cleanrooms and laboratories. Industries and institutions utilizing these building types (termed high-tech buildings) have played an important part in the vitality of the California economy. This roadmap's key objective to present a multi-year agenda to prioritize and coordinate research efforts. It also addresses delivery mechanisms to get the research products into the market. Because of the importance to the California economy, it is appropriate and important for California to take the lead in assessing the energy efficiency research needs, opportunities, and priorities for this market. In addition to the importance to California's economy, energy demand for this market segment is large and growing (estimated at 9400 GWH for 1996, Mills et al. 1996). With their 24hr. continuous operation, high tech facilities are a major contributor to the peak electrical demand. Laboratories and cleanrooms constitute the high tech building market, and although each building type has its unique features, they are similar in that they are extremely energy intensive, involve special environmental considerations, have very high ventilation requirements, and are subject to regulations-primarily safety driven-that tend to have adverse energy implications. High-tech buildings have largely been overlooked in past energy efficiency research. Many industries and institutions utilize laboratories and cleanrooms. As illustrated, there are many industries operating cleanrooms in California. These include semiconductor manufacturing, semiconductor suppliers, pharmaceutical, biotechnology, disk drive manufacturing, flat panel displays, automotive, aerospace, food, hospitals, medical devices, universities, and federal research facilities

  1. Simulation and experimental validation of the performance of a absorption refrigerator

    International Nuclear Information System (INIS)

    Olbricht, Michael; Luke, Andrea

    2015-01-01

    The two biggest obstacles to a stronger market penetration of absorption refrigerators are their high cost and the size of the apparatus, which are due to the inaccurate methods for plant design. In order to contribute to an improved design a thermodynamic model is presented to describe the performance of a absorption refrigerator with the working fluid water/lithium. In this model, the processes are displayed in the single apparatus and coupled to each other in the systemic context. Thereby the interactions between the apparatus can specifically investigated and thus the process limiting component can be identified under the respective conditions. A validation of the simulation model and the boundary conditions used is done based on experimental data operating a self-developed absorption refrigerator. In the simulation, the heat transfer surfaces in accordance with the real system can be specified. The heat transport is taken into account based on typical values for the heat transfer in the individual apparatuses. Simulation results show good agreement with the experimental data. The physical relationships and influences externally defined operating parameters are correctly reproduced. Due to the chosen low heat transfer coefficient, the calculated cooling capacities by the model are below the experimentally measured. Finally, the possibilities and limitations are discussed by using the model and further improvement possibilities are suggested. [de

  2. Validation of High Displacement Piezoelectric Actuator Finite Element Models

    Science.gov (United States)

    Taleghani, B. K.

    2000-01-01

    The paper presents the results obtained by using NASTRAN(Registered Trademark) and ANSYS(Regitered Trademark) finite element codes to predict doming of the THUNDER piezoelectric actuators during the manufacturing process and subsequent straining due to an applied input voltage. To effectively use such devices in engineering applications, modeling and characterization are essential. Length, width, dome height, and thickness are important parameters for users of such devices. Therefore, finite element models were used to assess the effects of these parameters. NASTRAN(Registered Trademark) and ANSYS(Registered Trademark) used different methods for modeling piezoelectric effects. In NASTRAN(Registered Trademark), a thermal analogy was used to represent voltage at nodes as equivalent temperatures, while ANSYS(Registered Trademark) processed the voltage directly using piezoelectric finite elements. The results of finite element models were validated by using the experimental results.

  3. Transport in JET high performance plasmas

    International Nuclear Information System (INIS)

    2001-01-01

    Two type of high performance scenarios have been produced in JET during DTE1 campaign. One of them is the well known and extensively used in the past ELM-free hot ion H-mode scenario which has two distinct regions- plasma core and the edge transport barrier. The results obtained during DTE-1 campaign with D, DT and pure T plasmas confirms our previous conclusion that the core transport scales as a gyroBohm in the inner half of plasma volume, recovers its Bohm nature closer to the separatrix and behaves as ion neoclassical in the transport barrier. Measurements on the top of the barrier suggest that the width of the barrier is dependent upon isotope and moreover suggest that fast ions play a key role. The other high performance scenario is a relatively recently developed Optimised Shear Scenario with small or slightly negative magnetic shear in plasma core. Different mechanisms of Internal Transport Barrier (ITB) formation have been tested by predictive modelling and the results are compared with experimentally observed phenomena. The experimentally observed non-penetration of the heavy impurities through the strong ITB which contradicts to a prediction of the conventional neo-classical theory is discussed. (author)

  4. High-performance computing for airborne applications

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Manuzatto, Andrea; Fairbanks, Tom; Dallmann, Nicholas; Desgeorges, Rose

    2010-01-01

    Recently, there has been attempts to move common satellite tasks to unmanned aerial vehicles (UAVs). UAVs are significantly cheaper to buy than satellites and easier to deploy on an as-needed basis. The more benign radiation environment also allows for an aggressive adoption of state-of-the-art commercial computational devices, which increases the amount of data that can be collected. There are a number of commercial computing devices currently available that are well-suited to high-performance computing. These devices range from specialized computational devices, such as field-programmable gate arrays (FPGAs) and digital signal processors (DSPs), to traditional computing platforms, such as microprocessors. Even though the radiation environment is relatively benign, these devices could be susceptible to single-event effects. In this paper, we will present radiation data for high-performance computing devices in a accelerated neutron environment. These devices include a multi-core digital signal processor, two field-programmable gate arrays, and a microprocessor. From these results, we found that all of these devices are suitable for many airplane environments without reliability problems.

  5. Transport in JET high performance plasmas

    International Nuclear Information System (INIS)

    1999-01-01

    Two type of high performance scenarios have been produced in JET during DTE1 campaign. One of them is the well known and extensively used in the past ELM-free hot ion H-mode scenario which has two distinct regions- plasma core and the edge transport barrier. The results obtained during DTE-1 campaign with D, DT and pure T plasmas confirms our previous conclusion that the core transport scales as a gyroBohm in the inner half of plasma volume, recovers its Bohm nature closer to the separatrix and behaves as ion neoclassical in the transport barrier. Measurements on the top of the barrier suggest that the width of the barrier is dependent upon isotope and moreover suggest that fast ions play a key role. The other high performance scenario is a relatively recently developed Optimised Shear Scenario with small or slightly negative magnetic shear in plasma core. Different mechanisms of Internal Transport Barrier (ITB) formation have been tested by predictive modelling and the results are compared with experimentally observed phenomena. The experimentally observed non-penetration of the heavy impurities through the strong ITB which contradicts to a prediction of the conventional neo-classical theory is discussed. (author)

  6. High-performance vertical organic transistors.

    Science.gov (United States)

    Kleemann, Hans; Günther, Alrun A; Leo, Karl; Lüssem, Björn

    2013-11-11

    Vertical organic thin-film transistors (VOTFTs) are promising devices to overcome the transconductance and cut-off frequency restrictions of horizontal organic thin-film transistors. The basic physical mechanisms of VOTFT operation, however, are not well understood and VOTFTs often require complex patterning techniques using self-assembly processes which impedes a future large-area production. In this contribution, high-performance vertical organic transistors comprising pentacene for p-type operation and C60 for n-type operation are presented. The static current-voltage behavior as well as the fundamental scaling laws of such transistors are studied, disclosing a remarkable transistor operation with a behavior limited by injection of charge carriers. The transistors are manufactured by photolithography, in contrast to other VOTFT concepts using self-assembled source electrodes. Fluorinated photoresist and solvent compounds allow for photolithographical patterning directly and strongly onto the organic materials, simplifying the fabrication protocol and making VOTFTs a prospective candidate for future high-performance applications of organic transistors. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Development and validation of trauma surgical skills metrics: Preliminary assessment of performance after training.

    Science.gov (United States)

    Shackelford, Stacy; Garofalo, Evan; Shalin, Valerie; Pugh, Kristy; Chen, Hegang; Pasley, Jason; Sarani, Babak; Henry, Sharon; Bowyer, Mark; Mackenzie, Colin F

    2015-07-01

    Maintaining trauma-specific surgical skills is an ongoing challenge for surgical training programs. An objective assessment of surgical skills is needed. We hypothesized that a validated surgical performance assessment tool could detect differences following a training intervention. We developed surgical performance assessment metrics based on discussion with expert trauma surgeons, video review of 10 experts and 10 novice surgeons performing three vascular exposure procedures and lower extremity fasciotomy on cadavers, and validated the metrics with interrater reliability testing by five reviewers blinded to level of expertise and a consensus conference. We tested these performance metrics in 12 surgical residents (Year 3-7) before and 2 weeks after vascular exposure skills training in the Advanced Surgical Skills for Exposure in Trauma (ASSET) course. Performance was assessed in three areas as follows: knowledge (anatomic, management), procedure steps, and technical skills. Time to completion of procedures was recorded, and these metrics were combined into a single performance score, the Trauma Readiness Index (TRI). Wilcoxon matched-pairs signed-ranks test compared pretraining/posttraining effects. Mean time to complete procedures decreased by 4.3 minutes (from 13.4 minutes to 9.1 minutes). The performance component most improved by the 1-day skills training was procedure steps, completion of which increased by 21%. Technical skill scores improved by 12%. Overall knowledge improved by 3%, with 18% improvement in anatomic knowledge. TRI increased significantly from 50% to 64% with ASSET training. Interrater reliability of the surgical performance assessment metrics was validated with single intraclass correlation coefficient of 0.7 to 0.98. A trauma-relevant surgical performance assessment detected improvements in specific procedure steps and anatomic knowledge taught during a 1-day course, quantified by the TRI. ASSET training reduced time to complete vascular

  8. Performance of the CMS High Level Trigger

    CERN Document Server

    Perrotta, Andrea

    2015-01-01

    The CMS experiment has been designed with a 2-level trigger system. The first level is implemented using custom-designed electronics. The second level is the so-called High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. For Run II of the Large Hadron Collider, the increases in center-of-mass energy and luminosity will raise the event rate to a level challenging for the HLT algorithms. The increase in the number of interactions per bunch crossing, on average 25 in 2012, and expected to be around 40 in Run II, will be an additional complication. We present here the expected performance of the main triggers that will be used during the 2015 data taking campaign, paying particular attention to the new approaches that have been developed to cope with the challenges of the new run. This includes improvements in HLT electron and photon reconstruction as well as better performing muon triggers. We will also present the performance of the improved trac...

  9. Development of a High Performance Spacer Grid

    Energy Technology Data Exchange (ETDEWEB)

    Song, Kee Nam; Song, K. N.; Yoon, K. H. (and others)

    2007-03-15

    A spacer grid in a LWR fuel assembly is a key structural component to support fuel rods and to enhance the heat transfer from the fuel rod to the coolant. In this research, the main research items are the development of inherent and high performance spacer grid shapes, the establishment of mechanical/structural analysis and test technology, and the set-up of basic test facilities for the spacer grid. The main research areas and results are as follows. 1. 18 different spacer grid candidates have been invented and applied for domestic and US patents. Among the candidates 16 are chosen from the patent. 2. Two kinds of spacer grids are finally selected for the advanced LWR fuel after detailed performance tests on the candidates and commercial spacer grids from a mechanical/structural point of view. According to the test results the features of the selected spacer grids are better than those of the commercial spacer grids. 3. Four kinds of basic test facilities are set up and the relevant test technologies are established. 4. Mechanical/structural analysis models and technology for spacer grid performance are developed and the analysis results are compared with the test results to enhance the reliability of the models.

  10. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost which quickly becomes intractable with the current explosion of data sizes. In this work we reduce this complexity to quadratic with the synergy of two algorithms that gracefully complement each other and lead to a radically different approach. First, we turned to stochastic estimation of the diagonal. This allowed us to cast the problem as a linear system with a relatively small number of multiple right hand sides. Second, for this linear system we developed a novel, mixed precision, iterative refinement scheme, which uses iterative solvers instead of matrix factorizations. We demonstrate that the new framework not only achieves the much needed quadratic cost but in addition offers excellent opportunities for scaling at massively parallel environments. We based our implementation on BLAS 3 kernels that ensure very high processor performance. We achieved a peak performance of 730 TFlops on 72 BG/P racks, with a sustained performance 73% of theoretical peak. We stress that the techniques presented in this work are quite general and applicable to several other important applications. Copyright © 2009 ACM.

  11. Energy Efficient Graphene Based High Performance Capacitors.

    Science.gov (United States)

    Bae, Joonwon; Kwon, Oh Seok; Lee, Chang-Soo

    2017-07-10

    Graphene (GRP) is an interesting class of nano-structured electronic materials for various cutting-edge applications. To date, extensive research activities have been performed on the investigation of diverse properties of GRP. The incorporation of this elegant material can be very lucrative in terms of practical applications in energy storage/conversion systems. Among various those systems, high performance electrochemical capacitors (ECs) have become popular due to the recent need for energy efficient and portable devices. Therefore, in this article, the application of GRP for capacitors is described succinctly. In particular, a concise summary on the previous research activities regarding GRP based capacitors is also covered extensively. It was revealed that a lot of secondary materials such as polymers and metal oxides have been introduced to improve the performance. Also, diverse devices have been combined with capacitors for better use. More importantly, recent patents related to the preparation and application of GRP based capacitors are also introduced briefly. This article can provide essential information for future study. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  12. SISYPHUS: A high performance seismic inversion factory

    Science.gov (United States)

    Gokhberg, Alexey; Simutė, Saulė; Boehm, Christian; Fichtner, Andreas

    2016-04-01

    In the recent years the massively parallel high performance computers became the standard instruments for solving the forward and inverse problems in seismology. The respective software packages dedicated to forward and inverse waveform modelling specially designed for such computers (SPECFEM3D, SES3D) became mature and widely available. These packages achieve significant computational performance and provide researchers with an opportunity to solve problems of bigger size at higher resolution within a shorter time. However, a typical seismic inversion process contains various activities that are beyond the common solver functionality. They include management of information on seismic events and stations, 3D models, observed and synthetic seismograms, pre-processing of the observed signals, computation of misfits and adjoint sources, minimization of misfits, and process workflow management. These activities are time consuming, seldom sufficiently automated, and therefore represent a bottleneck that can substantially offset performance benefits provided by even the most powerful modern supercomputers. Furthermore, a typical system architecture of modern supercomputing platforms is oriented towards the maximum computational performance and provides limited standard facilities for automation of the supporting activities. We present a prototype solution that automates all aspects of the seismic inversion process and is tuned for the modern massively parallel high performance computing systems. We address several major aspects of the solution architecture, which include (1) design of an inversion state database for tracing all relevant aspects of the entire solution process, (2) design of an extensible workflow management framework, (3) integration with wave propagation solvers, (4) integration with optimization packages, (5) computation of misfits and adjoint sources, and (6) process monitoring. The inversion state database represents a hierarchical structure with

  13. Ultra high performance concrete dematerialization study

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-03-01

    Concrete is the most widely used building material in the world and its use is expected to grow. It is well recognized that the production of portland cement results in the release of large amounts of carbon dioxide, a greenhouse gas (GHG). The main challenge facing the industry is to produce concrete in an environmentally sustainable manner. Reclaimed industrial by-proudcts such as fly ash, silica fume and slag can reduce the amount of portland cement needed to make concrete, thereby reducing the amount of GHGs released to the atmosphere. The use of these supplementary cementing materials (SCM) can also enhance the long-term strength and durability of concrete. The intention of the EcoSmart{sup TM} Concrete Project is to develop sustainable concrete through innovation in supply, design and construction. In particular, the project focuses on finding a way to minimize the GHG signature of concrete by maximizing the replacement of portland cement in the concrete mix with SCM while improving the cost, performance and constructability. This paper describes the use of Ductal{sup R} Ultra High Performance Concrete (UHPC) for ramps in a condominium. It examined the relationship between the selection of UHPC and the overall environmental performance, cost, constructability maintenance and operational efficiency as it relates to the EcoSmart Program. The advantages and challenges of using UHPC were outlined. In addition to its very high strength, UHPC has been shown to have very good potential for GHG emission reduction due to the reduced material requirements, reduced transport costs and increased SCM content. refs., tabs., figs.

  14. Validation of the Monte Carlo Criticality Program KENO V. a for highly-enriched uranium systems

    Energy Technology Data Exchange (ETDEWEB)

    Knight, J.R.

    1984-11-01

    A series of calculations based on critical experiments have been performed using the KENO V.a Monte Carlo Criticality Program for the purpose of validating KENO V.a for use in evaluating Y-12 Plant criticality problems. The experiments were reflected and unreflected systems of single units and arrays containing highly enriched uranium metal or uranium compounds. Various geometrical shapes were used in the experiments. The SCALE control module CSAS25 with the 27-group ENDF/B-4 cross-section library was used to perform the calculations. Some of the experiments were also calculated using the 16-group Hansen-Roach Library. Results are presented in a series of tables and discussed. Results show that the criteria established for the safe application of the KENO IV program may also be used for KENO V.a results.

  15. Numerical modeling and validation of helium jet impingement cooling of high heat flux divertor components

    International Nuclear Information System (INIS)

    Koncar, Bostjan; Simonovski, Igor; Norajitra, Prachai

    2009-01-01

    Numerical analyses of jet impingement cooling presented in this paper were performed as a part of helium-cooled divertor studies for post-ITER generation of fusion reactors. The cooling ability of divertor cooled by multiple helium jets was analysed. Thermal-hydraulic characteristics and temperature distributions in the solid structures were predicted for the reference geometry of one cooling finger. To assess numerical errors, different meshes (hexagonal, tetra, tetra-prism) and discretisation schemes were used. The temperatures in the solid structures decrease with finer mesh and higher order discretisation and converge towards finite values. Numerical simulations were validated against high heat flux experiments, performed at Efremov Institute, St. Petersburg. The predicted design parameters show reasonable agreement with measured data. The calculated maximum thimble temperature was below the tile-thimble brazing temperature, indicating good heat removal capability of reference divertor design. (author)

  16. JT-60U high performance regimes

    International Nuclear Information System (INIS)

    Ishida, S.

    1999-01-01

    High performance regimes of JT-60U plasmas are presented with an emphasis upon the results from the use of a semi-closed pumped divertor with W-shaped geometry. Plasma performance in transient and quasi steady states has been significantly improved in reversed shear and high- βp regimes. The reversed shear regime elevated an equivalent Q DT eq transiently up to 1.25 (n D (0)τ E T i (0)=8.6x10 20 m-3·s·keV) in a reactor-relevant thermonuclear dominant regime. Long sustainment of enhanced confinement with internal transport barriers (ITBs) with a fully non-inductive current drive in a reversed shear discharge was successfully demonstrated with LH wave injection. Performance sustainment has been extended in the high- bp regime with a high triangularity achieving a long sustainment of plasma conditions equivalent to Q DT eq ∼0.16 (n D (0)τ E T i (0)∼1.4x10 20 m -3 ·s·keV) for ∼4.5 s with a large non-inductive current drive fraction of 60-70% of the plasma current. Thermal and particle transport analyses show significant reduction of thermal and particle diffusivities around ITB resulting in a strong Er shear in the ITB region. The W-shaped divertor is effective for He ash exhaust demonstrating steady exhaust capability of τ He */τ E ∼3-10 in support of ITER. Suppression of neutral back flow and chemical sputtering effect have been observed while MARFE onset density is rather decreased. Negative-ion based neutral beam injection (N-NBI) experiments have created a clear H-mode transition. Enhanced ionization cross- section due to multi-step ionization processes was confirmed as theoretically predicted. A current density profile driven by N-NBI is measured in a good agreement with theoretical prediction. N-NBI induced TAE modes characterized as persistent and bursting oscillations have been observed from a low hot beta of h >∼0.1-0.2% without a significant loss of fast ions. (author)

  17. Performance validation of commercially available mobile waste-assay systems: Preliminary report

    Energy Technology Data Exchange (ETDEWEB)

    Schanfein, M.; Bonner, C.; Maez, R. [Los Alamos National Lab., NM (United States)] [and others

    1997-11-01

    Prior to disposal, nuclear waste must be accurately characterized to identify and quantify the radioactive content to reduce the radioactive hazard to the public. Validation of the waste-assay systems` performance is critical for establishing the credibility of the assay results for storage and disposal purposes. Canberra Nuclear has evaluated regulations worldwide and identified standard, modular, neutron- and gamma-waste-assay systems that can be used to characterize a large portion of existing and newly generated transuranic (TRU) and low-level waste. Before making claims of guaranteeing any system`s performance for specific waste types, the standardized systems` performance be evaluated. 7 figs., 11 tabs.

  18. Performance validation of commercially available mobile waste-assay systems: Preliminary report

    International Nuclear Information System (INIS)

    Schanfein, M.; Bonner, C.; Maez, R.

    1997-01-01

    Prior to disposal, nuclear waste must be accurately characterized to identify and quantify the radioactive content to reduce the radioactive hazard to the public. Validation of the waste-assay systems' performance is critical for establishing the credibility of the assay results for storage and disposal purposes. Canberra Nuclear has evaluated regulations worldwide and identified standard, modular, neutron- and gamma-waste-assay systems that can be used to characterize a large portion of existing and newly generated transuranic (TRU) and low-level waste. Before making claims of guaranteeing any system's performance for specific waste types, the standardized systems' performance be evaluated. 7 figs., 11 tabs

  19. High-performance phase-field modeling

    KAUST Repository

    Vignal, Philippe

    2015-04-27

    Many processes in engineering and sciences involve the evolution of interfaces. Among the mathematical frameworks developed to model these types of problems, the phase-field method has emerged as a possible solution. Phase-fields nonetheless lead to complex nonlinear, high-order partial differential equations, whose solution poses mathematical and computational challenges. Guaranteeing some of the physical properties of the equations has lead to the development of efficient algorithms and discretizations capable of recovering said properties by construction [2, 5]. This work builds-up on these ideas, and proposes novel discretization strategies that guarantee numerical energy dissipation for both conserved and non-conserved phase-field models. The temporal discretization is based on a novel method which relies on Taylor series and ensures strong energy stability. It is second-order accurate, and can also be rendered linear to speed-up the solution process [4]. The spatial discretization relies on Isogeometric Analysis, a finite element method that possesses the k-refinement technology and enables the generation of high-order, high-continuity basis functions. These basis functions are well suited to handle the high-order operators present in phase-field models. Two-dimensional and three dimensional results of the Allen-Cahn, Cahn-Hilliard, Swift-Hohenberg and phase-field crystal equation will be presented, which corroborate the theoretical findings, and illustrate the robustness of the method. Results related to more challenging examples, namely the Navier-Stokes Cahn-Hilliard and a diusion-reaction Cahn-Hilliard system, will also be presented. The implementation was done in PetIGA and PetIGA-MF, high-performance Isogeometric Analysis frameworks [1, 3], designed to handle non-linear, time-dependent problems.

  20. High performance visual display for HENP detectors

    International Nuclear Information System (INIS)

    McGuigan, Michael; Smith, Gordon; Spiletic, John; Fine, Valeri; Nevski, Pavel

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactive control, including the ability to slice, search and mark areas of the detector. We incorporate the ability to make a high quality still image of a view of the detector and the ability to generate animations and a fly through of the detector and output these to MPEG or VRML models. We develop data compression hardware and software so that remote interactive visualization will be possible among dispersed collaborators. We obtain real time visual display for events accumulated during simulations

  1. Development of high performance ODS alloys

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Lin [Texas A & M Univ., College Station, TX (United States); Gao, Fei [Univ. of Michigan, Ann Arbor, MI (United States); Garner, Frank [Texas A & M Univ., College Station, TX (United States)

    2018-01-29

    This project aims to capitalize on insights developed from recent high-dose self-ion irradiation experiments in order to develop and test the next generation of optimized ODS alloys needed to meet the nuclear community's need for high strength, radiation-tolerant cladding and core components, especially with enhanced resistance to void swelling. Two of these insights are that ferrite grains swell earlier than tempered martensite grains, and oxide dispersions currently produced only in ferrite grains require a high level of uniformity and stability to be successful. An additional insight is that ODS particle stability is dependent on as-yet unidentified compositional combinations of dispersoid and alloy matrix, such as dispersoids are stable in MA957 to doses greater than 200 dpa but dissolve in MA956 at doses less than 200 dpa. These findings focus attention on candidate next-generation alloys which address these concerns. Collaboration with two Japanese groups provides this project with two sets of first-round candidate alloys that have already undergone extensive development and testing for unirradiated properties, but have not yet been evaluated for their irradiation performance. The first set of candidate alloys are dual phase (ferrite + martensite) ODS alloys with oxide particles uniformly distributed in both ferrite and martensite phases. The second set of candidate alloys are ODS alloys containing non-standard dispersoid compositions with controllable oxide particle sizes, phases and interfaces.

  2. Low-Cost High-Performance MRI

    Science.gov (United States)

    Sarracanie, Mathieu; Lapierre, Cristen D.; Salameh, Najat; Waddington, David E. J.; Witzel, Thomas; Rosen, Matthew S.

    2015-10-01

    Magnetic Resonance Imaging (MRI) is unparalleled in its ability to visualize anatomical structure and function non-invasively with high spatial and temporal resolution. Yet to overcome the low sensitivity inherent in inductive detection of weakly polarized nuclear spins, the vast majority of clinical MRI scanners employ superconducting magnets producing very high magnetic fields. Commonly found at 1.5-3 tesla (T), these powerful magnets are massive and have very strict infrastructure demands that preclude operation in many environments. MRI scanners are costly to purchase, site, and maintain, with the purchase price approaching $1 M per tesla (T) of magnetic field. We present here a remarkably simple, non-cryogenic approach to high-performance human MRI at ultra-low magnetic field, whereby modern under-sampling strategies are combined with fully-refocused dynamic spin control using steady-state free precession techniques. At 6.5 mT (more than 450 times lower than clinical MRI scanners) we demonstrate (2.5 × 3.5 × 8.5) mm3 imaging resolution in the living human brain using a simple, open-geometry electromagnet, with 3D image acquisition over the entire brain in 6 minutes. We contend that these practical ultra-low magnetic field implementations of MRI (standards for affordable (<$50,000) and robust portable devices.

  3. The prone bridge test: Performance, validity, and reliability among older and younger adults.

    Science.gov (United States)

    Bohannon, Richard W; Steffl, Michal; Glenney, Susan S; Green, Michelle; Cashwell, Leah; Prajerova, Kveta; Bunn, Jennifer

    2018-04-01

    The prone bridge maneuver, or plank, has been viewed as a potential alternative to curl-ups for assessing trunk muscle performance. The purpose of this study was to assess prone bridge test performance, validity, and reliability among younger and older adults. Sixty younger (20-35 years old) and 60 older (60-79 years old) participants completed this study. Groups were evenly divided by sex. Participants completed surveys regarding physical activity and abdominal exercise participation. Height, weight, body mass index (BMI), and waist circumference were measured. On two occasions, 5-9 days apart, participants held a prone bridge until volitional exhaustion or until repeated technique failure. Validity was examined using data from the first session: convergent validity by calculating correlations between survey responses, anthropometrics, and prone bridge time, known groups validity by using an ANOVA comparing bridge times of younger and older adults and of men and women. Test-retest reliability was examined by using a paired t-test to compare prone bridge times for Session1 and Session 2. Furthermore, an intraclass correlation coefficient (ICC) was used to characterize relative reliability and minimal detectable change (MDC 95% ) was used to describe absolute reliability. The mean prone bridge time was 145.3 ± 71.5 s, and was positively correlated with physical activity participation (p ≤ 0.001) and negatively correlated with BMI and waist circumference (p ≤ 0.003). Younger participants had significantly longer plank times than older participants (p = 0.003). The ICC between testing sessions was 0.915. The prone bridge test is a valid and reliable measure for evaluating abdominal performance in both younger and older adults. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Validation of a RANS transition model using a high-order weighted compact nonlinear scheme

    Science.gov (United States)

    Tu, GuoHua; Deng, XiaoGang; Mao, MeiLiang

    2013-04-01

    A modified transition model is given based on the shear stress transport (SST) turbulence model and an intermittency transport equation. The energy gradient term in the original model is replaced by flow strain rate to saving computational costs. The model employs local variables only, and then it can be conveniently implemented in modern computational fluid dynamics codes. The fifth-order weighted compact nonlinear scheme and the fourth-order staggered scheme are applied to discrete the governing equations for the purpose of minimizing discretization errors, so as to mitigate the confusion between numerical errors and transition model errors. The high-order package is compared with a second-order TVD method on simulating the transitional flow of a flat plate. Numerical results indicate that the high-order package give better grid convergence property than that of the second-order method. Validation of the transition model is performed for transitional flows ranging from low speed to hypersonic speed.

  5. High Performance Computing in Science and Engineering '15 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2015. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  6. High Performance Computing in Science and Engineering '17 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael; HLRS 2017

    2018-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2017. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance.The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  7. aMAP is a validated pipeline for registration and segmentation of high-resolution mouse brain data

    Science.gov (United States)

    Niedworok, Christian J.; Brown, Alexander P. Y.; Jorge Cardoso, M.; Osten, Pavel; Ourselin, Sebastien; Modat, Marc; Margrie, Troy W.

    2016-01-01

    The validation of automated image registration and segmentation is crucial for accurate and reliable mapping of brain connectivity and function in three-dimensional (3D) data sets. While validation standards are necessarily high and routinely met in the clinical arena, they have to date been lacking for high-resolution microscopy data sets obtained from the rodent brain. Here we present a tool for optimized automated mouse atlas propagation (aMAP) based on clinical registration software (NiftyReg) for anatomical segmentation of high-resolution 3D fluorescence images of the adult mouse brain. We empirically evaluate aMAP as a method for registration and subsequent segmentation by validating it against the performance of expert human raters. This study therefore establishes a benchmark standard for mapping the molecular function and cellular connectivity of the rodent brain. PMID:27384127

  8. Thermal interface pastes nanostructured for high performance

    Science.gov (United States)

    Lin, Chuangang

    Thermal interface materials in the form of pastes are needed to improve thermal contacts, such as that between a microprocessor and a heat sink of a computer. High-performance and low-cost thermal pastes have been developed in this dissertation by using polyol esters as the vehicle and various nanoscale solid components. The proportion of a solid component needs to be optimized, as an excessive amount degrades the performance, due to the increase in the bond line thickness. The optimum solid volume fraction tends to be lower when the mating surfaces are smoother, and higher when the thermal conductivity is higher. Both a low bond line thickness and a high thermal conductivity help the performance. When the surfaces are smooth, a low bond line thickness can be even more important than a high thermal conductivity, as shown by the outstanding performance of the nanoclay paste of low thermal conductivity in the smooth case (0.009 mum), with the bond line thickness less than 1 mum, as enabled by low storage modulus G', low loss modulus G" and high tan delta. However, for rough surfaces, the thermal conductivity is important. The rheology affects the bond line thickness, but it does not correlate well with the performance. This study found that the structure of carbon black is an important parameter that governs the effectiveness of a carbon black for use in a thermal paste. By using a carbon black with a lower structure (i.e., a lower DBP value), a thermal paste that is more effective than the previously reported carbon black paste was obtained. Graphite nanoplatelet (GNP) was found to be comparable in effectiveness to carbon black (CB) pastes for rough surfaces, but it is less effective for smooth surfaces. At the same filler volume fraction, GNP gives higher thermal conductivity than carbon black paste. At the same pressure, GNP gives higher bond line thickness than CB (Tokai or Cabot). The effectiveness of GNP is limited, due to the high bond line thickness. A

  9. High performance liquid chromatography in pharmaceutical analyses

    Directory of Open Access Journals (Sweden)

    Branko Nikolin

    2004-05-01

    Full Text Available In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatographyreplaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography(HPLC analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1 Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or

  10. Combining high productivity with high performance on commodity hardware

    DEFF Research Database (Denmark)

    Skovhede, Kenneth

    -like compiler for translating CIL bytecode on the CELL-BE. I then introduce a bytecode converter that transforms simple loops in Java bytecode to GPGPU capable code. I then introduce the numeric library for the Common Intermediate Language, NumCIL. I can then utilizing the vector programming model from Num......CIL and map this to the Bohrium framework. The result is a complete system that gives the user a choice of high-level languages with no explicit parallelism, yet seamlessly performs efficient execution on a number of hardware setups....

  11. Integrating advanced facades into high performance buildings

    International Nuclear Information System (INIS)

    Selkowitz, Stephen E.

    2001-01-01

    Glass is a remarkable material but its functionality is significantly enhanced when it is processed or altered to provide added intrinsic capabilities. The overall performance of glass elements in a building can be further enhanced when they are designed to be part of a complete facade system. Finally the facade system delivers the greatest performance to the building owner and occupants when it becomes an essential element of a fully integrated building design. This presentation examines the growing interest in incorporating advanced glazing elements into more comprehensive facade and building systems in a manner that increases comfort, productivity and amenity for occupants, reduces operating costs for building owners, and contributes to improving the health of the planet by reducing overall energy use and negative environmental impacts. We explore the role of glazing systems in dynamic and responsive facades that provide the following functionality: Enhanced sun protection and cooling load control while improving thermal comfort and providing most of the light needed with daylighting; Enhanced air quality and reduced cooling loads using natural ventilation schemes employing the facade as an active air control element; Reduced operating costs by minimizing lighting, cooling and heating energy use by optimizing the daylighting-thermal tradeoffs; Net positive contributions to the energy balance of the building using integrated photovoltaic systems; Improved indoor environments leading to enhanced occupant health, comfort and performance. In addressing these issues facade system solutions must, of course, respect the constraints of latitude, location, solar orientation, acoustics, earthquake and fire safety, etc. Since climate and occupant needs are dynamic variables, in a high performance building the facade solution have the capacity to respond and adapt to these variable exterior conditions and to changing occupant needs. This responsive performance capability

  12. The need for high performance breeder reactors

    International Nuclear Information System (INIS)

    Vaughan, R.D.; Chermanne, J.

    1977-01-01

    It can be easily demonstrated, on the basis of realistic estimates of continued high oil costs, that an increasing portion of the growth in energy demand must be supplied by nuclear power and that this one might account for 20% of all the energy production by the end of the century. Such assumptions lead very quickly to the conclusion that the discovery, extraction and processing of the uranium will not be able to follow the demand; the bottleneck will essentially be related to the rate at which the ore can be discovered and extracted, and not to the existing quantities nor their grade. Figures as high as 150.000 T/annum and more would be quickly reached, and it is necessary to wonder already now if enough capital can be attracted to meet these requirements. There is only one solution to this problem: improve the conversion ratio of the nuclear system and quickly reach the breeding; this would lead to the reduction of the natural uranium consumption by a factor of about 50. However, this condition is not sufficient; the commercial breeder must have a breeding gain as high as possible because the Pu out-of-pile time and the Pu losses in the cycle could lead to an unacceptable doubling time for the system, if the breeding gain is too low. That is the reason why it is vital to develop high performance breeder reactors. The present paper indicates how the Gas-cooled Breeder Reactor [GBR] can meet the problems mentioned above, on the basis of recent and realistic studies. It briefly describes the present status of GBR development, from the predecessors in the gas cooled reactor line, particularly the AGR. It shows how the GBR fuel takes mostly profit from the LMFBR fuel irradiation experience. It compares the GBR performance on a consistent basis with that of the LMFBR. The GBR capital and fuel cycle costs are compared with those of thermal and fast reactors respectively. The conclusion is, based on a cost-benefit study, that the GBR must be quickly developed in order

  13. High performance nano-composite technology development

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Whung Whoe; Rhee, C. K.; Kim, S. J.; Park, S. D. [KAERI, Taejon (Korea, Republic of); Kim, E. K.; Jung, S. Y.; Ryu, H. J. [KRICT, Taejon (Korea, Republic of); Hwang, S. S.; Kim, J. K.; Hong, S. M. [KIST, Taejon (Korea, Republic of); Chea, Y. B. [KIGAM, Taejon (Korea, Republic of); Choi, C. H.; Kim, S. D. [ATS, Taejon (Korea, Republic of); Cho, B. G.; Lee, S. H. [HGREC, Taejon (Korea, Republic of)

    1999-06-15

    The trend of new material development are being to carried out not only high performance but also environmental attraction. Especially nano composite material which enhances the functional properties of components, extending the component life resulting to reduced the wastes and environmental contamination, has a great effect on various industrial area. The application of nano composite, depends on the polymer matrix and filler materials, has various application from semiconductor to medical field. In spite of nano composite merits, nano composite study are confined to a few special materials as a lab, scale because a few technical difficulties are still on hold. Therefore, the purpose of this study establishes the systematical planning to carried out the next generation projects on order to compete with other countries and overcome the protective policy of advanced countries with grasping over sea's development trends and our present status. (author).

  14. How to create high-performing teams.

    Science.gov (United States)

    Lam, Samuel M

    2010-02-01

    This article is intended to discuss inspirational aspects on how to lead a high-performance team. Cogent topics discussed include how to hire staff through methods of "topgrading" with reference to Geoff Smart and "getting the right people on the bus" referencing Jim Collins' work. In addition, once the staff is hired, this article covers how to separate the "eagles from the ducks" and how to inspire one's staff by creating the right culture with suggestions for further reading by Don Miguel Ruiz (The four agreements) and John Maxwell (21 Irrefutable laws of leadership). In addition, Simon Sinek's concept of "Start with Why" is elaborated to help a leader know what the core element should be with any superior culture. Thieme Medical Publishers.

  15. High performance nano-composite technology development

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Whung Whoe; Rhee, C. K.; Kim, S. J.; Park, S. D. [KAERI, Taejon (Korea, Republic of); Kim, E. K.; Jung, S. Y.; Ryu, H. J. [KRICT, Taejon (Korea, Republic of); Hwang, S. S.; Kim, J. K.; Hong, S. M. [KIST, Taejon (Korea, Republic of); Chea, Y. B. [KIGAM, Taejon (Korea, Republic of); Choi, C. H.; Kim, S. D. [ATS, Taejon (Korea, Republic of); Cho, B. G.; Lee, S. H. [HGREC, Taejon (Korea, Republic of)

    1999-06-15

    The trend of new material development are being to carried out not only high performance but also environmental attraction. Especially nano composite material which enhances the functional properties of components, extending the component life resulting to reduced the wastes and environmental contamination, has a great effect on various industrial area. The application of nano composite, depends on the polymer matrix and filler materials, has various application from semiconductor to medical field. In spite of nano composite merits, nano composite study are confined to a few special materials as a lab, scale because a few technical difficulties are still on hold. Therefore, the purpose of this study establishes the systematical planning to carried out the next generation projects on order to compete with other countries and overcome the protective policy of advanced countries with grasping over sea's development trends and our present status. (author).

  16. High performance nano-composite technology development

    International Nuclear Information System (INIS)

    Kim, Whung Whoe; Rhee, C. K.; Kim, S. J.; Park, S. D.; Kim, E. K.; Jung, S. Y.; Ryu, H. J.; Hwang, S. S.; Kim, J. K.; Hong, S. M.; Chea, Y. B.; Choi, C. H.; Kim, S. D.; Cho, B. G.; Lee, S. H.

    1999-06-01

    The trend of new material development are being to carried out not only high performance but also environmental attraction. Especially nano composite material which enhances the functional properties of components, extending the component life resulting to reduced the wastes and environmental contamination, has a great effect on various industrial area. The application of nano composite, depends on the polymer matrix and filler materials, has various application from semiconductor to medical field. In spite of nano composite merits, nano composite study are confined to a few special materials as a lab, scale because a few technical difficulties are still on hold. Therefore, the purpose of this study establishes the systematical planning to carried out the next generation projects on order to compete with other countries and overcome the protective policy of advanced countries with grasping over sea's development trends and our present status. (author).

  17. High Performance with Prescriptive Optimization and Debugging

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo

    parallelization and automatic vectorization is attractive as it transparently optimizes programs. The thesis contributes an improved dependence analysis for explicitly parallel programs. These improvements lead to more loops being vectorized, on average we achieve a speedup of 1.46 over the existing dependence...... analysis and vectorizer in GCC. Automatic optimizations often fail for theoretical and practical reasons. When they fail we argue that a hybrid approach can be effective. Using compiler feedback, we propose to use the programmer’s intuition and insight to achieve high performance. Compiler feedback...... enlightens the programmer why a given optimization was not applied, and suggest how to change the source code to make it more amenable to optimizations. We show how this can yield significant speedups and achieve 2.4 faster execution on a real industrial use case. To aid in parallel debugging we propose...

  18. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  19. Optimizing High Performance Self Compacting Concrete

    Directory of Open Access Journals (Sweden)

    Raymond A Yonathan

    2017-01-01

    Full Text Available This paper’s objectives are to learn the effect of glass powder, silica fume, Polycarboxylate Ether, and gravel to optimizing composition of each factor in making High Performance SCC. Taguchi method is proposed in this paper as best solution to minimize specimen variable which is more than 80 variations. Taguchi data analysis method is applied to provide composition, optimizing, and the effect of contributing materials for nine variable of specimens. Concrete’s workability was analyzed using Slump flow test, V-funnel test, and L-box test. Compressive and porosity test were performed for the hardened state. With a dimension of 100×200 mm the cylindrical specimens were cast for compressive test with the age of 3, 7, 14, 21, 28 days. Porosity test was conducted at 28 days. It is revealed that silica fume contributes greatly to slump flow and porosity. Coarse aggregate shows the greatest contributing factor to L-box and compressive test. However, all factors show unclear result to V-funnel test.

  20. Performance Evaluation of Spectral Clustering Algorithm using Various Clustering Validity Indices

    OpenAIRE

    M. T. Somashekara; D. Manjunatha

    2014-01-01

    In spite of the popularity of spectral clustering algorithm, the evaluation procedures are still in developmental stage. In this article, we have taken benchmarking IRIS dataset for performing comparative study of twelve indices for evaluating spectral clustering algorithm. The results of the spectral clustering technique were also compared with k-mean algorithm. The validity of the indices was also verified with accuracy and (Normalized Mutual Information) NMI score. Spectral clustering algo...

  1. The Assessment of Military Multitasking Performance: Validation of a Dual-Task and Multitask Protocol

    Science.gov (United States)

    2015-11-01

    preliminary validity of the Walking and Remembering Test. Journal of geriatric physical therapy . 2009;32(1):2-9. 23. Mancini M, Salarian A, Carlson-Kuhta P...MacMillan), American Physical Therapy Association (APTA) 2014 Annual conference, Charlotte, NC 88 August 18-21, 2014 (paper) A novel dual...Multitasking Performance for Mild TBI. Federal Section, American Physical Therapy Association’s Combined Section Meeting, (Weightman, Scherer, McCulloch

  2. High Performance Circularly Polarized Microstrip Antenna

    Science.gov (United States)

    Bondyopadhyay, Probir K. (Inventor)

    1997-01-01

    A microstrip antenna for radiating circularly polarized electromagnetic waves comprising a cluster array of at least four microstrip radiator elements, each of which is provided with dual orthogonal coplanar feeds in phase quadrature relation achieved by connection to an asymmetric T-junction power divider impedance notched at resonance. The dual fed circularly polarized reference element is positioned with its axis at a 45 deg angle with respect to the unit cell axis. The other three dual fed elements in the unit cell are positioned and fed with a coplanar feed structure with sequential rotation and phasing to enhance the axial ratio and impedance matching performance over a wide bandwidth. The centers of the radiator elements are disposed at the corners of a square with each side of a length d in the range of 0.7 to 0.9 times the free space wavelength of the antenna radiation and the radiator elements reside in a square unit cell area of sides equal to 2d and thereby permit the array to be used as a phased array antenna for electronic scanning and is realizable in a high temperature superconducting thin film material for high efficiency.

  3. NCI's Transdisciplinary High Performance Scientific Data Platform

    Science.gov (United States)

    Evans, Ben; Antony, Joseph; Bastrakova, Irina; Car, Nicholas; Cox, Simon; Druken, Kelsey; Evans, Bradley; Fraser, Ryan; Ip, Alex; Kemp, Carina; King, Edward; Minchin, Stuart; Larraondo, Pablo; Pugh, Tim; Richards, Clare; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2016-04-01

    The Australian National Computational Infrastructure (NCI) manages Earth Systems data collections sourced from several domains and organisations onto a single High Performance Data (HPD) Node to further Australia's national priority research and innovation agenda. The NCI HPD Node has rapidly established its value, currently managing over 10 PBytes of datasets from collections that span a wide range of disciplines including climate, weather, environment, geoscience, geophysics, water resources and social sciences. Importantly, in order to facilitate broad user uptake, maximise reuse and enable transdisciplinary access through software and standardised interfaces, the datasets, associated information systems and processes have been incorporated into the design and operation of a unified platform that NCI has called, the National Environmental Research Data Interoperability Platform (NERDIP). The key goal of the NERDIP is to regularise data access so that it is easily discoverable, interoperable for different domains and enabled for high performance methods. It adopts and implements international standards and data conventions, and promotes scientific integrity within a high performance computing and data analysis environment. NCI has established a rich and flexible computing environment to access to this data, through the NCI supercomputer; a private cloud that supports both domain focused virtual laboratories and in-common interactive analysis interfaces; as well as remotely through scalable data services. Data collections of this importance must be managed with careful consideration of both their current use and the needs of the end-communities, as well as its future potential use, such as transitioning to more advanced software and improved methods. It is therefore critical that the data platform is both well-managed and trusted for stable production use (including transparency and reproducibility), agile enough to incorporate new technological advances and

  4. AULA virtual reality test as an attention measure: convergent validity with Conners' Continuous Performance Test.

    Science.gov (United States)

    Díaz-Orueta, Unai; Garcia-López, Cristina; Crespo-Eguílaz, Nerea; Sánchez-Carpintero, Rocío; Climent, Gema; Narbona, Juan

    2014-01-01

    The majority of neuropsychological tests used to evaluate attention processes in children lack ecological validity. The AULA Nesplora (AULA) is a continuous performance test, developed in a virtual setting, very similar to a school classroom. The aim of the present study is to analyze the convergent validity between the AULA and the Continuous Performance Test (CPT) of Conners. The AULA and CPT were administered correlatively to 57 children, aged 6-16 years (26.3% female) with average cognitive ability (IQ mean = 100.56, SD = 10.38) who had a diagnosis of attention deficit/hyperactivity disorder (ADHD) according to DSM-IV-TR criteria. Spearman correlations analyses were conducted among the different variables. Significant correlations were observed between both tests in all the analyzed variables (omissions, commissions, reaction time, and variability of reaction time), including for those measures of the AULA based on different sensorial modalities, presentation of distractors, and task paradigms. Hence, convergent validity between both tests was confirmed. Moreover, the AULA showed differences by gender and correlation to Perceptual Reasoning and Working Memory indexes of the WISC-IV, supporting the relevance of IQ measures in the understanding of cognitive performance in ADHD. In addition, the AULA (but not Conners' CPT) was able to differentiate between ADHD children with and without pharmacological treatment for a wide range of measures related to inattention, impulsivity, processing speed, motor activity, and quality of attention focus. Additional measures and advantages of the AULA versus Conners' CPT are discussed.

  5. Incremental Validity of Personality Measures in Predicting Underwater Performance and Adaptation.

    Science.gov (United States)

    Colodro, Joaquín; Garcés-de-Los-Fayos, Enrique J; López-García, Juan J; Colodro-Conde, Lucía

    2015-03-17

    Intelligence and personality traits are currently considered effective predictors of human behavior and job performance. However, there are few studies about their relevance in the underwater environment. Data from a sample of military personnel performing scuba diving courses were analyzed with regression techniques, testing the contribution of individual differences and ascertaining the incremental validity of the personality in an environment with extreme psychophysical demands. The results confirmed the incremental validity of personality traits (ΔR 2 = .20, f 2 = .25) over the predictive contribution of general mental ability (ΔR 2 = .07, f 2 = .08) in divers' performance. Moreover, personality (R(L)2 = .34) also showed a higher validity to predict underwater adaptation than general mental ability ( R(L)2 = .09). The ROC curve indicated 86% of the maximum possible discrimination power for the prediction of underwater adaptation, AUC = .86, p personality traits as predictors of an effective response to the changing circumstances of military scuba diving. They also may improve the understanding of the behavioral effects and psychophysiological complications of diving and can also provide guidance for psychological intervention and prevention of risk in this extreme environment.

  6. Experimental validation of the buildings energy performance (PEC assessment methods with reference to occupied spaces heating

    Directory of Open Access Journals (Sweden)

    Cristian PETCU

    2010-01-01

    Full Text Available This paper is part of the series of pre-standardization research aimed to analyze the existing methods of calculating the Buildings Energy Performance (PEC in view of their correction of completing. The entire research activity aims to experimentally validate the PEC Calculation Algorithm as well as the comparative application, on the support of several case studies focused on representative buildings of the stock of buildings in Romania, of the PEC calculation methodology for buildings equipped with occupied spaces heating systems. The targets of the report are the experimental testing of the calculation models so far known (NP 048-2000, Mc 001-2006, SR EN 13790:2009, on the support provided by the CE INCERC Bucharest experimental building, together with the complex calculation algorithms specific to the dynamic modeling, for the evaluation of the occupied spaces heat demand in the cold season, specific to the traditional buildings and to modern buildings equipped with solar radiation passive systems, of the ventilated solar space type. The schedule of the measurements performed in the 2008-2009 cold season is presented as well as the primary processing of the measured data and the experimental validation of the heat demand monthly calculation methods, on the support of CE INCERC Bucharest. The calculation error per heating season (153 days of measurements between the measured heat demand and the calculated one was of 0.61%, an exceptional value confirming the phenomenological nature of the INCERC method, NP 048-2006. The mathematical model specific to the hourly thermal balance is recurrent – decisional with alternating paces. The experimental validation of the theoretical model is based on the measurements performed on the CE INCERC Bucharest building, within a time lag of 57 days (06.01-04.03.2009. The measurements performed on the CE INCERC Bucharest building confirm the accuracy of the hourly calculation model by comparison to the values

  7. Silicon Photomultiplier Performance in High ELectric Field

    Science.gov (United States)

    Montoya, J.; Morad, J.

    2016-12-01

    Roughly 27% of the universe is thought to be composed of dark matter. The Large Underground Xenon (LUX) relies on the emission of light from xenon atoms after a collision with a dark matter particle. After a particle interaction in the detector, two things can happen: the xenon will emit light and charge. The charge (electrons), in the liquid xenon needs to be pulled into the gas section so that it can interact with gas and emit light. This allows LUX to convert a single electron into many photons. This is done by applying a high voltage across the liquid and gas regions, effectively ripping electrons out of the liquid xenon and into the gas. The current device used to detect photons is the photomultiplier tube (PMT). These devices are large and costly. In recent years, a new technology that is capable of detecting single photons has emerged, the silicon photomultiplier (SiPM). These devices are cheaper and smaller than PMTs. Their performance in a high electric fields, such as those found in LUX, are unknown. It is possible that a large electric field could introduce noise on the SiPM signal, drowning the single photon detection capability. My hypothesis is that SiPMs will not observe a significant increase is noise at an electric field of roughly 10kV/cm (an electric field within the range used in detectors like LUX). I plan to test this hypothesis by first rotating the SiPMs with no applied electric field between two metal plates roughly 2 cm apart, providing a control data set. Then using the same angles test the dark counts with the constant electric field applied. Possibly the most important aspect of LUX, is the photon detector because it's what detects the signals. Dark matter is detected in the experiment by looking at the ratio of photons to electrons emitted for a given interaction in the detector. Interactions with a low electron to photon ratio are more like to be dark matter events than those with a high electron to photon ratio. The ability to

  8. Validation of a Residual Stress Measurement Method by Swept High-Frequency Eddy Currents

    International Nuclear Information System (INIS)

    Lee, C.; Shen, Y.; Lo, C. C. H.; Nakagawa, N.

    2007-01-01

    This paper reports on a swept high-frequency eddy current (SHFEC) measurement method developed for electromagnetic nondestructive characterization of residual stresses in shot peened aerospace materials. In this approach, we regard shot-peened surfaces as modified surface layers of varying conductivity, and determine the conductivity deviation profile by inversion of the SHFEC data. The SHFEC measurement system consists of a pair of closely matched printed-circuit-board coils driven by laboratory instrument under software control. This provides improved sensitivity and high frequency performance compared to conventional coils, so that swept frequency EC measurements up to 50 MHz can be made to achieve the smallest skin depth of 80 μm for nickel-based superalloys. We devised a conductivity profile inversion procedure based on the laterally uniform multi-layer theory of Cheng, Dodd and Deeds. The main contribution of this paper is the methodology validation. Namely, the forward and inverse models were validated against measurements on artificial layer specimens consisting of metal films with different conductivities placed on a metallic substrate. The inversion determined the film conductivities which were found to agree with those measured using the direct current potential drop (DCPD) method

  9. Validation of a Residual Stress Measurement Method by Swept High-Frequency Eddy Currents

    Science.gov (United States)

    Lee, C.; Shen, Y.; Lo, C. C. H.; Nakagawa, N.

    2007-03-01

    This paper reports on a swept high-frequency eddy current (SHFEC) measurement method developed for electromagnetic nondestructive characterization of residual stresses in shot peened aerospace materials. In this approach, we regard shot-peened surfaces as modified surface layers of varying conductivity, and determine the conductivity deviation profile by inversion of the SHFEC data. The SHFEC measurement system consists of a pair of closely matched printed-circuit-board coils driven by laboratory instrument under software control. This provides improved sensitivity and high frequency performance compared to conventional coils, so that swept frequency EC measurements up to 50 MHz can be made to achieve the smallest skin depth of 80 μm for nickel-based superalloys. We devised a conductivity profile inversion procedure based on the laterally uniform multi-layer theory of Cheng, Dodd and Deeds. The main contribution of this paper is the methodology validation. Namely, the forward and inverse models were validated against measurements on artificial layer specimens consisting of metal films with different conductivities placed on a metallic substrate. The inversion determined the film conductivities which were found to agree with those measured using the direct current potential drop (DCPD) method.

  10. The Role of Performance Management in the High Performance Organisation

    NARCIS (Netherlands)

    de Waal, André A.; van der Heijden, Beatrice I.J.M.

    2014-01-01

    The allegiance of partnering organisations and their employees to an Extended Enterprise performance is its proverbial sword of Damocles. Literature on Extended Enterprises focuses on collaboration, inter-organizational integration and learning to avoid diminishing or missing allegiance becoming an

  11. Strategies of high-performing paramedic educational programs.

    Science.gov (United States)

    Margolis, Gregg S; Romero, Gabriel A; Fernandez, Antonio R; Studnek, Jonathan R

    2009-01-01

    To identify the specific educational strategies used by paramedic educational programs that have attained consistently high success rates on the National Registry of Emergency Medical Technicians (NREMT) examination. NREMT data from 2003-2007 were analyzed to identify consistently high-performing paramedic educational programs. Representatives from 12 programs that have maintained a 75% first-attempt pass rate for at least four of five years and had more than 20 graduates per year were invited to participate in a focus group. Using the nominal group technique (NGT), participants were asked to answer the following question: "What are specific strategies that lead to a successful paramedic educational program?" All 12 emergency medical services (EMS) educational programs meeting the eligibility requirements participated. After completing the seven-step NGT process, 12 strategies were identified as leading to a successful paramedic educational program: 1) achieve and maintain national accreditation; 2) maintain high-level entry requirements and prerequisites; 3) provide students with a clear idea of expectations for student success; 4) establish a philosophy and foster a culture that values continuous review and improvement; 5) create your own examinations, lesson plans, presentations, and course materials using multiple current references; 6) emphasize emergency medical technician (EMT)-Basic concepts throughout the class; 7) use frequent case-based classroom scenarios; 8) expose students to as many prehospital advanced life support (ALS) patient contacts as possible, preferably where they are in charge; 9) create and administer valid examinations that have been through a review process (such as qualitative analysis); 10) provide students with frequent detailed feedback regarding their performance (such as formal examination reviews); 11) incorporate critical thinking and problem solving into all testing; and 12) deploy predictive testing with analysis prior to

  12. User's Manual for Data for Validating Models for PV Module Performance

    Energy Technology Data Exchange (ETDEWEB)

    Marion, W.; Anderberg, A.; Deline, C.; Glick, S.; Muller, M.; Perrin, G.; Rodriguez, J.; Rummel, S.; Terwilliger, K.; Silverman, T. J.

    2014-04-01

    This user's manual describes performance data measured for flat-plate photovoltaic (PV) modules installed in Cocoa, Florida, Eugene, Oregon, and Golden, Colorado. The data include PV module current-voltage curves and associated meteorological data for approximately one-year periods. These publicly available data are intended to facilitate the validation of existing models for predicting the performance of PV modules, and for the development of new and improved models. For comparing different modeling approaches, using these public data will provide transparency and more meaningful comparisons of the relative benefits.

  13. Operational Street Pollution Model (OSPM) - a review of performed validation studies, and future prospects

    DEFF Research Database (Denmark)

    Kakosimos K.E., Konstantinos E.; Hertel, Ole; Ketzel, Matthias

    2010-01-01

    in this context is the fast and easy to apply Operational Street Pollution Model (OSPM). For almost 20 years, OSPM has been routinely used in many countries for studying traffic pollution, performing analyses of field campaign measurements, studying efficiency of pollution abatement strategies, carrying out...... exposure assessments and as reference in comparisons to other models. OSPM is generally considered as state-of-the-art in applied street pollution modelling. This paper outlines the most important findings in OSPM validation and application studies in literature. At the end of the paper, future research...... needs are outlined for traffic air pollution modelling in general but with outset in the research performed with OSPM....

  14. Validation of the Short Form of the Career Development Inventory with an Iranian High School Sample

    Science.gov (United States)

    Sadeghi, Ahmad; Baghban, Iran; Bahrami, Fatemeh; Ahmadi, Ahmad; Creed, Peter

    2011-01-01

    A short 33-item form of the Career Development Inventory was validated on a sample of 310 Iranian high school students. Factor analysis indicated that attitude and cognitive subscale items loaded on their respective factors, and that internal reliability coefficients at all levels were satisfactory to good. Support for validity was demonstrated by…

  15. Predictive Validity of National Basketball Association Draft Combine on Future Performance.

    Science.gov (United States)

    Teramoto, Masaru; Cross, Chad L; Rieger, Randall H; Maak, Travis G; Willick, Stuart E

    2018-02-01

    Teramoto, M, Cross, CL, Rieger, RH, Maak, TG, and Willick, SE. Predictive validity of national basketball association draft combine on future performance. J Strength Cond Res 32(2): 396-408, 2018-The National Basketball Association (NBA) Draft Combine is an annual event where prospective players are evaluated in terms of their athletic abilities and basketball skills. Data collected at the Combine should help NBA teams select right the players for the upcoming NBA draft; however, its value for predicting future performance of players has not been examined. This study investigated predictive validity of the NBA Draft Combine on future performance of basketball players. We performed a principal component analysis (PCA) on the 2010-2015 Combine data to reduce correlated variables (N = 234), a correlation analysis on the Combine data and future on-court performance to examine relationships (maximum pairwise N = 217), and a robust principal component regression (PCR) analysis to predict first-year and 3-year on-court performance from the Combine measures (N = 148 and 127, respectively). Three components were identified within the Combine data through PCA (= Combine subscales): length-size, power-quickness, and upper-body strength. As per the correlation analysis, the individual Combine items for anthropometrics, including height without shoes, standing reach, weight, wingspan, and hand length, as well as the Combine subscale of length-size, had positive, medium-to-large-sized correlations (r = 0.313-0.545) with defensive performance quantified by Defensive Box Plus/Minus. The robust PCR analysis showed that the Combine subscale of length-size was a predictor most significantly associated with future on-court performance (p ≤ 0.05), including Win Shares, Box Plus/Minus, and Value Over Replacement Player, followed by upper-body strength. In conclusion, the NBA Draft Combine has value for predicting future performance of players.

  16. Evaluating performance of high efficiency mist eliminators

    Energy Technology Data Exchange (ETDEWEB)

    Waggoner, Charles A.; Parsons, Michael S.; Giffin, Paxton K. [Mississippi State University, Institute for Clean Energy Technology, 205 Research Blvd, Starkville, MS (United States)

    2013-07-01

    Processing liquid wastes frequently generates off gas streams with high humidity and liquid aerosols. Droplet laden air streams can be produced from tank mixing or sparging and processes such as reforming or evaporative volume reduction. Unfortunately these wet air streams represent a genuine threat to HEPA filters. High efficiency mist eliminators (HEME) are one option for removal of liquid aerosols with high dissolved or suspended solids content. HEMEs have been used extensively in industrial applications, however they have not seen widespread use in the nuclear industry. Filtering efficiency data along with loading curves are not readily available for these units and data that exist are not easily translated to operational parameters in liquid waste treatment plants. A specialized test stand has been developed to evaluate the performance of HEME elements under use conditions of a US DOE facility. HEME elements were tested at three volumetric flow rates using aerosols produced from an iron-rich waste surrogate. The challenge aerosol included submicron particles produced from Laskin nozzles and super micron particles produced from a hollow cone spray nozzle. Test conditions included ambient temperature and relative humidities greater than 95%. Data collected during testing HEME elements from three different manufacturers included volumetric flow rate, differential temperature across the filter housing, downstream relative humidity, and differential pressure (dP) across the filter element. Filter challenge was discontinued at three intermediate dPs and the filter to allow determining filter efficiency using dioctyl phthalate and then with dry surrogate aerosols. Filtering efficiencies of the clean HEME, the clean HEME loaded with water, and the HEME at maximum dP were also collected using the two test aerosols. Results of the testing included differential pressure vs. time loading curves for the nine elements tested along with the mass of moisture and solid

  17. Validation of CryoSat-2 SARIn Performance over Arctic Sea Ice

    Science.gov (United States)

    Di Bella, A.; Skourup, H.; Bouffard, J.; Parrinello, T.

    2016-08-01

    The main objective of this work is to validate CryoSat-2 (CS2) SARIn performance over sea ice by use of airborne laser altimetry data obtained during the CryoVEx 2012 campaign. A study by [1] has shown that the extra information from the CS2 SARIn mode increases the number of valid sea surface height estimates which are usually discarded in the SAR mode due to snagging of the radar signal. As the number of valid detected leads increases, the uncertainty of the freeboard heights decreases.In this study, the snow freeboard heights estimated using data from the airborne laser scanner are used to validate the sea ice freeboard obtained by processing CS2 SARIn level 1b waveforms. The possible reduction in the random freeboard uncertainty is investigated comparing two scenarios, i.e. a SAR-like and a SARIn acquisition.It is observed that using the extra phase information, CS2 is able to detect leads up to 2370 m off-nadir. A reduction in the the total random freeboard uncertainty of ˜40% is observed by taking advantage of the CS2 interferometric capabilities, which enable to include ˜35% of the wave-forms discarded in the SAR-like scenario.

  18. Validity of linear encoder measurement of sit-to-stand performance power in older people.

    Science.gov (United States)

    Lindemann, U; Farahmand, P; Klenk, J; Blatzonis, K; Becker, C

    2015-09-01

    To investigate construct validity of linear encoder measurement of sit-to-stand performance power in older people by showing associations with relevant functional performance and physiological parameters. Cross-sectional study. Movement laboratory of a geriatric rehabilitation clinic. Eighty-eight community-dwelling, cognitively unimpaired older women (mean age 78 years). Sit-to-stand performance power and leg power were assessed using a linear encoder and the Nottingham Power Rig, respectively. Gait speed was measured on an instrumented walkway. Maximum quadriceps and hand grip strength were assessed using dynamometers. Mid-thigh muscle cross-sectional area of both legs was measured using magnetic resonance imaging. Associations of sit-to-stand performance power with power assessed by the Nottingham Power Rig, maximum gait speed and muscle cross-sectional area were r=0.646, r=0.536 and r=0.514, respectively. A linear regression model explained 50% of the variance in sit-to-stand performance power including muscle cross-sectional area (p=0.001), maximum gait speed (p=0.002), and power assessed by the Nottingham Power Rig (p=0.006). Construct validity of linear encoder measurement of sit-to-stand power was shown at functional level and morphological level for older women. This measure could be used in routine clinical practice as well as in large-scale studies. DRKS00003622. Copyright © 2015 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  19. Performance and Symptom Validity Testing as a Function of Medical Board Evaluation in U.S. Military Service Members with a History of Mild Traumatic Brain Injury.

    Science.gov (United States)

    Armistead-Jehle, Patrick; Cole, Wesley R; Stegman, Robert L

    2018-02-01

    The study was designed to replicate and extend pervious findings demonstrating the high rates of invalid neuropsychological testing in military service members (SMs) with a history of mild traumatic brain injury (mTBI) assessed in the context of a medical evaluation board (MEB). Two hundred thirty-one active duty SMs (61 of which were undergoing an MEB) underwent neuropsychological assessment. Performance validity (Word Memory Test) and symptom validity (MMPI-2-RF) test data were compared across those evaluated within disability (MEB) and clinical contexts. As with previous studies, there were significantly more individuals in an MEB context that failed performance (MEB = 57%, non-MEB = 31%) and symptom validity testing (MEB = 57%, non-MEB = 22%) and performance validity testing had a notable affect on cognitive test scores. Performance and symptom validity test failure rates did not vary as a function of the reason for disability evaluation when divided into behavioral versus physical health conditions. These data are consistent with past studies, and extends those studies by including symptom validity testing and investigating the effect of reason for MEB. This and previous studies demonstrate that more than 50% of SMs seen in the context of an MEB will fail performance validity tests and over-report on symptom validity measures. These results emphasize the importance of using both performance and symptom validity testing when evaluating SMs with a history of mTBI, especially if they are being seen for disability evaluations, in order to ensure the accuracy of cognitive and psychological test data. Published by Oxford University Press 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  20. The role of performance assessment in validation, regulation and public acceptance

    International Nuclear Information System (INIS)

    Pigford, T.H.

    1992-01-01

    This paper reports that regulation of public health and safety for a geologic repository for radioactive waste requires that performance assessment show that radioactive releases will not violate a safety limit. Accurate predictions of actual performance are not required. Because of the long times in the future when radioactivity can be released, performance predictions must be based on sound hypothesis of the mechanisms that control and mitigate releases. Such hypotheses are useful only if they lead to clear mathematical formulations, specify clearly the parameters that are expected to control the releases, and specify means of accelerated testing or other means for validating the hypotheses. useful hypotheses usually lead to conservative and bounding analyses that can be more reliable for this purpose than efforts to predict actual repository performance

  1. Analytical Performance Modeling and Validation of Intel’s Xeon Phi Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Chunduri, Sudheer; Balaprakash, Prasanna; Morozov, Vitali; Vishwanath, Venkatram; Kumaran, Kalyan

    2017-01-01

    Modeling the performance of scientific applications on emerging hardware plays a central role in achieving extreme-scale computing goals. Analytical models that capture the interaction between applications and hardware characteristics are attractive because even a reasonably accurate model can be useful for performance tuning before the hardware is made available. In this paper, we develop a hardware model for Intel’s second-generation Xeon Phi architecture code-named Knights Landing (KNL) for the SKOPE framework. We validate the KNL hardware model by projecting the performance of mini-benchmarks and application kernels. The results show that our KNL model can project the performance with prediction errors of 10% to 20%. The hardware model also provides informative recommendations for code transformations and tuning.

  2. Performance prediction and validation of equilibrium modeling for gasification of cashew nut shell char

    Directory of Open Access Journals (Sweden)

    M. Venkata Ramanan

    2008-09-01

    Full Text Available Cashew nut shell, a waste product obtained during deshelling of cashew kernels, had in the past been deemed unfit as a fuel for gasification owing to its high occluded oil content. The oil, a source of natural phenol, oozes upon gasification, thereby clogging the gasifier throat, downstream equipment and associated utilities with oil, resulting in ineffective gasification and premature failure of utilities due to its corrosive characteristics. To overcome this drawback, the cashew shells were de-oiled by charring in closed chambers and were subsequently gasified in an autothermal downdraft gasifier. Equilibrium modeling was carried out to predict the producer gas composition under varying performance influencing parameters, viz., equivalence ratio (ER, reaction temperature (RT and moisture content (MC. The results were compared with the experimental output and are presented in this paper. The model is quite satisfactory with the experimental outcome at the ER applicable to gasification systems, i.e., 0.15 to 0.30. The results show that the mole fraction of (i H2, CO and CH4 decreases while (N2 + H2O and CO2 increases with ER, (ii H2 and CO increases while CH4, (N2 + H2O and CO2 decreases with reaction temperature, (iii H2, CH4, CO2 and (N2 + H2O increases while CO decreases with moisture content. However at an equivalence ratio less than 0.15, the model predicts an unrealistic composition and is observed to be non valid below this ER.

  3. Noncredible cognitive performance at clinical evaluation of adult ADHD : An embedded validity indicator in a visuospatial working memory test

    NARCIS (Netherlands)

    Fuermaier, Anselm B M; Tucha, Oliver; Koerts, Janneke; Lange, Klaus W; Weisbrod, Matthias; Aschenbrenner, Steffen; Tucha, Lara

    2017-01-01

    The assessment of performance validity is an essential part of the neuropsychological evaluation of adults with attention-deficit/hyperactivity disorder (ADHD). Most available tools, however, are inaccurate regarding the identification of noncredible performance. This study describes the development

  4. An integrated high performance Fastbus slave interface

    International Nuclear Information System (INIS)

    Christiansen, J.; Ljuslin, C.

    1993-01-01

    A high performance CMOS Fastbus slave interface ASIC (Application Specific Integrated Circuit) supporting all addressing and data transfer modes defined in the IEEE 960 - 1986 standard is presented. The FAstbus Slave Integrated Circuit (FASIC) is an interface between the asynchronous Fastbus and a clock synchronous processor/memory bus. It can work stand-alone or together with a 32 bit microprocessor. The FASIC is a programmable device enabling its direct use in many different applications. A set of programmable address mapping windows can map Fastbus addresses to convenient memory addresses and at the same time act as address decoding logic. Data rates of 100 MBytes/sec to Fastbus can be obtained using an internal FIFO in the FASIC to buffer data between the two buses during block transfers. Message passing from Fastbus to a microprocessor on the slave module is supported. A compact (70 mm x 170 mm) Fastbus slave piggy back sub-card interface including level conversion between ECL and TTL signal levels has been implemented using surface mount components and the 208 pin FASIC chip

  5. High Performance Graphene Oxide Based Rubber Composites

    Science.gov (United States)

    Mao, Yingyan; Wen, Shipeng; Chen, Yulong; Zhang, Fazhong; Panine, Pierre; Chan, Tung W.; Zhang, Liqun; Liang, Yongri; Liu, Li

    2013-01-01

    In this paper, graphene oxide/styrene-butadiene rubber (GO/SBR) composites with complete exfoliation of GO sheets were prepared by aqueous-phase mixing of GO colloid with SBR latex and a small loading of butadiene-styrene-vinyl-pyridine rubber (VPR) latex, followed by their co-coagulation. During co-coagulation, VPR not only plays a key role in the prevention of aggregation of GO sheets but also acts as an interface-bridge between GO and SBR. The results demonstrated that the mechanical properties of the GO/SBR composite with 2.0 vol.% GO is comparable with those of the SBR composite reinforced with 13.1 vol.% of carbon black (CB), with a low mass density and a good gas barrier ability to boot. The present work also showed that GO-silica/SBR composite exhibited outstanding wear resistance and low-rolling resistance which make GO-silica/SBR very competitive for the green tire application, opening up enormous opportunities to prepare high performance rubber composites for future engineering applications. PMID:23974435

  6. Initial rheological description of high performance concretes

    Directory of Open Access Journals (Sweden)

    Alessandra Lorenzetti de Castro

    2006-12-01

    Full Text Available Concrete is defined as a composite material and, in rheological terms, it can be understood as a concentrated suspension of solid particles (aggregates in a viscous liquid (cement paste. On a macroscopic scale, concrete flows as a liquid. It is known that the rheological behavior of the concrete is close to that of a Bingham fluid and two rheological parameters regarding its description are needed: yield stress and plastic viscosity. The aim of this paper is to present the initial rheological description of high performance concretes using the modified slump test. According to the results, an increase of yield stress was observed over time, while a slight variation in plastic viscosity was noticed. The incorporation of silica fume showed changes in the rheological properties of fresh concrete. The behavior of these materials also varied with the mixing procedure employed in their production. The addition of superplasticizer meant that there was a large reduction in the mixture's yield stress, while plastic viscosity remained practically constant.

  7. High thermoelectric performance of graphite nanofibers.

    Science.gov (United States)

    Tran, Van-Truong; Saint-Martin, Jérôme; Dollfus, Philippe; Volz, Sebastian

    2018-02-22

    Graphite nanofibers (GNFs) have been demonstrated to be a promising material for hydrogen storage and heat management in electronic devices. Here, by means of first-principles and transport simulations, we show that GNFs can also be an excellent material for thermoelectric applications thanks to the interlayer weak van der Waals interaction that induces low thermal conductance and a step-like shape in the electronic transmission with mini-gaps, which are necessary ingredients to achieve high thermoelectric performance. This study unveils that the platelet form of GNFs in which graphite layers are perpendicular to the fiber axis can exhibit outstanding thermoelectric properties with a figure of merit ZT reaching 3.55 in a 0.5 nm diameter fiber and 1.1 in a 1.1 nm diameter one. Interestingly, by introducing 14 C isotope doping, ZT can even be enhanced up to more than 5, and more than 8 if we include the effect of finite phonon mean free path, which demonstrates the amazing thermoelectric potential of GNFs.

  8. Durability of high performance concrete in seawater

    International Nuclear Information System (INIS)

    Amjad Hussain Memon; Salihuddin Radin Sumadi; Rabitah Handan

    2000-01-01

    This paper presents a report on the effects of blended cements on the durability of high performance concrete (HPC) in seawater. In this research the effect of seawater was investigated. The specimens were initially subjected to water curing for seven days inside the laboratory at room temperature, followed by seawater curing exposed to tidal zone until testing. In this study three levels of cement replacement (0%, 30% and 70%) were used. The combined use of chemical and mineral admixtures has resulted in a new generation of concrete called HPC. The HPC has been identified as one of the most important advanced materials necessary in the effort to build a nation's infrastructure. HPC opens new opportunities in the utilization of the industrial by-products (mineral admixtures) in the construction industry. As a matter of fact permeability is considered as one of the fundamental properties governing the durability of concrete in the marine environment. Results of this investigation indicated that the oxygen permeability values for the blended cement concretes at the age of one year are reduced by a factor of about 2 as compared to OPC control mix concrete. Therefore both blended cement concretes are expected to withstand in the seawater exposed to tidal zone without serious deterioration. (Author)

  9. Reliability and validity of the test of incremental respiratory endurance measures of inspiratory muscle performance in COPD

    Directory of Open Access Journals (Sweden)

    Formiga MF

    2018-05-01

    Full Text Available Magno F Formiga,1,2 Kathryn E Roach,1 Isabel Vital,3 Gisel Urdaneta,3 Kira Balestrini,3 Rafael A Calderon-Candelario,3,4 Michael A Campos,3,4,* Lawrence P Cahalin1,* 1Department of Physical Therapy, University of Miami Miller School of Medicine, Coral Gables, FL, USA; 2CAPES Foundation, Ministry of Education of Brazil, Brasilia, Brazil; 3Pulmonary Section, Miami Veterans Administration Medical Center, Miami, FL, USA; 4Division of Pulmonary, Allergy, Critical Care and Sleep Medicine, University of Miami Miller School of Medicine, Miami, FL, USA *These authors contributed equally to this work Purpose: The Test of Incremental Respiratory Endurance (TIRE provides a comprehensive assessment of inspiratory muscle performance by measuring maximal inspiratory pressure (MIP over time. The integration of MIP over inspiratory duration (ID provides the sustained maximal inspiratory pressure (SMIP. Evidence on the reliability and validity of these measurements in COPD is not currently available. Therefore, we assessed the reliability, responsiveness and construct validity of the TIRE measures of inspiratory muscle performance in subjects with COPD. Patients and methods: Test–retest reliability, known-groups and convergent validity assessments were implemented simultaneously in 81 male subjects with mild to very severe COPD. TIRE measures were obtained using the portable PrO2 device, following standard guidelines. Results: All TIRE measures were found to be highly reliable, with SMIP demonstrating the strongest test–retest reliability with a nearly perfect intraclass correlation coefficient (ICC of 0.99, while MIP and ID clustered closely together behind SMIP with ICC values of about 0.97. Our findings also demonstrated known-groups validity of all TIRE measures, with SMIP and ID yielding larger effect sizes when compared to MIP in distinguishing between subjects of different COPD status. Finally, our analyses confirmed convergent validity for both SMIP

  10. Validación del método por cromatografía líquida de alta resolución para ácido ascórbico en tabletas de producción nacional Validation of ascorbic acid tablets of national production by high-performance liquid chromatography method

    Directory of Open Access Journals (Sweden)

    Yaslenis Rodríguez Hernández

    2009-12-01

    Full Text Available Se realizó la validación de un método analítico por cromatografía líquida de alta resolución, para la determinación de ácido ascórbico en tabletas de vitamina C, el cual se diseñó como método alternativo para el control de calidad y para el seguimiento de la estabilidad química del principio activo, pues las técnicas oficiales para el control de calidad del ácido ascórbico en las tabletas, no son selectivas frente a los productos de degradación. El método se modificó con respecto al reportado en la USP 28, 2005 para el análisis del inyectable. Se empleó una columna RP-18 de 250 x 4,6 mm 5 mm con detector UV a 245 nm. Su validación fue necesaria para ambos propósitos, teniendo en cuenta los parámetros exigidos para los métodos de las categorías I y II. El método fue suficientemente lineal, exacto y preciso en el rango de 100-300 mg/mL. Además fue selectivo frente a los restantes componentes de la matriz y a los posibles productos de degradación obtenidos en condiciones de estrés. Se calcularon los límites de detección y cuantificación. Una vez validado el método se aplicó a la cuantificación de ácido ascórbico en 2 lotes de tabletas envejecidas, y se detectó una marcada influencia del envase en la degradación del principio activo transcurridos 12 meses a temperatura ambiente.We validate an analytical method by high-performance liquid chromatography to determine ascorbic acid proportion in vitamin C tablets, which was designed as an alternative method to quality control and to follow-up of active principle chemical stability, since official techniques to quality control of ascorbic acid in tablets are not selective with degradation products. Method was modified according to that reported in USP 28, 2005 for analysis of injectable product. We used a RP-18 column of 250 x 4.6 mm 5 mm with a UV detector to 245 nm. Its validation was necessary for both objectives, considering parameters required for methods of

  11. Validation of the updated ArthroS simulator: face and construct validity of a passive haptic virtual reality simulator with novel performance metrics.

    Science.gov (United States)

    Garfjeld Roberts, Patrick; Guyver, Paul; Baldwin, Mathew; Akhtar, Kash; Alvand, Abtin; Price, Andrew J; Rees, Jonathan L

    2017-02-01

    To assess the construct and face validity of ArthroS, a passive haptic VR simulator. A secondary aim was to evaluate the novel performance metrics produced by this simulator. Two groups of 30 participants, each divided into novice, intermediate or expert based on arthroscopic experience, completed three separate tasks on either the knee or shoulder module of the simulator. Performance was recorded using 12 automatically generated performance metrics and video footage of the arthroscopic procedures. The videos were blindly assessed using a validated global rating scale (GRS). Participants completed a survey about the simulator's realism and training utility. This new simulator demonstrated construct validity of its tasks when evaluated against a GRS (p ≤ 0.003 in all cases). Regarding it's automatically generated performance metrics, established outputs such as time taken (p ≤ 0.001) and instrument path length (p ≤ 0.007) also demonstrated good construct validity. However, two-thirds of the proposed 'novel metrics' the simulator reports could not distinguish participants based on arthroscopic experience. Face validity assessment rated the simulator as a realistic and useful tool for trainees, but the passive haptic feedback (a key feature of this simulator) is rated as less realistic. The ArthroS simulator has good task construct validity based on established objective outputs, but some of the novel performance metrics could not distinguish between surgical experience. The passive haptic feedback of the simulator also needs improvement. If simulators could offer automated and validated performance feedback, this would facilitate improvements in the delivery of training by allowing trainees to practise and self-assess.

  12. Alternative High-Performance Ceramic Waste Forms

    Energy Technology Data Exchange (ETDEWEB)

    Sundaram, S. K. [Alfred Univ., NY (United States)

    2017-02-01

    This final report (M5NU-12-NY-AU # 0202-0410) summarizes the results of the project titled “Alternative High-Performance Ceramic Waste Forms,” funded in FY12 by the Nuclear Energy University Program (NEUP Project # 12-3809) being led by Alfred University in collaboration with Savannah River National Laboratory (SRNL). The overall focus of the project is to advance fundamental understanding of crystalline ceramic waste forms and to demonstrate their viability as alternative waste forms to borosilicate glasses. We processed single- and multiphase hollandite waste forms based on simulated waste streams compositions provided by SRNL based on the advanced fuel cycle initiative (AFCI) aqueous separation process developed in the Fuel Cycle Research and Development (FCR&D). For multiphase simulated waste forms, oxide and carbonate precursors were mixed together via ball milling with deionized water using zirconia media in a polyethylene jar for 2 h. The slurry was dried overnight and then separated from the media. The blended powders were then subjected to melting or spark plasma sintering (SPS) processes. Microstructural evolution and phase assemblages of these samples were studied using x-ray diffraction (XRD), scanning electron microscopy (SEM), energy dispersion analysis of x-rays (EDAX), wavelength dispersive spectrometry (WDS), transmission electron spectroscopy (TEM), selective area x-ray diffraction (SAXD), and electron backscatter diffraction (EBSD). These results showed that the processing methods have significant effect on the microstructure and thus the performance of these waste forms. The Ce substitution into zirconolite and pyrochlore materials was investigated using a combination of experimental (in situ XRD and x-ray absorption near edge structure (XANES)) and modeling techniques to study these single phases independently. In zirconolite materials, a transition from the 2M to the 4M polymorph was observed with increasing Ce content. The resulting

  13. Evaluation of a performance assessment methodology for low-level radioactive waste disposal facilities: Validation needs. Volume 2

    International Nuclear Information System (INIS)

    Kozak, M.W.; Olague, N.E.

    1995-02-01

    In this report, concepts on how validation fits into the scheme of developing confidence in performance assessments are introduced. A general framework for validation and confidence building in regulatory decision making is provided. It is found that traditional validation studies have a very limited role in developing site-specific confidence in performance assessments. Indeed, validation studies are shown to have a role only in the context that their results can narrow the scope of initial investigations that should be considered in a performance assessment. In addition, validation needs for performance assessment of low-level waste disposal facilities are discussed, and potential approaches to address those needs are suggested. These areas of topical research are ranked in order of importance based on relevance to a performance assessment and likelihood of success

  14. Evaluation of a performance assessment methodology for low-level radioactive waste disposal facilities: Validation needs. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kozak, M.W.; Olague, N.E. [Sandia National Labs., Albuquerque, NM (United States)

    1995-02-01

    In this report, concepts on how validation fits into the scheme of developing confidence in performance assessments are introduced. A general framework for validation and confidence building in regulatory decision making is provided. It is found that traditional validation studies have a very limited role in developing site-specific confidence in performance assessments. Indeed, validation studies are shown to have a role only in the context that their results can narrow the scope of initial investigations that should be considered in a performance assessment. In addition, validation needs for performance assessment of low-level waste disposal facilities are discussed, and potential approaches to address those needs are suggested. These areas of topical research are ranked in order of importance based on relevance to a performance assessment and likelihood of success.

  15. Case Study of Using High Performance Commercial Processors in Space

    Science.gov (United States)

    Ferguson, Roscoe C.; Olivas, Zulema

    2009-01-01

    The purpose of the Space Shuttle Cockpit Avionics Upgrade project (1999 2004) was to reduce crew workload and improve situational awareness. The upgrade was to augment the Shuttle avionics system with new hardware and software. A major success of this project was the validation of the hardware architecture and software design. This was significant because the project incorporated new technology and approaches for the development of human rated space software. An early version of this system was tested at the Johnson Space Center for one month by teams of astronauts. The results were positive, but NASA eventually cancelled the project towards the end of the development cycle. The goal to reduce crew workload and improve situational awareness resulted in the need for high performance Central Processing Units (CPUs). The choice of CPU selected was the PowerPC family, which is a reduced instruction set computer (RISC) known for its high performance. However, the requirement for radiation tolerance resulted in the re-evaluation of the selected family member of the PowerPC line. Radiation testing revealed that the original selected processor (PowerPC 7400) was too soft to meet mission objectives and an effort was established to perform trade studies and performance testing to determine a feasible candidate. At that time, the PowerPC RAD750s were radiation tolerant, but did not meet the required performance needs of the project. Thus, the final solution was to select the PowerPC 7455. This processor did not have a radiation tolerant version, but had some ability to detect failures. However, its cache tags did not provide parity and thus the project incorporated a software strategy to detect radiation failures. The strategy was to incorporate dual paths for software generating commands to the legacy Space Shuttle avionics to prevent failures due to the softness of the upgraded avionics.

  16. Evidence of Reliability and Validity for a Children’s Auditory Continuous Performance Test

    Directory of Open Access Journals (Sweden)

    Michael J. Lasee

    2013-11-01

    Full Text Available Continuous Performance Tests (CPTs are commonly utilized clinical measures of attention and response inhibition. While there have been many studies of CPTs that utilize a visual format, there is considerably less research employing auditory CPTs. The current study provides initial reliability and validity evidence for the Auditory Vigilance Screening Measure (AVSM, a newly developed CPT. Participants included 105 five- to nine-year-old children selected from two rural Midwestern school districts. Reliability data for the AVSM was collected through retesting of 42 participants. Validity was evaluated through correlation of AVSM scales with subscales from the ADHD Rating Scale–IV. Test–retest reliability coefficients ranged from .62 to .74 for AVSM subscales. A significant (r = .31 correlation was obtained between the AVSM Impulsivity Scale and teacher ratings of inattention. Limitations and implications for future study are discussed.

  17. An ecologically valid performance-based social functioning assessment battery for schizophrenia.

    Science.gov (United States)

    Shi, Chuan; He, Yi; Cheung, Eric F C; Yu, Xin; Chan, Raymond C K

    2013-12-30

    Psychiatrists pay more attention to the social functioning outcome of schizophrenia nowadays. How to evaluate the real world function among schizophrenia is a challenging task due to culture difference, there is no such kind of instrument in terms of the Chinese setting. This study aimed to report the validation of an ecologically valid performance-based everyday functioning assessment for schizophrenia, namely the Beijing Performance-based Functional Ecological Test (BJ-PERFECT). Fifty community-dwelling adults with schizophrenia and 37 healthy controls were recruited. Fifteen of the healthy controls were re-tested one week later. All participants were administered the University of California, San Diego, Performance-based Skill Assessment-Brief version (UPSA-B) and the MATRICS Consensus Cognitive Battery (MCCB). The finalized assessment included three subdomains: transportation, financial management and work ability. The test-retest and inter-rater reliabilities were good. The total score significantly correlated with the UPSA-B. The performance of individuals with schizophrenia was significantly more impaired than healthy controls, especially in the domain of work ability. Among individuals with schizophrenia, functional outcome was influenced by premorbid functioning, negative symptoms and neurocognition such as processing speed, visual learning and attention/vigilance. © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Turbulent Scalar Transport Model Validation for High Speed Propulsive Flows, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort entails the validation of a RANS turbulent scalar transport model (SFM) for high speed propulsive flows, using new experimental data sets and...

  19. Symptom and performance validity with veterans assessed for attention-deficit/hyperactivity disorder (ADHD).

    Science.gov (United States)

    Shura, Robert D; Denning, John H; Miskey, Holly M; Rowland, Jared A

    2017-12-01

    Little is known about attention-deficit/hyperactivity disorder (ADHD) in veterans. Practice standards recommend the use of both symptom and performance validity measures in any assessment, and there are salient external incentives associated with ADHD evaluation (stimulant medication access and academic accommodations). The purpose of this study was to evaluate symptom and performance validity measures in a clinical sample of veterans presenting for specialty ADHD evaluation. Patients without a history of a neurocognitive disorder and for whom data were available on all measures (n = 114) completed a clinical interview structured on DSM-5 ADHD symptoms, the Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF), and the Test of Memory Malingering Trial 1 (TOMM1) as part of a standardized ADHD diagnostic evaluation. Veterans meeting criteria for ADHD were not more likely to overreport symptoms on the MMPI-2-RF nor to fail TOMM1 (score ≤ 41) compared with those who did not meet criteria. Those who overreported symptoms did not endorse significantly more ADHD symptoms; however, those who failed TOMM1 did report significantly more ADHD symptoms (g = 0.90). In the total sample, 19.3% failed TOMM1, 44.7% overreported on the MMPI-2-RF, and 8.8% produced both an overreported MMPI-2-RF and invalid TOMM1. F-r had the highest correlation to TOMM1 scores (r = -.30). These results underscore the importance of assessing both symptom and performance validity in a clinical ADHD evaluation with veterans. In contrast to certain other conditions (e.g., mild traumatic brain injury), ADHD as a diagnosis is not related to higher rates of invalid report/performance in veterans. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Validation of two conceptualizations of fragile self-esteem: Contingent high self-esteem and incongruent high self-esteem

    OpenAIRE

    Bodroža Bojana

    2014-01-01

    The aim of this research was to validate two aspects of fragile high self-esteem: a combination of contingent and high (explicit) self-esteem and a combination of high explicit and low implicit self-esteem (i.e. incongruent high self-esteem), as well as to examine the relationship between these aspects of fragile self-esteem and narcissism. No convergence was found between contingent high and incongruent high self-esteem. The result was consistent regardles...