WorldWideScience

Sample records for validate analytical results

  1. Development and validation of analytical methods for dietary supplements

    International Nuclear Information System (INIS)

    Sullivan, Darryl; Crowley, Richard

    2006-01-01

    The expanding use of innovative botanical ingredients in dietary supplements and foods has resulted in a flurry of research aimed at the development and validation of analytical methods for accurate measurement of active ingredients. The pressing need for these methods is being met through an expansive collaborative initiative involving industry, government, and analytical organizations. This effort has resulted in the validation of several important assays as well as important advances in the method engineering procedures which have improved the efficiency of the process. The initiative has also allowed researchers to hurdle many of the barricades that have hindered accurate analysis such as the lack of reference standards and comparative data. As the availability for nutraceutical products continues to increase these methods will provide consumers and regulators with the scientific information needed to assure safety and dependable labeling

  2. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    Science.gov (United States)

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012

  3. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    Science.gov (United States)

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  4. Analytical validation of Gentian NGAL particle-enhanced enhanced turbidimetric immunoassay (PETIA

    Directory of Open Access Journals (Sweden)

    Gian Luca Salvagno

    2017-08-01

    Full Text Available Objectives: This study was designed to validate the analytical performance of the new Gentian particle-enhanced enhanced turbidimetric immunoassay (PETIA for measuring neutrophil gelatinase-associated lipocalin (NGAL in serum samples. Design and methods: Analytical validation of the Gentian NGAL assay was carried out on a Roche Cobas c501 and was based on assessment of limit of blank (LOB, limit of detection (LOD, functional sensitivity, imprecision, linearity and concordance with the BioPorto NGAL test. Results: The LOB and LOD of Gentian NGAL were found to be 3.8 ng/mL and 6.3 ng/mL, respectively. An analytical coefficient of variation (CV of 20% corresponded to a NGAL value of 10 ng/mL. The intra-assay and inter-assay imprecision (CV was between 0.4 and 5.2% and 0.6 and 7.1% and the total imprecision (CV was 3.7%. The linearity was optimal at NGAL concentrations between 37 and 1420 ng/mL (r=1.00; p<0.001. An excellent correlation was observed between values measured with Gentian NGAL and BioPorto NGAL in 74 routine serum samples (r=0.993. The mean percentage bias of the Gentian assay versus the Bioporto assay was +3.1% (95% CI, +1.6% to +4.5%. Conclusions: These results show that Gentian NGAL may be a viable option to other commercial immunoassays for both routine and urgent assessment of serum NGAL. Keywords: Neutrophil gelatinase-associated lipocalin, NGAL, Analytical validation, Acute kidney injury

  5. Validation of Analytical Damping Ratio by Fatigue Stress Limit

    Science.gov (United States)

    Foong, Faruq Muhammad; Chung Ket, Thein; Beng Lee, Ooi; Aziz, Abdul Rashid Abdul

    2018-03-01

    The optimisation process of a vibration energy harvester is usually restricted to experimental approaches due to the lack of an analytical equation to describe the damping of a system. This study derives an analytical equation, which describes the first mode damping ratio of a clamp-free cantilever beam under harmonic base excitation by combining the transverse equation of motion of the beam with the damping-stress equation. This equation, as opposed to other common damping determination methods, is independent of experimental inputs or finite element simulations and can be solved using a simple iterative convergence method. The derived equation was determined to be correct for cases when the maximum bending stress in the beam is below the fatigue limit stress of the beam. However, an increasing trend in the error between the experiment and the analytical results were observed at high stress levels. Hence, the fatigue limit stress was used as a parameter to define the validity of the analytical equation.

  6. Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.

    Science.gov (United States)

    Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro

    2017-07-01

    The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian

  7. Validation of an advanced analytical procedure applied to the measurement of environmental radioactivity.

    Science.gov (United States)

    Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van

    2018-04-01

    In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Valid, legally defensible data from your analytical laboratories

    International Nuclear Information System (INIS)

    Gay, D.D.; Allen, V.C.

    1989-01-01

    This paper discusses the definition of valid, legally defensible data. The authors describe the expectations of project managers and what should be gleaned from the laboratory in regard to analytical data

  9. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    Development and Validation of Analytical Method for Losartan-Copper Complex Using UV-Vis Spectrophotometry. ... Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with ...

  10. Measuring Students' Writing Ability on a Computer-Analytic Developmental Scale: An Exploratory Validity Study

    Science.gov (United States)

    Burdick, Hal; Swartz, Carl W.; Stenner, A. Jackson; Fitzgerald, Jill; Burdick, Don; Hanlon, Sean T.

    2013-01-01

    The purpose of the study was to explore the validity of a novel computer-analytic developmental scale, the Writing Ability Developmental Scale. On the whole, collective results supported the validity of the scale. It was sensitive to writing ability differences across grades and sensitive to within-grade variability as compared to human-rated…

  11. Consistency of FMEA used in the validation of analytical procedures

    DEFF Research Database (Denmark)

    Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten

    2011-01-01

    is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating......In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...

  12. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    Science.gov (United States)

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  13. Risk analysis by FMEA as an element of analytical validation.

    Science.gov (United States)

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-05

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.

  14. Risk analysis of analytical validations by probabilistic modification of FMEA.

    Science.gov (United States)

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Risk analysis of analytical validations by probabilistic modification of FMEA

    DEFF Research Database (Denmark)

    Barends, D.M.; Oldenhof, M.T.; Vredenbregt, M.J.

    2012-01-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection...... and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring...... of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence...

  16. Consistency of FMEA used in the validation of analytical procedures.

    Science.gov (United States)

    Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M

    2011-02-20

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback. Copyright © 2010 Elsevier B.V. All rights reserved.

  17. Analytical validation of a new point-of-care assay for serum amyloid A in horses.

    Science.gov (United States)

    Schwartz, D; Pusterla, N; Jacobsen, S; Christopher, M M

    2018-01-17

    Serum amyloid A (SAA) is a major acute phase protein in horses. A new point-of-care (POC) test for SAA (Stablelab) is available, but studies evaluating its analytical accuracy are lacking. To evaluate the analytical performance of the SAA POC test by 1) determining linearity and precision, 2) comparing results in whole blood with those in serum or plasma, and 3) comparing POC results with those obtained using a previously validated turbidimetric immunoassay (TIA). Assay validation. Analytical validation of the POC test was done in accordance with American Society of Veterinary Clinical Pathology guidelines using residual equine serum/plasma and whole blood samples from the Clinical Pathology Laboratory at the University of California-Davis. A TIA was used as the reference method. We also evaluated the effect of haematocrit (HCT). The POC test was linear for SAA concentrations of up to at least 1000 μg/mL (r = 0.991). Intra-assay CVs were 13, 18 and 15% at high (782 μg/mL), intermediate (116 μg/mL) and low (64 μg/mL) concentrations. Inter-assay (inter-batch) CVs were 45, 14 and 15% at high (1372 μg/mL), intermediate (140 μg/mL) and low (56 μg/mL) concentrations. SAA results in whole blood were significantly lower than those in serum/plasma (P = 0.0002), but were positively correlated (r = 0.908) and not affected by HCT (P = 0.261); proportional negative bias was observed in samples with SAA>500 μg/mL. The difference between methods exceeded the 95% confidence interval of the combined imprecision of both methods (15%). Analytical validation could not be performed in whole blood, the sample most likely to be used stall side. The POC test has acceptable accuracy and precision in equine serum/plasma with SAA concentrations of up to at least 1000 μg/mL. Low inter-batch precision at high concentrations may affect serial measurements, and the use of the same test batch and sample type (serum/plasma or whole blood) is recommended. Comparison of results between the

  18. Quantifying the measurement uncertainty of results from environmental analytical methods.

    Science.gov (United States)

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  19. SU-E-J-145: Validation of An Analytical Model for in Vivo Range Verification Using GATE Monte Carlo Simulation in Proton Therapy

    International Nuclear Information System (INIS)

    Lee, C; Lin, H; Chao, T; Hsiao, I; Chuang, K

    2015-01-01

    Purpose: Predicted PET images on the basis of analytical filtering approach for proton range verification has been successful developed and validated using FLUKA Monte Carlo (MC) codes and phantom measurements. The purpose of the study is to validate the effectiveness of analytical filtering model for proton range verification on GATE/GEANT4 Monte Carlo simulation codes. Methods: In this study, we performed two experiments for validation of predicted β+-isotope by the analytical model with GATE/GEANT4 simulations. The first experiments to evaluate the accuracy of predicting β+-yields as a function of irradiated proton energies. In second experiment, we simulate homogeneous phantoms of different materials irradiated by a mono-energetic pencil-like proton beam. The results of filtered β+-yields distributions by the analytical model is compared with those of MC simulated β+-yields in proximal and distal fall-off ranges. Results: The results investigate the distribution between filtered β+-yields and MC simulated β+-yields distribution in different conditions. First, we found that the analytical filtering can be applied over the whole range of the therapeutic energies. Second, the range difference between filtered β+-yields and MC simulated β+-yields at the distal fall-off region are within 1.5mm for all materials used. The findings validated the usefulness of analytical filtering model on range verification of proton therapy on GATE Monte Carlo simulations. In addition, there is a larger discrepancy between filtered prediction and MC simulated β+-yields using GATE code, especially in proximal region. This discrepancy might Result from the absence of wellestablished theoretical models for predicting the nuclear interactions. Conclusion: Despite the fact that large discrepancies of the distributions between MC-simulated and predicted β+-yields were observed, the study prove the effectiveness of analytical filtering model for proton range verification using

  20. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    Science.gov (United States)

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  1. Validated analytical modeling of diesel engine regulated exhaust CO emission rate

    Directory of Open Access Journals (Sweden)

    Waleed F Faris

    2016-06-01

    Full Text Available Albeit vehicle analytical models are often favorable for explainable mathematical trends, no analytical model has been developed of the regulated diesel exhaust CO emission rate for trucks yet. This research unprecedentedly develops and validates for trucks a model of the steady speed regulated diesel exhaust CO emission rate analytically. It has been found that the steady speed–based CO exhaust emission rate is based on (1 CO2 dissociation, (2 the water–gas shift reaction, and (3 the incomplete combustion of hydrocarbon. It has been found as well that the steady speed–based CO exhaust emission rate based on CO2 dissociation is considerably less than the rate that is based on the water–gas shift reaction. It has also been found that the steady speed–based CO exhaust emission rate based on the water–gas shift reaction is the dominant source of CO exhaust emission. The study shows that the average percentage of deviation of the steady speed–based simulated results from the corresponding field data is 1.7% for all freeway cycles with 99% coefficient of determination at the confidence level of 95%. This deviation of the simulated results from field data outperforms its counterpart of widely recognized models such as the comprehensive modal emissions model and VT-Micro for all freeway cycles.

  2. Evaluation and analytical validation of a handheld digital refractometer for urine specific gravity measurement

    Directory of Open Access Journals (Sweden)

    Sara P. Wyness

    2016-08-01

    Full Text Available Objectives: Refractometers are commonly used to determine urine specific gravity (SG in the assessment of hydration status and urine specimen validity testing. Few comprehensive performance evaluations are available demonstrating refractometer capability from a clinical laboratory perspective. The objective of this study was therefore to conduct an analytical validation of a handheld digital refractometer used for human urine SG testing. Design and methods: A MISCO Palm Abbe™ refractometer was used for all experiments, including device familiarization, carryover, precision, accuracy, linearity, analytical sensitivity, evaluation of potential substances which contribute to SG (i.e. “interference”, and reference interval evaluation. A manual refractometer, urine osmometer, and a solute score (sum of urine chloride, creatinine, glucose, potassium, sodium, total protein, and urea nitrogen; all in mg/dL were used as comparative methods for accuracy assessment. Results: Significant carryover was not observed. A wash step was still included as good laboratory practice. Low imprecision (%CV, <0.01 was demonstrated using low and high QC material. Accuracy studies showed strong correlation to manual refractometry. Linear correlation was also demonstrated between SG, osmolality, and solute score. Linearity of Palm Abbe performance was verified with observed error of ≤0.1%. Increases in SG were observed with increasing concentrations of albumin, creatinine, glucose, hemoglobin, sodium chloride, and urea. Transference of a previously published urine SG reference interval of 1.0020–1.0300 was validated. Conclusions: The Palm Abbe digital refractometer was a fast, simple, and accurate way to measure urine SG. Analytical validity was confirmed by the present experiments. Keywords: Specific gravity, Osmolality, Digital refractometry, Hydration, Sports medicine, Urine drug testing, Urine adulteration

  3. Path integral analysis of Jarzynski's equality: Analytical results

    Science.gov (United States)

    Minh, David D. L.; Adib, Artur B.

    2009-02-01

    We apply path integrals to study nonequilibrium work theorems in the context of Brownian dynamics, deriving in particular the equations of motion governing the most typical and most dominant trajectories. For the analytically soluble cases of a moving harmonic potential and a harmonic oscillator with a time-dependent natural frequency, we find such trajectories, evaluate the work-weighted propagators, and validate Jarzynski’s equality.

  4. Results from the First Validation Phase of CAP code

    International Nuclear Information System (INIS)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul; Ha, Sang Jun; Choi, Hoon

    2010-01-01

    The second stage of Safety Analysis Code Development for Nuclear Power Plants was lunched on Apirl, 2010 and is scheduled to be through 2012, of which the scope of work shall cover from code validation to licensing preparation. As a part of this project, CAP(Containment Analysis Package) will follow the same procedures. CAP's validation works are organized hieratically into four validation steps using; 1) Fundamental phenomena. 2) Principal phenomena (mixing and transport) and components in containment. 3) Demonstration test by small, middle, large facilities and International Standard Problems. 4) Comparison with other containment codes such as GOTHIC or COMTEMPT. In addition, collecting the experimental data related to containment phenomena and then constructing the database is one of the major works during the second stage as a part of this project. From the validation process of fundamental phenomenon, it could be expected that the current capability and the future improvements of CAP code will be revealed. For this purpose, simple but significant problems, which have the exact analytical solution, were selected and calculated for validation of fundamental phenomena. In this paper, some results of validation problems for the selected fundamental phenomena will be summarized and discussed briefly

  5. Analytical thermal model validation for Cassini radioisotope thermoelectric generator

    International Nuclear Information System (INIS)

    Lin, E.I.

    1997-01-01

    The Saturn-bound Cassini spacecraft is designed to rely, without precedent, on the waste heat from its three radioisotope thermoelectric generators (RTGs) to warm the propulsion module subsystem, and the RTG end dome temperature is a key determining factor of the amount of waste heat delivered. A previously validated SINDA thermal model of the RTG was the sole guide to understanding its complex thermal behavior, but displayed large discrepancies against some initial thermal development test data. A careful revalidation effort led to significant modifications and adjustments of the model, which result in a doubling of the radiative heat transfer from the heat source support assemblies to the end domes and bring up the end dome and flange temperature predictions to within 2 C of the pertinent test data. The increased inboard end dome temperature has a considerable impact on thermal control of the spacecraft central body. The validation process offers an example of physically-driven analytical model calibration with test data from not only an electrical simulator but also a nuclear-fueled flight unit, and has established the end dome temperatures of a flight RTG where no in-flight or ground-test data existed before

  6. Ethical leadership: meta-analytic evidence of criterion-related and incremental validity.

    Science.gov (United States)

    Ng, Thomas W H; Feldman, Daniel C

    2015-05-01

    This study examines the criterion-related and incremental validity of ethical leadership (EL) with meta-analytic data. Across 101 samples published over the last 15 years (N = 29,620), we observed that EL demonstrated acceptable criterion-related validity with variables that tap followers' job attitudes, job performance, and evaluations of their leaders. Further, followers' trust in the leader mediated the relationships of EL with job attitudes and performance. In terms of incremental validity, we found that EL significantly, albeit weakly in some cases, predicted task performance, citizenship behavior, and counterproductive work behavior-even after controlling for the effects of such variables as transformational leadership, use of contingent rewards, management by exception, interactional fairness, and destructive leadership. The article concludes with a discussion of ways to strengthen the incremental validity of EL. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  7. Use of reference materials for validating analytical methods. Applied to the determination of As, Co, Na, Hg, Se and Fe using neutron activation analysis

    International Nuclear Information System (INIS)

    Munoz, L; Andonie, O; Kohnenkamp, I

    2000-01-01

    The main purpose of an analytical laboratory is to provide reliable information on the nature and composition of the materials submitted for analysis. This purpose can only be attained if analytical methodologies that have the attributes of accuracy, precision, specificity and sensitivity, among others, are used. The process by which these attributes are evaluated is called validation of the analytical method. The Chilean Nuclear Energy Commission's Neutron Activation Analysis Laboratory is applying a quality guarantee program to ensure the quality of its analytical results, which aims, as well, to attain accreditation for some of its measurements. Validation of the analytical methodologies used is an essential part of applying this program. There are many forms of validation, from comparison with reference techniques to participation in inter-comparison rounds. Certified reference materials were used in this work in order to validate the application of neutron activation analysis in determining As, Co, Na, Hg, Se and Fe in shellfish samples. The use of reference materials was chosen because it is a simple option that easily detects sources of systematic errors. Neutron activation analysis is an instrumental analytical method that does not need chemical treatment and that is based on processes which take place in the nuclei of atoms, making the matrix effects unimportant and different biological reference materials can be used. The following certified reference materials were used for validating the method used: BCR human hair 397, NRCC dogfish muscle DORM-2, NRCC -dogfish liver DOLT-2, NIST - oyster tissue 1566, NIES - mussel 6 and BCR - tuna fish 464. The reference materials were analyzed using the procedure developed for the shellfish samples and the above-mentioned elements were determined. With the results obtained, the parameters of accuracy, precision, detection limit, quantification limit and uncertainty associated with the method were determined for each

  8. Analytic results for the one loop NMHV H anti qqgg amplitude

    International Nuclear Information System (INIS)

    Badger, Simon; Campbell, John M.; Williams, Ciaran

    2009-01-01

    We compute the one-loop amplitude for a Higgs boson, a quark-antiquark pair and a pair of gluons of negative helicity, i.e. for the next-to-maximally helicity violating (NMHV) case, A(H, 1 - anti q , 2 + q , 3 - g , 4 - g ). The calculation is performed using an effective Lagrangian which is valid in the limit of very large top quark mass. As a result of this paper all amplitudes for the transition of a Higgs boson into 4 partons are now known analytically at one-loop order. (orig.)

  9. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    Science.gov (United States)

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…

  10. Semi-physiologic model validation and bioequivalence trials simulation to select the best analyte for acetylsalicylic acid.

    Science.gov (United States)

    Cuesta-Gragera, Ana; Navarro-Fontestad, Carmen; Mangas-Sanjuan, Victor; González-Álvarez, Isabel; García-Arieta, Alfredo; Trocóniz, Iñaki F; Casabó, Vicente G; Bermejo, Marival

    2015-07-10

    The objective of this paper is to apply a previously developed semi-physiologic pharmacokinetic model implemented in NONMEM to simulate bioequivalence trials (BE) of acetyl salicylic acid (ASA) in order to validate the model performance against ASA human experimental data. ASA is a drug with first-pass hepatic and intestinal metabolism following Michaelis-Menten kinetics that leads to the formation of two main metabolites in two generations (first and second generation metabolites). The first aim was to adapt the semi-physiological model for ASA in NOMMEN using ASA pharmacokinetic parameters from literature, showing its sequential metabolism. The second aim was to validate this model by comparing the results obtained in NONMEM simulations with published experimental data at a dose of 1000 mg. The validated model was used to simulate bioequivalence trials at 3 dose schemes (100, 1000 and 3000 mg) and with 6 test formulations with decreasing in vivo dissolution rate constants versus the reference formulation (kD 8-0.25 h (-1)). Finally, the third aim was to determine which analyte (parent drug, first generation or second generation metabolite) was more sensitive to changes in formulation performance. The validation results showed that the concentration-time curves obtained with the simulations reproduced closely the published experimental data, confirming model performance. The parent drug (ASA) was the analyte that showed to be more sensitive to the decrease in pharmaceutical quality, with the highest decrease in Cmax and AUC ratio between test and reference formulations. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Principles of Single-Laboratory Validation of Analytical Methods for Testing the Chemical Composition of Pesticides

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Underlying theoretical and practical approaches towards pesticide formulation analysis are discussed, i.e. general principles, performance characteristics, applicability of validation data, verification of method performance, and adaptation of validated methods by other laboratories. The principles of single laboratory validation of analytical methods for testing the chemical composition of pesticides are outlined. Also the theoretical background is described for performing pesticide formulation analysis as outlined in ISO, CIPAC/AOAC and IUPAC guidelines, including methodological characteristics such as specificity, selectivity, linearity, accuracy, trueness, precision and bias. Appendices I–III hereof give practical and elaborated examples on how to use the Horwitz approach and formulae for estimating the target standard deviation towards acceptable analytical repeatability. The estimation of trueness and the establishment of typical within-laboratory reproducibility are treated in greater detail by means of worked-out examples. (author)

  12. Analytic results for the one loop NMHV H anti qqgg amplitude

    Energy Technology Data Exchange (ETDEWEB)

    Badger, Simon [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Campbell, John M. [Glasgow Univ. (United Kingdom). Dept. of Physics and Astronomy; Ellis, R. Keith [Fermilab, Batavia, IL (United States); Williams, Ciaran [Durham Univ. (United Kingdom). Dept. of Physics

    2009-10-23

    We compute the one-loop amplitude for a Higgs boson, a quark-antiquark pair and a pair of gluons of negative helicity, i.e. for the next-to-maximally helicity violating (NMHV) case, A(H, 1{sup -} {sub anti} {sub q}, 2{sup +}{sub q}, 3{sup -}{sub g}, 4{sup -}{sub g}). The calculation is performed using an effective Lagrangian which is valid in the limit of very large top quark mass. As a result of this paper all amplitudes for the transition of a Higgs boson into 4 partons are now known analytically at one-loop order. (orig.)

  13. Cryptography based on neural networks - analytical results

    International Nuclear Information System (INIS)

    Rosen-Zvi, Michal; Kanter, Ido; Kinzel, Wolfgang

    2002-01-01

    The mutual learning process between two parity feed-forward networks with discrete and continuous weights is studied analytically, and we find that the number of steps required to achieve full synchronization between the two networks in the case of discrete weights is finite. The synchronization process is shown to be non-self-averaging and the analytical solution is based on random auxiliary variables. The learning time of an attacker that is trying to imitate one of the networks is examined analytically and is found to be much longer than the synchronization time. Analytical results are found to be in agreement with simulations. (letter to the editor)

  14. Analytical Performance Modeling and Validation of Intel’s Xeon Phi Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Chunduri, Sudheer; Balaprakash, Prasanna; Morozov, Vitali; Vishwanath, Venkatram; Kumaran, Kalyan

    2017-01-01

    Modeling the performance of scientific applications on emerging hardware plays a central role in achieving extreme-scale computing goals. Analytical models that capture the interaction between applications and hardware characteristics are attractive because even a reasonably accurate model can be useful for performance tuning before the hardware is made available. In this paper, we develop a hardware model for Intel’s second-generation Xeon Phi architecture code-named Knights Landing (KNL) for the SKOPE framework. We validate the KNL hardware model by projecting the performance of mini-benchmarks and application kernels. The results show that our KNL model can project the performance with prediction errors of 10% to 20%. The hardware model also provides informative recommendations for code transformations and tuning.

  15. Analytical validation of a novel multiplex test for detection of advanced adenoma and colorectal cancer in symptomatic patients.

    Science.gov (United States)

    Dillon, Roslyn; Croner, Lisa J; Bucci, John; Kairs, Stefanie N; You, Jia; Beasley, Sharon; Blimline, Mark; Carino, Rochele B; Chan, Vicky C; Cuevas, Danissa; Diggs, Jeff; Jennings, Megan; Levy, Jacob; Mina, Ginger; Yee, Alvin; Wilcox, Bruce

    2018-05-30

    Early detection of colorectal cancer (CRC) is key to reducing associated mortality. Despite the importance of early detection, approximately 40% of individuals in the United States between the ages of 50-75 have never been screened for CRC. The low compliance with colonoscopy and fecal-based screening may be addressed with a non-invasive alternative such as a blood-based test. We describe here the analytical validation of a multiplexed blood-based assay that measures the plasma concentrations of 15 proteins to assess advanced adenoma (AA) and CRC risk in symptomatic patients. The test was developed on an electrochemiluminescent immunoassay platform employing four multi-marker panels, to be implemented in the clinic as a laboratory developed test (LDT). Under the Clinical Laboratory Improvement Amendments (CLIA) and College of American Pathologists (CAP) regulations, a United States-based clinical laboratory utilizing an LDT must establish performance characteristics relating to analytical validity prior to releasing patient test results. This report describes a series of studies demonstrating the precision, accuracy, analytical sensitivity, and analytical specificity for each of the 15 assays, as required by CLIA/CAP. In addition, the report describes studies characterizing each of the assays' dynamic range, parallelism, tolerance to common interfering substances, spike recovery, and stability to sample freeze-thaw cycles. Upon completion of the analytical characterization, a clinical accuracy study was performed to evaluate concordance of AA and CRC classifier model calls using the analytical method intended for use in the clinic. Of 434 symptomatic patient samples tested, the percent agreement with original CRC and AA calls was 87% and 92% respectively. All studies followed CLSI guidelines and met the regulatory requirements for implementation of a new LDT. The results provide the analytical evidence to support the implementation of the novel multi-marker test as

  16. Short-Term Predictive Validity of Cluster Analytic and Dimensional Classification of Child Behavioral Adjustment in School

    Science.gov (United States)

    Kim, Sangwon; Kamphaus, Randy W.; Baker, Jean A.

    2006-01-01

    A constructive debate over the classification of child psychopathology can be stimulated by investigating the validity of different classification approaches. We examined and compared the short-term predictive validity of cluster analytic and dimensional classifications of child behavioral adjustment in school using the Behavior Assessment System…

  17. Wetting boundary condition for the color-gradient lattice Boltzmann method: Validation with analytical and experimental data

    Science.gov (United States)

    Akai, Takashi; Bijeljic, Branko; Blunt, Martin J.

    2018-06-01

    In the color gradient lattice Boltzmann model (CG-LBM), a fictitious-density wetting boundary condition has been widely used because of its ease of implementation. However, as we show, this may lead to inaccurate results in some cases. In this paper, a new scheme for the wetting boundary condition is proposed which can handle complicated 3D geometries. The validity of our method for static problems is demonstrated by comparing the simulated results to analytical solutions in 2D and 3D geometries with curved boundaries. Then, capillary rise simulations are performed to study dynamic problems where the three-phase contact line moves. The results are compared to experimental results in the literature (Heshmati and Piri, 2014). If a constant contact angle is assumed, the simulations agree with the analytical solution based on the Lucas-Washburn equation. However, to match the experiments, we need to implement a dynamic contact angle that varies with the flow rate.

  18. Analytical validation of a melanoma diagnostic gene signature using formalin-fixed paraffin-embedded melanocytic lesions.

    Science.gov (United States)

    Warf, M Bryan; Flake, Darl D; Adams, Doug; Gutin, Alexander; Kolquist, Kathryn A; Wenstrup, Richard J; Roa, Benjamin B

    2015-01-01

    These studies were to validate the analytical performance of a gene expression signature that differentiates melanoma and nevi, using RNA expression from 14 signature genes and nine normalization genes that generates a melanoma diagnostic score (MDS). Formalin-fixed paraffin-embedded melanocytic lesions were evaluated in these studies. The overall SD of the assay was determined to be 0.69 MDS units. Individual amplicons within the signature had an average amplification efficiency of 92% and a SD less than 0.5 CT. The MDS was reproducible across a 2000-fold dilution range of input RNA. Melanin, an inhibitor of PCR, does not interfere with the signature. These studies indicate this signature is robust and reproducible and is analytically validated on formalin-fixed paraffin-embedded melanocytic lesions.

  19. An analytic solution for numerical modeling validation in electromagnetics: the resistive sphere

    Science.gov (United States)

    Swidinsky, Andrei; Liu, Lifei

    2017-11-01

    We derive the electromagnetic response of a resistive sphere to an electric dipole source buried in a conductive whole space. The solution consists of an infinite series of spherical Bessel functions and associated Legendre polynomials, and follows the well-studied problem of a conductive sphere buried in a resistive whole space in the presence of a magnetic dipole. Our result is particularly useful for controlled-source electromagnetic problems using a grounded electric dipole transmitter and can be used to check numerical methods of calculating the response of resistive targets (such as finite difference, finite volume, finite element and integral equation). While we elect to focus on the resistive sphere in our examples, the expressions in this paper are completely general and allow for arbitrary source frequency, sphere radius, transmitter position, receiver position and sphere/host conductivity contrast so that conductive target responses can also be checked. Commonly used mesh validation techniques consist of comparisons against other numerical codes, but such solutions may not always be reliable or readily available. Alternatively, the response of simple 1-D models can be tested against well-known whole space, half-space and layered earth solutions, but such an approach is inadequate for validating models with curved surfaces. We demonstrate that our theoretical results can be used as a complementary validation tool by comparing analytic electric fields to those calculated through a finite-element analysis; the software implementation of this infinite series solution is made available for direct and immediate application.

  20. Analytical techniques and method validation for the measurement of selected semivolatile and nonvolatile organofluorochemicals in air.

    Science.gov (United States)

    Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M

    2004-09-01

    The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used

  1. Analytic nuclear scattering theories

    International Nuclear Information System (INIS)

    Di Marzio, F.; University of Melbourne, Parkville, VIC

    1999-01-01

    A wide range of nuclear reactions are examined in an analytical version of the usual distorted wave Born approximation. This new approach provides either semi analytic or fully analytic descriptions of the nuclear scattering processes. The resulting computational simplifications, when used within the limits of validity, allow very detailed tests of both nuclear interaction models as well as large basis models of nuclear structure to be performed

  2. Fission product release from nuclear fuel II. Validation of ASTEC/ELSA on analytical and large scale experiments

    International Nuclear Information System (INIS)

    Brillant, G.; Marchetto, C.; Plumecocq, W.

    2013-01-01

    Highlights: • A wide range of experiments is presented for the ASTEC/ELSA code validation. • Analytical tests such as AECL, ORNL and VERCORS are considered. • A large-scale experiment, PHEBUS FPT1, is considered. • The good agreement with measurements shows the efficiency of the ASTEC modelling. • Improvements concern the FP release modelling from MOX and high burn-up UO 2 fuels. - Abstract: This article is the second of two articles dedicated to the mechanisms of fission product release from a degraded core. The models of fission product release from nuclear fuel in the ASTEC code have been described in detail in the first part of this work (Brillant et al., this issue). In this contribution, the validation of ELSA, the module of ASTEC that deals with fission product and structural material release from a degraded core, is presented. A large range of experimental tests, with various temperature and conditions for the fuel surrounding atmosphere (oxidising and reducing), is thus simulated with the ASTEC code. The validation database includes several analytical experiments with both bare fuel (e.g. MCE1 experiments) and cladded fuel (e.g. HCE3, VERCORS). Furthermore, the PHEBUS large-scale experiments are used for the validation of ASTEC. The rather satisfactory comparison between ELSA calculations and experimental measurements demonstrates the efficiency of the analytical models to describe fission product release in severe accident conditions

  3. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  4. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  5. Computer-aided test selection and result validation-opportunities and pitfalls

    DEFF Research Database (Denmark)

    McNair, P; Brender, J; Talmon, J

    1998-01-01

    /or to increase cost-efficiency). Our experience shows that there is a practical limit to the extent of exploitation of the principle of dynamic test scheduling, unless it is automated in one way or the other. This paper analyses some issues of concern related to the profession of clinical biochemistry, when......Dynamic test scheduling is concerned with pre-analytical preprocessing of the individual samples within a clinical laboratory production by means of decision algorithms. The purpose of such scheduling is to provide maximal information with minimal data production (to avoid data pollution and...... implementing such dynamic test scheduling within a Laboratory Information System (and/or an advanced analytical workstation). The challenge is related to 1) generation of appropriately validated decision models, and 2) mastering consequences of analytical imprecision and bias....

  6. Interacting steps with finite-range interactions: Analytical approximation and numerical results

    Science.gov (United States)

    Jaramillo, Diego Felipe; Téllez, Gabriel; González, Diego Luis; Einstein, T. L.

    2013-05-01

    We calculate an analytical expression for the terrace-width distribution P(s) for an interacting step system with nearest- and next-nearest-neighbor interactions. Our model is derived by mapping the step system onto a statistically equivalent one-dimensional system of classical particles. The validity of the model is tested with several numerical simulations and experimental results. We explore the effect of the range of interactions q on the functional form of the terrace-width distribution and pair correlation functions. For physically plausible interactions, we find modest changes when next-nearest neighbor interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.

  7. Development, validation and evaluation of an analytical method for the determination of monomeric and oligomeric procyanidins in apple extracts.

    Science.gov (United States)

    Hollands, Wendy J; Voorspoels, Stefan; Jacobs, Griet; Aaby, Kjersti; Meisland, Ane; Garcia-Villalba, Rocio; Tomas-Barberan, Francisco; Piskula, Mariusz K; Mawson, Deborah; Vovk, Irena; Needs, Paul W; Kroon, Paul A

    2017-04-28

    There is a lack of data for individual oligomeric procyanidins in apples and apple extracts. Our aim was to develop, validate and evaluate an analytical method for the separation, identification and quantification of monomeric and oligomeric flavanols in apple extracts. To achieve this, we prepared two types of flavanol extracts from freeze-dried apples; one was an epicatechin-rich extract containing ∼30% (w/w) monomeric (-)-epicatechin which also contained oligomeric procyanidins (Extract A), the second was an oligomeric procyanidin-rich extract depleted of epicatechin (Extract B). The parameters considered for method optimisation were HPLC columns and conditions, sample heating, mass of extract and dilution volumes. The performance characteristics considered for method validation included standard linearity, method sensitivity, precision and trueness. Eight laboratories participated in the method evaluation. Chromatographic separation of the analytes was best achieved utilizing a Hilic column with a binary mobile phase consisting of acidic acetonitrile and acidic aqueous methanol. The final method showed linearity for epicatechin in the range 5-100μg/mL with a correlation co-efficient >0.999. Intra-day and inter-day precision of the analytes ranged from 2 to 6% and 2 to 13% respectively. Up to dp3, trueness of the method was >95% but decreased with increasing dp. Within laboratory precision showed RSD values <5 and 10% for monomers and oligomers, respectively. Between laboratory precision was 4 and 15% (Extract A) and 7 and 30% (Extract B) for monomers and oligomers, respectively. An analytical method for the separation, identification and quantification of procyanidins in an apple extract was developed, validated and assessed. The results of the inter-laboratory evaluation indicate that the method is reliable and reproducible. Copyright © 2017. Published by Elsevier B.V.

  8. Validation of analytical methods for the stability studies of naproxen suppositories for infant and adult use

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar

    2011-01-01

    Analytical and validating studies were performed in this paper, with a view to using them in the stability studies of the future formulations of naproxen suppositories for children and adults. The most influential factors in the naproxen stability were determined, that is, the major degradation occurred in acid medium, oxidative medium and by light action. One high-performance liquid chromatography-based method was evaluated, which proved to be adequate to quantify naproxen in suppositories and was selective against degradation products. The quantification limit was 3,480 μg, so it was valid for these studies. Additionally, the parameters specificity for stability, detection and quantification limits were evaluated for the direct semi-aqueous acid-base method, which was formerly validated for the quality control and showed satisfactory results. Nevertheless, the volumetric methods were not regarded as stability indicators; therefore, this method will be used along with the chromatographic methods of choice, that is, thin-layer chromatography and highperformance liquid chromatography, to determine the degradation products

  9. Validation of an analytical methodology for the quantitative analysis of petroleum hydrocarbons in marine sediment samples

    Directory of Open Access Journals (Sweden)

    Eloy Yordad Companioni Damas

    2009-01-01

    Full Text Available This work describes a validation of an analytical procedure for the analysis of petroleum hydrocarbons in marine sediment samples. The proposed protocol is able to measure n-alkanes and polycyclic aromatic hydrocarbons (PAH in samples at concentrations as low as 30 ng/g, with a precision better than 15% for most of analytes. The extraction efficiency of fortified sediments varied from 65.1 to 105.6% and 59.7 to 97.8%, for n-alkanes and PAH in the ranges: C16 - C32 and fluoranthene - benzo(apyrene, respectively. The analytical protocol was applied to determine petroleum hydrocarbons in sediments collected from a marine coastal zone.

  10. Optimization of instrumental neutron activation analysis method by means of 2k experimental design technique aiming the validation of analytical procedures

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2013-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2 k experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  11. Adequacy and validation of an analytical method for the quantification of lead in chamomile tisanes produced in Costa Rica

    International Nuclear Information System (INIS)

    Blanco Barrantes, Jeimy

    2014-01-01

    An analytical methodology is developed and validated to quantify lead in chamomile tisanes. Lead is quantified by utilizing the technique of flame atomic absorption spectroscopy in three brands of chamomile tisanes sold in Costa Rica to determine its safety and quality based on international standards. A method of sample preparation is established through a comparison of different forms of extraction. The acid digestion extraction method has been the procedure utilized, reaching an average recovery percentage of 97,1%, with a standard deviation of 2,3%. The optimization of the chosen analytical procedure and complete validation is performed. The results obtained in the validation of the analytical procedure have shown that the interval where is generated the best calibration curve in terms of the correlation coefficient and the value of the statistically significant intercept equal to zero, have been the comprised between (0,2-3,2) μg/mL (r 2 =0,9996), corresponding to a range between 20% to 320% of the maximum allowed limit. In addition, the procedure has been adequate in terms of accuracy (average recovery percentage 101,1%) and precision under repeatability and intermediate precision (RSD max. 9,3%) and limit of quantification (0,2551 μg/mL). The safety criterion of World Health Organization (WHO) is determined with respect to the concentration of lead in the analyzed products. The 9 analyzed samples of products to prepare chamomile tisanes have stayed without evidencing concentrations of lead above the limit value of 10 μg/g suggested for medicinal herbs by WHO [es

  12. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    Science.gov (United States)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  13. Validation of the analytical method for sodium dichloroisocyanurate aimed at drinking water disinfection

    International Nuclear Information System (INIS)

    Martinez Alvarez, Luis Octavio; Alejo Cisneros, Pedro; Garcia Pereira, Reynaldo; Campos Valdez, Doraily

    2014-01-01

    Cuba has developed the first effervescent 3.5 mg sodium dichloroisocyanurate tablets as a non-therapeutic active principle. This ingredient releases certain amount of chlorine when dissolved into a litre of water and it can cause adequate disinfection of drinking water ready to be taken after 30 min. Developing and validating an analytical iodometric method applicable to the quality control of effervescent 3.5 mg sodium dichloroisocyanurate tablets

  14. Analytical validation of an ultra low-cost mobile phone microplate reader for infectious disease testing.

    Science.gov (United States)

    Wang, Li-Ju; Naudé, Nicole; Demissie, Misganaw; Crivaro, Anne; Kamoun, Malek; Wang, Ping; Li, Lei

    2018-07-01

    Most mobile health (mHealth) diagnostic devices for laboratory tests only analyze one sample at a time, which is not suitable for large volume serology testing, especially in low-resource settings with shortage of health professionals. In this study, we developed an ultra-low-cost clinically-accurate mobile phone microplate reader (mReader), and clinically validated this optical device for 12 infectious disease tests. The mReader optically reads 96 samples on a microplate at one time. 771 de-identified patient samples were tested for 12 serology assays for bacterial/viral infections. The mReader and the clinical instrument blindly read and analyzed all tests in parallel. The analytical accuracy and the diagnostic performance of the mReader were evaluated across the clinical reportable categories by comparison with clinical laboratorial testing results. The mReader exhibited 97.59-99.90% analytical accuracy and envision the mReader can benefit underserved areas/populations and low-resource settings in rural clinics/hospitals at a low cost (~$50 USD) with clinical-level analytical quality. It has the potential to improve health access, speed up healthcare delivery, and reduce health disparities and education disparities by providing access to a low-cost spectrophotometer. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Validation of multivariate classification methods using analytical fingerprints – concept and case study on organic feed for laying hens

    NARCIS (Netherlands)

    Alewijn, Martin; Voet, van der Hilko; Ruth, van Saskia

    2016-01-01

    Multivariate classification methods based on analytical fingerprints have found many applications in the food and feed area, but practical applications are still scarce due to a lack of a generally accepted validation procedure. This paper proposes a new approach for validation of this type of

  16. Optimization of instrumental neutron activation analysis method by means of 2{sup k} experimental design technique aiming the validation of analytical procedures

    Energy Technology Data Exchange (ETDEWEB)

    Petroni, Robson; Moreira, Edson G., E-mail: rpetroni@ipen.br, E-mail: emoreira@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2{sup k} experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  17. Analytical validation of operator actions in case of primary to secondary leakage for VVER-1000/V320

    Energy Technology Data Exchange (ETDEWEB)

    Andreeva, M., E-mail: m_andreeva@inrne.bas.bg; Groudev, P., E-mail: pavlinpg@inrne.bas.bg; Pavlova, M., E-mail: pavlova@inrne.bas.bg

    2015-12-15

    Highlights: • We validate operator actions in case of primary to secondary leakage. • We perform four scenarios related to SGTR accident for VVER-1000/V320. • The reference power plant for the analyses is Unit 6 at Kozloduy NPP. • The RELAP5/MOD 3.2 computer code is used in performing the analyses. • The analyses confirm the effectiveness of operator actions during PRISE. - Abstract: This paper presents the results of analytical validation of operator actions in case of “Steam Generator Tube Rupture” (SGTR) for VVER-1000/V320 units at Kozloduy Nuclear Power Plant (KNPP), done during the development of Symptom Based Emergency Operating Procedures (SB EOPs) for this plant. The purpose of the analyses is to demonstrate the ability to terminate primary to secondary leakage and to indicate an effective strategy for preventing secondary leakage to the environment and in this way to prevent radiological release to the environment. Following depressurization and cooldown of reactor coolant system (RCS) with isolation of the affected steam generator (SG), in these analyses are validated options for post-SGTR cooldown by: • back up filling the ruptured SG; • using letdown system in the affected SG and • by opening Fast Acting Isolation Valve (FAIV) and using Steam Dump Facility to the Condenser (BRU-K). The results of the thermal-hydraulic analyses have been used to assist KNPP specialists in analytical validation of EOPs. The RELAP5/MOD3.2 computer code has been used for the analyses in a VVER-1000 Nuclear Power Plant (NPP) model. A model of VVER-1000 based on Unit 6 of Kozloduy NPP has been developed for the thermal-hydraulics code RELAP5/MOD3.2 at the Institute for Nuclear Research and Nuclear Energy – Bulgarian Academy of Sciences (INRNE-BAS). This paper is possible through the participation of leading specialists from KNPP.

  18. Analytical validation of the PAM50-based Prosigna Breast Cancer Prognostic Gene Signature Assay and nCounter Analysis System using formalin-fixed paraffin-embedded breast tumor specimens

    International Nuclear Information System (INIS)

    Nielsen, Torsten; Storhoff, James; Wallden, Brett; Schaper, Carl; Ferree, Sean; Liu, Shuzhen; Gao, Dongxia; Barry, Garrett; Dowidar, Naeem; Maysuria, Malini

    2014-01-01

    NanoString’s Prosigna™ Breast Cancer Prognostic Gene Signature Assay is based on the PAM50 gene expression signature. The test outputs a risk of recurrence (ROR) score, risk category, and intrinsic subtype (Luminal A/B, HER2-enriched, Basal-like). The studies described here were designed to validate the analytical performance of the test on the nCounter Analysis System across multiple laboratories. Analytical precision was measured by testing five breast tumor RNA samples across 3 sites. Reproducibility was measured by testing replicate tissue sections from 43 FFPE breast tumor blocks across 3 sites following independent pathology review at each site. The RNA input range was validated by comparing assay results at the extremes of the specified range to the nominal RNA input level. Interference was evaluated by including non-tumor tissue into the test. The measured standard deviation (SD) was less than 1 ROR unit within the analytical precision study and the measured total SD was 2.9 ROR units within the reproducibility study. The ROR scores for RNA inputs at the extremes of the range were the same as those at the nominal input level. Assay results were stable in the presence of moderate amounts of surrounding non-tumor tissue (<70% by area). The analytical performance of NanoString’s Prosigna assay has been validated using FFPE breast tumor specimens across multiple clinical testing laboratories

  19. Analytical Validation of a New Enzymatic and Automatable Method for d-Xylose Measurement in Human Urine Samples

    Directory of Open Access Journals (Sweden)

    Israel Sánchez-Moreno

    2017-01-01

    Full Text Available Hypolactasia, or intestinal lactase deficiency, affects more than half of the world population. Currently, xylose quantification in urine after gaxilose oral administration for the noninvasive diagnosis of hypolactasia is performed with the hand-operated nonautomatable phloroglucinol reaction. This work demonstrates that a new enzymatic xylose quantification method, based on the activity of xylose dehydrogenase from Caulobacter crescentus, represents an excellent alternative to the manual phloroglucinol reaction. The new method is automatable and facilitates the use of the gaxilose test for hypolactasia diagnosis in the clinical practice. The analytical validation of the new technique was performed in three different autoanalyzers, using buffer or urine samples spiked with different xylose concentrations. For the comparison between the phloroglucinol and the enzymatic assays, 224 urine samples of patients to whom the gaxilose test had been prescribed were assayed by both methods. A mean bias of −16.08 mg of xylose was observed when comparing the results obtained by both techniques. After adjusting the cut-off of the enzymatic method to 19.18 mg of xylose, the Kappa coefficient was found to be 0.9531, indicating an excellent level of agreement between both analytical procedures. This new assay represents the first automatable enzymatic technique validated for xylose quantification in urine.

  20. Compact tokamak reactors. Part 1 (analytic results)

    International Nuclear Information System (INIS)

    Wootton, A.J.; Wiley, J.C.; Edmonds, P.H.; Ross, D.W.

    1996-01-01

    We discuss the possible use of tokamaks for thermonuclear power plants, in particular tokamaks with low aspect ratio and copper toroidal field coils. Three approaches are presented. First we review and summarize the existing literature. Second, using simple analytic estimates, the size of the smallest tokamak to produce an ignited plasma is derived. This steady state energy balance analysis is then extended to determine the smallest tokamak power plant, by including the power required to drive the toroidal field, and considering two extremes of plasma current drive efficiency. The analytic results will be augmented by a numerical calculation which permits arbitrary plasma current drive efficiency; the results of which will be presented in Part II. Third, a scaling from any given reference reactor design to a copper toroidal field coil device is discussed. Throughout the paper the importance of various restrictions is emphasized, in particular plasma current drive efficiency, plasma confinement, plasma safety factor, plasma elongation, plasma beta, neutron wall loading, blanket availability and recirculating electric power. We conclude that the latest published reactor studies, which show little advantage in using low aspect ratio unless remarkably high efficiency plasma current drive and low safety factor are combined, can be reproduced with the analytic model

  1. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... analytical bias based on biological variation. RESULTS: Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. DISCUSSION: Patient results applied in analytical quality performance...

  2. Analytical method validation for quality control and the study of the 50 mg Propylthiouracil stability

    International Nuclear Information System (INIS)

    Valdes Bendoyro, Maria Olga; Garcia Penna, Caridad Margarita; Fernandez, Juan Lugones; Garcia Borges, Lisandra; Martinez Espinosa, Vivian

    2010-01-01

    A high-performance liquid chromatography analytical method was developed and validated for the quality control and stability studies of 50 mg Propylthiouracil tablets. Method is based in active principle separation through a 100 RP-18 RP-18 (5 μm) (250 x 4 mm) Lichrospher chromatography with UV detection to 272 nm, using a mobile phase composed by a ungaseous mixture of a 0.025 M buffer solution-monobasic potassium phosphate to pH= 4,6 ad acetonitrile in a 80:20 ratio with a flux speed of 0,5 mL/min. Analytical method was linear, precise, specific and exact in the study concentrations interval

  3. Analytical results for Abelian projection

    International Nuclear Information System (INIS)

    Ogilivie, Michael C.

    1999-01-01

    Analytic methods for Abelian projection are developed, and a number of results related to string tension measurements are obtained. It is proven that even without gauge fixing, Abelian projection yields string tensions of the underlying non-Abelian theory. Strong arguments are given for similar results in the case where gauge fixing is employed. The subgroup used for projection need only contain the center of the gauge group, and need not be Abelian. While gauge fixing is shown to be in principle unnecessary for the success of Abelian projection, it is computationally advantageous for the same reasons that improved operators, e.g., the use of fat links, are advantageous in Wilson loop measurements

  4. Numerical simulation and experimental validation of the three-dimensional flow field and relative analyte concentration distribution in an atmospheric pressure ion source.

    Science.gov (United States)

    Poehler, Thorsten; Kunte, Robert; Hoenen, Herwart; Jeschke, Peter; Wissdorf, Walter; Brockmann, Klaus J; Benter, Thorsten

    2011-11-01

    In this study, the validation and analysis of steady state numerical simulations of the gas flows within a multi-purpose ion source (MPIS) are presented. The experimental results were obtained with particle image velocimetry (PIV) measurements in a non-scaled MPIS. Two-dimensional time-averaged velocity and turbulent kinetic energy distributions are presented for two dry gas volume flow rates. The numerical results of the validation simulations are in very good agreement with the experimental data. All significant flow features have been correctly predicted within the accuracy of the experiments. For technical reasons, the experiments were conducted at room temperature. Thus, numerical simulations of ionization conditions at two operating points of the MPIS are also presented. It is clearly shown that the dry gas volume flow rate has the most significant impact on the overall flow pattern within the APLI source; far less critical is the (larger) nebulization gas flow. In addition to the approximate solution of Reynolds-Averaged Navier-Stokes equations, a transport equation for the relative analyte concentration has been solved. The results yield information on the three-dimensional analyte distribution within the source. It becomes evident that for ion transport into the MS ion transfer capillary, electromagnetic forces are at least as important as fluid dynamic forces. However, only the fluid dynamics determines the three-dimensional distribution of analyte gas. Thus, local flow phenomena in close proximity to the spray shield are strongly impacting on the ionization efficiency.

  5. SU-E-T-479: Development and Validation of Analytical Models Predicting Secondary Neutron Radiation in Proton Therapy Applications

    International Nuclear Information System (INIS)

    Farah, J; Bonfrate, A; Donadille, L; Martinetti, F; Trompier, F; Clairand, I; De Olivera, A; Delacroix, S; Herault, J; Piau, S; Vabre, I

    2014-01-01

    Purpose: Test and validation of analytical models predicting leakage neutron exposure in passively scattered proton therapy. Methods: Taking inspiration from the literature, this work attempts to build an analytical model predicting neutron ambient dose equivalents, H*(10), within the local 75 MeV ocular proton therapy facility. MC simulations were first used to model H*(10) in the beam axis plane while considering a closed final collimator and pristine Bragg peak delivery. Next, MC-based analytical model was tested against simulation results and experimental measurements. The model was also expended in the vertical direction to enable a full 3D mapping of H*(10) inside the treatment room. Finally, the work focused on upgrading the literature model to clinically relevant configurations considering modulated beams, open collimators, patient-induced neutron fluctuations, etc. Results: The MC-based analytical model efficiently reproduced simulated H*(10) values with a maximum difference below 10%. In addition, it succeeded in predicting measured H*(10) values with differences <40%. The highest differences were registered at the closest and farthest positions from isocenter where the analytical model failed to faithfully reproduce the high neutron fluence and energy variations. The differences remains however acceptable taking into account the high measurement/simulation uncertainties and the end use of this model, i.e. radiation protection. Moreover, the model was successfully (differences < 20% on simulations and < 45% on measurements) extended to predict neutrons in the vertical direction with respect to the beam line as patients are in the upright seated position during ocular treatments. Accounting for the impact of beam modulation, collimation and the present of a patient in the beam path is far more challenging and conversion coefficients are currently being defined to predict stray neutrons in clinically representative treatment configurations. Conclusion

  6. Application of analytical quality by design principles for the determination of alkyl p-toluenesulfonates impurities in Aprepitant by HPLC. Validation using total-error concept.

    Science.gov (United States)

    Zacharis, Constantinos K; Vastardi, Elli

    2018-02-20

    In the research presented we report the development of a simple and robust liquid chromatographic method for the quantification of two genotoxic alkyl sulphonate impurities (namely methyl p-toluenesulfonate and isopropyl p-toluenesulfonate) in Aprepitant API substances using the Analytical Quality by Design (AQbD) approach. Following the steps of AQbD protocol, the selected critical method attributes (CMAs) were the separation criterions between the critical peak pairs, the analysis time and the peak efficiencies of the analytes. The critical method parameters (CMPs) included the flow rate, the gradient slope and the acetonitrile content at the first step of the gradient elution program. Multivariate experimental designs namely Plackett-Burman and Box-Behnken designs were conducted sequentially for factor screening and optimization of the method parameters. The optimal separation conditions were estimated using the desirability function. The method was fully validated in the range of 10-200% of the target concentration limit of the analytes using the "total error" approach. Accuracy profiles - a graphical decision making tool - were constructed using the results of the validation procedures. The β-expectation tolerance intervals did not exceed the acceptance criteria of±10%, meaning that 95% of future results will be included in the defined bias limits. The relative bias ranged between - 1.3-3.8% for both analytes, while the RSD values for repeatability and intermediate precision were less than 1.9% in all cases. The achieved limit of detection (LOD) and the limit of quantification (LOQ) were adequate for the specific purpose and found to be 0.02% (corresponding to 48μgg -1 in sample) for both methyl and isopropyl p-toluenesulfonate. As proof-of-concept, the validated method was successfully applied in the analysis of several Aprepitant batches indicating that this methodology could be used for routine quality control analyses. Copyright © 2017 Elsevier B

  7. Validation of analytical method for quality control of B12 Vitamin-10 000 injection

    International Nuclear Information System (INIS)

    Botet Garcia, Martha; Garcia Penna, Caridad Margarita; Troche Concepcion, Yenilen; Cannizares Arencibia, Yanara; Moreno Correoso, Barbara

    2009-01-01

    Analytical method reported by USA Pharmacopeia was validated for quality control of injectable B 1 2 Vitamin (10 000 U) by UV spectrophotometry because this is a simpler and low-cost method allowing quality control of finished product. Calibration curve was graphed at 60 to 140% interval, where it was linear with a correlation coefficient similar to 0, 9999; statistical test for interception and slope was considered non-significant. There was a recovery of 99.7 % in study concentrations interval where the Cochran (G) and Student(t) test were not significant too. Variation coefficient in repetition study was similar to 0.59 % for the 6 assayed replies, whereas in intermediate precision analysis, the Fisher and Student tests were not significant. Analytical method was linear, precise, specific and exact in study concentrations interval

  8. A Validated Analytical Model for Availability Prediction of IPTV Services in VANETs

    Directory of Open Access Journals (Sweden)

    Bernd E. Wolfinger

    2014-12-01

    Full Text Available In vehicular ad hoc networks (VANETs, besides the original applications typically related to traffic safety, we nowadays can observe an increasing trend toward infotainment applications, such as IPTV services. Quality of experience (QoE, as observed by the end users of IPTV, is highly important to guarantee adequate user acceptance for the service. In IPTV, QoE is mainly determined by the availability of TV channels for the users. This paper presents an efficient and rather generally applicable analytical model that allows one to predict the blocking probability of TV channels, both for channel-switching-induced, as well as for handover-induced blocking events. We present the successful validation of the model by means of simulation, and we introduce a new measure for QoE. Numerous case studies illustrate how the analytical model and our new QoE measure can be applied successfully for the dimensioning of IPTV systems, taking into account the QoE requirements of the IPTV service users in strongly diverse traffic scenarios.

  9. Oxcarbazepine: validation and application of an analytical method

    Directory of Open Access Journals (Sweden)

    Paula Cristina Rezende Enéas

    2010-06-01

    Full Text Available Oxcarbazepine (OXC is an important anticonvulsant and mood stabilizing drug. A pharmacopoeial monograph for OXC is not yet available and therefore the development and validation of a new analytical method for quantification of this drug is essential. In the present study, a UV spectrophotometric method for the determination of OXC was developed. The various parameters, such as linearity, precision, accuracy and specificity, were studied according to International Conference on Harmonization Guidelines. Batches of 150 mg OXC capsules were prepared and analyzed using the validated UV method. The formulations were also evaluated for parameters including drug-excipient compatibility, flowability, uniformity of weight, disintegration time, assay, uniformity of content and the amount of drug dissolved during the first hour.Oxcarbazepina (OXC é um fármaco anticonvulsivante e estabilizante do humor. O desenvolvimento e validação de método analítico para quantificação da OXC são de fundamental importância devido à ausência de monografias farmacopéicas oficiais para esse fármaco. Nesse trabalho, um método espectrofotométrico UV para determinação da OXC foi desenvolvido. O método proposto foi validado seguindo os parâmetros de linearidade, precisão, exatidão e especificidade de acordo com as normas da Conferência Internacional de Harmonização. Cápsulas de OXC 150 mg foram preparadas e analisadas utilizando-se o método analítico validado. As formulações foram avaliadas com relação à compatibilidade fármaco-excipientes, fluidez, determinação de peso, tempo de desintegração, doseamento, uniformidade de conteúdo e quantidade do fármaco dissolvido após 60 minutos.

  10. Dynamically triangulated surfaces - some analytical results

    International Nuclear Information System (INIS)

    Kostov, I.K.

    1987-01-01

    We give a brief review of the analytical results concerning the model of dynamically triangulated surfaces. We will discuss the possible types of critical behaviour (depending on the dimension D of the embedding space) and the exact solutions obtained for D=0 and D=-2. The latter are important as a check of the Monte Carlo simulations applyed to study the model in more physical dimensions. They give also some general insight of its critical properties

  11. Analytical quality control in environmental analysis - Recent results and future trends of the IAEA's analytical quality control programme

    Energy Technology Data Exchange (ETDEWEB)

    Suschny, O; Heinonen, J

    1973-12-01

    The significance of analytical results depends critically on the degree of their reliability, an assessment of this reliability is indispensable if the results are to have any meaning at all. Environmental radionuclide analysis is a relatively new analytical field in which new methods are continuously being developed and into which many new laboratories have entered during the last ten to fifteen years. The scarcity of routine methods and the lack of experience of the new laboratories have made the need for the assessment of the reliability of results particularly urgent in this field. The IAEA, since 1962, has provided assistance to its member states by making available to their laboratories analytical quality control services in the form of standard samples, reference materials and the organization of analytical intercomparisons. The scope of this programme has increased over the years and now includes, in addition to environmental radionuclides, non-radioactive environmental contaminants which may be analysed by nuclear methods, materials for forensic neutron activation analysis, bioassay materials and nuclear fuel. The results obtained in recent intercomparisons demonstrate the continued need for these services. (author)

  12. Median of patient results as a tool for assessment of analytical stability.

    Science.gov (United States)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Dosimetric validation of the anisotropic analytical algorithm for photon dose calculation: fundamental characterization in water

    International Nuclear Information System (INIS)

    Fogliata, Antonella; Nicolini, Giorgia; Vanetti, Eugenio; Clivio, Alessandro; Cozzi, Luca

    2006-01-01

    In July 2005 a new algorithm was released by Varian Medical Systems for the Eclipse planning system and installed in our institute. It is the anisotropic analytical algorithm (AAA) for photon dose calculations, a convolution/superposition model for the first time implemented in a Varian planning system. It was therefore necessary to perform validation studies at different levels with a wide investigation approach. To validate the basic performances of the AAA, a detailed analysis of data computed by the AAA configuration algorithm was carried out and data were compared against measurements. To better appraise the performance of AAA and the capability of its configuration to tailor machine-specific characteristics, data obtained from the pencil beam convolution (PBC) algorithm implemented in Eclipse were also added in the comparison. Since the purpose of the paper is to address the basic performances of the AAA and of its configuration procedures, only data relative to measurements in water will be reported. Validation was carried out for three beams: 6 MV and 15 MV from a Clinac 2100C/D and 6 MV from a Clinac 6EX. Generally AAA calculations reproduced very well measured data, and small deviations were observed, on average, for all the quantities investigated for open and wedged fields. In particular, percentage depth-dose curves showed on average differences between calculation and measurement smaller than 1% or 1 mm, and computed profiles in the flattened region matched measurements with deviations smaller than 1% for all beams, field sizes, depths and wedges. Percentage differences in output factors were observed as small as 1% on average (with a range smaller than ±2%) for all conditions. Additional tests were carried out for enhanced dynamic wedges with results comparable to previous results. The basic dosimetric validation of the AAA was therefore considered satisfactory

  14. NCI-FDA Interagency Oncology Task Force Workshop Provides Guidance for Analytical Validation of Protein-based Multiplex Assays | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    An NCI-FDA Interagency Oncology Task Force (IOTF) Molecular Diagnostics Workshop was held on October 30, 2008 in Cambridge, MA, to discuss requirements for analytical validation of protein-based multiplex technologies in the context of its intended use. This workshop developed through NCI's Clinical Proteomic Technologies for Cancer initiative and the FDA focused on technology-specific analytical validation processes to be addressed prior to use in clinical settings. In making this workshop unique, a case study approach was used to discuss issues related to

  15. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  16. Experimental validation of an analytical kinetic model for edge-localized modes in JET-ITER-like wall

    Science.gov (United States)

    Guillemaut, C.; Metzger, C.; Moulton, D.; Heinola, K.; O’Mullane, M.; Balboa, I.; Boom, J.; Matthews, G. F.; Silburn, S.; Solano, E. R.; contributors, JET

    2018-06-01

    The design and operation of future fusion devices relying on H-mode plasmas requires reliable modelling of edge-localized modes (ELMs) for precise prediction of divertor target conditions. An extensive experimental validation of simple analytical predictions of the time evolution of target plasma loads during ELMs has been carried out here in more than 70 JET-ITER-like wall H-mode experiments with a wide range of conditions. Comparisons of these analytical predictions with diagnostic measurements of target ion flux density, power density, impact energy and electron temperature during ELMs are presented in this paper and show excellent agreement. The analytical predictions tested here are made with the ‘free-streaming’ kinetic model (FSM) which describes ELMs as a quasi-neutral plasma bunch expanding along the magnetic field lines into the Scrape-Off Layer without collisions. Consequences of the FSM on energy reflection and deposition on divertor targets during ELMs are also discussed.

  17. Life cycle management of analytical methods.

    Science.gov (United States)

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Simultaneous determination of renal function biomarkers in urine using a validated paper-based microfluidic analytical device.

    Science.gov (United States)

    Rossini, Eduardo Luiz; Milani, Maria Izabel; Carrilho, Emanuel; Pezza, Leonardo; Pezza, Helena Redigolo

    2018-01-02

    In this paper, we describe a validated paper-based microfluidic analytical device for the simultaneous quantification of two important biomarkers of renal function in urine. This paper platform provides an inexpensive, simple, and easy to use colorimetric method for the quantification of creatinine (CRN) and uric acid (UA) in urine samples. The microfluidic paper-based analytical device (μPAD) consists of a main channel with three identical arms, each containing a circular testing zone and a circular uptake zone. Creatinine detection is based on the Jaffé reaction, in which CRN reacts with picrate to form an orange-red product. Uric acid quantification is based on the reduction of Fe 3+ to Fe 2+ by UA, which is detected in a colorimetric reaction using 1,10-phenanthroline. Under optimum conditions, obtained through chemometrics, the concentrations of the analytes showed good linear correlations with the effective intensities, and the method presented satisfactory repeatability. The limits of detection and the linear ranges, respectively, were 15.7 mg L -1 and 50-600 mg L -1 for CRN and 16.5 mg L -1 and 50-500 mg L -1 for UA. There were no statistically significant differences between the results obtained using the μPAD and a chromatographic comparative method (Student's t-test at 95% confidence level). Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Analytical validation of a reference laboratory ELISA for the detection of feline leukemia virus p27 antigen.

    Science.gov (United States)

    Buch, Jesse S; Clark, Genevieve H; Cahill, Roberta; Thatcher, Brendon; Smith, Peter; Chandrashekar, Ramaswamy; Leutenegger, Christian M; O'Connor, Thomas P; Beall, Melissa J

    2017-09-01

    Feline leukemia virus (FeLV) is an oncogenic retrovirus of cats. Immunoassays for the p27 core protein of FeLV aid in the detection of FeLV infections. Commercial microtiter-plate ELISAs have rapid protocols and visual result interpretation, limiting their usefulness in high-throughput situations. The purpose of our study was to validate the PetChek FeLV 15 ELISA, which is designed for the reference laboratory, and incorporates sequential, orthogonal screening and confirmatory protocols. A cutoff for the screening assay was established with 100% accuracy using 309 feline samples (244 negative, 65 positive) defined by the combined results of FeLV PCR and an independent reference p27 antigen ELISA. Precision of the screening assay was measured using a panel of 3 samples (negative, low-positive, and high-positive). The intra-assay coefficient of variation (CV) was 3.9-7.9%; the inter-assay CV was 6.0-8.6%. For the confirmatory assay, the intra-assay CV was 3.0-4.7%, and the inter-assay CV was 7.4-9.7%. The analytical sensitivity for p27 antigen was 3.7 ng/mL for inactivated whole FeLV and 1.2 ng/mL for purified recombinant FeLV p27. Analytical specificity was demonstrated based on the absence of cross-reactivity to related retroviruses. No interference was observed for samples containing added bilirubin, hemoglobin, or lipids. Based on these results, the new high-throughput design of the PetChek FeLV 15 ELISA makes it suitable for use in reference laboratory settings and maintains overall analytical performance.

  20. Analytic expressions for mode conversion in a plasma with a parabolic density profile: Generalized results

    International Nuclear Information System (INIS)

    Hinkel-Lipsker, D.E.; Fried, B.D.; Morales, G.J.

    1993-01-01

    This study provides an analytic solution to the general problem of mode conversion in an unmagnetized plasma. Specifically, an electromagnetic wave of frequency ω propagating through a plasma with a parabolic density profile of scale length L p is examined. The mode conversion points are located a distance Δ 0 from the peak of the profile, where the electron plasma frequency ω p (z) matches the wave frequency ω. The corresponding reflection, transmission, and mode conversion coefficients are expressed analytically in terms of parabolic cylinder functions for all values of Δ 0 . The method of solution is based on a source approximation technique that is valid when the electromagnetic and electrostatic scale lengths are well separated. For large Δ 0 , i.e., (cL p /ω) 1/2 much-lt Δ 0 p , the appropriately scaled result [D. E. Hinkel-Lipsker et al., Phys. Fluids B 4, 559 (1992)] for a linear density profile is recovered as the parabolic cylinder functions asymptotically become Airy functions. When Δ 0 →0, the special case of conversion at the peak of the profile [D. E. Hinkel-Lipsker et al., Phys. Fluids B 4, 1772 (1992)] is obtained

  1. Analytical method validation of GC-FID for the simultaneous measurement of hydrocarbons (C2-C4) in their gas mixture

    OpenAIRE

    Oman Zuas; Harry budiman; Muhammad Rizky Mulyana

    2016-01-01

    An accurate gas chromatography coupled to a flame ionization detector (GC-FID) method was validated for the simultaneous analysis of light hydrocarbons (C2-C4) in their gas mixture. The validation parameters were evaluated based on the ISO/IEC 17025 definition including method selectivity, repeatability, accuracy, linearity, limit of detection (LOD), limit of quantitation (LOQ), and ruggedness. Under the optimum analytical conditions, the analysis of gas mixture revealed that each target comp...

  2. Teachable, high-content analytics for live-cell, phase contrast movies.

    Science.gov (United States)

    Alworth, Samuel V; Watanabe, Hirotada; Lee, James S J

    2010-09-01

    CL-Quant is a new solution platform for broad, high-content, live-cell image analysis. Powered by novel machine learning technologies and teach-by-example interfaces, CL-Quant provides a platform for the rapid development and application of scalable, high-performance, and fully automated analytics for a broad range of live-cell microscopy imaging applications, including label-free phase contrast imaging. The authors used CL-Quant to teach off-the-shelf universal analytics, called standard recipes, for cell proliferation, wound healing, cell counting, and cell motility assays using phase contrast movies collected on the BioStation CT and BioStation IM platforms. Similar to application modules, standard recipes are intended to work robustly across a wide range of imaging conditions without requiring customization by the end user. The authors validated the performance of the standard recipes by comparing their performance with truth created manually, or by custom analytics optimized for each individual movie (and therefore yielding the best possible result for the image), and validated by independent review. The validation data show that the standard recipes' performance is comparable with the validated truth with low variation. The data validate that the CL-Quant standard recipes can provide robust results without customization for live-cell assays in broad cell types and laboratory settings.

  3. Preliminary results of testing bioassay analytical performance standards

    International Nuclear Information System (INIS)

    Fisher, D.R.; Robinson, A.V.; Hadley, R.T.

    1983-08-01

    The analytical performance of both in vivo and in vitro bioassay laboratories is being studied to determine the capability of these laboratories to meet the minimum criteria for accuracy and precision specified in the draft ANSI Standard N13.30, Performance Criteria for Radiobioassay. This paper presents preliminary results of the first round of testing

  4. Development and Validation Dissolution Analytical Method of Nimesulide beta-Cyclodextrin 400 mg Tablet

    Directory of Open Access Journals (Sweden)

    Carlos Eduardo Carvalho Pereira

    2016-10-01

    Full Text Available The nimesulide (N-(4-nitro-2-phenoxyphenylmethanesulfonamide belongs to the class of non-steroidal anti-inflammatory drugs (NSAIDs and category II of the biopharmaceutical classification, The complexation of nimesulide with b-cyclodextrin is a pharmacological strategy to increase the solubility of the drug The objective of this study was to develop and validate an analytical methodology for dissolving the nimesulide beta-cyclodextrin 400 mg tablet and meets the guidelines of ANVISA for drug registration purposes. Once developed, the dissolution methodology was validated according to the RE of parameters no.  899/2003. In the development of the method it was noted that the duration of the dissolution test was 60 minutes, the volume and the most suitable dissolution medium was 900 mL of aqueous solution of sodium lauryl sulfate 1% (w/ v. It was also noted that rotation of 100 rpm and the paddle apparatus was the most appropriate to evaluate the dissolution of the drug. Spectrophotometric methodology was used to quantify the percentage of dissolved drug. The wavelength was 390 nm using the quantification. The validation of the methodology, system suitability parameters, specificity/selectivity, linearity, precision, accuracy and robustness were satisfactory and proved that the developed dissolution methodology was duly executed. DOI: http://dx.doi.org/10.17807/orbital.v8i5.827

  5. Development and validation of analytical methodology for determination of polycyclic aromatic hydrocarbons (PAHS) in sediments. Assesment of Pedroso Park dam, Santo Andre, SP

    International Nuclear Information System (INIS)

    Brito, Carlos Fernando de

    2009-01-01

    The polycyclic aromatic hydrocarbons (PAHs), by being considered persistent contaminants, by their ubiquity in the environment and by the recognition of their genotoxicity, have stimulated research activities in order to determine and evaluate their sources, transport, processing, biological effects and accumulation in compartments of aquatic and terrestrial ecosystems. In this work, the matrix studied was sediment collected at Pedroso Park's dam at Santo Andre, SP. The analytical technique employed was liquid chromatography in reverse phase with a UV/Vis detector. Statistics treatment of the data was established during the process of developing the methodology for which there was reliable results. The steps involved were evaluated using the concept of Validation of Chemical Testing. The parameters selected for the analytical validation were selectivity, linearity, Working Range, Sensitivity, Accuracy, Precision, Limit of Detection, Limit of quantification and robustness. These parameters showed satisfactory results, allowing the application of the methodology, and is a simple method that allows the minimization of contamination and loss of compounds by over-handling. For the PAHs tested were no found positive results, above the limit of detection, in any of the samples collected in the first phase. But, at the second collection, were found small changes mainly acenaphthylene, fluorene and benzo[a]anthracene. Although the area is preserved, it is possible to realize little signs of contamination. (author)

  6. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Adaptive cyclically dominating game on co-evolving networks: numerical and analytic results

    Science.gov (United States)

    Choi, Chi Wun; Xu, Chen; Hui, Pak Ming

    2017-10-01

    A co-evolving and adaptive Rock (R)-Paper (P)-Scissors (S) game (ARPS) in which an agent uses one of three cyclically dominating strategies is proposed and studied numerically and analytically. An agent takes adaptive actions to achieve a neighborhood to his advantage by rewiring a dissatisfying link with a probability p or switching strategy with a probability 1 - p. Numerical results revealed two phases in the steady state. An active phase for p pc has three separate clusters of agents using only R, P, and S, respectively with terminated adaptive actions. A mean-field theory based on the link densities in co-evolving network is formulated and the trinomial closure scheme is applied to obtain analytical solutions. The analytic results agree with simulation results on ARPS well. In addition, the different probabilities of winning, losing, and drawing a game among the agents are identified as the origin of the small discrepancy between analytic and simulation results. As a result of the adaptive actions, agents of higher degrees are often those being taken advantage of. Agents with a smaller (larger) degree than the mean degree have a higher (smaller) probability of winning than losing. The results are informative for future attempts on formulating more accurate theories.

  8. Interacting Brownian Swarms: Some Analytical Results

    Directory of Open Access Journals (Sweden)

    Guillaume Sartoretti

    2016-01-01

    Full Text Available We consider the dynamics of swarms of scalar Brownian agents subject to local imitation mechanisms implemented using mutual rank-based interactions. For appropriate values of the underlying control parameters, the swarm propagates tightly and the distances separating successive agents are iid exponential random variables. Implicitly, the implementation of rank-based mutual interactions, requires that agents have infinite interaction ranges. Using the probabilistic size of the swarm’s support, we analytically estimate the critical interaction range below that flocked swarms cannot survive. In the second part of the paper, we consider the interactions between two flocked swarms of Brownian agents with finite interaction ranges. Both swarms travel with different barycentric velocities, and agents from both swarms indifferently interact with each other. For appropriate initial configurations, both swarms eventually collide (i.e., all agents interact. Depending on the values of the control parameters, one of the following patterns emerges after collision: (i Both swarms remain essentially flocked, or (ii the swarms become ultimately quasi-free and recover their nominal barycentric speeds. We derive a set of analytical flocking conditions based on the generalized rank-based Brownian motion. An extensive set of numerical simulations corroborates our analytical findings.

  9. Analytical Validation of the ReEBOV Antigen Rapid Test for Point-of-Care Diagnosis of Ebola Virus Infection

    Science.gov (United States)

    Cross, Robert W.; Boisen, Matthew L.; Millett, Molly M.; Nelson, Diana S.; Oottamasathien, Darin; Hartnett, Jessica N.; Jones, Abigal B.; Goba, Augustine; Momoh, Mambu; Fullah, Mohamed; Bornholdt, Zachary A.; Fusco, Marnie L.; Abelson, Dafna M.; Oda, Shunichiro; Brown, Bethany L.; Pham, Ha; Rowland, Megan M.; Agans, Krystle N.; Geisbert, Joan B.; Heinrich, Megan L.; Kulakosky, Peter C.; Shaffer, Jeffrey G.; Schieffelin, John S.; Kargbo, Brima; Gbetuwa, Momoh; Gevao, Sahr M.; Wilson, Russell B.; Saphire, Erica Ollmann; Pitts, Kelly R.; Khan, Sheik Humarr; Grant, Donald S.; Geisbert, Thomas W.; Branco, Luis M.; Garry, Robert F.

    2016-01-01

    Background. Ebola virus disease (EVD) is a severe viral illness caused by Ebola virus (EBOV). The 2013–2016 EVD outbreak in West Africa is the largest recorded, with >11 000 deaths. Development of the ReEBOV Antigen Rapid Test (ReEBOV RDT) was expedited to provide a point-of-care test for suspected EVD cases. Methods. Recombinant EBOV viral protein 40 antigen was used to derive polyclonal antibodies for RDT and enzyme-linked immunosorbent assay development. ReEBOV RDT limits of detection (LOD), specificity, and interference were analytically validated on the basis of Food and Drug Administration (FDA) guidance. Results. The ReEBOV RDT specificity estimate was 95% for donor serum panels and 97% for donor whole-blood specimens. The RDT demonstrated sensitivity to 3 species of Ebolavirus (Zaire ebolavirus, Sudan ebolavirus, and Bundibugyo ebolavirus) associated with human disease, with no cross-reactivity by pathogens associated with non-EBOV febrile illness, including malaria parasites. Interference testing exhibited no reactivity by medications in common use. The LOD for antigen was 4.7 ng/test in serum and 9.4 ng/test in whole blood. Quantitative reverse transcription–polymerase chain reaction testing of nonhuman primate samples determined the range to be equivalent to 3.0 × 105–9.0 × 108 genomes/mL. Conclusions. The analytical validation presented here contributed to the ReEBOV RDT being the first antigen-based assay to receive FDA and World Health Organization emergency use authorization for this EVD outbreak, in February 2015. PMID:27587634

  10. Analytic result for the one-loop scalar pentagon integral with massless propagators

    International Nuclear Information System (INIS)

    Kniehl, Bernd A.; Tarasov, Oleg V.

    2010-01-01

    The method of dimensional recurrences proposed by one of the authors (O. V.Tarasov, 1996) is applied to the evaluation of the pentagon-type scalar integral with on-shell external legs and massless internal lines. For the first time, an analytic result valid for arbitrary space-time dimension d and five arbitrary kinematic variables is presented. An explicit expression in terms of the Appell hypergeometric function F 3 and the Gauss hypergeometric function 2 F 1 , both admitting one-fold integral representations, is given. In the case when one kinematic variable vanishes, the integral reduces to a combination of Gauss hypergeometric functions 2 F 1 . For the case when one scalar invariant is large compared to the others, the asymptotic values of the integral in terms of Gauss hypergeometric functions 2 F 1 are presented in d=2-2ε, 4-2ε, and 6-2ε dimensions. For multi-Regge kinematics, the asymptotic value of the integral in d=4-2ε dimensions is given in terms of the Appell function F 3 and the Gauss hypergeometric function 2 F 1 . (orig.)

  11. Analytic result for the one-loop scalar pentagon integral with massless propagators

    International Nuclear Information System (INIS)

    Kniehl, Bernd A.; Tarasov, Oleg V.

    2010-01-01

    The method of dimensional recurrences proposed by Tarasov (1996, 2000) is applied to the evaluation of the pentagon-type scalar integral with on-shell external legs and massless internal lines. For the first time, an analytic result valid for arbitrary space-time dimension d and five arbitrary kinematic variables is presented. An explicit expression in terms of the Appell hypergeometric function F 3 and the Gauss hypergeometric function 2 F 1 , both admitting one-fold integral representations, is given. In the case when one kinematic variable vanishes, the integral reduces to a combination of Gauss hypergeometric functions 2 F 1 . For the case when one scalar invariant is large compared to the others, the asymptotic values of the integral in terms of Gauss hypergeometric functions 2 F 1 are presented in d=2-2ε, 4-2ε, and 6-2ε dimensions. For multi-Regge kinematics, the asymptotic value of the integral in d=4-2ε dimensions is given in terms of the Appell function F 3 and the Gauss hypergeometric function 2 F 1 .

  12. Tank 48H Waste Composition and Results of Investigation of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Walker , D.D. [Westinghouse Savannah River Company, AIKEN, SC (United States)

    1997-04-02

    This report serves two purposes. First, it documents the analytical results of Tank 48H samples taken between April and August 1996. Second, it describes investigations of the precision of the sampling and analytical methods used on the Tank 48H samples.

  13. Analytical validation of an ultraviolet-visible procedure for determining lutein concentration and application to lutein-loaded nanoparticles.

    Science.gov (United States)

    Silva, Jéssica Thaís do Prado; Silva, Anderson Clayton da; Geiss, Julia Maria Tonin; de Araújo, Pedro Henrique Hermes; Becker, Daniela; Bracht, Lívia; Leimann, Fernanda Vitória; Bona, Evandro; Guerra, Gustavo Petri; Gonçalves, Odinei Hess

    2017-09-01

    Lutein is a carotenoid presenting known anti-inflammatory and antioxidant properties. Lutein-rich diets have been associated with neurological improvement as well as reduction of the risk of vision loss due to Age-Related Macular Degeneration (AMD). Micro and nanoencapsulation have demonstrated to be effective techniques in protecting lutein against degradation and also in improving its bioavailability. However, actual lutein concentration inside the capsules and encapsulation efficiency are key parameters that must be precisely known when designing in vitro and in vivo tests. In this work an analytical procedure was validated for the determination of the actual lutein content in zein nanoparticles using ultraviolet-visible spectroscopy. Method validation followed the International Conference on Harmonisation (ICH) guidelines which evaluate linearity, detection limit, quantification limit, accuracy and precision. The validated methodology was applied to characterize lutein-loaded nanoparticles. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. An assessment of the validity of inelastic design analysis methods by comparisons of predictions with test results

    International Nuclear Information System (INIS)

    Corum, J.M.; Clinard, J.A.; Sartory, W.K.

    1976-01-01

    The use of computer programs that employ relatively complex constitutive theories and analysis procedures to perform inelastic design calculations on fast reactor system components introduces questions of validation and acceptance of the analysis results. We may ask ourselves, ''How valid are the answers.'' These questions, in turn, involve the concepts of verification of computer programs as well as qualification of the computer programs and of the underlying constitutive theories and analysis procedures. This paper addresses the latter - the qualification of the analysis methods for inelastic design calculations. Some of the work underway in the United States to provide the necessary information to evaluate inelastic analysis methods and computer programs is described, and typical comparisons of analysis predictions with inelastic structural test results are presented. It is emphasized throughout that rather than asking ourselves how valid, or correct, are the analytical predictions, we might more properly question whether or not the combination of the predictions and the associated high-temperature design criteria leads to an acceptable level of structural integrity. It is believed that in this context the analysis predictions are generally valid, even though exact correlations between predictions and actual behavior are not obtained and cannot be expected. Final judgment, however, must be reserved for the design analyst in each specific case. (author)

  15. Validação de metodologia analítica para doseamento de soluções de lapachol por CLAE Validation of the analytical methodology for evaluation of lapachol in solution by HPCL

    Directory of Open Access Journals (Sweden)

    Said G. C. Fonseca

    2004-02-01

    Full Text Available Lapachol is a naphthoquinone found in several species of the Bignoniaceae family possessing mainly anticancer activity. The present work consists of the development and validation of analytical methodology for lapachol and its preparations. The results here obtained show that lapachol has a low quantification limit, that the analytical methodology is accurate, reproducible, robust and linear over the concentration range 0.5-100 µg/mL of lapachol.

  16. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    Science.gov (United States)

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  17. Configuration and validation of an analytical model predicting secondary neutron radiation in proton therapy using Monte Carlo simulations and experimental measurements.

    Science.gov (United States)

    Farah, J; Bonfrate, A; De Marzi, L; De Oliveira, A; Delacroix, S; Martinetti, F; Trompier, F; Clairand, I

    2015-05-01

    This study focuses on the configuration and validation of an analytical model predicting leakage neutron doses in proton therapy. Using Monte Carlo (MC) calculations, a facility-specific analytical model was built to reproduce out-of-field neutron doses while separately accounting for the contribution of intra-nuclear cascade, evaporation, epithermal and thermal neutrons. This model was first trained to reproduce in-water neutron absorbed doses and in-air neutron ambient dose equivalents, H*(10), calculated using MCNPX. Its capacity in predicting out-of-field doses at any position not involved in the training phase was also checked. The model was next expanded to enable a full 3D mapping of H*(10) inside the treatment room, tested in a clinically relevant configuration and finally consolidated with experimental measurements. Following the literature approach, the work first proved that it is possible to build a facility-specific analytical model that efficiently reproduces in-water neutron doses and in-air H*(10) values with a maximum difference less than 25%. In addition, the analytical model succeeded in predicting out-of-field neutron doses in the lateral and vertical direction. Testing the analytical model in clinical configurations proved the need to separate the contribution of internal and external neutrons. The impact of modulation width on stray neutrons was found to be easily adjustable while beam collimation remains a challenging issue. Finally, the model performance agreed with experimental measurements with satisfactory results considering measurement and simulation uncertainties. Analytical models represent a promising solution that substitutes for time-consuming MC calculations when assessing doses to healthy organs. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  18. Theoretical, analytical, and statistical interpretation of environmental data

    International Nuclear Information System (INIS)

    Lombard, S.M.

    1974-01-01

    The reliability of data from radiochemical analyses of environmental samples cannot be determined from nuclear counting statistics alone. The rigorous application of the principles of propagation of errors, an understanding of the physics and chemistry of the species of interest in the environment, and the application of information from research on the analytical procedure are all necessary for a valid estimation of the errors associated with analytical results. The specific case of the determination of plutonium in soil is considered in terms of analytical problems and data reliability. (U.S.)

  19. An analytical solution for improved HIFU SAR estimation

    International Nuclear Information System (INIS)

    Dillon, C R; Vyas, U; Christensen, D A; Roemer, R B; Payne, A

    2012-01-01

    Accurate determination of the specific absorption rates (SARs) present during high intensity focused ultrasound (HIFU) experiments and treatments provides a solid physical basis for scientific comparison of results among HIFU studies and is necessary to validate and improve SAR predictive software, which will improve patient treatment planning, control and evaluation. This study develops and tests an analytical solution that significantly improves the accuracy of SAR values obtained from HIFU temperature data. SAR estimates are obtained by fitting the analytical temperature solution for a one-dimensional radial Gaussian heating pattern to the temperature versus time data following a step in applied power and evaluating the initial slope of the analytical solution. The analytical method is evaluated in multiple parametric simulations for which it consistently (except at high perfusions) yields maximum errors of less than 10% at the center of the focal zone compared with errors up to 90% and 55% for the commonly used linear method and an exponential method, respectively. For high perfusion, an extension of the analytical method estimates SAR with less than 10% error. The analytical method is validated experimentally by showing that the temperature elevations predicted using the analytical method's SAR values determined for the entire 3D focal region agree well with the experimental temperature elevations in a HIFU-heated tissue-mimicking phantom. (paper)

  20. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    Science.gov (United States)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  1. Circular orbits of corotating binary black holes: Comparison between analytical and numerical results

    International Nuclear Information System (INIS)

    Damour, Thibault; Gourgoulhon, Eric; Grandclement, Philippe

    2002-01-01

    We compare recent numerical results, obtained within a 'helical Killing vector' approach, on circular orbits of corotating binary black holes to the analytical predictions made by the effective one-body (EOB) method (which has been recently extended to the case of spinning bodies). On the scale of the differences between the results obtained by different numerical methods, we find good agreement between numerical data and analytical predictions for several invariant functions describing the dynamical properties of circular orbits. This agreement is robust against the post-Newtonian accuracy used for the analytical estimates, as well as under choices of the resummation method for the EOB 'effective potential', and gets better as one uses a higher post-Newtonian accuracy. These findings open the way to a significant 'merging' of analytical and numerical methods, i.e. to matching an EOB-based analytical description of the (early and late) inspiral, up to the beginning of the plunge, to a numerical description of the plunge and merger. We illustrate also the 'flexibility' of the EOB approach, i.e. the possibility of determining some 'best fit' values for the analytical parameters by comparison with numerical data

  2. Intercalibration of analytical methods on marine environmental samples

    International Nuclear Information System (INIS)

    1988-06-01

    The pollution of the seas by various chemical substances constitutes nowadays one of the principal concerns of mankind. The International Atomic Energy Agency has organized in past years several intercomparison exercises in the framework of its Analytical Quality Control Service. The present intercomparison had a double aim: first, to give laboratories participating in this intercomparison an opportunity for checking their analytical performance. Secondly, to produce on the basis of the results of this intercomparison a reference material made of fish tissue which would be accurately certified with respect to many trace elements. Such a material could be used by analytical chemists to check the validity of new analytical procedures. In total, 53 laboratories from 29 countries reported results (585 laboratory means for 48 elements). 5 refs, 52 tabs

  3. Development and validation of a multi-analyte method for the regulatory control of carotenoids used as feed additives in fish and poultry feed.

    Science.gov (United States)

    Vincent, Ursula; Serano, Federica; von Holst, Christoph

    2017-08-01

    Carotenoids are used in animal nutrition mainly as sensory additives that favourably affect the colour of fish, birds and food of animal origin. Various analytical methods exist for their quantification in compound feed, reflecting the different physico-chemical characteristics of the carotenoid and the corresponding feed additives. They may be natural products or specific formulations containing the target carotenoids produced by chemical synthesis. In this study a multi-analyte method was developed that can be applied to the determination of all 10 carotenoids currently authorised within the European Union for compound feedingstuffs. The method functions regardless of whether the carotenoids have been added to the compound feed via natural products or specific formulations. It is comprised of three steps: (1) digestion of the feed sample with an enzyme; (2) pressurised liquid extraction; and (3) quantification of the analytes by reversed-phase HPLC coupled to a photodiode array detector in the visible range. The method was single-laboratory validated for poultry and fish feed covering a mass fraction range of the target analyte from 2.5 to 300 mg kg - 1 . The following method performance characteristics were obtained: the recovery rate varied from 82% to 129% and precision expressed as the relative standard deviation of intermediate precision varied from 1.6% to 15%. Based on the acceptable performance obtained in the validation study, the multi-analyte method is considered fit for the intended purpose.

  4. Validation of analytical method to quality control and the stability study of 0.025 % eyedrops Ketotiphen

    International Nuclear Information System (INIS)

    Troche Concepcion, Yenilen; Romero Diaz, Jacqueline Aylema; Garcia Penna, Caridad M

    2010-01-01

    The Ketotiphen eyedrop is prescribed to relief the signs and symptoms of allergic conjunctivitis due to its potent H 1a ntihistaminic effect showing some ability to inhibit the histamine release and other mediators in cases of mastocytosis. The aim of present paper was to develop and validate an analytical method for the high-performance liquid chromatography, to quality control and the stability studies of 0.025 % eyedrop Ketotiphen. Method was based on active principle separation by means of a Lichrosorb RP-18 (5 μm) (250 x 4 mm), with UV detection to 296 nm using a mobile phase including a non-gasified mixture of methanol:buffer-phosphate (75:25; pH 8.5) adding 1 mL of Isopropanol by each 1 000 mL of the previous mixture at a 1.2 mL/min flow velocity. The analytical method was linear, accurate, specific and exact during the study concentrations

  5. Improving the trust in results of numerical simulations and scientific data analytics

    Energy Technology Data Exchange (ETDEWEB)

    Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States); Constantinescu, Emil [Argonne National Lab. (ANL), Argonne, IL (United States); Hovland, Paul [Argonne National Lab. (ANL), Argonne, IL (United States); Peterka, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Phillips, Carolyn [Argonne National Lab. (ANL), Argonne, IL (United States); Snir, Marc [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, Stefan [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-04-30

    This white paper investigates several key aspects of the trust that a user can give to the results of numerical simulations and scientific data analytics. In this document, the notion of trust is related to the integrity of numerical simulations and data analytics applications. This white paper complements the DOE ASCR report on Cybersecurity for Scientific Computing Integrity by (1) exploring the sources of trust loss; (2) reviewing the definitions of trust in several areas; (3) providing numerous cases of result alteration, some of them leading to catastrophic failures; (4) examining the current notion of trust in numerical simulation and scientific data analytics; (5) providing a gap analysis; and (6) suggesting two important research directions and their respective research topics. To simplify the presentation without loss of generality, we consider that trust in results can be lost (or the results’ integrity impaired) because of any form of corruption happening during the execution of the numerical simulation or the data analytics application. In general, the sources of such corruption are threefold: errors, bugs, and attacks. Current applications are already using techniques to deal with different types of corruption. However, not all potential corruptions are covered by these techniques. We firmly believe that the current level of trust that a user has in the results is at least partially founded on ignorance of this issue or the hope that no undetected corruptions will occur during the execution. This white paper explores the notion of trust and suggests recommendations for developing a more scientifically grounded notion of trust in numerical simulation and scientific data analytics. We first formulate the problem and show that it goes beyond previous questions regarding the quality of results such as V&V, uncertainly quantification, and data assimilation. We then explore the complexity of this difficult problem, and we sketch complementary general

  6. Analytic result for the one-loop scalar pentagon integral with massless propagators

    Energy Technology Data Exchange (ETDEWEB)

    Kniehl, Bernd A.; Tarasov, Oleg V. [Hamburg Univ. (Germany). II. Inst. fuer Theoretische Physik

    2010-01-15

    The method of dimensional recurrences proposed by one of the authors (O. V.Tarasov, 1996) is applied to the evaluation of the pentagon-type scalar integral with on-shell external legs and massless internal lines. For the first time, an analytic result valid for arbitrary space-time dimension d and five arbitrary kinematic variables is presented. An explicit expression in terms of the Appell hypergeometric function F{sub 3} and the Gauss hypergeometric function {sub 2}F{sub 1}, both admitting one-fold integral representations, is given. In the case when one kinematic variable vanishes, the integral reduces to a combination of Gauss hypergeometric functions {sub 2}F{sub 1}. For the case when one scalar invariant is large compared to the others, the asymptotic values of the integral in terms of Gauss hypergeometric functions {sub 2}F{sub 1} are presented in d=2-2{epsilon}, 4-2{epsilon}, and 6-2{epsilon} dimensions. For multi-Regge kinematics, the asymptotic value of the integral in d=4-2{epsilon} dimensions is given in terms of the Appell function F{sub 3} and the Gauss hypergeometric function {sub 2}F{sub 1}. (orig.)

  7. Validación del método analítico para la cuantificación de bacitracina Validation of the analytical method to quantify the bacitracine

    Directory of Open Access Journals (Sweden)

    Carolina Velandia-Castellanos

    2011-06-01

    Full Text Available Se desarrolló y validó un método analítico para la determinación cuantitativa de bacitracina zinc al 15 % y bacitracina metilen disalicilato al 11 %, por el método de cilindro en placa (difusión en agar, con el fin de ser usado en el control de calidad de las materias primas y productos farmacéuticos. Se evaluaron los parámetros de especificidad, selectividad, linealidad del sistema, y del método, exactitud, límite de cuantificación y precisión. Mediante el diseño experimental y la evaluación estadística de los resultados, se demostró que el método analítico es específico, selectivo, lineal, preciso (CVAn analytical method was developed and validated for quantitative determination of 15 % zinc bacitracine and 11 % disalicylate methylene-bacitracine by the plate-cylinder method (agar diffusion to be used in quality control of raw products and pharmaceutical products. Specificity, selectivity, system and method linearity, accuracy, quantification and precision parameters were assessed. By the experimental design and the statistic evaluation of results, it was demonstrated that the analytical method is specific, selective, linear, precise (CV< 5 % and exact (bias < 3 %, Gexp< Gtab< t exp< t tab during the study concentrations. The quantification and detection limit was of 0.02 and 0.005 Ul/mL, respectively. The analytical performance characteristics fulfill the requirement for the proposal analytical implementation.

  8. Validation of an analytical method applicable to study of 1 mg/mL oral Risperidone solution stability

    International Nuclear Information System (INIS)

    Abreu Alvarez, Maikel; Garcia Penna, Caridad Margarita; Martinez Miranda, Lissette

    2010-01-01

    A validated analytical method by high-performance liquid chromatography (HPLC) was applicable to study of 1 mg/mL Risperidone oral solution stability. The above method was linear, accurate, specific and exact. A stability study of the 1 mg/mL Risperidone oral solution was developed determining its expiry date. The shelf life study was conducted for 24 months at room temperature; whereas the accelerated stability study was conducted with product under influence of humidity and temperature; analysis was made during 3 months. Formula fulfilled the quality specifications described in Pharmacopeia. The results of stability according to shelf life after 24 months showed that the product maintains the parameters determining its quality during this time and in accelerated studies there was not significant degradation (p> 0.05) in the product. Under mentioned conditions expiry date was of 2 years

  9. Application of Statistical Methods to Activation Analytical Results near the Limit of Detection

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Wanscher, B.

    1978-01-01

    Reporting actual numbers instead of upper limits for analytical results at or below the detection limit may produce reliable data when these numbers are subjected to appropriate statistical processing. Particularly in radiometric methods, such as activation analysis, where individual standard...... deviations of analytical results may be estimated, improved discrimination may be based on the Analysis of Precision. Actual experimental results from a study of the concentrations of arsenic in human skin demonstrate the power of this principle....

  10. $W^+ W^-$ + Jet: Compact Analytic Results

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, John [Fermilab; Miller, David [Glasgow U.; Robens, Tania [Dresden, Tech. U.

    2016-01-14

    In the second run of the LHC, which started in April 2015, an accurate understanding of Standard Model processes is more crucial than ever. Processes including electroweak gauge bosons serve as standard candles for SM measurements, and equally constitute important background for BSM searches. We here present the NLO QCD virtual contributions to W+W- + jet in an analytic format obtained through unitarity methods and show results for the full process using an implementation into the Monte Carlo event generator MCFM. Phenomenologically, we investigate total as well as differential cross sections for the LHC with 14 TeV center-of-mass energy, as well as a future 100 TeV proton-proton machine. In the format presented here, the one-loop virtual contributions also serve as important ingredients in the calculation of W+W- pair production at NNLO.

  11. Verification and Validation of TMAP7

    Energy Technology Data Exchange (ETDEWEB)

    James Ambrosek; James Ambrosek

    2008-12-01

    The Tritium Migration Analysis Program, Version 7 (TMAP7) code is an update of TMAP4, an earlier version that was verified and validated in support of the International Thermonuclear Experimental Reactor (ITER) program and of the intermediate version TMAP2000. It has undergone several revisions. The current one includes radioactive decay, multiple trap capability, more realistic treatment of heteronuclear molecular formation at surfaces, processes that involve surface-only species, and a number of other improvements. Prior to code utilization, it needed to be verified and validated to ensure that the code is performing as it was intended and that its predictions are consistent with physical reality. To that end, the demonstration and comparison problems cited here show that the code results agree with analytical solutions for select problems where analytical solutions are straightforward or with results from other verified and validated codes, and that actual experimental results can be accurately replicated using reasonable models with this code. These results and their documentation in this report are necessary steps in the qualification of TMAP7 for its intended service.

  12. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    Science.gov (United States)

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.

  13. Determination of proline in honey: comparison between official methods, optimization and validation of the analytical methodology.

    Science.gov (United States)

    Truzzi, Cristina; Annibaldi, Anna; Illuminati, Silvia; Finale, Carolina; Scarponi, Giuseppe

    2014-05-01

    The study compares official spectrophotometric methods for the determination of proline content in honey - those of the International Honey Commission (IHC) and the Association of Official Analytical Chemists (AOAC) - with the original Ough method. Results show that the extra time-consuming treatment stages added by the IHC method with respect to the Ough method are pointless. We demonstrate that the AOACs method proves to be the best in terms of accuracy and time saving. The optimized waiting time for the absorbance recording is set at 35min from the removal of reaction tubes from the boiling bath used in the sample treatment. The optimized method was validated in the matrix: linearity up to 1800mgL(-1), limit of detection 20mgL(-1), limit of quantification 61mgL(-1). The method was applied to 43 unifloral honey samples from the Marche region, Italy. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Stracuzzi, David John [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Brost, Randolph [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Chen, Maximillian Gene [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Malinas, Rebecca [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Peterson, Matthew Gregor [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Phillips, Cynthia A. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Robinson, David G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Woodbridge, Diane [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search, and discuss a number of possible improvements for each.

  15. Analytical evaluation of atomic form factors: Application to Rayleigh scattering

    Energy Technology Data Exchange (ETDEWEB)

    Safari, L., E-mail: laleh.safari@ist.ac.at [IST Austria (Institute of Science and Technology Austria), Am Campus 1, 3400 Klosterneuburg (Austria); Department of Physics, University of Oulu, Box 3000, FI-90014 Oulu (Finland); Santos, J. P. [Laboratório de Instrumentação, Engenharia Biomédica e Física da Radiação (LIBPhys-UNL), Departamento de Física, Faculdade de Ciências e Tecnologia, FCT, Universidade Nova de Lisboa, 2829-516 Caparica (Portugal); Amaro, P. [Laboratório de Instrumentação, Engenharia Biomédica e Física da Radiação (LIBPhys-UNL), Departamento de Física, Faculdade de Ciências e Tecnologia, FCT, Universidade Nova de Lisboa, 2829-516 Caparica (Portugal); Physikalisches Institut, Universität Heidelberg, D-69120 Heidelberg (Germany); Jänkälä, K. [Department of Physics, University of Oulu, Box 3000, FI-90014 Oulu (Finland); Fratini, F. [Department of Physics, University of Oulu, Box 3000, FI-90014 Oulu (Finland); Institute of Atomic and Subatomic Physics, TU Wien, Stadionallee 2, 1020 Wien (Austria); Departamento de Física, Instituto de Ciências Exatas, Universidade Federal de Minas Gerais, 31270-901 Belo Horizonte, MG (Brazil)

    2015-05-15

    Atomic form factors are widely used for the characterization of targets and specimens, from crystallography to biology. By using recent mathematical results, here we derive an analytical expression for the atomic form factor within the independent particle model constructed from nonrelativistic screened hydrogenic wave functions. The range of validity of this analytical expression is checked by comparing the analytically obtained form factors with the ones obtained within the Hartee-Fock method. As an example, we apply our analytical expression for the atomic form factor to evaluate the differential cross section for Rayleigh scattering off neutral atoms.

  16. Radiochemical verification and validation in the environmental data collection process

    International Nuclear Information System (INIS)

    Rosano-Reece, D.; Bottrell, D.; Bath, R.J.

    1994-01-01

    A credible and cost effective environmental data collection process should produce analytical data which meets regulatory and program specific requirements. Analytical data, which support the sampling and analysis activities at hazardous waste sites, undergo verification and independent validation before the data are submitted to regulators. Understanding the difference between verification and validation and their respective roles in the sampling and analysis process is critical to the effectiveness of a program. Verification is deciding whether the measurement data obtained are what was requested. The verification process determines whether all the requirements were met. Validation is more complicated than verification. It attempts to assess the impacts on data use, especially when requirements are not met. Validation becomes part of the decision-making process. Radiochemical data consists of a sample result with an associated error. Therefore, radiochemical validation is different and more quantitative than is currently possible for the validation of hazardous chemical data. Radiochemical data include both results and uncertainty that can be statistically compared to identify significance of differences in a more technically defensible manner. Radiochemical validation makes decisions about analyte identification, detection, and uncertainty for a batch of data. The process focuses on the variability of the data in the context of the decision to be made. The objectives of this paper are to present radiochemical verification and validation for environmental data and to distinguish the differences between the two operations

  17. Aplikasi Analytical Hierarchy Process Pada Pemilihan Metode Analisis Zat Organik Dalam Air

    Directory of Open Access Journals (Sweden)

    Dino Rimantho

    2016-07-01

    Full Text Available Water is one of the food products analyzed in water chemistry and environmental laboratories. One of the parameters analyzed are organic substances. The number of samples that were not comparable with the analytical skills can cause delays in test results. Analytical Hierarchy Process applied to evaluate the analytical methods used. Alternative methods tested include titrimetric method, spectrophotometry, and total organic carbon (TOC. Respondents consisted of deputy technical manager, laboratory coordinator, and two senior analysts. Alternative results obtained are methods of TOC. Proposed improvements alternative analytical method based on the results obtained, the method of the TOC with a 10-15 minute analysis time and use of CRM to the validity of the analysis results.

  18. Analytical method development and validation for quantification of uranium by Fourier Transform Infrared Spectroscopy (FTIR) for routine quality control analysis

    International Nuclear Information System (INIS)

    Pereira, Elaine; Silva, Ieda de S.; Gomide, Ricardo G.; Pires, Maria Aparecida F.

    2015-01-01

    This work presents a low cost, simple and new methodology for direct determination uranium in different matrices uranium: organic phase (UO 2 (NO 3 ) 2 .2TBP - uranyl nitrate complex) and aqueous phase (UO 2 (NO 3 ) 2 - NTU - uranyl nitrate), based on Fourier Transform Infrared spectroscopy (FTIR) using KBr pellets technique. The analytical validation is essential to define if a developed methodology is completely adjusted to the objectives that it is destined and is considered one of the main instruments of quality control. The parameters used in the validation process were: selectivity, linearity, limits of detection (LD) and quantitation (LQ), precision (repeatability and intermediate precision), accuracy and robustness. The method for uranium in organic phase (UO 2 (NO 3 ) 2 .2TBP in hexane/embedded in KBr) was linear (r=0.9989) over the range of 1.0 g L -1 a 14.3 g L -1 , LD were 92.1 mg L -1 and LQ 113.1 mg L -1 , precision (RSD < 1.6% and p-value < 0.05), accurate (recovery of 100.1% - 102.9%). The method for uranium aqueous phase (UO 2 (NO 3 )2/embedded in KBr) was linear (r=0.9964) over the range of 5.4 g L -1 a 51.2 g L -1 , LD were 835 mg L -1 and LQ 958 mg L -1 , precision (RSD < 1.0% and p-value < 0.05), accurate (recovery of 99.1% - 102.0%). The FTIR method is robust regarding most of the variables analyzed, as the difference between results obtained under nominal and modified conditions were lower than the critical value for all analytical parameters studied. Some process samples were analyzed in FTIR and compared with gravimetric and x ray fluorescence (XRF) analyses showing similar results in all three methods. The statistical tests (Student-t and Fischer) showed that the techniques are equivalent. (author)

  19. Maritime Analytics Prototype: Phase 3 Validation

    Science.gov (United States)

    2014-01-01

    different so we need a flexible analysis set hierarchy encoded as directories or groups – like a recipe [C.3.1.4n] Improve the GUI:  Provide more...Problems zooming and panning on the timeline [C.1.2.1c, C.1.2.4e, C.1.3.1c, C.1.1.4c, C.1.1.4b]  Selected the wrong year and then the vessel...Scholtz_VAMetrics_2006.pdf] [21] J. Thomas, and K. Cook , Illuminating the Path, the Research and Development Agenda for Visual analytics: IEEE, 2005. [22

  20. Validation of an analytical method for determining halothane in urine as an instrument for evaluating occupational exposure

    International Nuclear Information System (INIS)

    Gonzalez Chamorro, Rita Maria; Jaime Novas, Arelis; Diaz Padron, Heliodora

    2010-01-01

    The occupational exposure to harmful substances may impose the apparition of determined significative changes in the normal physiology of the organism when the adequate security measures are not taken in time in a working place where the risk may be present. Among the chemical risks that may affect the workers' health are the inhalable anesthetic agents. With the objective to take the first steps for the introduction of an epidemiological surveillance system to this personnel, an analytical method for determining this anesthetic in urine was validated with the instrumental conditions created in our laboratory. To carry out this validation the following parameters were taken into account: specificity, lineament, precision, accuracy, detection limit and quantification limit, and the uncertainty of the method was calculated. In the validation procedure it was found that the technique is specific and precise, the detection limit was of 0,118 μg/L, and of the quantification limit of 0,354 μg/L. The global uncertainty was of 0,243, and the expanded of 0,486. The validated method, together with the posterior introduction of the biological exposure limits, will serve as an auxiliary means of diagnosis which will allow us a periodical control of the personnel exposure

  1. Analytical model for screening potential CO2 repositories

    Science.gov (United States)

    Okwen, R.T.; Stewart, M.T.; Cunningham, J.A.

    2011-01-01

    Assessing potential repositories for geologic sequestration of carbon dioxide using numerical models can be complicated, costly, and time-consuming, especially when faced with the challenge of selecting a repository from a multitude of potential repositories. This paper presents a set of simple analytical equations (model), based on the work of previous researchers, that could be used to evaluate the suitability of candidate repositories for subsurface sequestration of carbon dioxide. We considered the injection of carbon dioxide at a constant rate into a confined saline aquifer via a fully perforated vertical injection well. The validity of the analytical model was assessed via comparison with the TOUGH2 numerical model. The metrics used in comparing the two models include (1) spatial variations in formation pressure and (2) vertically integrated brine saturation profile. The analytical model and TOUGH2 show excellent agreement in their results when similar input conditions and assumptions are applied in both. The analytical model neglects capillary pressure and the pressure dependence of fluid properties. However, simulations in TOUGH2 indicate that little error is introduced by these simplifications. Sensitivity studies indicate that the agreement between the analytical model and TOUGH2 depends strongly on (1) the residual brine saturation, (2) the difference in density between carbon dioxide and resident brine (buoyancy), and (3) the relationship between relative permeability and brine saturation. The results achieved suggest that the analytical model is valid when the relationship between relative permeability and brine saturation is linear or quasi-linear and when the irreducible saturation of brine is zero or very small. ?? 2011 Springer Science+Business Media B.V.

  2. CARVEDILOL POPULATION PHARMACOKINETIC ANALYSIS – APPLIED VALIDATION PROCEDURE

    Directory of Open Access Journals (Sweden)

    Aleksandra Catić-Đorđević

    2013-09-01

    Full Text Available Carvedilol is a nonselective beta blocker/alpha-1 blocker, which is used for treatment of essential hypertension, chronic stable angina, unstable angina and ischemic left ventricular dysfunction. The aim of this study was to describe carvedilol population pharmacokinetic (PK analysis as well as the validation of analytical procedure, which is an important step regarding this approach. In contemporary clinical practice, population PK analysis is often more important than standard PK approach in setting a mathematical model that describes the PK parameters. Also, it includes the variables that have particular importance in the drugs pharmacokinetics such as sex, body mass, dosage, pharmaceutical form, pathophysiological state, disease associated with the organism or the presence of a specific polymorphism in the isoenzyme important for biotransformation of the drug. One of the most frequently used approach in population PK analysis is the Nonlinear Modeling of Mixed Effects - NONMEM modeling. Analytical methods used in the data collection period is of great importance for the implementation of a population PK analysis of carvedilol in order to obtain reliable data that can be useful in clinical practice. High performance liquid chromatography (HPLC analysis of carvedilol is used to confirm the identity of a drug and provide quantitative results and also to monitor the efficacy of the therapy. Analytical procedures used in other studies could not be fully implemented in our research as it was necessary to perform certain modification and validation of the method with the aim of using the obtained results for the purpose of a population pharmacokinetic analysis. Validation process is a logical terminal phase of analytical procedure development that provides applicability of the procedure itself. The goal of validation is to ensure consistency of the method and accuracy of results or to confirm the selection of analytical method for a given sample

  3. Validation of QuEChERS analytical technique for organochlorines and synthetic pyrethroids in fruits and vegetables using GC-ECD.

    Science.gov (United States)

    Dubey, J K; Patyal, S K; Sharma, Ajay

    2018-03-19

    In the present day scenario of increasing awareness and concern about the pesticides, it is very important to ensure the quality of data being generated in pesticide residue analysis. To impart confidence in the products, terms like quality assurance and quality control are used as an integral part of quality management. In order to ensure better quality of results in pesticide residue analysis, validation of analytical methods to be used is extremely important. Keeping in view the importance of validation of method, the validation of QuEChERS (quick, easy, cheap, effective, rugged, and safe) a multiresidue method for extraction of 13 organochlorines and seven synthetic pyrethroids in fruits and vegetables followed by GC ECD for quantification was done so as to use this method for analysis of samples received in the laboratory. The method has been validated as per the Guidelines issued by SANCO (French words Sante for Health and Consommateurs for Consumers) in accordance with their document SANCO/XXXX/2013. Various parameters analyzed, viz., linearity, specificity, repeatability, reproducibility, and ruggedness were found to have acceptable values with a per cent RSD of less than 10%. Limit of quantification (LOQ) for the organochlorines was established to be 0.01 and 0.05 mg kg -1 for the synthetic pyrethroids. The uncertainty of the measurement (MU) for all these compounds ranged between 1 and 10%. The matrix-match calibration was used to compensate the matrix effect on the quantification of the compounds. The overall recovery of the method ranged between 80 and 120%. These results demonstrate the applicability and acceptability of this method in routine estimation of pesticide residues of these 20 pesticides in the fruits and vegetables by the laboratory.

  4. Analytical Plan for Roman Glasses

    Energy Technology Data Exchange (ETDEWEB)

    Strachan, Denis M.; Buck, Edgar C.; Mueller, Karl T.; Schwantes, Jon M.; Olszta, Matthew J.; Thevuthasan, Suntharampillai; Heeren, Ronald M.

    2011-01-01

    Roman glasses that have been in the sea or underground for about 1800 years can serve as the independent “experiment” that is needed for validation of codes and models that are used in performance assessment. Two sets of Roman-era glasses have been obtained for this purpose. One set comes from the sunken vessel the Iulia Felix; the second from recently excavated glasses from a Roman villa in Aquileia, Italy. The specimens contain glass artifacts and attached sediment or soil. In the case of the Iulia Felix glasses quite a lot of analytical work has been completed at the University of Padova, but from an archaeological perspective. The glasses from Aquileia have not been so carefully analyzed, but they are similar to other Roman glasses. Both glass and sediment or soil need to be analyzed and are the subject of this analytical plan. The glasses need to be analyzed with the goal of validating the model used to describe glass dissolution. The sediment and soil need to be analyzed to determine the profile of elements released from the glass. This latter need represents a significant analytical challenge because of the trace quantities that need to be analyzed. Both pieces of information will yield important information useful in the validation of the glass dissolution model and the chemical transport code(s) used to determine the migration of elements once released from the glass. In this plan, we outline the analytical techniques that should be useful in obtaining the needed information and suggest a useful starting point for this analytical effort.

  5. Validation of an analytical method for the determination of spiramycin, virginiamycin and tylosin in feeding-stuffs bij thin-layer chromatography and bio-autography

    NARCIS (Netherlands)

    Vincent, U.; Gizzi, G.; Holst, von C.; Jong, de J.; Michard, J.

    2007-01-01

    An inter-laboratory validation was carried out to determine the performance characteristics of an analytical method based on thin-layer chromatography (TLC) coupled to microbiological detection (bio-autography) for screening feed samples for the presence of spiramycin, tylosin and virginiamycin.

  6. Calculations for Adjusting Endogenous Biomarker Levels During Analytical Recovery Assessments for Ligand-Binding Assay Bioanalytical Method Validation.

    Science.gov (United States)

    Marcelletti, John F; Evans, Cindy L; Saxena, Manju; Lopez, Adriana E

    2015-07-01

    It is often necessary to adjust for detectable endogenous biomarker levels in spiked validation samples (VS) and in selectivity determinations during bioanalytical method validation for ligand-binding assays (LBA) with a matrix like normal human serum (NHS). Described herein are case studies of biomarker analyses using multiplex LBA which highlight the challenges associated with such adjustments when calculating percent analytical recovery (%AR). The LBA test methods were the Meso Scale Discovery V-PLEX® proinflammatory and cytokine panels with NHS as test matrix. The NHS matrix blank exhibited varied endogenous content of the 20 individual cytokines before spiking, ranging from undetectable to readily quantifiable. Addition and subtraction methods for adjusting endogenous cytokine levels in %AR calculations are both used in the bioanalytical field. The two methods were compared in %AR calculations following spiking and analysis of VS for cytokines having detectable endogenous levels in NHS. Calculations for %AR obtained by subtracting quantifiable endogenous biomarker concentrations from the respective total analytical VS values yielded reproducible and credible conclusions. The addition method, in contrast, yielded %AR conclusions that were frequently unreliable and discordant with values obtained with the subtraction adjustment method. It is shown that subtraction of assay signal attributable to matrix is a feasible alternative when endogenous biomarkers levels are below the limit of quantitation, but above the limit of detection. These analyses confirm that the subtraction method is preferable over that using addition to adjust for detectable endogenous biomarker levels when calculating %AR for biomarker LBA.

  7. Quality data validation: Comprehensive approach to environmental data validation

    International Nuclear Information System (INIS)

    Matejka, L.A. Jr.

    1993-01-01

    Environmental data validation consists of an assessment of three major areas: analytical method validation; field procedures and documentation review; evaluation of the level of achievement of data quality objectives based in part on PARCC parameters analysis and expected applications of data. A program utilizing matrix association of required levels of validation effort and analytical levels versus applications of this environmental data was developed in conjunction with DOE-ID guidance documents to implement actions under the Federal Facilities Agreement and Consent Order in effect at the Idaho National Engineering Laboratory. This was an effort to bring consistent quality to the INEL-wide Environmental Restoration Program and database in an efficient and cost-effective manner. This program, documenting all phases of the review process, is described here

  8. An analytical model on thermal performance evaluation of counter flow wet cooling tower

    Directory of Open Access Journals (Sweden)

    Wang Qian

    2017-01-01

    Full Text Available This paper proposes an analytical model for simultaneous heat and mass transfer processes in a counter flow wet cooling tower, with the assumption that the enthalpy of the saturated air is a linear function of the water surface temperature. The performance of the proposed analytical model is validated in some typical cases. The validation reveals that, when cooling range is in a certain interval, the proposed model is not only comparable with the accurate model, but also can reduce computational complexity. In addition, with the proposed analytical model, the thermal performance of the counter flow wet cooling towers in power plants is calculated. The results show that the proposed analytical model can be applied to evaluate and predict the thermal performance of counter flow wet cooling towers.

  9. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals

    DEFF Research Database (Denmark)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G

    2018-01-01

    for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision...... are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation...

  10. Validation of an analytical method for the determination of aldehydes and acetone present in the ambient air at the metropolitan area of Costa Rica

    International Nuclear Information System (INIS)

    Rojas Marin, Jose Felix

    2010-01-01

    The analytical method validation has been conducted for the simultaneous determination of 15 carbonyl compounds, the main aldehydes and ketones present in ambient air. The compounds have been captured on cartridges packed with silica gel impregnated with 2.4-dinitrophenylhydrazine (DNPH) at a constant flow of about 1 lmin -1 . Carbonyl compounds present have formed the respective products, which are then eluted with acetonitrile (solid phase extraction). The extracts were analyzed by the technique of high resolution liquid chromatography with ultraviolet detector at a wavelength of 360 nm. The following results were obtained during method validation: linearity from 0.03 mgl -1 to 15 mgl -1 , limits of detection and quantification of 0.02 mgl -1 and 0.06 mgl -1 , the accuracy no significant bias at a confidence level of 95%, accuracy for repeatability and producibility of the analytical method are around 1%. Two sampling campaigns were made in dry and rainy seasons of 2009 for areas of San Jose, Heredia and Belen. The predominant compounds were found to be acetone, acetaldehyde and the formaldehyde was the most abundant in the city of San Jose, others do not have significant amounts, so there is strong correlation between formaldehyde and acetaldehyde suggesting that stem from a common source, possibly vehicle emissions. (author) [es

  11. Complete analytic results for radiative-recoil corrections to ground-state muonium hyperfine splitting

    International Nuclear Information System (INIS)

    Karshenboim, S.G.; Shelyuto, V.A.; Eides, M.E.

    1988-01-01

    Analytic expressions are obtained for radiative corrections to the hyperfine splitting related to the muon line. The corresponding contribution amounts to (Z 2 a) (Za) (m/M) (9/2 ζ(3) - 3π 2 ln 2 + 39/8) in units of the Fermi hyperfine splitting energy. A complete analytic result for all radiative-recoil corrections is also presented

  12. An introduction to use of the USACE HTRW program's data validation guidelines engineering manual

    International Nuclear Information System (INIS)

    Becker, L.D.; Coats, K.H.

    1994-01-01

    Data validation has been defined by regulatory agencies as a systematic process (consisting of data editing, screening, checking, auditing, verification, certification, and review) for comparing data to established criteria in order to provide assurance that data are adequate for their intended use. A problem for the USACE HTRW Program was that clearly defined data validation guidelines were available only for analytical data quality level IV. These functional data validation guidelines were designed for validation of data produced using protocols from the US E.P.A.'s Contract Laboratory Program (CLP). Unfortunately, USACE experience demonstrates that these level IV functional data validation guidelines were being used to validate data not produced under the CLP. The resulting data validation product was less than satisfactory for USACE HTRW needs. Therefore, the HTRW-MCX initiated an Engineering Manual (EM) for validation of analytical data quality levels other than IV. This EM is entitle ''USACE HTRW Data Validation Guidelines.'' Use of the EM is required for validation of analytical data relating to projects under the jurisdiction of the Department of the Army, Corps of Engineers, Hazardous, Toxic, and Radioactive Waste Program. These data validation guidelines include procedures and checklists for technical review of analytical data at quality levels I, II, III, and V

  13. Advances in classical and analytical mechanics: A reviews of author’s results

    Directory of Open Access Journals (Sweden)

    Hedrih-Stevanović Katica R.

    2013-01-01

    Full Text Available A review, in subjective choice, of author’s scientific results in area of: classical mechanics, analytical mechanics of discrete hereditary systems, analytical mechanics of discrete fractional order system vibrations, elastodynamics, nonlinear dynamics and hybrid system dynamics is presented. Main original author’s results were presented through the mathematical methods of mechanics with examples of applications for solving problems of mechanical real system dynamics abstracted to the theoretical models of mechanical discrete or continuum systems, as well as hybrid systems. Paper, also, presents serries of methods and scientific results authored by professors Mitropolyski, Andjelić and Rašković, as well as author’s of this paper original scientific research results obtained by methods of her professors. Vector method based on mass inertia moment vectors and corresponding deviational vector components for pole and oriented axis, defined in 1991 by K. Hedrih, is presented. Results in construction of analytical dynamics of hereditary discrete system obtained in collaboration with O. A. Gorosho are presented. Also, some selections of results author’s postgraduate students and doctorantes in area of nonlinear dynamics are presented. A list of scientific projects headed by author of this paper is presented with a list of doctoral dissertation and magister of sciences thesis which contain scientific research results obtained under the supervision by author of this paper or their fist doctoral candidates. [Projekat Ministarstva nauke Republike Srbije, br. ON174001: Dynamics of hybrid systems with complex structures

  14. Roll-up of validation results to a target application.

    Energy Technology Data Exchange (ETDEWEB)

    Hills, Richard Guy

    2013-09-01

    Suites of experiments are preformed over a validation hierarchy to test computational simulation models for complex applications. Experiments within the hierarchy can be performed at different conditions and configurations than those for an intended application, with each experiment testing only part of the physics relevant for the application. The purpose of the present work is to develop methodology to roll-up validation results to an application, and to assess the impact the validation hierarchy design has on the roll-up results. The roll-up is accomplished through the development of a meta-model that relates validation measurements throughout a hierarchy to the desired response quantities for the target application. The meta-model is developed using the computation simulation models for the experiments and the application. The meta-model approach is applied to a series of example transport problems that represent complete and incomplete coverage of the physics of the target application by the validation experiments.

  15. Validation of the replica trick for simple models

    Science.gov (United States)

    Shinzato, Takashi

    2018-04-01

    We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.

  16. Analytical support of plant specific SAMG development validation of SAMG using MELCOR 1.8.5

    International Nuclear Information System (INIS)

    Duspiva, Jiri

    2006-01-01

    They are two NPPs in operation in Czech Republic. Both of NPPs operated in CR have already implemented EOPs, developed under collaboration with the WESE. The project on SAMG development has started and follows the previous one for EOPs also with the WESE as the leading organization. Plant specific SAMGs for the Temelin as well as Dukovany NPPs are based on the WOG generic SAMGs. The analytical support of plant specific SAMGs development is performed by the NRI Rez within the validation process. Basic conditions as well as their filling by NRI Rez are focused on analyst, analytical tools and their applications. More detail description is attended to the approach of the preparation of the MELCOR code application to the evaluation of hydrogen risk, validation of recent set of hydrogen passive autocatalytic recombiners and definition of proposals to amend system of hydrogen removal. Such kind of parametric calculations will request to perform very wide set of runs. It could not be possible with the whole plant model and decoupling of such calculation with storing of mass and energy sources into the containment is only one way. The example of this decoupling for the LOCA scenario is shown. It includes seven sources - heat losses from primary and secondary circuits, fluid blowndown through cold leg break, fission products blowndown through cold leg break, fluid blowndown through break in reactor pressure vessel bottom head, fission products through break in reactor pressure vessel bottom head, melt ejection from reactor pressure vessel to cavity and gas masses and heat losses from corium in cavity. The stand alone containment analysis was tested in two configurations - with or without taking of fission products into account. Testing showed very good agreement of all calculations until lower head failure and acceptable agreement after that. Also some problematic features appeared. The stand alone test with fission product was possible only after the changes in source code

  17. Development and validation of a simple high-performance liquid chromatography analytical method for simultaneous determination of phytosterols, cholesterol and squalene in parenteral lipid emulsions.

    Science.gov (United States)

    Novak, Ana; Gutiérrez-Zamora, Mercè; Domenech, Lluís; Suñé-Negre, Josep M; Miñarro, Montserrat; García-Montoya, Encarna; Llop, Josep M; Ticó, Josep R; Pérez-Lozano, Pilar

    2018-02-01

    A simple analytical method for simultaneous determination of phytosterols, cholesterol and squalene in lipid emulsions was developed owing to increased interest in their clinical effects. Method development was based on commonly used stationary (C 18 , C 8 and phenyl) and mobile phases (mixtures of acetonitrile, methanol and water) under isocratic conditions. Differences in stationary phases resulted in peak overlapping or coelution of different peaks. The best separation of all analyzed compounds was achieved on Zorbax Eclipse XDB C 8 (150 × 4.6 mm, 5 μm; Agilent) and ACN-H 2 O-MeOH, 80:19.5:0.5 (v/v/v). In order to achieve a shorter time of analysis, the method was further optimized and gradient separation was established. The optimized analytical method was validated and tested for routine use in lipid emulsion analyses. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Analytical results for a hole in an antiferromagnet

    International Nuclear Information System (INIS)

    Li, Y.M.; d'Ambrumenil, N.; Su, Z.B.

    1996-04-01

    The Green's function for a hole moving in an antiferromagnet is derived analytically in the long-wavelength limit. We find that the infrared divergence is eliminated in two and higher dimensions so that the quasiparticle weight is finite. Our results also suggest that the hole motion is polaronic in nature with a bandwidth proportional to t 2 /J exp[-c(t/J) 2 ] (c is a constant) for J/t >or approx 0.5. The connection of the long-wavelength approximation to the first-order approximation in the cumulant expansion is also clarified. (author). 23 refs, 2 figs

  19. 42 CFR 476.84 - Changes as a result of DRG validation.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Changes as a result of DRG validation. 476.84... § 476.84 Changes as a result of DRG validation. A provider or practitioner may obtain a review by a QIO... in DRG assignment as a result of QIO validation activities. ...

  20. Analytical performances of food microbiology laboratories - critical analysis of 7 years of proficiency testing results.

    Science.gov (United States)

    Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J

    2016-02-01

    Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.

  1. Irregular analytical errors in diagnostic testing - a novel concept.

    Science.gov (United States)

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    -isotope-dilution mass spectrometry methods are increasingly used for pre-market validation of routine diagnostic assays (these tests also involve substantial sets of clinical validation samples). Based on this definition/terminology, we list recognized causes of irregular analytical error as a risk catalog for clinical chemistry in this article. These issues include reproducible individual analytical errors (e.g. caused by anti-reagent antibodies) and non-reproducible, sporadic errors (e.g. errors due to incorrect pipetting volume due to air bubbles in a sample), which can both lead to inaccurate results and risks for patients.

  2. Validation for chromatographic and electrophoretic methods

    OpenAIRE

    Ribani, Marcelo; Bottoli, Carla Beatriz Grespan; Collins, Carol H.; Jardim, Isabel Cristina Sales Fontes; Melo, Lúcio Flávio Costa

    2004-01-01

    The validation of an analytical method is fundamental to implementing a quality control system in any analytical laboratory. As the separation techniques, GC, HPLC and CE, are often the principal tools used in such determinations, procedure validation is a necessity. The objective of this review is to describe the main aspects of validation in chromatographic and electrophoretic analysis, showing, in a general way, the similarities and differences between the guidelines established by the dif...

  3. Analytical method and result of radiation exposure for depressurization accident of HTTR

    International Nuclear Information System (INIS)

    Sawa, K.; Shiozawa, S.; Mikami, H.

    1990-01-01

    The Japan Atomic Energy Research Institute (JAERI) is now proceeding with the construction design of the High Temperature Engineering Test Reactor (HTTR). Since the HTTR has some characteristics different from LWRs, analytical method of radiation exposure in accidents provided for LWRs can not be applied directly. This paper describes the analytical method of radiation exposure developed by JAERI for the depressurization accident, which is the severest accident in respect to radiation exposure among the design basis accidents of the HTTR. The result is also described in this paper

  4. Tank 241-S-102, Core 232 analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    STEEN, F.H.

    1998-11-04

    This document is the analytical laboratory report for tank 241-S-102 push mode core segments collected between March 5, 1998 and April 2, 1998. The segments were subsampled and analyzed in accordance with the Tank 241-S-102 Retained Gas Sampler System Sampling and Analysis Plan (TSAP) (McCain, 1998), Letter of Instruction for Compatibility Analysis of Samples from Tank 241-S-102 (LOI) (Thompson, 1998) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DQO) (Mulkey and Miller, 1998). The analytical results are included in the data summary table (Table 1).

  5. Bio-analytical method development and validation of Rasagiline by high performance liquid chromatography tandem mass spectrometry detection and its application to pharmacokinetic study

    Directory of Open Access Journals (Sweden)

    Ravi Kumar Konda

    2012-10-01

    Full Text Available The most suitable bio-analytical method based on liquid–liquid extraction has been developed and validated for quantification of Rasagiline in human plasma. Rasagiline-13C3 mesylate was used as an internal standard for Rasagiline. Zorbax Eclipse Plus C18 (2.1 mm×50 mm, 3.5 μm column provided chromatographic separation of analyte followed by detection with mass spectrometry. The method involved simple isocratic chromatographic condition and mass spectrometric detection in the positive ionization mode using an API-4000 system. The total run time was 3.0 min. The proposed method has been validated with the linear range of 5–12000 pg/mL for Rasagiline. The intra-run and inter-run precision values were within 1.3%–2.9% and 1.6%–2.2% respectively for Rasagiline. The overall recovery for Rasagiline and Rasagiline-13C3 mesylate analog was 96.9% and 96.7% respectively. This validated method was successfully applied to the bioequivalence and pharmacokinetic study of human volunteers under fasting condition. Keywords: High performance liquid chromatography, Mass spectrometry, Rasagiline, Liquid–liquid extraction

  6. Review of analytical results from the proposed agent disposal facility site, Aberdeen Proving Ground

    Energy Technology Data Exchange (ETDEWEB)

    Brubaker, K.L.; Reed, L.L.; Myers, S.W.; Shepard, L.T.; Sydelko, T.G.

    1997-09-01

    Argonne National Laboratory reviewed the analytical results from 57 composite soil samples collected in the Bush River area of Aberdeen Proving Ground, Maryland. A suite of 16 analytical tests involving 11 different SW-846 methods was used to detect a wide range of organic and inorganic contaminants. One method (BTEX) was considered redundant, and two {open_quotes}single-number{close_quotes} methods (TPH and TOX) were found to lack the required specificity to yield unambiguous results, especially in a preliminary investigation. Volatile analytes detected at the site include 1, 1,2,2-tetrachloroethane, trichloroethylene, and tetrachloroethylene, all of which probably represent residual site contamination from past activities. Other volatile analytes detected include toluene, tridecane, methylene chloride, and trichlorofluoromethane. These compounds are probably not associated with site contamination but likely represent cross-contamination or, in the case of tridecane, a naturally occurring material. Semivolatile analytes detected include three different phthalates and low part-per-billion amounts of the pesticide DDT and its degradation product DDE. The pesticide could represent residual site contamination from past activities, and the phthalates are likely due, in part, to cross-contamination during sample handling. A number of high-molecular-weight hydrocarbons and hydrocarbon derivatives were detected and were probably naturally occurring compounds. 4 refs., 1 fig., 8 tabs.

  7. Validation of NAA Method for Urban Particulate Matter

    International Nuclear Information System (INIS)

    Woro Yatu Niken Syahfitri; Muhayatun; Diah Dwiana Lestiani; Natalia Adventini

    2009-01-01

    Nuclear analytical techniques have been applied in many countries for determination of environmental pollutant. Method of NAA (neutron activation analysis) representing one of nuclear analytical technique of that has low detection limits, high specificity, high precision, and accuracy for large majority of naturally occurring elements, and ability of non-destructive and simultaneous determination of multi-elemental, and can handle small sample size (< 1 mg). To ensure quality and reliability of the method, validation are needed to be done. A standard reference material, SRM NIST 1648 Urban Particulate Matter, has been used to validate NAA method. Accuracy and precision test were used as validation parameters. Particulate matter were validated for 18 elements: Ti, I, V, Br, Mn, Na, K, Cl, Cu, Al, As, Fe, Co, Zn, Ag, La, Cr, and Sm,. The result showed that the percent relative standard deviation of the measured elemental concentrations are found to be within ranged from 2 to 14,8% for most of the elements analyzed whereas Hor rat value in range 0,3-1,3. Accuracy test results showed that relative bias ranged from -11,1 to 3,6%. Based on validation results, it can be stated that NAA method is reliable for characterization particulate matter and other similar matrix samples to support air quality monitoring. (author)

  8. Soviet-designed pressurized water reactor symptomatic emergency operating instruction analytical procedure: approach, methodology, development and application

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1999-01-01

    A symptom approach to the analytical validation of symptom-based EOPs includes: (1) Identification of critical safety functions to the maintenance of fission product barrier integrity; (2) Identification of the symptoms which manifest an impending challenge to critical safety function maintenance; (3) Development of a symptomatic methodology to delineate bounding plant transient response modes; (4) Specification of bounding scenarios; (5) Development of a systematic calculational approach consistent with the objectives of the methodology; (6) Performance of thermal-hydraulic computer code calculations implementing the analytical methodology; (7) Interpretation of the analytical results on the basis of information available to the operator; (8) Application of the results to the validation of the proposed operator actions; (9) Production of a technical basis document justifying the proposed operator actions. (author)

  9. Validation of an analytical method for the determination of the sodium content in foods

    International Nuclear Information System (INIS)

    Valverde Montero, Ericka; Silva Trejos, Paulina

    2012-01-01

    The analytical methodology for quantitative determination of sodium in foods by flame atomic absorption spectrometry was validated. The samples of 0,5 g was realized by microwave oven with 5,0 mL of nitric acid (HNO 3 ) to 65% by mass. The linearity range has been from 0,043 mg/L to 0,70 mg/L with a correlation coefficient equal to 0,998. The detection and quantification limits have reported 0,025 mg/L and 0,043 mg/L, respectively; with 0,805 Lmg -1 of calibration sensitivity and 44 Lmg -1 of analytical sensitivity. The precision was evaluated in terms of repeatability and have obtained a value equal to 2,9% RDS r . The trueness was determined using three NIST ® , certified standards SRM 1846 Infant Formula with a reported value for sodium of (2310 ± 130) mg/kg, SRM 8414 Bovine Muscle Powder with a reported value for sodium of (0,210 ± 0,008)% and SRM 8415 Whole Egg Powder with a reported value for sodium of (0,377 ± 0,034)% by mass. The bias have obtained an average between(-0,010 to 0,009) mg/L. From the list of foods that were selected for the study, for example, whole milk powder, white wheat bread, fresh cheese and mozzarella cheese have submitted highest content in sodium concentrations, ranging from (106 to 452) mg Na /100g. (author) [es

  10. Development and validation of an analytical method for quality control and the stability of the eyedrops 10 % Phenylephrine and the 1 % Tropicamide

    International Nuclear Information System (INIS)

    Garcia Penna, Caridad Margarita; Botet Garcia, Martha; Troche Concepcion, Yenilen

    2011-01-01

    An analytical high-performance liquid chromatography method was developed and validated applicable to quality control and to stability study of 10 % phenylephrine plus eyedrops 1 % tropicamide. To quantify simultaneously both active principles in the finished product, separation was carried out through a Lichrosorb RP-18 (15 μm) (260 x 4 mm) column chromatography, with ultraviolet detection at 253 nm using the mobile phase composed of methanol: distilled water (1:1), with 1.1 g of sodium 1-octasulfanate by litre and pH fitted to 3.0 with phosphoric acid and the quantification of this front to a reference sample using the external standard method. The analytical method developed was linear, precise, specific and accurate in the rank of study concentrations, established for the quality control and stability study of the finished product since there were not analytical methods designed for these aims

  11. Validated spectroscopic methods for determination of anti-histaminic drug azelastine in pure form: Analytical application for quality control of its pharmaceutical preparations

    Science.gov (United States)

    El-Masry, Amal A.; Hammouda, Mohammed E. A.; El-Wasseef, Dalia R.; El-Ashry, Saadia M.

    2018-02-01

    Two simple, sensitive, rapid, validated and cost effective spectroscopic methods were established for quantification of antihistaminic drug azelastine (AZL) in bulk powder as well as in pharmaceutical dosage forms. In the first method (A) the absorbance difference between acidic and basic solutions was measured at 228 nm, whereas in the second investigated method (B) the binary complex formed between AZL and Eosin Y in acetate buffer solution (pH 3) was measured at 550 nm. Different criteria that have critical influence on the intensity of absorption were deeply studied and optimized so as to achieve the highest absorption. The proposed methods obeyed Beer's low in the concentration range of (2.0-20.0 μg·mL- 1) and (0.5-15.0 μg·mL- 1) with % recovery ± S.D. of (99.84 ± 0.87), (100.02 ± 0.78) for methods (A) and (B), respectively. Furthermore, the proposed methods were easily applied for quality control of pharmaceutical preparations without any conflict with its co-formulated additives, and the analytical results were compatible with those obtained by the comparison one with no significant difference as insured by student's t-test and the variance ratio F-test. Validation of the proposed methods was performed according the ICH guidelines in terms of linearity, limit of quantification, limit of detection, accuracy, precision and specificity, where the analytical results were persuasive. The absorption spectrum of AZL (16 μg·mL- 1) in 0.1 M HCl. The absorption spectrum of AZL (16 μg·mL- 1) in 0.1 M NaOH. The difference absorption spectrum of AZL (16 μg·mL- 1) in 0.1 M NaOH vs 0.1 M HCl. The absorption spectrum of eosin binary complex with AZL (10 μg·mL- 1).

  12. Analytical solutions of the electrostatically actuated curled beam problem

    KAUST Repository

    Younis, Mohammad I.

    2014-07-24

    This works presents analytical expressions of the electrostatically actuated initially deformed cantilever beam problem. The formulation is based on the continuous Euler-Bernoulli beam model combined with a single-mode Galerkin approximation. We derive simple analytical expressions for two commonly observed deformed beams configurations: the curled and tilted configurations. The derived analytical formulas are validated by comparing their results to experimental data and numerical results of a multi-mode reduced order model. The derived expressions do not involve any complicated integrals or complex terms and can be conveniently used by designers for quick, yet accurate, estimations. The formulas are found to yield accurate results for most commonly encountered microbeams of initial tip deflections of few microns. For largely deformed beams, we found that these formulas yield less accurate results due to the limitations of the single-mode approximation. In such cases, multi-mode reduced order models are shown to yield accurate results. © 2014 Springer-Verlag Berlin Heidelberg.

  13. Pesticides residues in water treatment plant sludge: validation of analytical methodology using liquid chromatography coupled to Tandem mass spectrometry (LC-MS/MS)

    International Nuclear Information System (INIS)

    Moracci, Luiz Fernando Soares

    2008-01-01

    The evolving scenario of Brazilian agriculture brings benefits to the population and demands technological advances to this field. Constantly, new pesticides are introduced encouraging scientific studies with the aim of determine and evaluate impacts on the population and on environment. In this work, the evaluated sample was the sludge resulted from water treatment plant located in the Vale do Ribeira, Sao Paulo, Brazil. The technique used was the reversed phase liquid chromatography coupled to electrospray ionization tandem mass spectrometry. Compounds were previously liquid extracted from the matrix. The development of the methodology demanded data processing in order to be transformed into reliable information. The processes involved concepts of validation of chemical analysis. The evaluated parameters were selectivity, linearity, range, sensitivity, accuracy, precision, limit of detection, limit of quantification and robustness. The obtained qualitative and quantitative results were statistically treated and presented. The developed and validated methodology is simple. As results, even exploring the sensitivity of the analytical technique, the work compounds were not detected in the sludge of the WTP. One can explain that these compounds can be present in a very low concentration, can be degraded under the conditions of the water treatment process or are not completely retained by the WTP. (author)

  14. Validation of a new analytical procedure for determination of residual solvents in [18F]FDG by gas chromatography

    International Nuclear Information System (INIS)

    Costa, Flávia M.; Costa, Cassiano L.S.; Silva, Juliana B.; Ferreira, Soraya M.Z.M.D.

    2017-01-01

    Fludeoxyglucose F 18 ([ 18 F]FDG) is the most used radiopharmaceutical for positron emission tomography, especially on oncology. Organic solvents such as ether, ethanol and acetonitrile might be used in the synthesis of [ 18 F]FDG; however, they might not be completely removed during purification steps. The determination of residual solvents in [ 18 F]FDG is required in the European Pharmacopoeia (EP) and the United States Pharmacopeia (USP) monographs. While the procedure described in the EP is quite general, the one described in the USP requires a long runtime (about 13 minutes). In this work a simple and fast (4-minute) analytical procedure was developed and validated for determination of residual solvents in [ 18 F]FDG. Analyses were carried out in a Perkin Elmer gas chromatograph equipped with a flame ionization detector. The separation was obtained on a 0.53-mm x 30 m fused-silica column. Validation included the evaluation of various parameters, such as: specificity, linearity and range, limits of detection and quantitation, precision (repeatability and intermediate precision), accuracy, and robustness. Results were found to be within acceptable limits, indicating the developed procedure is suitable for its intended application. Considering the short half-life of fluorine-18 (109.7 minutes), this new method could be a valuable alternative for routine quality control of [ 18 F]FDG. (author)

  15. On the Analytical Solution of Non-Orthogonal Stagnation Point Flow towards a Stretching Sheet

    DEFF Research Database (Denmark)

    Kimiaeifar, Amin; Bagheri, G. H.; Barari, Amin

    2011-01-01

    An analytical solution for non-orthogonal stagnation point for the steady flow of a viscous and incompressible fluid is presented. The governing nonlinear partial differential equations for the flow field are reduced to ordinary differential equations by using similarity transformations existed...... in the literature and are solved analytically by means of the Homotopy Analysis Method (HAM). The comparison of results from this paper and those published in the literature confirms the precise accuracy of the HAM. The resulting analytical equation from HAM is valid for entire physical domain and effective...

  16. Development and validation of HPLC analytical method for quantitative determination of metronidazole in human plasma

    International Nuclear Information System (INIS)

    Safdar, K.A.; Shyum, S.B.; Usman, S.

    2016-01-01

    The objective of the present study was to develop a simple, rapid and sensitive reversed-phase high performance liquid chromatographic (RP-HPLC) analytical method with UV detection system for the quantitative determination of metronidazole in human plasma. The chromatographic separation was performed by using C18 RP column (250mm X 4.6mm, 5 meu m) as stationary phase and 0.01M potassium dihydrogen phosphate buffered at pH 3.0 and acetonitrile (83:17, v/v) as mobile phase at flow rate of 1.0 ml/min. The UV detection was carried out at 320nm. The method was validated as per the US FDA guideline for bioanalytical method validation and was found to be selective without interferences from mobile phase components, impurities and biological matrix. The method found to be linear over the concentration range of 0.2812 meu g/ml to 18.0 meu g/ml (r2 = 0.9987) with adequate level of accuracy and precision. The samples were found to be stable under various recommended laboratory and storage conditions. Therefore, the method can be used with adequate level of confidence and assurance for bioavailability, bioequivalence and other pharmacokinetic studies of metronidazole in human. (author)

  17. Comparison of gamma knife validation film's analysis results of different film dose analysis software

    International Nuclear Information System (INIS)

    Cheng Xiaojun; Zhang Conghua; Liu Han; Dai Fuyou; Hu Chuanpeng; Liu Cheng; Yao Zhongfu

    2011-01-01

    Objective: To compare the analytical result of different kinds of film dose analysis software for the same gamma knife, analyze the reasons of difference caused, and explore the measurements and means for quality control and quality assurance during testing gamma knife and analyzing its result. Methods: To test the Moon Deity gamma knife with Kodak EDR2 film and γ-Star gamma knife with GAFCHROMIC® EBT film, respectively. All the validation films are scanned to proper imagine format for dose analysis software by EPSON PERFECTION V750 PRO scanner. Then imagines of Moon Deity gamma knife are analyzed with Robot Knife Adjuvant 1.09 and Fas-09 1.0, and imagines of γ-Star gamma knife with Fas-09 and MATLAB 7.0. Results: There is no significant difference in the maximum deviation of radiation field size (Full Width at Half Maximum, FWHM) and its nominal value between Robot Knife Adjuvant and Fas-09 for Moon Deity gamma knife (t=-2.133, P>0.05). The analysis on the radiation field's penumbra region width of collimators which have different sizes indicated that the differences are significant (t=-8.154, P<0.05). There is no significant difference in the maximum deviation of FWHM and its nominal value between Fas-09 and MATLAB for γ-Star gamma knife (t=-1.384, P>0.05). However, following national standards,analysis of φ4 mm width of collimators can obtain different results according to the two kinds software, and the result of Fas-09 is not qualified while MATLAB is qualified. The analysis on the radiation field's penumbra region width of collimators which have different sizes indicates that the differences are significant (t=3.074, P<0.05). The imagines are processed with Fas-09. The analysis of imagine in the pre-and the post-processing indicates that there is no significant difference in the maximum deviation of FWHM and its nominal value (t=0.647, P>0.05), and the analytical result of the radiation field's penumbra region width indicates that there is

  18. Modification and validation of an analytical source model for external beam radiotherapy Monte Carlo dose calculations

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, Scott E., E-mail: sedavids@utmb.edu [Radiation Oncology, The University of Texas Medical Branch, Galveston, Texas 77555 (United States); Cui, Jing [Radiation Oncology, University of Southern California, Los Angeles, California 90033 (United States); Kry, Stephen; Ibbott, Geoffrey S.; Followill, David S. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Deasy, Joseph O. [Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, New York 10065 (United States); Vicic, Milos [Department of Applied Physics, University of Belgrade, Belgrade 11000 (Serbia); White, R. Allen [Bioinformatics and Computational Biology, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States)

    2016-08-15

    Purpose: A dose calculation tool, which combines the accuracy of the dose planning method (DPM) Monte Carlo code and the versatility of a practical analytical multisource model, which was previously reported has been improved and validated for the Varian 6 and 10 MV linear accelerators (linacs). The calculation tool can be used to calculate doses in advanced clinical application studies. One shortcoming of current clinical trials that report dose from patient plans is the lack of a standardized dose calculation methodology. Because commercial treatment planning systems (TPSs) have their own dose calculation algorithms and the clinical trial participant who uses these systems is responsible for commissioning the beam model, variation exists in the reported calculated dose distributions. Today’s modern linac is manufactured to tight specifications so that variability within a linac model is quite low. The expectation is that a single dose calculation tool for a specific linac model can be used to accurately recalculate dose from patient plans that have been submitted to the clinical trial community from any institution. The calculation tool would provide for a more meaningful outcome analysis. Methods: The analytical source model was described by a primary point source, a secondary extra-focal source, and a contaminant electron source. Off-axis energy softening and fluence effects were also included. The additions of hyperbolic functions have been incorporated into the model to correct for the changes in output and in electron contamination with field size. A multileaf collimator (MLC) model is included to facilitate phantom and patient dose calculations. An offset to the MLC leaf positions was used to correct for the rudimentary assumed primary point source. Results: Dose calculations of the depth dose and profiles for field sizes 4 × 4 to 40 × 40 cm agree with measurement within 2% of the maximum dose or 2 mm distance to agreement (DTA) for 95% of the data

  19. Extension of analytical indicial aerodynamics to generic trapezoidal wings in subsonic flow

    Directory of Open Access Journals (Sweden)

    Andrea DA RONCH

    2018-04-01

    Full Text Available Analytical indicial aerodynamic functions are calculated for several trapezoidal wings in subsonic flow, with a Mach number 0.3 ≤ Ma ≤ 0.7. The formulation herein proposed extends well-known aerodynamic theories, which are limited to thin aerofoils in incompressible flow, to generic trapezoidal wing planforms. Firstly, a thorough study is executed to assess the accuracy and limitation of analytical predictions, using unsteady results from two state-of-the-art computational fluid dynamics solvers as cross-validated benchmarks. Indicial functions are calculated for a step change in the angle of attack and for a sharp-edge gust, each for four wing configurations and three Mach numbers. Then, analytical and computational indicial responses are used to predict dynamic derivatives and the maximum lift coefficient following an encounter with a one-minus-cosine gust. It is found that the analytical results are in excellent agreement with the computational results for all test cases. In particular, the deviation of the analytical results from the computational results is within the scatter or uncertainty in the data arising from using two computational fluid dynamics solvers. This indicates the usefulness of the developed analytical theories. Keywords: Analytical approach, CFD, Compressible flow, Gust response, Indicial aerodynamics, Trapezoidal wing

  20. Analytical Model for High Impedance Fault Analysis in Transmission Lines

    Directory of Open Access Journals (Sweden)

    S. Maximov

    2014-01-01

    Full Text Available A high impedance fault (HIF normally occurs when an overhead power line physically breaks and falls to the ground. Such faults are difficult to detect because they often draw small currents which cannot be detected by conventional overcurrent protection. Furthermore, an electric arc accompanies HIFs, resulting in fire hazard, damage to electrical devices, and risk with human life. This paper presents an analytical model to analyze the interaction between the electric arc associated to HIFs and a transmission line. A joint analytical solution to the wave equation for a transmission line and a nonlinear equation for the arc model is presented. The analytical model is validated by means of comparisons between measured and calculated results. Several cases of study are presented which support the foundation and accuracy of the proposed model.

  1. Validated sampling strategy for assessing contaminants in soil stockpiles

    International Nuclear Information System (INIS)

    Lame, Frank; Honders, Ton; Derksen, Giljam; Gadella, Michiel

    2005-01-01

    Dutch legislation on the reuse of soil requires a sampling strategy to determine the degree of contamination. This sampling strategy was developed in three stages. Its main aim is to obtain a single analytical result, representative of the true mean concentration of the soil stockpile. The development process started with an investigation into how sample pre-treatment could be used to obtain representative results from composite samples of heterogeneous soil stockpiles. Combining a large number of random increments allows stockpile heterogeneity to be fully represented in the sample. The resulting pre-treatment method was then combined with a theoretical approach to determine the necessary number of increments per composite sample. At the second stage, the sampling strategy was evaluated using computerised models of contaminant heterogeneity in soil stockpiles. The now theoretically based sampling strategy was implemented by the Netherlands Centre for Soil Treatment in 1995. It was applied to all types of soil stockpiles, ranging from clean to heavily contaminated, over a period of four years. This resulted in a database containing the analytical results of 2570 soil stockpiles. At the final stage these results were used for a thorough validation of the sampling strategy. It was concluded that the model approach has indeed resulted in a sampling strategy that achieves analytical results representative of the mean concentration of soil stockpiles. - A sampling strategy that ensures analytical results representative of the mean concentration in soil stockpiles is presented and validated

  2. Surveillance of emerging drugs of abuse in Hong Kong: validation of an analytical tool.

    Science.gov (United States)

    Tang, Magdalene H Y; Ching, C K; Tse, M L; Ng, Carol; Lee, Caroline; Chong, Y K; Wong, Watson; Mak, Tony W L

    2015-04-01

    To validate a locally developed chromatography-based method to monitor emerging drugs of abuse whilst performing regular drug testing in abusers. Cross-sectional study. Eleven regional hospitals, seven social service units, and a tertiary level clinical toxicology laboratory in Hong Kong. A total of 972 drug abusers and high-risk individuals were recruited from acute, rehabilitation, and high-risk settings between 1 November 2011 and 31 July 2013. A subset of the participants was of South Asian ethnicity. In total, 2000 urine or hair specimens were collected. Proof of concept that surveillance of emerging drugs of abuse can be performed whilst conducting routine drug of abuse testing in patients. The method was successfully applied to 2000 samples with three emerging drugs of abuse detected in five samples: PMMA (paramethoxymethamphetamine), TFMPP [1-(3-trifluoromethylphenyl)piperazine], and methcathinone. The method also detected conventional drugs of abuse, with codeine, methadone, heroin, methamphetamine, and ketamine being the most frequently detected drugs. Other findings included the observation that South Asians had significantly higher rates of using opiates such as heroin, methadone, and codeine; and that ketamine and cocaine had significantly higher detection rates in acute subjects compared with the rehabilitation population. This locally developed analytical method is a valid tool for simultaneous surveillance of emerging drugs of abuse and routine drug monitoring of patients at minimal additional cost and effort. Continued, proactive surveillance and early identification of emerging drugs will facilitate prompt clinical, social, and legislative management.

  3. Statistically qualified neuro-analytic failure detection method and system

    Science.gov (United States)

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  4. Anisotropic Multishell Analytical Modeling of an Intervertebral Disk Subjected to Axial Compression.

    Science.gov (United States)

    Demers, Sébastien; Nadeau, Sylvie; Bouzid, Abdel-Hakim

    2016-04-01

    Studies on intervertebral disk (IVD) response to various loads and postures are essential to understand disk's mechanical functions and to suggest preventive and corrective actions in the workplace. The experimental and finite-element (FE) approaches are well-suited for these studies, but validating their findings is difficult, partly due to the lack of alternative methods. Analytical modeling could allow methodological triangulation and help validation of FE models. This paper presents an analytical method based on thin-shell, beam-on-elastic-foundation and composite materials theories to evaluate the stresses in the anulus fibrosus (AF) of an axisymmetric disk composed of multiple thin lamellae. Large deformations of the soft tissues are accounted for using an iterative method and the anisotropic material properties are derived from a published biaxial experiment. The results are compared to those obtained by FE modeling. The results demonstrate the capability of the analytical model to evaluate the stresses at any location of the simplified AF. It also demonstrates that anisotropy reduces stresses in the lamellae. This novel model is a preliminary step in developing valuable analytical models of IVDs, and represents a distinctive groundwork that is able to sustain future refinements. This paper suggests important features that may be included to improve model realism.

  5. Analytic webs support the synthesis of ecological data sets.

    Science.gov (United States)

    Ellison, Aaron M; Osterweil, Leon J; Clarke, Lori; Hadley, Julian L; Wise, Alexander; Boose, Emery; Foster, David R; Hanson, Allen; Jensen, David; Kuzeja, Paul; Riseman, Edward; Schultz, Howard

    2006-06-01

    A wide variety of data sets produced by individual investigators are now synthesized to address ecological questions that span a range of spatial and temporal scales. It is important to facilitate such syntheses so that "consumers" of data sets can be confident that both input data sets and synthetic products are reliable. Necessary documentation to ensure the reliability and validation of data sets includes both familiar descriptive metadata and formal documentation of the scientific processes used (i.e., process metadata) to produce usable data sets from collections of raw data. Such documentation is complex and difficult to construct, so it is important to help "producers" create reliable data sets and to facilitate their creation of required metadata. We describe a formal representation, an "analytic web," that aids both producers and consumers of data sets by providing complete and precise definitions of scientific processes used to process raw and derived data sets. The formalisms used to define analytic webs are adaptations of those used in software engineering, and they provide a novel and effective support system for both the synthesis and the validation of ecological data sets. We illustrate the utility of an analytic web as an aid to producing synthetic data sets through a worked example: the synthesis of long-term measurements of whole-ecosystem carbon exchange. Analytic webs are also useful validation aids for consumers because they support the concurrent construction of a complete, Internet-accessible audit trail of the analytic processes used in the synthesis of the data sets. Finally we describe our early efforts to evaluate these ideas through the use of a prototype software tool, SciWalker. We indicate how this tool has been used to create analytic webs tailored to specific data-set synthesis and validation activities, and suggest extensions to it that will support additional forms of validation. The process metadata created by SciWalker is

  6. Validation of an analytical method for the determination of polycyclic aromatic hydrocarbons by high efficiency liquid chromatography in PM10 and PM2,5 particles

    International Nuclear Information System (INIS)

    Herrera Murillo, Jorge; Chaves Villalobos, Maria del Carmen

    2012-01-01

    An analytical method was validated for polycyclic aromatic hydrocarbons in PM10 and PM2,5 particles collected from air by high performance liquid chromatography (HPLC) was validated. The PAHs analyzed in the methodology include: Naphthalene, Acenaphthylene, Fluorene, Acenaphthene, Phenanthrene, Anthracene, fluoranthene, pyrene, Benzo (a)anthracene, Chrysene, Benzo (b)fluoranthene, Benzo (k)fluoranthene, Benzo (a)pyrene, Dibenzo (a, h)anthracene, Benzo (g, h, i)perylene and Indeno (1,2,3-CD)pyrene. For these compounds, the detection limit and quantification limit have been between 0,02 and 0,1 mg/l. An equipment DIONEX, ICS 3000 model is used, that has two in series detectors: one ultraviolet model VWD-1, and fluorescence detector, model RF-2000, separating the different absorption and emission signals for proper identification of individual compounds. For all the compounds analyzed, the recovery factor has found not significantly different from each other and the repeatability and reproducibility has been to be suitable for an analytical method, especially for the lighter PAHs. (author) [es

  7. Analytical Method Validation of High-Performance Liquid Chromatography and Stability-Indicating Study of Medroxyprogesterone Acetate Intravaginal Sponges

    Directory of Open Access Journals (Sweden)

    Nidal Batrawi

    2017-02-01

    Full Text Available Medroxyprogesterone acetate is widely used in veterinary medicine as intravaginal dosage for the synchronization of breeding cycle in ewes and goats. The main goal of this study was to develop reverse-phase high-performance liquid chromatography method for the quantification of medroxyprogesterone acetate in veterinary vaginal sponges. A single high-performance liquid chromatography/UV isocratic run was used for the analytical assay of the active ingredient medroxyprogesterone. The chromatographic system consisted of a reverse-phase C18 column as the stationary phase and a mixture of 60% acetonitrile and 40% potassium dihydrogen phosphate buffer as the mobile phase; the pH was adjusted to 5.6. The method was validated according to the International Council for Harmonisation (ICH guidelines. Forced degradation studies were also performed to evaluate the stability-indicating properties and specificity of the method. Medroxyprogesterone was eluted at 5.9 minutes. The linearity of the method was confirmed in the range of 0.0576 to 0.1134 mg/mL ( R 2 > 0.999. The limit of quantification was shown to be 3.9 µg/mL. Precision and accuracy ranges were found to be %RSD <0.2 and 98% to 102%, respectively. Medroxyprogesterone capacity factor value of 2.1, tailing factor value of 1.03, and resolution value of 3.9 were obtained in accordance with ICH guidelines. Based on the obtained results, a rapid, precise, accurate, sensitive, and cost-effective analysis procedure was proposed for quantitative determination of medroxyprogesterone in vaginal sponges. This analytical method is the only available method to analyse medroxyprogesterone in veterinary intravaginal dosage form.

  8. Experimental verification and analytical calculation of unbalanced magnetic force in permanent magnet machines

    Directory of Open Access Journals (Sweden)

    Kyung-Hun Shin

    2017-05-01

    Full Text Available In this study, an exact analytical solution based on Fourier analysis is proposed to compute the unbalanced magnetic force in a permanent magnet machine. The magnetic field solutions are obtained by using a magnetic vector potential and by selecting the appropriate boundary conditions. Based on these field solutions, the force characteristics are also determined analytically. All analytical results were extensively validated with nonlinear two-dimensional finite element analysis and experimental results. Using proposed method, we investigated the influence on the UMF according to machine parameters. Therefore, the proposed method should be very useful in initial design and optimization process of PM machines for UMF analysis.

  9. Validation of a new analytical procedure for determination of residual solvents in [{sup 18}F]FDG by gas chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Flávia M.; Costa, Cassiano L.S.; Silva, Juliana B.; Ferreira, Soraya M.Z.M.D., E-mail: flaviabiomedica@yahoo.com.br [Centro de Desenvolvimento da Tecnologia Nuclear (UPPR/CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Unidade de Pesquisa e Produção de Radiofármacos

    2017-07-01

    Fludeoxyglucose F 18 ([{sup 18}F]FDG) is the most used radiopharmaceutical for positron emission tomography, especially on oncology. Organic solvents such as ether, ethanol and acetonitrile might be used in the synthesis of [{sup 18}F]FDG; however, they might not be completely removed during purification steps. The determination of residual solvents in [{sup 18}F]FDG is required in the European Pharmacopoeia (EP) and the United States Pharmacopeia (USP) monographs. While the procedure described in the EP is quite general, the one described in the USP requires a long runtime (about 13 minutes). In this work a simple and fast (4-minute) analytical procedure was developed and validated for determination of residual solvents in [{sup 18}F]FDG. Analyses were carried out in a Perkin Elmer gas chromatograph equipped with a flame ionization detector. The separation was obtained on a 0.53-mm x 30 m fused-silica column. Validation included the evaluation of various parameters, such as: specificity, linearity and range, limits of detection and quantitation, precision (repeatability and intermediate precision), accuracy, and robustness. Results were found to be within acceptable limits, indicating the developed procedure is suitable for its intended application. Considering the short half-life of fluorine-18 (109.7 minutes), this new method could be a valuable alternative for routine quality control of [{sup 18}F]FDG. (author)

  10. Analytical Validation of the ReEBOV Antigen Rapid Test for Point-of-Care Diagnosis of Ebola Virus Infection.

    Science.gov (United States)

    Cross, Robert W; Boisen, Matthew L; Millett, Molly M; Nelson, Diana S; Oottamasathien, Darin; Hartnett, Jessica N; Jones, Abigal B; Goba, Augustine; Momoh, Mambu; Fullah, Mohamed; Bornholdt, Zachary A; Fusco, Marnie L; Abelson, Dafna M; Oda, Shunichiro; Brown, Bethany L; Pham, Ha; Rowland, Megan M; Agans, Krystle N; Geisbert, Joan B; Heinrich, Megan L; Kulakosky, Peter C; Shaffer, Jeffrey G; Schieffelin, John S; Kargbo, Brima; Gbetuwa, Momoh; Gevao, Sahr M; Wilson, Russell B; Saphire, Erica Ollmann; Pitts, Kelly R; Khan, Sheik Humarr; Grant, Donald S; Geisbert, Thomas W; Branco, Luis M; Garry, Robert F

    2016-10-15

     Ebola virus disease (EVD) is a severe viral illness caused by Ebola virus (EBOV). The 2013-2016 EVD outbreak in West Africa is the largest recorded, with >11 000 deaths. Development of the ReEBOV Antigen Rapid Test (ReEBOV RDT) was expedited to provide a point-of-care test for suspected EVD cases.  Recombinant EBOV viral protein 40 antigen was used to derive polyclonal antibodies for RDT and enzyme-linked immunosorbent assay development. ReEBOV RDT limits of detection (LOD), specificity, and interference were analytically validated on the basis of Food and Drug Administration (FDA) guidance.  The ReEBOV RDT specificity estimate was 95% for donor serum panels and 97% for donor whole-blood specimens. The RDT demonstrated sensitivity to 3 species of Ebolavirus (Zaire ebolavirus, Sudan ebolavirus, and Bundibugyo ebolavirus) associated with human disease, with no cross-reactivity by pathogens associated with non-EBOV febrile illness, including malaria parasites. Interference testing exhibited no reactivity by medications in common use. The LOD for antigen was 4.7 ng/test in serum and 9.4 ng/test in whole blood. Quantitative reverse transcription-polymerase chain reaction testing of nonhuman primate samples determined the range to be equivalent to 3.0 × 10 5 -9.0 × 10 8 genomes/mL.  The analytical validation presented here contributed to the ReEBOV RDT being the first antigen-based assay to receive FDA and World Health Organization emergency use authorization for this EVD outbreak, in February 2015. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  11. Validating and Determining the Weight of Items Used for Evaluating Clinical Governance Implementation Based on Analytic Hierarchy Process Model

    Directory of Open Access Journals (Sweden)

    Elaheh Hooshmand

    2015-10-01

    Full Text Available Background The purpose of implementing a system such as Clinical Governance (CG is to integrate, establish and globalize distinct policies in order to improve quality through increasing professional knowledge and the accountability of healthcare professional toward providing clinical excellence. Since CG is related to change, and change requires money and time, CG implementation has to be focused on priority areas that are in more dire need of change. The purpose of the present study was to validate and determine the significance of items used for evaluating CG implementation. Methods The present study was descriptive-quantitative in method and design. Items used for evaluating CG implementation were first validated by the Delphi method and then compared with one another and ranked based on the Analytical Hierarchy Process (AHP model. Results The items that were validated for evaluating CG implementation in Iran include performance evaluation, training and development, personnel motivation, clinical audit, clinical effectiveness, risk management, resource allocation, policies and strategies, external audit, information system management, research and development, CG structure, implementation prerequisites, the management of patients’ non-medical needs, complaints and patients’ participation in the treatment process. The most important items based on their degree of significance were training and development, performance evaluation, and risk management. The least important items included the management of patients’ non-medical needs, patients’ participation in the treatment process and research and development. Conclusion The fundamental requirements of CG implementation included having an effective policy at national level, avoiding perfectionism, using the expertise and potentials of the entire country and the coordination of this model with other models of quality improvement such as accreditation and patient safety.

  12. Develop and validation of an analytic method for the histamine determination in fish, using chromatography liquidates of high efficiency in reverse phase with ultraviolet detection

    International Nuclear Information System (INIS)

    Valverde Chavarria, J. C.

    1997-01-01

    There were determined and optimized the reaction and conditions analysis, for the derivation of the histamine with the reagent of or-ftalaldehido (OPA), it was proven that it is possible to quantify the one derived formed at 333nm. The good conditions crhomatografics settled down for the determination of the histamine in fish by means of the analytic technique of chromatography it liquidates of high efficiency (HPLC) in reverse phase, using the derivatizacion in precolumn of the histamine with the reagent of OPA, with ultraviolet detection at 333nm. The conditions of the proposed methodology were optimized and the variables of analytic acting were validated, for the analytic quantification of the histamine in the mg g-1 environment. The applicability of the methodology was demonstrated in the histamine determination in samples of fresh fish [es

  13. The German cervical cancer screening model: development and validation of a decision-analytic model for cervical cancer screening in Germany.

    Science.gov (United States)

    Siebert, Uwe; Sroczynski, Gaby; Hillemanns, Peter; Engel, Jutta; Stabenow, Roland; Stegmaier, Christa; Voigt, Kerstin; Gibis, Bernhard; Hölzel, Dieter; Goldie, Sue J

    2006-04-01

    We sought to develop and validate a decision-analytic model for the natural history of cervical cancer for the German health care context and to apply it to cervical cancer screening. We developed a Markov model for the natural history of cervical cancer and cervical cancer screening in the German health care context. The model reflects current German practice standards for screening, diagnostic follow-up and treatment regarding cervical cancer and its precursors. Data for disease progression and cervical cancer survival were obtained from the literature and German cancer registries. Accuracy of Papanicolaou (Pap) testing was based on meta-analyses. We performed internal and external model validation using observed epidemiological data for unscreened women from different German cancer registries. The model predicts life expectancy, incidence of detected cervical cancer cases, lifetime cervical cancer risks and mortality. The model predicted a lifetime cervical cancer risk of 3.0% and a lifetime cervical cancer mortality of 1.0%, with a peak cancer incidence of 84/100,000 at age 51 years. These results were similar to observed data from German cancer registries, German literature data and results from other international models. Based on our model, annual Pap screening could prevent 98.7% of diagnosed cancer cases and 99.6% of deaths due to cervical cancer in women completely adherent to screening and compliant to treatment. Extending the screening interval from 1 year to 2, 3 or 5 years resulted in reduced screening effectiveness. This model provides a tool for evaluating the long-term effectiveness of different cervical cancer screening tests and strategies.

  14. Analytical and experimental investigations of magnetohydrodynamic flows near the entrance to a strong magnetic field

    International Nuclear Information System (INIS)

    Picologlou, B.F.; Reed, C.B.; Dauzvardis, P.V.; Walker, J.S.

    1986-01-01

    A program of analytical and experimental investigations in MHD flows has been established at Argonne National Lab. (ANL) within the framework of the Blanket Technology Program. An experimental facility for such investigations has been built and is being operated at ANL. The investigations carried out on the Argonne Liquid-Metal engineering EXperiment (ALEX) are complemented by analysis carried out at the Univ. of Illinois. The first phase of the experimental program is devoted to investigations of well-defined cases for which analytical solutions exist. Such testing will allow validation and increased confidence in the theory. Because analytical solutions exist for only a few cases, which do not cover the entire range of anticipated flow behavior, confining testing to these cases will not be an adequate validation of the theory. For this reason, this phase involves testing and a companion analytical effort aimed toward obtaining solutions for a broad range of cases, which, although simple in geometry, are believed to encompass the range of flow phenomena relevant to fusion. This parallel approach is necessary so that analysis will guide and help plan the experiments, whereas the experimental results will provide information needed to validate and/or refine the analysis

  15. Dried blood spot specimen quality and validation of a new pre-analytical processing method for qualitative HIV-1 PCR, KwaZulu-Natal, South Africa

    Science.gov (United States)

    Parboosing, Raveen; Siyaca, Ntombizandile; Moodley, Pravikrishnen

    2016-01-01

    Background Poor quality dried blood spot (DBS) specimens are usually rejected by virology laboratories, affecting early infant diagnosis of HIV. The practice of combining two incompletely-filled DBS in one specimen preparation tube during pre-analytical specimen processing (i.e., the two-spot method) has been implemented to reduce the number of specimens being rejected for insufficient volume. Objectives This study analysed laboratory data to describe the quality of DBS specimens and the use of the two-spot method over a one-year period, then validated the two-spot method against the standard (one-spot) method. Methods Data on HIV-1 PCR test requests submitted in 2014 to the Department of Virology at Inkosi Albert Luthuli Central Hospital in KwaZulu-Natal province, South Africa were analysed to describe reasons for specimen rejection, as well as results of the two-spot method. The accuracy, lower limit of detection and precision of the two-spot method were assessed. Results Of the 88 481 specimens received, 3.7% were rejected for pre-analytical problems. Of those, 48.9% were rejected as a result of insufficient specimen volume. Two health facilities had significantly more specimen rejections than other facilities. The two-spot method prevented 10 504 specimen rejections. The Pearson correlation coefficient comparing the standard to the two-spot method was 0.997. Conclusions The two-spot method was comparable with the standard method of pre-analytical specimen processing. Two health facilities were identified for targeted retraining on specimen quality. The two-spot method of DBS specimen processing can be used as an adjunct to retraining, to reduce the number of specimens rejected and improve linkage to care. PMID:28879108

  16. Analytic results for planar three-loop integrals for massive form factors

    Energy Technology Data Exchange (ETDEWEB)

    Henn, Johannes M. [PRISMA Cluster of Excellence, Johannes Gutenberg Universität Mainz,55099 Mainz (Germany); Kavli Institute for Theoretical Physics, UC Santa Barbara,Santa Barbara (United States); Smirnov, Alexander V. [Research Computing Center, Moscow State University,119992 Moscow (Russian Federation); Smirnov, Vladimir A. [Skobeltsyn Institute of Nuclear Physics of Moscow State University,119992 Moscow (Russian Federation); Institut für Theoretische Teilchenphysik, Karlsruhe Institute of Technology (KIT),76128 Karlsruhe (Germany)

    2016-12-28

    We use the method of differential equations to analytically evaluate all planar three-loop Feynman integrals relevant for form factor calculations involving massive particles. Our results for ninety master integrals at general q{sup 2} are expressed in terms of multiple polylogarithms, and results for fiftyone master integrals at the threshold q{sup 2}=4m{sup 2} are expressed in terms of multiple polylogarithms of argument one, with indices equal to zero or to a sixth root of unity.

  17. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    Science.gov (United States)

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  18. Analytical validation and reference intervals for freezing point depression osmometer measurements of urine osmolality in dogs.

    Science.gov (United States)

    Guerrero, Samantha; Pastor, Josep; Tvarijonaviciute, Asta; Cerón, José Joaquín; Balestra, Graziano; Caldin, Marco

    2017-11-01

    Urine osmolality (UOsm) is considered the most accurate measure of urine concentration and is used to assess body fluid homeostasis and renal function. We performed analytical validation of freezing point depression measurement of canine UOsm, to establish reference intervals (RIs) and to determine the effect of age, sex, and reproductive status on UOsm in dogs. Clinically healthy dogs ( n = 1,991) were retrospectively selected and stratified in groups by age (young [0-12 mo], adults [13-84 mo], and seniors [>84 mo]), sex (females and males), and reproductive status (intact and neutered). RIs were calculated for each age group. Intra- and inter-assay coefficients of variation were dogs, and 366-2,178 mOsm/kg in seniors. Senior dogs had a significantly lower UOsm than young and adult dogs ( p dogs ( p dogs.

  19. Comparison of analytical and Monte Carlo calculations of multi-photon effects in bremsstrahlung emission by high-energy electrons

    DEFF Research Database (Denmark)

    Mangiarotti, Alessio; Sona, Pietro; Ballestrero, Sergio

    2012-01-01

    Approximate analytical calculations of multi-photon effects in the spectrum of total radiated energy by high-energy electrons crossing thin targets are compared to the results of Monte Carlo type simulations. The limits of validity of the analytical expressions found in the literature are establi...

  20. Validation of Pressure Drop Models for PHWR-type Fuel Elements

    International Nuclear Information System (INIS)

    Brasnarof Daniel; Daverio, H.

    2003-01-01

    In the present work an one-dimensional pressure drop analytical model and the COBRA code, are validated with experimental data of CANDU and Atucha fuel bundles in low and high pressure experimental test loops.Models have very good agreement with the experimental data, having less than 5 % of discrepancy. The analytical model results were compared with COBRA code results, having small difference between them in a wide range of pressure, temperature and mass flow

  1. Tank 241-T-203, core 190 analytical results for the final report

    International Nuclear Information System (INIS)

    Steen, F.H.

    1997-01-01

    This document is the analytical laboratory report for tank 241-T-203 push mode core segments collected on April 17, 1997 and April 18, 1997. The segments were subsainpled and analyzed in accordance with the Tank 241-T-203 Push Mode Core Sampling andanalysis Plan (TSAP) (Schreiber, 1997a), the Safety Screening Data Quality Objective (DQO)(Dukelow, et al., 1995) and Leffer oflnstructionfor Core Sample Analysis of Tanks 241-T-201, 241-T-202, 241-T-203, and 241-T-204 (LOI)(Hall, 1997). The analytical results are included in the data summary report (Table 1). None of the samples submitted for Differential Scanning Calorimetry (DSC), Total Alpha Activity (AT) and Total Organic Carbon (TOC) exceeded notification limits as stated in the TSAP (Schreiber, 1997a). The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems (TWRS) Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997b) and not considered in this report

  2. ValidatorDB: database of up-to-date validation results for ligands and non-standard residues from the Protein Data Bank.

    Science.gov (United States)

    Sehnal, David; Svobodová Vařeková, Radka; Pravda, Lukáš; Ionescu, Crina-Maria; Geidl, Stanislav; Horský, Vladimír; Jaiswal, Deepti; Wimmerová, Michaela; Koča, Jaroslav

    2015-01-01

    Following the discovery of serious errors in the structure of biomacromolecules, structure validation has become a key topic of research, especially for ligands and non-standard residues. ValidatorDB (freely available at http://ncbr.muni.cz/ValidatorDB) offers a new step in this direction, in the form of a database of validation results for all ligands and non-standard residues from the Protein Data Bank (all molecules with seven or more heavy atoms). Model molecules from the wwPDB Chemical Component Dictionary are used as reference during validation. ValidatorDB covers the main aspects of validation of annotation, and additionally introduces several useful validation analyses. The most significant is the classification of chirality errors, allowing the user to distinguish between serious issues and minor inconsistencies. Other such analyses are able to report, for example, completely erroneous ligands, alternate conformations or complete identity with the model molecules. All results are systematically classified into categories, and statistical evaluations are performed. In addition to detailed validation reports for each molecule, ValidatorDB provides summaries of the validation results for the entire PDB, for sets of molecules sharing the same annotation (three-letter code) or the same PDB entry, and for user-defined selections of annotations or PDB entries. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. Fine structure and analytical quantum-defect wave functions

    International Nuclear Information System (INIS)

    Kostelecky, V.A.; Nieto, M.M.; Truax, D.R.

    1988-01-01

    We investigate the domain of validity of previously proposed analytical wave functions for atomic quantum-defect theory. This is done by considering the fine-structure splitting of alkali-metal and singly ionized alkaline-earth atoms. The Lande formula is found to be naturally incorporated. A supersymmetric-type integer is necessary for finite results. Calculated splittings correctly reproduce the principal features of experimental values for alkali-like atoms

  4. Semi-Analytical Benchmarks for MCNP6

    Energy Technology Data Exchange (ETDEWEB)

    Grechanuk, Pavel Aleksandrovi [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-11-07

    Code verification is an extremely important process that involves proving or disproving the validity of code algorithms by comparing them against analytical results of the underlying physics or mathematical theory on which the code is based. Monte Carlo codes such as MCNP6 must undergo verification and testing upon every release to ensure that the codes are properly simulating nature. Specifically, MCNP6 has multiple sets of problems with known analytic solutions that are used for code verification. Monte Carlo codes primarily specify either current boundary sources or a volumetric fixed source, either of which can be very complicated functions of space, energy, direction and time. Thus, most of the challenges with modeling analytic benchmark problems in Monte Carlo codes come from identifying the correct source definition to properly simulate the correct boundary conditions. The problems included in this suite all deal with mono-energetic neutron transport without energy loss, in a homogeneous material. The variables that differ between the problems are source type (isotropic/beam), medium dimensionality (infinite/semi-infinite), etc.

  5. Development and validation of simple RP-HPLC-PDA analytical protocol for zileuton assisted with Design of Experiments for robustness determination

    OpenAIRE

    Saurabh B. Ganorkar; Dinesh M. Dhumal; Atul A. Shirkhedkar

    2017-01-01

    A simple, rapid, sensitive, robust, stability-indicating RP-HPLC-PDA analytical protocol was developed and validated for the analysis of zileuton racemate in bulk and in tablet formulation. Development of method and resolution of degradation products from forced; hydrolytic (acidic, basic, neutral), oxidative, photolytic (acidic, basic, neutral, solid state) and thermal (dry heat) degradation was achieved on a LC – GC Qualisil BDS C18 column (250 mm × 4.6 mm × 5 μm) by isocratic mode at ambie...

  6. Decentral gene expression analysis: analytical validation of the Endopredict genomic multianalyte breast cancer prognosis test

    Directory of Open Access Journals (Sweden)

    Kronenwett Ralf

    2012-10-01

    Full Text Available Abstract Background EndoPredict (EP is a clinically validated multianalyte gene expression test to predict distant metastasis in ER-positive, HER2-negative breast cancer treated with endocrine therapy alone. The test is based on the combined analysis of 12 genes in formalin-fixed, paraffin-embedded (FFPE tissue by reverse transcription-quantitative real-time PCR (RT-qPCR. Recently, it was shown that EP is feasible for reliable decentralized assessment of gene expression. The aim of this study was the analytical validation of the performance characteristics of the assay and its verification in a molecular-pathological routine laboratory. Methods Gene expression values to calculate the EP score were assayed by one-step RT-qPCR using RNA from FFPE tumor tissue. Limit of blank, limit of detection, linear range, and PCR efficiency were assessed for each of the 12 PCR assays using serial samples dilutions. Different breast cancer samples were used to evaluate RNA input range, precision and inter-laboratory variability. Results PCR assays were linear up to Cq values between 35.1 and 37.2. Amplification efficiencies ranged from 75% to 101%. The RNA input range without considerable change of the EP score was between 0.16 and 18.5 ng/μl. Analysis of precision (variation of day, day time, instrument, operator, reagent lots resulted in a total noise (standard deviation of 0.16 EP score units on a scale from 0 to 15. The major part of the total noise (SD 0.14 was caused by the replicate-to-replicate noise of the PCR assays (repeatability and was not associated with different operating conditions (reproducibility. Performance characteristics established in the manufacturer’s laboratory were verified in a routine molecular pathology laboratory. Comparison of 10 tumor samples analyzed in two different laboratories showed a Pearson coefficient of 0.995 and a mean deviation of 0.15 score units. Conclusions The EP test showed reproducible performance

  7. Clinical validation of an epigenetic assay to predict negative histopathological results in repeat prostate biopsies.

    Science.gov (United States)

    Partin, Alan W; Van Neste, Leander; Klein, Eric A; Marks, Leonard S; Gee, Jason R; Troyer, Dean A; Rieger-Christ, Kimberly; Jones, J Stephen; Magi-Galluzzi, Cristina; Mangold, Leslie A; Trock, Bruce J; Lance, Raymond S; Bigley, Joseph W; Van Criekinge, Wim; Epstein, Jonathan I

    2014-10-01

    The DOCUMENT multicenter trial in the United States validated the performance of an epigenetic test as an independent predictor of prostate cancer risk to guide decision making for repeat biopsy. Confirming an increased negative predictive value could help avoid unnecessary repeat biopsies. We evaluated the archived, cancer negative prostate biopsy core tissue samples of 350 subjects from a total of 5 urological centers in the United States. All subjects underwent repeat biopsy within 24 months with a negative (controls) or positive (cases) histopathological result. Centralized blinded pathology evaluation of the 2 biopsy series was performed in all available subjects from each site. Biopsies were epigenetically profiled for GSTP1, APC and RASSF1 relative to the ACTB reference gene using quantitative methylation specific polymerase chain reaction. Predetermined analytical marker cutoffs were used to determine assay performance. Multivariate logistic regression was used to evaluate all risk factors. The epigenetic assay resulted in a negative predictive value of 88% (95% CI 85-91). In multivariate models correcting for age, prostate specific antigen, digital rectal examination, first biopsy histopathological characteristics and race the test proved to be the most significant independent predictor of patient outcome (OR 2.69, 95% CI 1.60-4.51). The DOCUMENT study validated that the epigenetic assay was a significant, independent predictor of prostate cancer detection in a repeat biopsy collected an average of 13 months after an initial negative result. Due to its 88% negative predictive value adding this epigenetic assay to other known risk factors may help decrease unnecessary repeat prostate biopsies. Copyright © 2014 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  8. Two-dimensional analytical solution for nodal calculation of nuclear reactors

    International Nuclear Information System (INIS)

    Silva, Adilson C.; Pessoa, Paulo O.; Silva, Fernando C.; Martinez, Aquilino S.

    2017-01-01

    Highlights: • A proposal for a coarse mesh nodal method is presented. • The proposal uses the analytical solution of the two-dimensional neutrons diffusion equation. • The solution is performed homogeneous nodes with dimensions of the fuel assembly. • The solution uses four average fluxes on the node surfaces as boundary conditions. • The results show good accuracy and efficiency. - Abstract: In this paper, the two-dimensional (2D) neutron diffusion equation is analytically solved for two energy groups (2G). The spatial domain of reactor core is divided into a set of nodes with uniform nuclear parameters. To determine iteratively the multiplication factor and the neutron flux in the reactor we combine the analytical solution of the neutron diffusion equation with an iterative method known as power method. The analytical solution for different types of regions that compose the reactor is obtained, such as fuel and reflector regions. Four average fluxes in the node surfaces are used as boundary conditions for analytical solution. Discontinuity factors on the node surfaces derived from the homogenization process are applied to maintain averages reaction rates and the net current in the fuel assembly (FA). To validate the results obtained by the analytical solution a relative power density distribution in the FAs is determined from the neutron flux distribution and compared with the reference values. The results show good accuracy and efficiency.

  9. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening.

    Science.gov (United States)

    Lawton, Zachary E; Traub, Angelica; Fatigante, William L; Mancias, Jose; O'Leary, Adam E; Hall, Seth E; Wieland, Jamie R; Oberacher, Herbert; Gizzi, Michael C; Mulligan, Christopher C

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. Graphical Abstract ᅟ.

  10. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    Science.gov (United States)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  11. Synthetic salt cake standards for analytical laboratory quality control

    International Nuclear Information System (INIS)

    Schilling, A.E.; Miller, A.G.

    1980-01-01

    The validation of analytical results in the characterization of Hanford Nuclear Defense Waste requires the preparation of synthetic waste for standard reference materials. Two independent synthetic salt cake standards have been prepared to monitor laboratory quality control for the chemical characterization of high-level salt cake and sludge waste in support of Rockwell Hanford Operations' High-Level Waste Management Program. Each synthetic salt cake standard contains 15 characterized chemical species and was subjected to an extensive verification/characterization program in two phases. Phase I consisted of an initial verification of each analyte in salt cake form in order to determine the current analytical capability for chemical analysis. Phase II consisted of a final characterization of those chemical species in solution form where conflicting verification data were observed. The 95 percent confidence interval on the mean for the following analytes within each standard is provided: sodium, nitrate, nitrite, phosphate, carbonate, sulfate, hydroxide, chromate, chloride, fluoride, aluminum, plutonium-239/240, strontium-90, cesium-137, and water

  12. Tank 214-AW-105, grab samples, analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final report for tank 241-AW-105 grab samples. Twenty grabs samples were collected from risers 10A and 15A on August 20 and 21, 1996, of which eight were designated for the K Basin sludge compatibility and mixing studies. This document presents the analytical results for the remaining twelve samples. Analyses were performed in accordance with the Compatibility Grab Sampling and Analysis Plan (TSAP) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DO). The results for the previous sampling of this tank were reported in WHC-SD-WM-DP-149, Rev. 0, 60-Day Waste Compatibility Safety Issue and Final Results for Tank 241-A W-105, Grab Samples 5A W-95-1, 5A W-95-2 and 5A W-95-3. Three supernate samples exceeded the TOC notification limit (30,000 microg C/g dry weight). Appropriate notifications were made. No immediate notifications were required for any other analyte. The TSAP requested analyses for polychlorinated biphenyls (PCB) for all liquids and centrifuged solid subsamples. The PCB analysis of the liquid samples has been delayed and will be presented in a revision to this document

  13. Coupled thermodynamic-dynamic semi-analytical model of free piston Stirling engines

    Energy Technology Data Exchange (ETDEWEB)

    Formosa, F., E-mail: fabien.formosa@univ-savoie.f [Laboratoire SYMME, Universite de Savoie, BP 80439, 74944 Annecy le Vieux Cedex (France)

    2011-05-15

    Research highlights: {yields} The free piston Stirling behaviour relies on its thermal and dynamic features. {yields} A global semi-analytical model for preliminary design is developed. {yields} The model compared with NASA-RE1000 experimental data shows good correlations. -- Abstract: The study of free piston Stirling engine (FPSE) requires both accurate thermodynamic and dynamic modelling to predict its performances. The steady state behaviour of the engine partly relies on non linear dissipative phenomena such as pressure drop loss within heat exchangers which is dependant on the temperature within the associated components. An analytical thermodynamic model which encompasses the effectiveness and the flaws of the heat exchangers and the regenerator has been previously developed and validated. A semi-analytical dynamic model of FPSE is developed and presented in this paper. The thermodynamic model is used to define the thermal variables that are used in the dynamic model which evaluates the kinematic results. Thus, a coupled iterative strategy has been used to perform a global simulation. The global modelling approach has been validated using the experimental data available from the NASA RE-1000 Stirling engine prototype. The resulting coupled thermodynamic-dynamic model using a standardized description of the engine allows efficient and realistic preliminary design of FPSE.

  14. Coupled thermodynamic-dynamic semi-analytical model of free piston Stirling engines

    International Nuclear Information System (INIS)

    Formosa, F.

    2011-01-01

    Research highlights: → The free piston Stirling behaviour relies on its thermal and dynamic features. → A global semi-analytical model for preliminary design is developed. → The model compared with NASA-RE1000 experimental data shows good correlations. -- Abstract: The study of free piston Stirling engine (FPSE) requires both accurate thermodynamic and dynamic modelling to predict its performances. The steady state behaviour of the engine partly relies on non linear dissipative phenomena such as pressure drop loss within heat exchangers which is dependant on the temperature within the associated components. An analytical thermodynamic model which encompasses the effectiveness and the flaws of the heat exchangers and the regenerator has been previously developed and validated. A semi-analytical dynamic model of FPSE is developed and presented in this paper. The thermodynamic model is used to define the thermal variables that are used in the dynamic model which evaluates the kinematic results. Thus, a coupled iterative strategy has been used to perform a global simulation. The global modelling approach has been validated using the experimental data available from the NASA RE-1000 Stirling engine prototype. The resulting coupled thermodynamic-dynamic model using a standardized description of the engine allows efficient and realistic preliminary design of FPSE.

  15. Development and validation of analytical method for Naftopidil in human plasma by LC–MS/MS

    Directory of Open Access Journals (Sweden)

    Pritam S. Jain

    2015-09-01

    Full Text Available A highly sensitive and simple high-performance liquid chromatographic–tandem mass spectrometric (LC–MS-MS assay is developed and validated for the quantification of Naftopidil in human plasma. Naftopidil is extracted from human plasma by methyl tertiary butyl ether and analyzed using a reversed-phase gradient elution on a discovery C 18 5 μ (50 × 4.6 column. A methanol: 2 mM ammonium formate (90:10 as mobile phase, is used and detection was performed by MS using electrospray ionization in positive mode. Propranolol is used as the internal standard. The lower limits of quantification are 0.495 ng/mL. The calibration curves are linear over the concentration range of 0.495–200.577 ng/mL of plasma for each analyte. This novel LC–MS-MS method shows satisfactory accuracy and precision and is sufficiently sensitive for the performance of pharmacokinetic studies in humans.

  16. Analytical method validation of GC-FID for the simultaneous measurement of hydrocarbons (C2-C4 in their gas mixture

    Directory of Open Access Journals (Sweden)

    Oman Zuas

    2016-09-01

    Full Text Available An accurate gas chromatography coupled to a flame ionization detector (GC-FID method was validated for the simultaneous analysis of light hydrocarbons (C2-C4 in their gas mixture. The validation parameters were evaluated based on the ISO/IEC 17025 definition including method selectivity, repeatability, accuracy, linearity, limit of detection (LOD, limit of quantitation (LOQ, and ruggedness. Under the optimum analytical conditions, the analysis of gas mixture revealed that each target component was well-separated with high selectivity property. The method was also found to be precise and accurate. The method linearity was found to be high with good correlation coefficient values (R2 ≥ 0.999 for all target components. It can be concluded that the GC-FID developed method is reliable and suitable for determination of light C2-C4 hydrocarbons (including ethylene, propane, propylene, isobutane, and n-butane in their gas mixture. The validated method has successfully been applied to the estimation of hydrocarbons light C2-C4 hydrocarbons in natural gas samples, showing high performance repeatability with relative standard deviation (RSD less than 1.0% and good selectivity with no interference from other possible components could be observed.

  17. Sharing the Data along with the Responsibility: Examining an Analytic Scale-Based Model for Assessing School Climate.

    Science.gov (United States)

    Shindler, John; Taylor, Clint; Cadenas, Herminia; Jones, Albert

    This study was a pilot effort to examine the efficacy of an analytic trait scale school climate assessment instrument and democratic change system in two urban high schools. Pilot study results indicate that the instrument shows promising soundness in that it exhibited high levels of validity and reliability. In addition, the analytic trait format…

  18. Analytical solution for a coaxial plasma gun: Weak coupling limit

    International Nuclear Information System (INIS)

    Dietz, D.

    1987-01-01

    The analytical solution of the system of coupled ODE's which describes the time evolution of an ideal (i.e., zero resistance) coaxial plasma gun operating in the snowplow mode is obtained in the weak coupling limit, i.e, when the gun is fully influenced by the driving (RLC) circuit in which it resides but the circuit is negligibly influenced by the gun. Criteria for the validity of this limit are derived and numerical examples are presented. Although others have obtained approximate, asymptotic and numerical solutions of the equations, the present analytical results seem not to have appeared previously in the literature

  19. No Impact of the Analytical Method Used for Determining Cystatin C on Estimating Glomerular Filtration Rate in Children.

    Science.gov (United States)

    Alberer, Martin; Hoefele, Julia; Benz, Marcus R; Bökenkamp, Arend; Weber, Lutz T

    2017-01-01

    Measurement of inulin clearance is considered to be the gold standard for determining kidney function in children, but this method is time consuming and expensive. The glomerular filtration rate (GFR) is on the other hand easier to calculate by using various creatinine- and/or cystatin C (Cys C)-based formulas. However, for the determination of serum creatinine (Scr) and Cys C, different and non-interchangeable analytical methods exist. Given the fact that different analytical methods for the determination of creatinine and Cys C were used in order to validate existing GFR formulas, clinicians should be aware of the type used in their local laboratory. In this study, we compared GFR results calculated on the basis of different GFR formulas and either used Scr and Cys C values as determined by the analytical method originally employed for validation or values obtained by an alternative analytical method to evaluate any possible effects on the performance. Cys C values determined by means of an immunoturbidimetric assay were used for calculating the GFR using equations in which this analytical method had originally been used for validation. Additionally, these same values were then used in other GFR formulas that had originally been validated using a nephelometric immunoassay for determining Cys C. The effect of using either the compatible or the possibly incompatible analytical method for determining Cys C in the calculation of GFR was assessed in comparison with the GFR measured by creatinine clearance (CrCl). Unexpectedly, using GFR equations that employed Cys C values derived from a possibly incompatible analytical method did not result in a significant difference concerning the classification of patients as having normal or reduced GFR compared to the classification obtained on the basis of CrCl. Sensitivity and specificity were adequate. On the other hand, formulas using Cys C values derived from a compatible analytical method partly showed insufficient

  20. An analytical model of leakage neutron equivalent dose for passively-scattered proton radiotherapy and validation with measurements.

    Science.gov (United States)

    Schneider, Christopher; Newhauser, Wayne; Farah, Jad

    2015-05-18

    Exposure to stray neutrons increases the risk of second cancer development after proton therapy. Previously reported analytical models of this exposure were difficult to configure and had not been investigated below 100 MeV proton energy. The purposes of this study were to test an analytical model of neutron equivalent dose per therapeutic absorbed dose  at 75 MeV and to improve the model by reducing the number of configuration parameters and making it continuous in proton energy from 100 to 250 MeV. To develop the analytical model, we used previously published H/D values in water from Monte Carlo simulations of a general-purpose beamline for proton energies from 100 to 250 MeV. We also configured and tested the model on in-air neutron equivalent doses measured for a 75 MeV ocular beamline. Predicted H/D values from the analytical model and Monte Carlo agreed well from 100 to 250 MeV (10% average difference). Predicted H/D values from the analytical model also agreed well with measurements at 75 MeV (15% average difference). The results indicate that analytical models can give fast, reliable calculations of neutron exposure after proton therapy. This ability is absent in treatment planning systems but vital to second cancer risk estimation.

  1. A Performance Analytical Strategy for Network-on-Chip Router with Input Buffer Architecture

    Directory of Open Access Journals (Sweden)

    WANG, J.

    2012-11-01

    Full Text Available In this paper, a performance analytical strategy is proposed for Network-on-Chip router with input buffer architecture. First, an analytical model is developed based on semi-Markov process. For the non-work-conserving router with small buffer size, the model can be used to analyze the schedule delay and the average service time for each buffer when given the related parameters. Then, the packet average delay in router is calculated by using the model. Finally, we validate the effectiveness of our strategy by simulation. By comparing our analytical results to simulation results, we show that our strategy successfully captures the Network-on-Chip router performance and it performs better than the state-of-art technology. Therefore, our strategy can be used as an efficiency performance analytical tool for Network-on-Chip design.

  2. Modeling Run Test Validity: A Meta-Analytic Approach

    National Research Council Canada - National Science Library

    Vickers, Ross

    2002-01-01

    .... This study utilized data from 166 samples (N = 5,757) to test the general hypothesis that differences in testing methods could account for the cross-situational variation in validity. Only runs >2 km...

  3. Analytical results for entanglement in the five-qubit anisotropic Heisenberg model

    International Nuclear Information System (INIS)

    Wang Xiaoguang

    2004-01-01

    We solve the eigenvalue problem of the five-qubit anisotropic Heisenberg model, without use of Bethe's ansatz, and give analytical results for entanglement and mixedness of two nearest-neighbor qubits. The entanglement takes its maximum at Δ=1 (Δ>1) for the case of zero (finite) temperature with Δ being the anisotropic parameter. In contrast, the mixedness takes its minimum at Δ=1 (Δ>1) for the case of zero (finite) temperature

  4. Method validation in pharmaceutical analysis: from theory to practical optimization

    Directory of Open Access Journals (Sweden)

    Jaqueline Kaleian Eserian

    2015-01-01

    Full Text Available The validation of analytical methods is required to obtain high-quality data. For the pharmaceutical industry, method validation is crucial to ensure the product quality as regards both therapeutic efficacy and patient safety. The most critical step in validating a method is to establish a protocol containing well-defined procedures and criteria. A well planned and organized protocol, such as the one proposed in this paper, results in a rapid and concise method validation procedure for quantitative high performance liquid chromatography (HPLC analysis.   Type: Commentary

  5. Dried blood spot specimen quality and validation of a new pre-analytical processing method for qualitative HIV-1 PCR, KwaZulu-Natal, South Africa.

    Science.gov (United States)

    Govender, Kerusha; Parboosing, Raveen; Siyaca, Ntombizandile; Moodley, Pravikrishnen

    2016-01-01

    Poor quality dried blood spot (DBS) specimens are usually rejected by virology laboratories, affecting early infant diagnosis of HIV. The practice of combining two incompletely-filled DBS in one specimen preparation tube during pre-analytical specimen processing (i.e., the two-spot method) has been implemented to reduce the number of specimens being rejected for insufficient volume. This study analysed laboratory data to describe the quality of DBS specimens and the use of the two-spot method over a one-year period, then validated the two-spot method against the standard (one-spot) method. Data on HIV-1 PCR test requests submitted in 2014 to the Department of Virology at Inkosi Albert Luthuli Central Hospital in KwaZulu-Natal province, South Africa were analysed to describe reasons for specimen rejection, as well as results of the two-spot method. The accuracy, lower limit of detection and precision of the two-spot method were assessed. Of the 88 481 specimens received, 3.7% were rejected for pre-analytical problems. Of those, 48.9% were rejected as a result of insufficient specimen volume. Two health facilities had significantly more specimen rejections than other facilities. The two-spot method prevented 10 504 specimen rejections. The Pearson correlation coefficient comparing the standard to the two-spot method was 0.997. The two-spot method was comparable with the standard method of pre-analytical specimen processing. Two health facilities were identified for targeted retraining on specimen quality. The two-spot method of DBS specimen processing can be used as an adjunct to retraining, to reduce the number of specimens rejected and improve linkage to care.

  6. Complex dynamics of memristive circuits: Analytical results and universal slow relaxation

    Science.gov (United States)

    Caravelli, F.; Traversa, F. L.; Di Ventra, M.

    2017-02-01

    Networks with memristive elements (resistors with memory) are being explored for a variety of applications ranging from unconventional computing to models of the brain. However, analytical results that highlight the role of the graph connectivity on the memory dynamics are still few, thus limiting our understanding of these important dynamical systems. In this paper, we derive an exact matrix equation of motion that takes into account all the network constraints of a purely memristive circuit, and we employ it to derive analytical results regarding its relaxation properties. We are able to describe the memory evolution in terms of orthogonal projection operators onto the subspace of fundamental loop space of the underlying circuit. This orthogonal projection explicitly reveals the coupling between the spatial and temporal sectors of the memristive circuits and compactly describes the circuit topology. For the case of disordered graphs, we are able to explain the emergence of a power-law relaxation as a superposition of exponential relaxation times with a broad range of scales using random matrices. This power law is also universal, namely independent of the topology of the underlying graph but dependent only on the density of loops. In the case of circuits subject to alternating voltage instead, we are able to obtain an approximate solution of the dynamics, which is tested against a specific network topology. These results suggest a much richer dynamics of memristive networks than previously considered.

  7. Study of a vibrating plate: comparison between experimental (ESPI) and analytical results

    Science.gov (United States)

    Romero, G.; Alvarez, L.; Alanís, E.; Nallim, L.; Grossi, R.

    2003-07-01

    Real-time electronic speckle pattern interferometry (ESPI) was used for tuning and visualization of natural frequencies of a trapezoidal plate. The plate was excited to resonant vibration by a sinusoidal acoustical source, which provided a continuous range of audio frequencies. Fringe patterns produced during the time-average recording of the vibrating plate—corresponding to several resonant frequencies—were registered. From these interferograms, calculations of vibrational amplitudes by means of zero-order Bessel functions were performed in some particular cases. The system was also studied analytically. The analytical approach developed is based on the Rayleigh-Ritz method and on the use of non-orthogonal right triangular co-ordinates. The deflection of the plate is approximated by a set of beam characteristic orthogonal polynomials generated by using the Gram-Schmidt procedure. A high degree of correlation between computational analysis and experimental results was observed.

  8. Practical approach to a procedure for judging the results of analytical verification measurements

    International Nuclear Information System (INIS)

    Beyrich, W.; Spannagel, G.

    1979-01-01

    For practical safeguards a particularly transparent procedure is described to judge analytical differences between declared and verified values based on experimental data relevant to the actual status of the measurement technique concerned. Essentially it consists of two parts: Derivation of distribution curves for the occurrence of interlaboratory differences from the results of analytical intercomparison programmes; and judging of observed differences using criteria established on the basis of these probability curves. By courtesy of the Euratom Safeguards Directorate, Luxembourg, the applicability of this judging procedure has been checked in practical data verification for safeguarding; the experience gained was encouraging and implementation of the method is intended. Its reliability might be improved further by evaluation of additional experimental data. (author)

  9. 42 CFR 478.15 - QIO review of changes resulting from DRG validation.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false QIO review of changes resulting from DRG validation... review of changes resulting from DRG validation. (a) General rules. (1) A provider or practitioner dissatisfied with a change to the diagnostic or procedural coding information made by a QIO as a result of DRG...

  10. Analytic results for asymmetric random walk with exponential transition probabilities

    International Nuclear Information System (INIS)

    Gutkowicz-Krusin, D.; Procaccia, I.; Ross, J.

    1978-01-01

    We present here exact analytic results for a random walk on a one-dimensional lattice with asymmetric, exponentially distributed jump probabilities. We derive the generating functions of such a walk for a perfect lattice and for a lattice with absorbing boundaries. We obtain solutions for some interesting moment properties, such as mean first passage time, drift velocity, dispersion, and branching ratio for absorption. The symmetric exponential walk is solved as a special case. The scaling of the mean first passage time with the size of the system for the exponentially distributed walk is determined by the symmetry and is independent of the range

  11. An analytical model for backscattered luminance in fog: comparisons with Monte Carlo computations and experimental results

    International Nuclear Information System (INIS)

    Taillade, Frédéric; Dumont, Eric; Belin, Etienne

    2008-01-01

    We propose an analytical model for backscattered luminance in fog and derive an expression for the visibility signal-to-noise ratio as a function of meteorological visibility distance. The model uses single scattering processes. It is based on the Mie theory and the geometry of the optical device (emitter and receiver). In particular, we present an overlap function and take the phase function of fog into account. The results of the backscattered luminance obtained with our analytical model are compared to simulations made using the Monte Carlo method based on multiple scattering processes. An excellent agreement is found in that the discrepancy between the results is smaller than the Monte Carlo standard uncertainties. If we take no account of the geometry of the optical device, the results of the model-estimated backscattered luminance differ from the simulations by a factor 20. We also conclude that the signal-to-noise ratio computed with the Monte Carlo method and our analytical model is in good agreement with experimental results since the mean difference between the calculations and experimental measurements is smaller than the experimental uncertainty

  12. An Analytical Model of Leakage Neutron Equivalent Dose for Passively-Scattered Proton Radiotherapy and Validation with Measurements

    International Nuclear Information System (INIS)

    Schneider, Christopher; Newhauser, Wayne; Farah, Jad

    2015-01-01

    Exposure to stray neutrons increases the risk of second cancer development after proton therapy. Previously reported analytical models of this exposure were difficult to configure and had not been investigated below 100 MeV proton energy. The purposes of this study were to test an analytical model of neutron equivalent dose per therapeutic absorbed dose (H/D) at 75 MeV and to improve the model by reducing the number of configuration parameters and making it continuous in proton energy from 100 to 250 MeV. To develop the analytical model, we used previously published H/D values in water from Monte Carlo simulations of a general-purpose beamline for proton energies from 100 to 250 MeV. We also configured and tested the model on in-air neutron equivalent doses measured for a 75 MeV ocular beamline. Predicted H/D values from the analytical model and Monte Carlo agreed well from 100 to 250 MeV (10% average difference). Predicted H/D values from the analytical model also agreed well with measurements at 75 MeV (15% average difference). The results indicate that analytical models can give fast, reliable calculations of neutron exposure after proton therapy. This ability is absent in treatment planning systems but vital to second cancer risk estimation

  13. An Analytical Model of Leakage Neutron Equivalent Dose for Passively-Scattered Proton Radiotherapy and Validation with Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Christopher; Newhauser, Wayne, E-mail: newhauser@lsu.edu [Department of Physics and Astronomy, Louisiana State University and Agricultural and Mechanical College, 202 Nicholson Hall, Baton Rouge, LA 70803 (United States); Mary Bird Perkins Cancer Center, 4950 Essen Lane, Baton Rouge, LA 70809 (United States); Farah, Jad [Institut de Radioprotection et de Sûreté Nucléaire, Service de Dosimétrie Externe, BP-17, 92262 Fontenay-aux-Roses (France)

    2015-05-18

    Exposure to stray neutrons increases the risk of second cancer development after proton therapy. Previously reported analytical models of this exposure were difficult to configure and had not been investigated below 100 MeV proton energy. The purposes of this study were to test an analytical model of neutron equivalent dose per therapeutic absorbed dose (H/D) at 75 MeV and to improve the model by reducing the number of configuration parameters and making it continuous in proton energy from 100 to 250 MeV. To develop the analytical model, we used previously published H/D values in water from Monte Carlo simulations of a general-purpose beamline for proton energies from 100 to 250 MeV. We also configured and tested the model on in-air neutron equivalent doses measured for a 75 MeV ocular beamline. Predicted H/D values from the analytical model and Monte Carlo agreed well from 100 to 250 MeV (10% average difference). Predicted H/D values from the analytical model also agreed well with measurements at 75 MeV (15% average difference). The results indicate that analytical models can give fast, reliable calculations of neutron exposure after proton therapy. This ability is absent in treatment planning systems but vital to second cancer risk estimation.

  14. Environmental concentrations of engineered nanomaterials: Review of modeling and analytical studies

    International Nuclear Information System (INIS)

    Gottschalk, Fadri; Sun, TianYin; Nowack, Bernd

    2013-01-01

    Scientific consensus predicts that the worldwide use of engineered nanomaterials (ENM) leads to their release into the environment. We reviewed the available literature concerning environmental concentrations of six ENMs (TiO 2 , ZnO, Ag, fullerenes, CNT and CeO 2 ) in surface waters, wastewater treatment plant effluents, biosolids, sediments, soils and air. Presently, a dozen modeling studies provide environmental concentrations for ENM and a handful of analytical works can be used as basis for a preliminary validation. There are still major knowledge gaps (e.g. on ENM production, application and release) that affect the modeled values, but over all an agreement on the order of magnitude of the environmental concentrations can be reached. True validation of the modeled values is difficult because trace analytical methods that are specific for ENM detection and quantification are not available. The modeled and measured results are not always comparable due to the different forms and sizes of particles that these two approaches target. -- Highlights: •Modeled environmental concentrations of engineered nanomaterials are reviewed. •Measured environmental concentrations of engineered nanomaterials are reviewed. •Possible validation of modeled data by measurements is critically evaluated. •Different approaches in modeling and measurement methods complicate validation. -- Modeled and measured environmental concentrations of engineered nanomaterials are reviewed and critically discussed

  15. Analytic theory for the selection of 2-D needle crystal at arbitrary Peclet number

    Science.gov (United States)

    Tanveer, Saleh

    1989-01-01

    An accurate analytic theory is presented for the velocity selection of a two-dimensional needle crystal for arbitrary Peclet number for small values of the surface tension parameter. The velocity selection is caused by the effect of transcendentally small terms which are determined by analytic continuation to the complex plane and analysis of nonlinear equations. The work supports the general conclusion of previous small Peclet number analytical results of other investigators, though there are some discrepancies in details. It also addresses questions raised on the validity of selection theory owing to assumptions made on shape corrections at large distances from the tip.

  16. Rigorous results of low-energy models of the analytic S-matrix theory

    International Nuclear Information System (INIS)

    Meshcheryakov, V.A.

    1974-01-01

    Results of analytic S-matrix theory, mainly dealing with the static limit of dispersion relations, are applied to pion-nucleon scattering in the low-energy region. Various approaches to solving equations of the chew-Low type are discussed. It is concluded that interesting results are obtained by reducing the equations to a system of nonlinear difference equations; the crucial element of this approach being the study of functions on the whole Riemann surface. Boundary and crossing symmetry conditions are studied. (HFdV)

  17. Analytic result for the two-loop six-point NMHV amplitude in N=4 super Yang-Mills theory

    CERN Document Server

    Dixon, Lance J.; Henn, Johannes M.

    2012-01-01

    We provide a simple analytic formula for the two-loop six-point ratio function of planar N = 4 super Yang-Mills theory. This result extends the analytic knowledge of multi-loop six-point amplitudes beyond those with maximal helicity violation. We make a natural ansatz for the symbols of the relevant functions appearing in the two-loop amplitude, and impose various consistency conditions, including symmetry, the absence of spurious poles, the correct collinear behaviour, and agreement with the operator product expansion for light-like (super) Wilson loops. This information reduces the ansatz to a small number of relatively simple functions. In order to fix these parameters uniquely, we utilize an explicit representation of the amplitude in terms of loop integrals that can be evaluated analytically in various kinematic limits. The final compact analytic result is expressed in terms of classical polylogarithms, whose arguments are rational functions of the dual conformal cross-ratios, plus precisely two function...

  18. On Conducting Construct Validity Meta-Analyses for the Rorschach: A Reply to Tibon Czopp and Zeligman (2016).

    Science.gov (United States)

    Mihura, Joni L; Meyer, Gregory J; Dumitrascu, Nicolae; Bombel, George

    2016-01-01

    We respond to Tibon Czopp and Zeligman's (2016) critique of our systematic reviews and meta-analyses of 65 Rorschach Comprehensive System (CS) variables published in Psychological Bulletin (2013). The authors endorsed our supportive findings but critiqued the same methodology when used for the 13 unsupported variables. Unfortunately, their commentary was based on significant misunderstandings of our meta-analytic method and results, such as thinking we used introspectively assessed criteria in classifying levels of support and reporting only a subset of our externally assessed criteria. We systematically address their arguments that our construct label and criterion variable choices were inaccurate and, therefore, meta-analytic validity for these 13 CS variables was artificially low. For example, the authors created new construct labels for these variables that they called "the customary CS interpretation," but did not describe their methodology nor provide evidence that their labels would result in better validity than ours. They cite studies they believe we should have included; we explain how these studies did not fit our inclusion criteria and that including them would have actually reduced the relevant CS variables' meta-analytic validity. Ultimately, criticisms alone cannot change meta-analytic support from negative to positive; Tibon Czopp and Zeligman would need to conduct their own construct validity meta-analyses.

  19. Bimodal fuzzy analytic hierarchy process (BFAHP) for coronary heart disease risk assessment.

    Science.gov (United States)

    Sabahi, Farnaz

    2018-04-04

    Rooted deeply in medical multiple criteria decision-making (MCDM), risk assessment is very important especially when applied to the risk of being affected by deadly diseases such as coronary heart disease (CHD). CHD risk assessment is a stochastic, uncertain, and highly dynamic process influenced by various known and unknown variables. In recent years, there has been a great interest in fuzzy analytic hierarchy process (FAHP), a popular methodology for dealing with uncertainty in MCDM. This paper proposes a new FAHP, bimodal fuzzy analytic hierarchy process (BFAHP) that augments two aspects of knowledge, probability and validity, to fuzzy numbers to better deal with uncertainty. In BFAHP, fuzzy validity is computed by aggregating the validities of relevant risk factors based on expert knowledge and collective intelligence. By considering both soft and statistical data, we compute the fuzzy probability of risk factors using the Bayesian formulation. In BFAHP approach, these fuzzy validities and fuzzy probabilities are used to construct a reciprocal comparison matrix. We then aggregate fuzzy probabilities and fuzzy validities in a pairwise manner for each risk factor and each alternative. BFAHP decides about being affected and not being affected by ranking of high and low risks. For evaluation, the proposed approach is applied to the risk of being affected by CHD using a real dataset of 152 patients of Iranian hospitals. Simulation results confirm that adding validity in a fuzzy manner can accrue more confidence of results and clinically useful especially in the face of incomplete information when compared with actual results. Applying the proposed BFAHP on CHD risk assessment of the dataset, it yields high accuracy rate above 85% for correct prediction. In addition, this paper recognizes that the risk factors of diastolic blood pressure in men and high-density lipoprotein in women are more important in CHD than other risk factors. Copyright © 2018 Elsevier Inc. All

  20. Analytical calculation of detailed model parameters of cast resin dry-type transformers

    International Nuclear Information System (INIS)

    Eslamian, M.; Vahidi, B.; Hosseinian, S.H.

    2011-01-01

    Highlights: → In this paper high frequency behavior of cast resin dry-type transformers was simulated. → Parameters of detailed model were calculated using analytical method and compared with FEM results. → A lab transformer was constructed in order to compare theoretical and experimental results. -- Abstract: Non-flammable characteristic of cast resin dry-type transformers make them suitable for different kind of usages. This paper presents an analytical method of how to obtain parameters of detailed model of these transformers. The calculated parameters are compared and verified with the corresponding FEM results and if it was necessary, correction factors are introduced for modification of the analytical solutions. Transient voltages under full and chopped test impulses are calculated using the obtained detailed model. In order to validate the model, a setup was constructed for testing on high-voltage winding of cast resin dry-type transformer. The simulation results were compared with the experimental data measured from FRA and impulse tests.

  1. Simple and Accurate Analytical Solutions of the Electrostatically Actuated Curled Beam Problem

    KAUST Repository

    Younis, Mohammad I.

    2014-08-17

    We present analytical solutions of the electrostatically actuated initially deformed cantilever beam problem. We use a continuous Euler-Bernoulli beam model combined with a single-mode Galerkin approximation. We derive simple analytical expressions for two commonly observed deformed beams configurations: the curled and tilted configurations. The derived analytical formulas are validated by comparing their results to experimental data in the literature and numerical results of a multi-mode reduced order model. The derived expressions do not involve any complicated integrals or complex terms and can be conveniently used by designers for quick, yet accurate, estimations. The formulas are found to yield accurate results for most commonly encountered microbeams of initial tip deflections of few microns. For largely deformed beams, we found that these formulas yield less accurate results due to the limitations of the single-mode approximations they are based on. In such cases, multi-mode reduced order models need to be utilized.

  2. Analytical and experimental vibration analysis of BWR pressure vessel internals

    International Nuclear Information System (INIS)

    Krutzik, N.; Schad, O.

    1975-01-01

    This report attempts to evaluate the validity as well as quality of several analytical methods in the light of presently available experimental data for the internals of pressure vessels of boiling-water-reactor-types. The experimental checks were performed after the numerical analysis was completed and showed the accuracy of the numerical results. The analytical investigations were done by finite element programmes - 2-dimensional as well as 3-dimensional, where the effect of the mass distribution with parts of virtual masses on the dynamic response could be studied in depth. The experimental data were collected at various different plants and with different mass correlations. Besides evaluating the dynamic characteristics of the components, tests were also performed to evaluate the vibrations of the pressure vessel relative to the main structure. After analysing extensive recorded data much better understanding of the response under a variety of loading- and boundary conditions could be gained. The comparison of the results of analytical studies with the experimental results made a broad qualitative evaluation possible. (Auth.)

  3. Analytical calculation of the torque exerted between two perpendicularly magnetized magnets

    Science.gov (United States)

    Allag, H.; Yonnet, J.-P.; Latreche, M. E. H.

    2011-04-01

    Analytical expressions of the torque on cuboidal permanent magnets are given. The only hypothesis is that the magnetizations are rigid, uniform, and perpendicularly oriented. The analytical calculation is made by replacing magnetizations by distributions of magnetic charges on the magnet poles. The torque expressions are obtained using the Lorentz force method. The results are valid for any relative magnet position, and the torque can be obtained with respect to any reference point. Although these expressions seem rather complicated, they enable an extremely fast and accurate torque calculation on a permanent magnet in the presence of a magnetic field of another permanent magnet.

  4. Analytic theory of alternate multilayer gratings operating in single-order regime.

    Science.gov (United States)

    Yang, Xiaowei; Kozhevnikov, Igor V; Huang, Qiushi; Wang, Hongchang; Hand, Matthew; Sawhney, Kawal; Wang, Zhanshan

    2017-07-10

    Using the coupled wave approach (CWA), we introduce the analytical theory for alternate multilayer grating (AMG) operating in the single-order regime, in which only one diffraction order is excited. Differing from previous study analogizing AMG to crystals, we conclude that symmetrical structure, or equal thickness of the two multilayer materials, is not the optimal design for AMG and may result in significant reduction in diffraction efficiency. The peculiarities of AMG compared with other multilayer gratings are analyzed. An influence of multilayer structure materials on diffraction efficiency is considered. The validity conditions of analytical theory are also discussed.

  5. Analytical Determining Of The Steinmetz Equivalent Diagram Elements Of Single-Phase Transformer

    Directory of Open Access Journals (Sweden)

    T. Aly Saandy

    2015-08-01

    Full Text Available This article presents to an analytical calculation methodology of the Steinmetz Equivalent Diagram Elements applied to the prediction of Eddy current loss in a single-phase transformer. Based on the electrical circuit theory the active and reactive powers consumed by the core are expressed analytically in function of the electromagnetic parameters as resistivity permeability and the geometrical dimensions of the core. The proposed modeling approach is established with the duality parallel series. The equivalent diagram elements empirically determined by Steinmetz are analytically expressed using the expressions of the no loaded transformer consumptions. To verify the relevance of the model validations both by simulations with different powers and measurements were carried out to determine the resistance and reactance of the core. The obtained results are in good agreement with the theoretical approach and the practical results.

  6. Robustness study in SSNTD method validation: indoor radon quality

    Energy Technology Data Exchange (ETDEWEB)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L., E-mail: danilacdias@gmail.com [Comissao Nacional de Energia Nuclear (LAPOC/CNEN), Pocos de Caldas, MG (Brazil). Laboratorio de Pocos de Caldas

    2017-07-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  7. Robustness study in SSNTD method validation: indoor radon quality

    International Nuclear Information System (INIS)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L.

    2017-01-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  8. Piezoresistive Cantilever Performance-Part I: Analytical Model for Sensitivity.

    Science.gov (United States)

    Park, Sung-Jin; Doll, Joseph C; Pruitt, Beth L

    2010-02-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors.

  9. Piezoresistive Cantilever Performance—Part I: Analytical Model for Sensitivity

    Science.gov (United States)

    Park, Sung-Jin; Doll, Joseph C.; Pruitt, Beth L.

    2010-01-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors. PMID:20336183

  10. Analytical solution to convection-radiation of a continuously moving fin with temperature-dependent thermal conductivity

    Directory of Open Access Journals (Sweden)

    Moradi Amir

    2013-01-01

    Full Text Available In this article, the simultaneous convection-radiation heat transfer of a moving fin of variable thermal conductivity is studied. The differential transformation method (DTM is applied for an analytic solution for heat transfer in fin with two different profiles. Fin profiles are rectangular and exponential. The accuracy of analytic solution is validated by comparing it with the numerical solution that is obtained by fourth-order Runge-Kutta method. The analytical and numerical results are shown for different values of the embedding parameters. DTM results show that series converge rapidly with high accuracy. The results indicate that the fin tip temperature increases when ambient temperature increases. Conversely, the fin tip temperature decreases with an increase in the Peclet number, convection-conduction and radiation-conduction parameters. It is shown that the fin tip temperature of the exponential profile is higher than the rectangular one. The results indicate that the numerical data and analytical method are in a good agreement with each other.

  11. Sewage-based epidemiology in monitoring the use of new psychoactive substances: Validation and application of an analytical method using LC-MS/MS.

    Science.gov (United States)

    Kinyua, Juliet; Covaci, Adrian; Maho, Walid; McCall, Ann-Kathrin; Neels, Hugo; van Nuijs, Alexander L N

    2015-09-01

    Sewage-based epidemiology (SBE) employs the analysis of sewage to detect and quantify drug use within a community. While SBE has been applied repeatedly for the estimation of classical illicit drugs, only few studies investigated new psychoactive substances (NPS). These compounds mimic effects of illicit drugs by introducing slight modifications to chemical structures of controlled illicit drugs. We describe the optimization, validation, and application of an analytical method using liquid chromatography coupled to positive electrospray tandem mass spectrometry (LC-ESI-MS/MS) for the determination of seven NPS in sewage: methoxetamine (MXE), butylone, ethylone, methylone, methiopropamine (MPA), 4-methoxymethamphetamine (PMMA), and 4-methoxyamphetamine (PMA). Sample preparation was performed using solid-phase extraction (SPE) with Oasis MCX cartridges. The LC separation was done with a HILIC (150 x 3 mm, 5 µm) column which ensured good resolution of the analytes with a total run time of 19 min. The lower limit of quantification (LLOQ) was between 0.5 and 5 ng/L for all compounds. The method was validated by evaluating the following parameters: sensitivity, selectivity, linearity, accuracy, precision, recoveries and matrix effects. The method was applied on sewage samples collected from sewage treatment plants in Belgium and Switzerland in which all investigated compounds were detected, except MPA and PMA. Furthermore, a consistent presence of MXE has been observed in most of the sewage samples at levels higher than LLOQ. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Predictive analytics and child protection: constraints and opportunities.

    Science.gov (United States)

    Russell, Jesse

    2015-08-01

    This paper considers how predictive analytics might inform, assist, and improve decision making in child protection. Predictive analytics represents recent increases in data quantity and data diversity, along with advances in computing technology. While the use of data and statistical modeling is not new to child protection decision making, its use in child protection is experiencing growth, and efforts to leverage predictive analytics for better decision-making in child protection are increasing. Past experiences, constraints and opportunities are reviewed. For predictive analytics to make the most impact on child protection practice and outcomes, it must embrace established criteria of validity, equity, reliability, and usefulness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Hanford Environmental Restoration data validation process for chemical and radiochemical analyses

    International Nuclear Information System (INIS)

    Adams, M.R.; Bechtold, R.A.; Clark, D.E.; Angelos, K.M.; Winter, S.M.

    1993-10-01

    Detailed procedures for validation of chemical and radiochemical data are used to assure consistent application of validation principles and support a uniform database of quality environmental data. During application of these procedures, it was determined that laboratory data packages were frequently missing certain types of documentation causing subsequent delays in meeting critical milestones in the completion of validation activities. A quality improvement team was assembled to address the problems caused by missing documentation and streamline the entire process. The result was the development of a separate data package verification procedure and revisions to the data validation procedures. This has resulted in a system whereby deficient data packages are immediately identified and corrected prior to validation and revised validation procedures which more closely match the common analytical reporting practices of laboratory service vendors

  14. Analytic theory for the selection of a two-dimensional needle crystal at arbitrary Peclet number

    Science.gov (United States)

    Tanveer, S.

    1989-01-01

    An accurate analytic theory is presented for the velocity selection of a two-dimensional needle crystal for arbitrary Peclet number for small values of the surface tension parameter. The velocity selection is caused by the effect of transcendentally small terms which are determined by analytic continuation to the complex plane and analysis of nonlinear equations. The work supports the general conclusion of previous small Peclet number analytical results of other investigators, though there are some discrepancies in details. It also addresses questions raised on the validity of selection theory owing to assumptions made on shape corrections at large distances from the tip.

  15. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  16. New analytical results in the electromagnetic response of composite superconducting wire in parallel fields

    NARCIS (Netherlands)

    Niessen, E.M.J.; Niessen, E.M.J.; Zandbergen, P.J.

    1993-01-01

    Analytical results are presented concerning the electromagnetic response of a composite superconducting wire in fields parallel to the wire axis, using the Maxwell equations supplemented with constitutive equations. The problem is nonlinear due to the nonlinearity in the constitutive equation

  17. Validation of the Classroom Behavior Inventory

    Science.gov (United States)

    Blunden, Dale; And Others

    1974-01-01

    Factor-analytic methods were used toassess contruct validity of the Classroom Behavior Inventory, a scale for rating behaviors associated with hyperactivity. The Classroom Behavior Inventory measures three dimensions of behavior: Hyperactivity, Hostility, and Sociability. Significant concurrent validity was obtained for only one Classroom Behavior…

  18. Decentral gene expression analysis: analytical validation of the Endopredict genomic multianalyte breast cancer prognosis test

    International Nuclear Information System (INIS)

    Kronenwett, Ralf; Brase, Jan C; Weber, Karsten E; Fisch, Karin; Müller, Berit M; Schmidt, Marcus; Filipits, Martin; Dubsky, Peter; Petry, Christoph; Dietel, Manfred; Denkert, Carsten; Bohmann, Kerstin; Prinzler, Judith; Sinn, Bruno V; Haufe, Franziska; Roth, Claudia; Averdick, Manuela; Ropers, Tanja; Windbergs, Claudia

    2012-01-01

    EndoPredict (EP) is a clinically validated multianalyte gene expression test to predict distant metastasis in ER-positive, HER2-negative breast cancer treated with endocrine therapy alone. The test is based on the combined analysis of 12 genes in formalin-fixed, paraffin-embedded (FFPE) tissue by reverse transcription-quantitative real-time PCR (RT-qPCR). Recently, it was shown that EP is feasible for reliable decentralized assessment of gene expression. The aim of this study was the analytical validation of the performance characteristics of the assay and its verification in a molecular-pathological routine laboratory. Gene expression values to calculate the EP score were assayed by one-step RT-qPCR using RNA from FFPE tumor tissue. Limit of blank, limit of detection, linear range, and PCR efficiency were assessed for each of the 12 PCR assays using serial samples dilutions. Different breast cancer samples were used to evaluate RNA input range, precision and inter-laboratory variability. PCR assays were linear up to C q values between 35.1 and 37.2. Amplification efficiencies ranged from 75% to 101%. The RNA input range without considerable change of the EP score was between 0.16 and 18.5 ng/μl. Analysis of precision (variation of day, day time, instrument, operator, reagent lots) resulted in a total noise (standard deviation) of 0.16 EP score units on a scale from 0 to 15. The major part of the total noise (SD 0.14) was caused by the replicate-to-replicate noise of the PCR assays (repeatability) and was not associated with different operating conditions (reproducibility). Performance characteristics established in the manufacturer’s laboratory were verified in a routine molecular pathology laboratory. Comparison of 10 tumor samples analyzed in two different laboratories showed a Pearson coefficient of 0.995 and a mean deviation of 0.15 score units. The EP test showed reproducible performance characteristics with good precision and negligible laboratory

  19. Kawerau fluid chemistry : analytical results

    International Nuclear Information System (INIS)

    Mroczek, E.K.; Christenson, B.W.; Mountain, B.; Stewart, M.K.

    2001-01-01

    This report summarises the water and gas analytical data collected from Kawerau geothermal field 1998-2000 under the Sustainable Management of Geothermal and Mineral Resources (GMR) Project, Objective 2 'Understanding New Zealand Geothermal Systems'. The work is part of the continuing effort to characterise the chemical, thermal and isotopic signatures of the deep magmatic heat sources which drive our geothermal systems. At Kawerau there is clear indication that the present-day heat source relates to young volcanism within the field. However, being at the margins of the explored reservoir, little is presently known of the characteristics of that heat source. The Kawerau study follows on directly from the recently completed work characterising the geochemical signatures of the Ohaaki hydrothermal system. In the latter study the interpretation of the radiogenic noble gas isotope systematics was of fundamental importance in characterising the magmatic heat source. Unfortunately the collaboration with LLNL, which analysed the isotopes, could not be extended to include the Kawerau data. The gas samples have been archived and will be analysed once a new collaborator is found to continue the work. The purpose of the present compilation is to facilitate the final completion of the study by ensuring the data is accessible in one report. (author). 5 refs., 2 figs., 9 tabs

  20. Evaluation of analytical results on DOE Quality Assessment Program Samples

    International Nuclear Information System (INIS)

    Jaquish, R.E.; Kinnison, R.R.; Mathur, S.P.; Sastry, R.

    1985-01-01

    Criteria were developed for evaluating the participants analytical results in the DOE Quality Assessment Program (QAP). Historical data from previous QAP studies were analyzed using descriptive statistical methods to determine the interlaboratory precision that had been attained. Performance criteria used in other similar programs were also reviewed. Using these data, precision values and control limits were recommended for each type of analysis performed in the QA program. Results of the analysis performed by the QAP participants on the November 1983 samples were statistically analyzed and evaluated. The Environmental Measurements Laboratory (EML) values were used as the known values and 3-sigma precision values were used as control limits. Results were submitted by 26 participating laboratories for 49 different radionuclide media combinations. The participants reported 419 results and of these, 350 or 84% were within control limits. Special attention was given to the data from gamma spectral analysis of air filters and water samples. both normal probability and box plots were prepared for each nuclide to help evaluate the distribution of the data. Results that were outside the expected range were identified and suggestions made that laboratories check calculations, and procedures on these results

  1. Analytical Methodology for the Determination of Radium Isotopes in Environmental Samples

    International Nuclear Information System (INIS)

    2010-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is an extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004, the environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. Measurements of radium isotopes are important for radiological and environmental protection, geochemical and geochronological investigations, hydrology, etc. The suite of isotopes creates and stimulates continuing interest in the development of new methods for determination of radium in various media. In this publication, the four most routinely used analytical methods for radium determination in biological and environmental samples, i.e. alpha spectrometry, gamma spectrometry, liquid scintillation spectrometry and mass spectrometry, are reviewed

  2. Analytical dynamic modeling of fast trilayer polypyrrole bending actuators

    International Nuclear Information System (INIS)

    Amiri Moghadam, Amir Ali; Moavenian, Majid; Tahani, Masoud; Torabi, Keivan

    2011-01-01

    Analytical modeling of conjugated polymer actuators with complicated electro-chemo-mechanical dynamics is an interesting area for research, due to the wide range of applications including biomimetic robots and biomedical devices. Although there have been extensive reports on modeling the electrochemical dynamics of polypyrrole (PPy) bending actuators, mechanical dynamics modeling of the actuators remains unexplored. PPy actuators can operate with low voltage while producing large displacement in comparison to robotic joints, they do not have friction or backlash, but they suffer from some disadvantages such as creep and hysteresis. In this paper, a complete analytical dynamic model for fast trilayer polypyrrole bending actuators has been proposed and named the analytical multi-domain dynamic actuator (AMDDA) model. First an electrical admittance model of the actuator will be obtained based on a distributed RC line; subsequently a proper mechanical dynamic model will be derived, based on Hamilton's principle. The purposed modeling approach will be validated based on recently published experimental results

  3. Tank 241-AN-104, cores 163 and 164 analytical results for the final report

    International Nuclear Information System (INIS)

    Steen, F.H.

    1997-01-01

    This document is the analytical laboratory report for tank 241-AN-104 push mode core segments collected between August 8, 1996 and September 12, 1996. The segments were subsampled and analyzed in accordance with the Tank 241-AAr-1 04 Push Mode Core Sampling and Analysis Plan (TSAP) (Winkelman, 1996), the Safety Screening Data Quality Objective (DQO) (Dukelow, et at., 1995) and the Flammable Gas Data Quality Objective (DQO) (Benar, 1995). The analytical results are included in a data summary table. None of the samples submitted for Differential Scanning Calorimetry (DSC), Total Alpha Activity (AT), Total Organic Carbon (TOC) and Plutonium analyses (239,240 Pu) exceeded notification limits as stated in the TSAP. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and not considered in this report

  4. Process and results of analytical framework and typology development for POINT

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Lehtonen, Markku; Bauler, Tom

    2009-01-01

    POINT is a project about how indicators are used in practice; to what extent and in what way indicators actually influence, support, or hinder policy and decision making processes, and what could be done to enhance the positive role of indicators in such processes. The project needs an analytical......, a set of core concepts and associated typologies, a series of analytic schemes proposed, and a number of research propositions and questions for the subsequent empirical work in POINT....

  5. Analytical results for non-Hermitian parity–time-symmetric and ...

    Indian Academy of Sciences (India)

    Abstract. We investigate both the non-Hermitian parity–time-(PT-)symmetric and Hermitian asymmetric volcano potentials, and present the analytical solution in terms of the confluent Heun function. Under certain special conditions, the confluent Heun function can be terminated as a polynomial, thereby leading to certain ...

  6. Automation of analytical systems in power cycles

    International Nuclear Information System (INIS)

    Staub Lukas

    2008-01-01

    'Automation' is a widely used term in instrumentation and is often applied to signal exchange, PLC and SCADA systems. Common use, however, does not necessarily described autonomous operation of analytical devices. We define an automated analytical system as a black box with an input (sample) and an output (measured value). In addition we need dedicated status lines for assessing the validities of the input for our black box and the output for subsequent systems. We will discuss input parameters, automated analytical processes and output parameters. Further considerations will be given to signal exchange and integration into the operating routine of a power plant. Local control loops (chemical dosing) and the automation of sampling systems are not discussed here. (author)

  7. Experimental, Numerical, and Analytical Slosh Dynamics of Water and Liquid Nitrogen in a Spherical Tank

    Science.gov (United States)

    Storey, Jedediah Morse

    2016-01-01

    Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecraft's mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many experimental and numerical studies of water slosh have been conducted. However, slosh data for cryogenic liquids is lacking. Water and cryogenic liquid nitrogen are used in various ground-based tests with a spherical tank to characterize damping, slosh mode frequencies, and slosh forces. A single ring baffle is installed in the tank for some of the tests. Analytical models for slosh modes, slosh forces, and baffle damping are constructed based on prior work. Select experiments are simulated using a commercial CFD software, and the numerical results are compared to the analytical and experimental results for the purposes of validation and methodology-improvement.

  8. Nuclear forensics: strategies and analytical techniques

    International Nuclear Information System (INIS)

    Marin, Rafael C.; Sarkis, Jorge E.S.; Pestana, Rafael C.B.

    2013-01-01

    The development of nuclear forensics as a field of science arose in response to international demand for methods to investigate the illicit trafficking of nuclear materials. After being seized, unknown nuclear material is collected and analyzed by a set of analytical methods. The fingerprints of these materials can be identified and further used during the investigations. Data interpretation is an extensive process aiming to validate the hypotheses made by the experts, and can help confirm the origin of seized nuclear materials at the end of the process or investigation. This work presents the set of measures and analytical methods that have been inherited by nuclear forensics from several fields of science. The main characteristics of these methods are evaluated and the analytical techniques employed to determine the fingerprint of nuclear materials are described. (author)

  9. Development, validation and application of a sensitive analytical method for residue determination and dissipation of imidacloprid in sugarcane under tropical field condition.

    Science.gov (United States)

    Ramasubramanian, T; Paramasivam, M; Nirmala, R

    2016-06-01

    A simple and sensitive analytical method has been developed and validated for the determination of trace amounts of imidacloprid in/on sugarcane sett, stalk and leaf. The method optimized in the present study requires less volume of organic solvent and time. Hence, this method is suitable for high-throughput analyses involving large number of samples. The limit of detection (LOD) and limit of quantification (LOQ) of the method were 0.003 and 0.01 mg/kg, respectively. The recovery and relative standard deviation were more than 93 % and less than 4 %, respectively. Thus, it is obvious that the analytical method standardized in this study is more precise and accurate enough to determine the residues of imidacloprid in sugarcane sett, stalk and leaf. The dissipation and translocation of imidacloprid residues from treated cane setts to leaf and stalk were studied by adopting this method. In sugarcane setts, the residues of imidacloprid persisted up to 120 days with half-life of 15.4 days at its recommended dose (70 g a.i./ha). The residues of imidacloprid were found to be translocated from setts to stalk and leaf. The imidacloprid residues were detected up to 105 days in both leaf and stalk. Dipping of sugarcane setts in imidacloprid at its recommended dose may result in better protection of cane setts and established crop because of higher initial deposit (>100 mg/kg) and longer persistence (>120 days).

  10. Tank 241-AX-103, cores 212 and 214 analytical results for the final report

    International Nuclear Information System (INIS)

    Steen, F.H.

    1998-01-01

    This document is the analytical laboratory report for tank 241-AX-103 push mode core segments collected between July 30, 1997 and August 11, 1997. The segments were subsampled and analyzed in accordance with the Tank 241-AX-103 Push Mode Core Sampling and Analysis Plan (TSAP) (Comer, 1997), the Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995) and the Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO) (Turner, et al., 1995). The analytical results are included in the data summary table (Table 1). None of the samples submitted for Differential Scanning Calorimetry (DSC), Total Alpha Activity (AT), plutonium 239 (Pu239), and Total Organic Carbon (TOC) exceeded notification limits as stated in the TSAP (Conner, 1997). The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and not considered in this report

  11. Validation of multi-element isotope dilution ICPMS for the analysis of basalts

    Energy Technology Data Exchange (ETDEWEB)

    Willbold, M.; Jochum, K.P.; Raczek, I.; Amini, M.A.; Stoll, B.; Hofmann, A.W. [Max-Planck-Institut fuer Chemie, Mainz (Germany)

    2003-09-01

    In this study we have validated a newly developed multi-element isotope dilution (ID) ICPMS method for the simultaneous analysis of up to 12 trace elements in geological samples. By evaluating the analytical uncertainty of individual components using certified reference materials we have quantified the overall analytical uncertainty of the multi-element ID ICPMS method at 1-2%. Individual components include sampling/weighing, purity of reagents, purity of spike solutions, calibration of spikes, determination of isotopic ratios, instrumental sources of error, correction of mass discrimination effect, values of constants, and operator bias. We have used the ID-determined trace elements for internal standardization to improve indirectly the analysis of 14 other (mainly mono-isotopic trace elements) by external calibration. The overall analytical uncertainty for those data is about 2-3%. In addition, we have analyzed USGS and MPI-DING geological reference materials (BHVO-1, BHVO-2, KL2-G, ML3B-G) to quantify the overall bias of the measurement procedure. Trace element analysis of geological reference materials yielded results that agree mostly within about 2-3% relative to the reference values. Since these results match the conclusions obtained by the investigation of the overall analytical uncertainty, we take this as a measure for the validity of multi-element ID ICPMS. (orig.)

  12. Arnol'd tongues for a resonant injection-locked frequency divider: analytical and numerical results

    DEFF Research Database (Denmark)

    Bartuccelli, Michele; Deane, Jonathan H.B.; Gentile, Guido

    2010-01-01

    ’d tongues in the frequency–amplitude plane. In particular, we provide exact analytical formulae for the widths of the tongues, which correspond to the plateaux of the devil’s staircase picture. The results account for numerical and experimental findings presented in the literature for special driving terms...

  13. Tank 241-TX-118, core 236 analytical results for the final report

    International Nuclear Information System (INIS)

    ESCH, R.A.

    1998-01-01

    This document is the analytical laboratory report for tank 241-TX-118 push mode core segments collected between April 1, 1998 and April 13, 1998. The segments were subsampled and analyzed in accordance with the Tank 241-TX-118 Push Mode Core sampling and Analysis Plan (TSAP) (Benar, 1997), the Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995), the Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO) (Turner, et al, 1995) and the Historical Model Evaluation Data Requirements (Historical DQO) (Sipson, et al., 1995). The analytical results are included in the data summary table (Table 1). None of the samples submitted for Differential Scanning Calorimetry (DSC) and Total Organic Carbon (TOC) exceeded notification limits as stated in the TSAP (Benar, 1997). One sample exceeded the Total Alpha Activity (AT) analysis notification limit of 38.4microCi/g (based on a bulk density of 1.6), core 236 segment 1 lower half solids (S98T001524). Appropriate notifications were made. Plutonium 239/240 analysis was requested as a secondary analysis. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and are not considered in this report

  14. Impurities in biogas - validation of analytical methods for siloxanes; Foeroreningar i biogas - validering av analysmetodik foer siloxaner

    Energy Technology Data Exchange (ETDEWEB)

    Arrhenius, Karine; Magnusson, Bertil; Sahlin, Eskil [SP Technical Research Institute of Sweden, Boraas (Sweden)

    2011-11-15

    Biogas produced from digester or landfill contains impurities which can be harmful for component that will be in contact with the biogas during its utilization. Among these, the siloxanes are often mentioned. During combustion, siloxanes are converted to silicon dioxide which accumulates on the heated surfaces in combustion equipment. Silicon dioxide is a solid compound and will remain in the engine and cause damages. Consequently, it is necessary to develop methods for the accurate determination of these compounds in biogases. In the first part of this report, a method for analysis of siloxanes in biogases was validated. The sampling was performed directly at the plant by drawing a small volume of biogas onto an adsorbent tube under a short period of time. These tubes were subsequently sent to the laboratory for analysis. The purpose of method validation is to demonstrate that the established method is fit for the purpose. This means that the method, as used by the laboratory generating the data, will provide data that meets a set of criteria concerning precision and accuracy. At the end, the uncertainty of the method was calculated. In the second part of this report, the validated method was applied to real samples collected in waste water treatment plants, co-digestion plants and plants digesting other wastes (agriculture waste). Results are presented at the end of this report. As expected, the biogases from waste water treatment plants contained largely higher concentrations of siloxanes than biogases from co-digestion plants and plants digesting agriculture wastes. The concentration of siloxanes in upgraded biogas regardless of which feedstock was digested and which upgrading technique was used was low.

  15. Approximate, analytic solutions of the Bethe equation for charged particle range

    OpenAIRE

    Swift, Damian C.; McNaney, James M.

    2009-01-01

    By either performing a Taylor expansion or making a polynomial approximation, the Bethe equation for charged particle stopping power in matter can be integrated analytically to obtain the range of charged particles in the continuous deceleration approximation. Ranges match reference data to the expected accuracy of the Bethe model. In the non-relativistic limit, the energy deposition rate was also found analytically. The analytic relations can be used to complement and validate numerical solu...

  16. Analytical Solution for the Anisotropic Rabi Model: Effects of Counter-Rotating Terms

    Science.gov (United States)

    Zhang, Guofeng; Zhu, Hanjie

    2015-03-01

    The anisotropic Rabi model, which was proposed recently, differs from the original Rabi model: the rotating and counter-rotating terms are governed by two different coupling constants. This feature allows us to vary the counter-rotating interaction independently and explore the effects of it on some quantum properties. In this paper, we eliminate the counter-rotating terms approximately and obtain the analytical energy spectrums and wavefunctions. These analytical results agree well with the numerical calculations in a wide range of the parameters including the ultrastrong coupling regime. In the weak counter-rotating coupling limit we find out that the counter-rotating terms can be considered as the shifts to the parameters of the Jaynes-Cummings model. This modification shows the validness of the rotating-wave approximation on the assumption of near-resonance and relatively weak coupling. Moreover, the analytical expressions of several physics quantities are also derived, and the results show the break-down of the U(1)-symmetry and the deviation from the Jaynes-Cummings model.

  17. Analytical results of radiochemistry of the JRR-3M

    International Nuclear Information System (INIS)

    Yoshijima, Tetsuo; Tanaka, Sumitoshi

    1997-07-01

    The JRR-3 was modified for upgrading to enhance the experimental capabilities in 1990 as JRR-3M. JRR-3M is pool type research reactor, moderated and cooled by light water with a maximum thermal power of 20 MWt and a thermal neutron flux of about 2x10 14 n/cm 2 ·sec. The core internal structure and fuel cladding tube is made by aluminum alloy. The cooling systems are composed of primary cooling system, secondary cooling system, heavy water reflector system and helium gas system. The primary piping system, reactor pool and heavy water reflector system is constructed of type 304 stainless steel. The main objectives of radiochemistry are check the general corrosion of structural materials and detection of failed fuel elements for safe operation of reactor plant. In this report analytical results of radiochemistry and evaluation of radionuclides of cooling systems in the JRR-3M are described. (author)

  18. Measurement of HDO Products Using GC-TCD: Towards Obtaining Reliable Analytical Data

    Directory of Open Access Journals (Sweden)

    Zuas Oman

    2018-03-01

    Full Text Available This paper reported the method development and validation of a gas chromatography with thermal conductivity detector (GC-TCD method for the measurement of the gaseous products of hydrodeoxygenation (HDO. The method validation parameters include selectivity, precision (repeatability and reproducibility, accuracy, linearity, limit of detection (LoD, limit of quantitation (LoQ, and robustness. The results showed that the developed method was able to separate the target components (H2, CO2, CH4 and CO from their mixtures without any special sample treatment. The validated method was selective, precise, accurate, and robust. Application of the developed and validated GC-TCD method to the measurement of by-product components of HDO of bio-oil revealed a good performance with relative standard deviation (RSD less than 1.0% for all target components, implying that the process of method development and validation provides a trustworthy way of obtaining reliable analytical data.

  19. Analytical model of a burst assembly algorithm for the VBR in the OBS networks

    International Nuclear Information System (INIS)

    Shargabi, M.A.A.; Mellah, H.; Abid, A.

    2008-01-01

    This paper presents a proposed analytical model for the number of bursts aggregated in a period of time in OBS networks. The model considers the case of VBR traffic with two different sending rates, which are SCR and PCR. The model is validated using extensive simulations. Where results from simulations are in total agreement with the results obtained by the proposed model. (author)

  20. Principles of validation of diagnostic assays for infectious diseases

    International Nuclear Information System (INIS)

    Jacobson, R.H.

    1998-01-01

    Assay validation requires a series of inter-related processes. Assay validation is an experimental process: reagents and protocols are optimized by experimentation to detect the analyte with accuracy and precision. Assay validation is a relative process: its diagnostic sensitivity and diagnostic specificity are calculated relative to test results obtained from reference animal populations of known infection/exposure status. Assay validation is a conditional process: classification of animals in the target population as infected or uninfected is conditional upon how well the reference animal population used to validate the assay represents the target population; accurate predictions of the infection status of animals from test results (PV+ and PV-) are conditional upon the estimated prevalence of disease/infection in the target population. Assay validation is an incremental process: confidence in the validity of an assay increases over time when use confirms that it is robust as demonstrated by accurate and precise results; the assay may also achieve increasing levels of validity as it is upgraded and extended by adding reference populations of known infection status. Assay validation is a continuous process: the assay remains valid only insofar as it continues to provide accurate and precise results as proven through statistical verification. Therefore, the work required for validation of diagnostic assays for infectious diseases does not end with a time-limited series of experiments based on a few reference samples rather, to assure valid test results from an assay requires constant vigilance and maintenance of the assay, along with reassessment of its performance characteristics for each unique population of animals to which it is applied. (author)

  1. Validation Test Results for Orthogonal Probe Eddy Current Thruster Inspection System

    Science.gov (United States)

    Wincheski, Russell A.

    2007-01-01

    Recent nondestructive evaluation efforts within NASA have focused on an inspection system for the detection of intergranular cracking originating in the relief radius of Primary Reaction Control System (PCRS) Thrusters. Of particular concern is deep cracking in this area which could lead to combustion leakage in the event of through wall cracking from the relief radius into an acoustic cavity of the combustion chamber. In order to reliably detect such defects while ensuring minimal false positives during inspection, the Orthogonal Probe Eddy Current (OPEC) system has been developed and an extensive validation study performed. This report describes the validation procedure, sample set, and inspection results as well as comparing validation flaws with the response from naturally occuring damage.

  2. Microplastics in the environment: Challenges in analytical chemistry - A review.

    Science.gov (United States)

    Silva, Ana B; Bastos, Ana S; Justino, Celine I L; da Costa, João P; Duarte, Armando C; Rocha-Santos, Teresa A P

    2018-08-09

    Microplastics can be present in the environment as manufactured microplastics (known as primary microplastics) or resulting from the continuous weathering of plastic litter, which yields progressively smaller plastic fragments (known as secondary microplastics). Herein, we discuss the numerous issues associated with the analysis of microplastics, and to a less extent of nanoplastics, in environmental samples (water, sediments, and biological tissues), from their sampling and sample handling to their identification and quantification. The analytical quality control and quality assurance associated with the validation of analytical methods and use of reference materials for the quantification of microplastics are also discussed, as well as the current challenges within this field of research and possible routes to overcome such limitations. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Polarographic validation of chemical speciation models

    International Nuclear Information System (INIS)

    Duffield, J.R.; Jarratt, J.A.

    2001-01-01

    It is well established that the chemical speciation of an element in a given matrix, or system of matrices, is of fundamental importance in controlling the transport behaviour of the element. Therefore, to accurately understand and predict the transport of elements and compounds in the environment it is a requirement that both the identities and concentrations of trace element physico-chemical forms can be ascertained. These twin requirements present the analytical scientist with considerable challenges given the labile equilibria, the range of time scales (from nanoseconds to years) and the range of concentrations (ultra-trace to macro) that may be involved. As a result of this analytical variability, chemical equilibrium modelling has become recognised as an important predictive tool in chemical speciation analysis. However, this technique requires firm underpinning by the use of complementary experimental techniques for the validation of the predictions made. The work reported here has been undertaken with the primary aim of investigating possible methodologies that can be used for the validation of chemical speciation models. However, in approaching this aim, direct chemical speciation analyses have been made in their own right. Results will be reported and analysed for the iron(II)/iron(III)-citrate proton system (pH 2 to 10; total [Fe] = 3 mmol dm -3 ; total [citrate 3- ] 10 mmol dm -3 ) in which equilibrium constants have been determined using glass electrode potentiometry, speciation is predicted using the PHREEQE computer code, and validation of predictions is achieved by determination of iron complexation and redox state with associated concentrations. (authors)

  4. Diffusion weighted MRI by spatiotemporal encoding: Analytical description and in vivo validations

    Science.gov (United States)

    Solomon, Eddy; Shemesh, Noam; Frydman, Lucio

    2013-07-01

    Diffusion-weighted (DW) MRI is a powerful modality for studying microstructure in normal and pathological tissues. The accuracy derived from DW MRI depends on the acquisition of quality images, and on a precise assessment of the b-values involved. Conventional DW MRI tends to be of limited use in regions suffering from large magnetic field or chemical shift heterogeneities, which severely distort the MR images. In this study we propose novel sequences based on SPatio-temporal ENcoding (SPEN), which overcome such shortcomings owing to SPEN's inherent robustness to offsets. SPEN, however, relies on the simultaneous application of gradients and radiofrequency-swept pulses, which may impart different diffusion weightings along the spatial axes. These will be further complicated in DW measurements by the diffusion-sensitizing gradients, and will in general lead to complex, spatially-dependent b-values. This study presents a formalism for analyzing these diffusion-weighted SPEN (dSPEN) data, which takes into account the concomitant effects of adiabatic pulses, of the imaging as well as diffusion gradients, and of the cross-terms between them. These analytical b-values derivations are subject to experimental validations in phantom systems and ex vivo spinal cords. Excellent agreement is found between the theoretical predictions and these dSPEN experiments. The ensuing methodology is then demonstrated by in vivo mapping of diffusion in human breast - organs where conventional k-space DW acquisition methods are challenged by both field and chemical shift heterogeneities. These studies demonstrate the increased robustness of dSPEN vis-à-vis comparable DW echo planar imaging, and demonstrate the value of this new methodology for medium- or high-field diffusion measurements in heterogeneous systems.

  5. Some analytical results for toroidal magnetic field coils with elongated minor cross-sections

    International Nuclear Information System (INIS)

    Raeder, J.

    1976-09-01

    The problem of determining the shape of a flexible current filament forming part of an ideal toroidal magnetic field coil is solved in a virtually analytical form. Analytical formulae for characteristic coil dimensions, stored magnetic energies, inductances and forces are derived for the so-called D-coils. The analytically calculated inductances of ideal D-coils are compared with numerically calculated ones for the case of finite numbers of D-shaped current filaments. Finally, the magnetic energies stored in ideal rectangular, elliptic and D-coils are compared. (orig.) [de

  6. Interpretation of results for tumor markers on the basis of analytical imprecision and biological variation

    DEFF Research Database (Denmark)

    Sölétormos, G; Schiøler, V; Nielsen, D

    1993-01-01

    Interpretation of results for CA 15.3, carcinoembryonic antigen (CEA), and tissue polypeptide antigen (TPA) during breast cancer monitoring requires data on intra- (CVP) and inter- (CVG) individual biological variation, analytical imprecision (CVA), and indices of individuality. The average CVP...

  7. Validation of an analytical method for simultaneous high-precision measurements of greenhouse gas emissions from wastewater treatment plants using a gas chromatography-barrier discharge detector system.

    Science.gov (United States)

    Pascale, Raffaella; Caivano, Marianna; Buchicchio, Alessandro; Mancini, Ignazio M; Bianco, Giuliana; Caniani, Donatella

    2017-01-13

    Wastewater treatment plants (WWTPs) emit CO 2 and N 2 O, which may lead to climate change and global warming. Over the last few years, awareness of greenhouse gas (GHG) emissions from WWTPs has increased. Moreover, the development of valid, reliable, and high-throughput analytical methods for simultaneous gas analysis is an essential requirement for environmental applications. In the present study, an analytical method based on a gas chromatograph (GC) equipped with a barrier ionization discharge (BID) detector was developed for the first time. This new method simultaneously analyses CO 2 and N 2 O and has a precision, measured in terms of relative standard of variation RSD%, equal to or less than 6.6% and 5.1%, respectively. The method's detection limits are 5.3ppm v for CO 2 and 62.0ppb v for N 2 O. The method's selectivity, linearity, accuracy, repeatability, intermediate precision, limit of detection and limit of quantification were good at trace concentration levels. After validation, the method was applied to a real case of N 2 O and CO 2 emissions from a WWTP, confirming its suitability as a standard procedure for simultaneous GHG analysis in environmental samples containing CO 2 levels less than 12,000mg/L. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. On the validity and practical applicability of derivative analyticity relations

    International Nuclear Information System (INIS)

    Kolar, P.; Fischer, J.

    1983-09-01

    We examine derivative analyticity relations (DAR), which were originally proposed by Bronzan as an alternative to dispersion relations and in which the dispersion integral is replaced by a tangent series of derivatives. We characterize the class of functions satisfying DAR, and show that outside this class the dispersion integral represents a Borel-like sum of tangent series. We point out difficulties connected with the application of DAR. (author)

  9. Worst-case study for cleaning validation of equipment in the radiopharmaceutical production of lyophilized reagents: Methodology validation of total organic carbon

    International Nuclear Information System (INIS)

    Porto, Luciana Valeria Ferrari Machado

    2015-01-01

    (repeatability and intermediate precision), and accuracy (recovery) and they were defined as follows: 4% acidifying reagent, 2.5 ml oxidizing reagent, 4.5 minutes integration curve time, 3 minutes sparge time and linearity in 40-1000 μgL -1 range, with correlation coefficient (r) and residual sum of minimum squares (r 2 ) greater than 0.99 respectively. DL and QL for NPOC were 14.25 ppb e 47.52 ppb respectively, repeatability between 0.11 and 4.47%; the intermediate precision between 0.59 and 3.80% and accuracy between 97.05 and 102.90%. The analytical curve for Mibi was linear in 100-800 μgL -1 range with r and r 2 greater than 0.99, presenting similar parameters to NPOC analytical curves. The results obtained in this study demonstrated that the worst-case approach to cleaning validation is a simple and effective way to reduce the complexity and slowness of the validation process, and provide a costs reduction involved in these activities. All results obtained in NPOC method validation assays met the requirements and specifications recommended by the RE 899/2003 Resolution from ANVISA to consider the method validated. (author)

  10. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  11. Local analytic geometry

    CERN Document Server

    Abhyankar, Shreeram Shankar

    1964-01-01

    This book provides, for use in a graduate course or for self-study by graduate students, a well-motivated treatment of several topics, especially the following: (1) algebraic treatment of several complex variables; (2) geometric approach to algebraic geometry via analytic sets; (3) survey of local algebra; (4) survey of sheaf theory. The book has been written in the spirit of Weierstrass. Power series play the dominant role. The treatment, being algebraic, is not restricted to complex numbers, but remains valid over any complete-valued field. This makes it applicable to situations arising from

  12. Improvement of the decision efficiency of the accuracy profile by means of a desirability function for analytical methods validation. Application to a diacetyl-monoxime colorimetric assay used for the determination of urea in transdermal iontophoretic extracts.

    Science.gov (United States)

    Rozet, E; Wascotte, V; Lecouturier, N; Préat, V; Dewé, W; Boulanger, B; Hubert, Ph

    2007-05-22

    Validation of analytical methods is a widely used and regulated step for each analytical method. However, the classical approaches to demonstrate the ability to quantify of a method do not necessarily fulfill this objective. For this reason an innovative methodology was recently introduced by using the tolerance interval and accuracy profile, which guarantee that a pre-defined proportion of future measurements obtained with the method will be included within the acceptance limits. Accuracy profile is an effective decision tool to assess the validity of analytical methods. The methodology to build such a profile is detailed here. However, as for any visual tool it has a part of subjectivity. It was then necessary to make the decision process objective in order to quantify the degree of adequacy of an accuracy profile and to allow a thorough comparison between such profiles. To achieve this, we developed a global desirability index based on the three most important validation criteria: the trueness, the precision and the range. The global index allows the classification of the different accuracy profiles obtained according to their respective response functions. A diacetyl-monoxime colorimetric assay for the determination of urea in transdermal iontophoretic extracts was used to illustrate these improvements.

  13. Analytical simulation of the cantilever-type energy harvester

    Directory of Open Access Journals (Sweden)

    Jie Mei

    2016-01-01

    Full Text Available This article describes an analytical model of the cantilever-type energy harvester based on Euler–Bernoulli’s beam theory. Starting from the Hamiltonian form of total energy equation, the bending mode shapes and electromechanical dynamic equations are derived. By solving the constitutive electromechanical dynamic equation, the frequency transfer function of output voltage and power can be obtained. Through a case study of a unimorph piezoelectric energy harvester, this analytical modeling method has been validated by the finite element method.

  14. Determining passive cooling limits in CPV using an analytical thermal model

    Science.gov (United States)

    Gualdi, Federico; Arenas, Osvaldo; Vossier, Alexis; Dollet, Alain; Aimez, Vincent; Arès, Richard

    2013-09-01

    We propose an original thermal analytical model aiming to predict the practical limits of passive cooling systems for high concentration photovoltaic modules. The analytical model is described and validated by comparison with a commercial 3D finite element model. The limiting performances of flat plate cooling systems in natural convection are then derived and discussed.

  15. An interlaboratory transfer of a multi-analyte assay between continents.

    Science.gov (United States)

    Georgiou, Alexandra; Dong, Kelly; Hughes, Stephen; Barfield, Matthew

    2015-01-01

    Alex has worked at GlaxoSmithKline for the past 15 years and currently works within the bioanalytical and toxicokinetic group in the United Kingdom. Alex's role in previous years has been the in-house support of preclinical and clinical bioanalysis, from method development through to sample analysis activities as well as acting as PI for GLP bioanalysis and toxicokinetics. For the past two years, Alex has applied this analytical and regulatory experience to focus on the outsourcing of preclinical bioanalysis, toxicokinetics and clinical bioanalysis, working closely with multiple bioanalytical and in-life CRO partners worldwide. Alex works to support DMPK and Safety Assessment outsourcing activities for GSK across multiple therapeutic areas, from the first GLP study through to late stage clinical PK studies. Transfer and cross-validation of an existing analytical assay between a laboratory providing current analytical support, and a laboratory needed for new or additional support, can present the bioanalyst with numerous challenges. These challenges can be technical or logistical in nature and may prove to be significant when transferring an assay between laboratories in different continents. Part of GlaxoSmithKline's strategy to improve confidence in providing quality data, is to cross-validate between laboratories. If the cross-validation fails predefined acceptance criteria, then a subsequent investigation would follow. This may also prove to be challenging. The importance of thorough planning and good communication throughout assay transfer, cross-validation and any subsequent investigations is illustrated in this case study.

  16. Method Validation Procedure in Gamma Spectroscopy Laboratory

    International Nuclear Information System (INIS)

    El Samad, O.; Baydoun, R.

    2008-01-01

    The present work describes the methodology followed for the application of ISO 17025 standards in gamma spectroscopy laboratory at the Lebanese Atomic Energy Commission including the management and technical requirements. A set of documents, written procedures and records were prepared to achieve the management part. The technical requirements, internal method validation was applied through the estimation of trueness, repeatability , minimum detectable activity and combined uncertainty, participation in IAEA proficiency tests assure the external method validation, specially that the gamma spectroscopy lab is a member of ALMERA network (Analytical Laboratories for the Measurements of Environmental Radioactivity). Some of these results are presented in this paper. (author)

  17. 100 Area excavation treatability study data validation report

    International Nuclear Information System (INIS)

    Frain, J.M.

    1994-01-01

    This report presents the results of sampling and chemical analyses at Hanford Reservation. The samples were analyzed by Thermo-Analytic Laboratories and Roy F. Weston Laboratories using US Environmental Protection Agency CLP protocols. Sample analyses included: volatile organics; semivolatile organics; inorganics; and general chemical parameters. The data from the chemical analyses were reviewed and validated to verify that reported sample results were of sufficient quality to support decisions regarding remedial actions performed at this site

  18. Analytic Validation of Immunohistochemistry Assays: New Benchmark Data From a Survey of 1085 Laboratories.

    Science.gov (United States)

    Stuart, Lauren N; Volmar, Keith E; Nowak, Jan A; Fatheree, Lisa A; Souers, Rhona J; Fitzgibbons, Patrick L; Goldsmith, Jeffrey D; Astles, J Rex; Nakhleh, Raouf E

    2017-09-01

    - A cooperative agreement between the College of American Pathologists (CAP) and the United States Centers for Disease Control and Prevention was undertaken to measure laboratories' awareness and implementation of an evidence-based laboratory practice guideline (LPG) on immunohistochemical (IHC) validation practices published in 2014. - To establish new benchmark data on IHC laboratory practices. - A 2015 survey on IHC assay validation practices was sent to laboratories subscribed to specific CAP proficiency testing programs and to additional nonsubscribing laboratories that perform IHC testing. Specific questions were designed to capture laboratory practices not addressed in a 2010 survey. - The analysis was based on responses from 1085 laboratories that perform IHC staining. Ninety-six percent (809 of 844) always documented validation of IHC assays. Sixty percent (648 of 1078) had separate procedures for predictive and nonpredictive markers, 42.7% (220 of 515) had procedures for laboratory-developed tests, 50% (349 of 697) had procedures for testing cytologic specimens, and 46.2% (363 of 785) had procedures for testing decalcified specimens. Minimum case numbers were specified by 85.9% (720 of 838) of laboratories for nonpredictive markers and 76% (584 of 768) for predictive markers. Median concordance requirements were 95% for both types. For initial validation, 75.4% (538 of 714) of laboratories adopted the 20-case minimum for nonpredictive markers and 45.9% (266 of 579) adopted the 40-case minimum for predictive markers as outlined in the 2014 LPG. The most common method for validation was correlation with morphology and expected results. Laboratories also reported which assay changes necessitated revalidation and their minimum case requirements. - Benchmark data on current IHC validation practices and procedures may help laboratories understand the issues and influence further refinement of LPG recommendations.

  19. Nonlinear heat conduction equations with memory: Physical meaning and analytical results

    Science.gov (United States)

    Artale Harris, Pietro; Garra, Roberto

    2017-06-01

    We study nonlinear heat conduction equations with memory effects within the framework of the fractional calculus approach to the generalized Maxwell-Cattaneo law. Our main aim is to derive the governing equations of heat propagation, considering both the empirical temperature-dependence of the thermal conductivity coefficient (which introduces nonlinearity) and memory effects, according to the general theory of Gurtin and Pipkin of finite velocity thermal propagation with memory. In this framework, we consider in detail two different approaches to the generalized Maxwell-Cattaneo law, based on the application of long-tail Mittag-Leffler memory function and power law relaxation functions, leading to nonlinear time-fractional telegraph and wave-type equations. We also discuss some explicit analytical results to the model equations based on the generalized separating variable method and discuss their meaning in relation to some well-known results of the ordinary case.

  20. Detailed validation in PCDDF analysis. ISO17025 data from Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Kernick Carvalhaes, G.; Azevedo, J.A.; Azevedo, G.; Machado, M.; Brooks, P. [Analytical Solutions, Rio de Janeiro (Brazil)

    2004-09-15

    When we define validation method we can use the ISO standard 8402, in reference to this, 'validation' is the 'confirmation by the examination and supplying of objective evidences that the particular requirements for a specific intended use are fulfilled'. This concept is extremely important to guarantee the quality of results. Validation method is based on the combined use of different validation procedures, but in this selection we have to analyze the cost benefit conditions. We must focus on the critical elements, and these critical factors must be the essential elements for providing good properties and results. If we have a solid validation methodology and a research of the source of uncertainty of our analytical method, we can generate results with confidence and veracity. When analyzing these two considerations, validation method and uncertainty calculations, we found out that there are very few articles and papers about these subjects, and it is even more difficult to find such materials on dioxins and furans. This short paper describes a validation and uncertainty calculation methodology using traditional studies with a few adaptations, yet it shows a new idea of recovery study as a source of uncertainty.

  1. A three-dimensional (3D) analytical model for subthreshold characteristics of uniformly doped FinFET

    Science.gov (United States)

    Tripathi, Shweta; Narendar, Vadthiya

    2015-07-01

    In this paper, three dimensional (3D) analytical model for subthreshold characteristics of doped FinFET has been presented. The separation of variables technique is used to solve the 3D Poisson's equation analytically with appropriate boundary conditions so as to obtain the expression for channel potential. The thus obtained potential distribution function has been employed in deriving subthreshold current and subthreshold slope model. The channel potential characteristics have been studied as a function of various device parameters such as gate length, gate oxide thickness and channel doping. The proposed analytical model results have been validated by comparing with the simulation data obtained by the 3D device simulator ATLAS™ from Silvaco.

  2. An accurate analytic description of neutrino oscillations in matter

    Science.gov (United States)

    Akhmedov, E. Kh.; Niro, Viviana

    2008-12-01

    A simple closed-form analytic expression for the probability of two-flavour neutrino oscillations in a matter with an arbitrary density profile is derived. Our formula is based on a perturbative expansion and allows an easy calculation of higher order corrections. The expansion parameter is small when the density changes relatively slowly along the neutrino path and/or neutrino energy is not very close to the Mikheyev-Smirnov-Wolfenstein (MSW) resonance energy. Our approximation is not equivalent to the adiabatic approximation and actually goes beyond it. We demonstrate the validity of our results using a few model density profiles, including the PREM density profile of the Earth. It is shown that by combining the results obtained from the expansions valid below and above the MSW resonance one can obtain a very good description of neutrino oscillations in matter in the entire energy range, including the resonance region.

  3. A Table Lookup Method for Exact Analytical Solutions of Nonlinear Fractional Partial Differential Equations

    Directory of Open Access Journals (Sweden)

    Ji Juan-Juan

    2017-01-01

    Full Text Available A table lookup method for solving nonlinear fractional partial differential equations (fPDEs is proposed in this paper. Looking up the corresponding tables, we can quickly obtain the exact analytical solutions of fPDEs by using this method. To illustrate the validity of the method, we apply it to construct the exact analytical solutions of four nonlinear fPDEs, namely, the time fractional simplified MCH equation, the space-time fractional combined KdV-mKdV equation, the (2+1-dimensional time fractional Zoomeron equation, and the space-time fractional ZKBBM equation. As a result, many new types of exact analytical solutions are obtained including triangular periodic solution, hyperbolic function solution, singular solution, multiple solitary wave solution, and Jacobi elliptic function solution.

  4. Semi-analytical Model for Estimating Absorption Coefficients of Optically Active Constituents in Coastal Waters

    Science.gov (United States)

    Wang, D.; Cui, Y.

    2015-12-01

    The objectives of this paper are to validate the applicability of a multi-band quasi-analytical algorithm (QAA) in retrieval absorption coefficients of optically active constituents in turbid coastal waters, and to further improve the model using a proposed semi-analytical model (SAA). The ap(531) and ag(531) semi-analytically derived using SAA model are quite different from the retrievals procedures of QAA model that ap(531) and ag(531) are semi-analytically derived from the empirical retrievals results of a(531) and a(551). The two models are calibrated and evaluated against datasets taken from 19 independent cruises in West Florida Shelf in 1999-2003, provided by SeaBASS. The results indicate that the SAA model produces a superior performance to QAA model in absorption retrieval. Using of the SAA model in retrieving absorption coefficients of optically active constituents from West Florida Shelf decreases the random uncertainty of estimation by >23.05% from the QAA model. This study demonstrates the potential of the SAA model in absorption coefficients of optically active constituents estimating even in turbid coastal waters. Keywords: Remote sensing; Coastal Water; Absorption Coefficient; Semi-analytical Model

  5. Triangular dislocation: an analytical, artefact-free solution

    Science.gov (United States)

    Nikkhoo, Mehdi; Walter, Thomas R.

    2015-05-01

    Displacements and stress-field changes associated with earthquakes, volcanoes, landslides and human activity are often simulated using numerical models in an attempt to understand the underlying processes and their governing physics. The application of elastic dislocation theory to these problems, however, may be biased because of numerical instabilities in the calculations. Here, we present a new method that is free of artefact singularities and numerical instabilities in analytical solutions for triangular dislocations (TDs) in both full-space and half-space. We apply the method to both the displacement and the stress fields. The entire 3-D Euclidean space {R}3 is divided into two complementary subspaces, in the sense that in each one, a particular analytical formulation fulfils the requirements for the ideal, artefact-free solution for a TD. The primary advantage of the presented method is that the development of our solutions involves neither numerical approximations nor series expansion methods. As a result, the final outputs are independent of the scale of the input parameters, including the size and position of the dislocation as well as its corresponding slip vector components. Our solutions are therefore well suited for application at various scales in geoscience, physics and engineering. We validate the solutions through comparison to other well-known analytical methods and provide the MATLAB codes.

  6. Validation of cell voltage and water content in a PEM (polymer electrolyte membrane) fuel cell model using neutron imaging for different operating conditions

    International Nuclear Information System (INIS)

    Salva, J. Antonio; Iranzo, Alfredo; Rosa, Felipe; Tapia, Elvira

    2016-01-01

    This work presents a one dimensional analytical model developed for a 50 cm"2 PEM (polymer electrolyte membrane) fuel cell with five-channel serpentine flow field. The different coupled physical phenomena such as electrochemistry, mass transfer of hydrogen, oxygen and water (two phases) together with heat transfer have been solved simultaneously. The innovation of this work is that the model has been validated with two different variables simultaneously and quantitatively in order to ensure the accuracy of the results. The selected variables are the cell voltage and the water content within the membrane MEA (Membrane Electrode Assembly) and GDL (gas diffusion layers) experimentally measured by means of neutron radiography. The results show a good agreement for a comprehensive set of different operating conditions of cell temperature, pressure, reactants relative humidity and cathode stoichiometry. The analytical model has a relative error less than 3.5% for the value of the cell voltage and the water content within the GDL + MEA for all experiments performed. This result presents a new standard of validation in the state of art of PEM fuel cell modeling where two variables are simultaneously and quantitatively validated with experimental results. The developed analytical model has been used in order to analyze the behavior of the PEM fuel cell under different values of relative humidity. - Highlights: • One dimensional analytical model has been developed for a PEM fuel cell. • The model is validated with two different variables simultaneously. • New standard of validation is proposed.

  7. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    Science.gov (United States)

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  8. Quasi-normal frequencies: Semi-analytic results for highly damped modes

    International Nuclear Information System (INIS)

    Skakala, Jozef; Visser, Matt

    2011-01-01

    Black hole highly-damped quasi-normal frequencies (QNFs) are very often of the form ω n = (offset) + in (gap). We have investigated the genericity of this phenomenon for the Schwarzschild-deSitter (SdS) black hole by considering a model potential that is piecewise Eckart (piecewise Poschl-Teller), and developing an analytic 'quantization condition' for the highly-damped quasi-normal frequencies. We find that the ω n = (offset) + in (gap) behaviour is common but not universal, with the controlling feature being whether or not the ratio of the surface gravities is a rational number. We furthermore observed that the relation between rational ratios of surface gravities and periodicity of QNFs is very generic, and also occurs within different analytic approaches applied to various types of black hole spacetimes. These observations are of direct relevance to any physical situation where highly-damped quasi-normal modes are important.

  9. Temperature based validation of the analytical model for the estimation of the amount of heat generated during friction stir welding

    Directory of Open Access Journals (Sweden)

    Milčić Dragan S.

    2012-01-01

    Full Text Available Friction stir welding is a solid-state welding technique that utilizes thermomechanical influence of the rotating welding tool on parent material resulting in a monolith joint - weld. On the contact of welding tool and parent material, significant stirring and deformation of parent material appears, and during this process, mechanical energy is partially transformed into heat. Generated heat affects the temperature of the welding tool and parent material, thus the proposed analytical model for the estimation of the amount of generated heat can be verified by temperature: analytically determined heat is used for numerical estimation of the temperature of parent material and this temperature is compared to the experimentally determined temperature. Numerical solution is estimated using the finite difference method - explicit scheme with adaptive grid, considering influence of temperature on material's conductivity, contact conditions between welding tool and parent material, material flow around welding tool, etc. The analytical model shows that 60-100% of mechanical power given to the welding tool is transformed into heat, while the comparison of results shows the maximal relative difference between the analytical and experimental temperature of about 10%.

  10. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  11. Analytical Modelling and Simulation of Photovoltaic Panels and Arrays

    Directory of Open Access Journals (Sweden)

    H. Bourdoucen

    2007-12-01

    Full Text Available In this paper, an analytical model for PV panels and arrays based on extracted physical parameters of solar cells is developed. The proposed model has the advantage of simplifying mathematical modelling for different configurations of cells and panels without losing efficiency of PV system operation. The effects of external parameters, mainly temperature and solar irradiance have been considered in the modelling. Due to their critical effects on the operation of the panel, effects of series and shunt resistances were also studied. The developed analytical model has been easily implemented, simulated and validated using both Spice and Matlab packages for different series and parallel configurations of cells and panels. The results obtained with these two programs are in total agreement, which make the proposed model very useful for researchers and designers for quick and accurate sizing of PV panels and arrays.

  12. Automatic-heuristic and executive-analytic processing during reasoning: Chronometric and dual-task considerations.

    Science.gov (United States)

    De Neys, Wim

    2006-06-01

    Human reasoning has been shown to overly rely on intuitive, heuristic processing instead of a more demanding analytic inference process. Four experiments tested the central claim of current dual-process theories that analytic operations involve time-consuming executive processing whereas the heuristic system would operate automatically. Participants solved conjunction fallacy problems and indicative and deontic selection tasks. Experiment 1 established that making correct analytic inferences demanded more processing time than did making heuristic inferences. Experiment 2 showed that burdening the executive resources with an attention-demanding secondary task decreased correct, analytic responding and boosted the rate of conjunction fallacies and indicative matching card selections. Results were replicated in Experiments 3 and 4 with a different secondary-task procedure. Involvement of executive resources for the deontic selection task was less clear. Findings validate basic processing assumptions of the dual-process framework and complete the correlational research programme of K. E. Stanovich and R. F. West (2000).

  13. Validating and determining the weight of items used for evaluating clinical governance implementation based on analytic hierarchy process model.

    Science.gov (United States)

    Hooshmand, Elaheh; Tourani, Sogand; Ravaghi, Hamid; Vafaee Najar, Ali; Meraji, Marziye; Ebrahimipour, Hossein

    2015-04-08

    The purpose of implementing a system such as Clinical Governance (CG) is to integrate, establish and globalize distinct policies in order to improve quality through increasing professional knowledge and the accountability of healthcare professional toward providing clinical excellence. Since CG is related to change, and change requires money and time, CG implementation has to be focused on priority areas that are in more dire need of change. The purpose of the present study was to validate and determine the significance of items used for evaluating CG implementation. The present study was descriptive-quantitative in method and design. Items used for evaluating CG implementation were first validated by the Delphi method and then compared with one another and ranked based on the Analytical Hierarchy Process (AHP) model. The items that were validated for evaluating CG implementation in Iran include performance evaluation, training and development, personnel motivation, clinical audit, clinical effectiveness, risk management, resource allocation, policies and strategies, external audit, information system management, research and development, CG structure, implementation prerequisites, the management of patients' non-medical needs, complaints and patients' participation in the treatment process. The most important items based on their degree of significance were training and development, performance evaluation, and risk management. The least important items included the management of patients' non-medical needs, patients' participation in the treatment process and research and development. The fundamental requirements of CG implementation included having an effective policy at national level, avoiding perfectionism, using the expertise and potentials of the entire country and the coordination of this model with other models of quality improvement such as accreditation and patient safety. © 2015 by Kerman University of Medical Sciences.

  14. Final Report on the Analytical Results for Tank Farm Samples in Support of Salt Dissolution Evaluation

    International Nuclear Information System (INIS)

    Hobbs, D.T.

    1996-01-01

    Recent processing of dilute solutions through the 2H-Evaporator system caused dissolution of salt in Tank 38H, the concentrate receipt tank. This report documents analytical results for samples taken from this evaporator system

  15. Translating tumor biology into personalized treatment planning: analytical performance characteristics of the Oncotype DX® Colon Cancer Assay

    Directory of Open Access Journals (Sweden)

    Krishnakumar Jayadevi

    2010-12-01

    Full Text Available Abstract Background The Oncotype DX® Colon Cancer Assay is a new diagnostic test for determining the likelihood of recurrence in stage II colon cancer patients after surgical resection using fixed paraffin embedded (FPE primary colon tumor tissue. Like the Oncotype DX Breast Cancer Assay, this is a high complexity, multi-analyte, reverse transcription (RT polymerase chain reaction (PCR assay that measures the expression levels of specific cancer-related genes. By capturing the biology underlying each patient's tumor, the Oncotype DX Colon Cancer Assay provides a Recurrence Score (RS that reflects an individualized risk of disease recurrence. Here we describe its analytical performance using pre-determined performance criteria, which is a critical component of molecular diagnostic test validation. Results All analytical measurements met pre-specified performance criteria. PCR amplification efficiency for all 12 assays was high, ranging from 96% to 107%, while linearity was demonstrated over an 11 log2 concentration range for all assays. Based on estimated components of variance for FPE RNA pools, analytical reproducibility and precision demonstrated low SDs for individual genes (0.16 to 0.32 CTs, gene groups (≤0.05 normalized/aggregate CTs and RS (≤1.38 RS units. Conclusions Analytical performance characteristics shown here for both individual genes and gene groups in the Oncotype DX Colon Cancer Assay demonstrate consistent translation of specific biology of individual tumors into clinically useful diagnostic information. The results of these studies illustrate how the analytical capability of the Oncotype DX Colon Cancer Assay has enabled clinical validation of a test to determine individualized recurrence risk after colon cancer surgery.

  16. Analytical Lie-algebraic solution of a 3D sound propagation problem in the ocean

    Energy Technology Data Exchange (ETDEWEB)

    Petrov, P.S., E-mail: petrov@poi.dvo.ru [Il' ichev Pacific Oceanological Institute, 43 Baltiyskaya str., Vladivostok, 690041 (Russian Federation); Prants, S.V., E-mail: prants@poi.dvo.ru [Il' ichev Pacific Oceanological Institute, 43 Baltiyskaya str., Vladivostok, 690041 (Russian Federation); Petrova, T.N., E-mail: petrova.tn@dvfu.ru [Far Eastern Federal University, 8 Sukhanova str., 690950, Vladivostok (Russian Federation)

    2017-06-21

    The problem of sound propagation in a shallow sea with variable bottom slope is considered. The sound pressure field produced by a time-harmonic point source in such inhomogeneous 3D waveguide is expressed in the form of a modal expansion. The expansion coefficients are computed using the adiabatic mode parabolic equation theory. The mode parabolic equations are solved explicitly, and the analytical expressions for the modal coefficients are obtained using a Lie-algebraic technique. - Highlights: • A group-theoretical approach is applied to a problem of sound propagation in a shallow sea with variable bottom slope. • An analytical solution of this problem is obtained in the form of modal expansion with analytical expressions of the coefficients. • Our result is the only analytical solution of the 3D sound propagation problem with no translational invariance. • This solution can be used for the validation of the numerical propagation models.

  17. Thermodynamics of atomic and ionized hydrogen: analytical results versus equation-of-state tables and Monte Carlo data.

    Science.gov (United States)

    Alastuey, A; Ballenegger, V

    2012-12-01

    We compute thermodynamical properties of a low-density hydrogen gas within the physical picture, in which the system is described as a quantum electron-proton plasma interacting via the Coulomb potential. Our calculations are done using the exact scaled low-temperature (SLT) expansion, which provides a rigorous extension of the well-known virial expansion-valid in the fully ionized phase-into the Saha regime where the system is partially or fully recombined into hydrogen atoms. After recalling the SLT expansion of the pressure [A. Alastuey et al., J. Stat. Phys. 130, 1119 (2008)], we obtain the SLT expansions of the chemical potential and of the internal energy, up to order exp(-|E_{H}|/kT) included (E_{H}≃-13.6 eV). Those truncated expansions describe the first five nonideal corrections to the ideal Saha law. They account exactly, up to the considered order, for all effects of interactions and thermal excitations, including the formation of bound states (atom H, ions H^{-} and H_{2}^{+}, molecule H_{2},⋯) and atom-charge and atom-atom interactions. Among the five leading corrections, three are easy to evaluate, while the remaining ones involve well-defined internal partition functions for the molecule H_{2} and ions H^{-} and H_{2}^{+}, for which no closed-form analytical formula exist currently. We provide accurate low-temperature approximations for those partition functions by using known values of rotational and vibrational energies. We compare then the predictions of the SLT expansion, for the pressure and the internal energy, with, on the one hand, the equation-of-state tables obtained within the opacity program at Livermore (OPAL) and, on the other hand, data of path integral quantum Monte Carlo (PIMC) simulations. In general, a good agreement is found. At low densities, the simple analytical SLT formulas reproduce the values of the OPAL tables up to the last digit in a large range of temperatures, while at higher densities (ρ∼10^{-2} g/cm^{3}), some

  18. Tank 241-BY-109, cores 201 and 203, analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final laboratory report for tank 241-BY-109 push mode core segments collected between June 6, 1997 and June 17, 1997. The segments were subsampled and analyzed in accordance with the Tank Push Mode Core Sampling and Analysis Plan (Bell, 1997), the Tank Safety Screening Data Quality Objective (Dukelow, et al, 1995). The analytical results are included

  19. Role of maintenance of analytical instruments in the proceedings of quality control laboratory

    International Nuclear Information System (INIS)

    Haribabu, A.; Sailoo, C.C.; Balaji Rao, Y.; Subba Rao, Y.

    2015-01-01

    Control Laboratory being a centralized analytical facility of Nuclear Fuel Complex (NFC) is engaged in chemical qualification of all nuclear materials processed/produced at NFC. The primary responsibility of control laboratory is to provide timely analytical results of raw materials, intermediates and final products to all the production plants of NFC for downstream processing. Annual analytical load of nearly five lakhs of estimations are being carried out at laboratory. For this purpose a gamut of analytical facilities ranging from classical methods like gravimetry, volumetry etc. to fully automated state-of-art analytical instruments like ICP-AES, Gas Analysers, Flame and Graphite Furnace-AAS, Direct Reading Emission Spectrometer (DRES), RF GD-OES, TIMS, WD-XRFS, ED-XRFS, Laser based PSD Analyser, Laser Fluorimeter, UV-Vis Spectrophotometer, Gamma Ray Spectrometer, Ion-Chromatography, Gas Chromatography are used to acquire analytical data to see the suitability of products for their intended use. Depending on the applications, analysts validate their procedures, calibrate their instruments, and perform additional instrument checks, such as system suitability tests and analysis of in-process quality control check samples. With the increasing sophistication and automation of analytical instruments, an increasing demand has been placed on maintenance engineers to qualify these instruments for the purpose

  20. Verification and validation of the THYTAN code for the graphite oxidation analysis in the HTGR systems

    International Nuclear Information System (INIS)

    Shimazaki, Yosuke; Isaka, Kazuyoshi; Nomoto, Yasunobu; Seki, Tomokazu; Ohashi, Hirofumi

    2014-12-01

    The analytical models for the evaluation of graphite oxidation were implemented into the THYTAN code, which employs the mass balance and a node-link computational scheme to evaluate tritium behavior in the High Temperature Gas-cooled Reactor (HTGR) systems for hydrogen production, to analyze the graphite oxidation during the air or water ingress accidents in the HTGR systems. This report describes the analytical models of the THYTAN code in terms of the graphite oxidation analysis and its verification and validation (V and V) results. Mass transfer from the gas mixture in the coolant channel to the graphite surface, diffusion in the graphite, graphite oxidation by air or water, chemical reaction and release from the primary circuit to the containment vessel by a safety valve were modeled to calculate the mass balance in the graphite and the gas mixture in the coolant channel. The computed solutions using the THYTAN code for simple questions were compared to the analytical results by a hand calculation to verify the algorithms for each implemented analytical model. A representation of the graphite oxidation experimental was analyzed using the THYTAN code, and the results were compared to the experimental data and the computed solutions using the GRACE code, which was used for the safety analysis of the High Temperature Engineering Test Reactor (HTTR), in regard to corrosion depth of graphite and oxygen concentration at the outlet of the test section to validate the analytical models of the THYTAN code. The comparison of THYTAN code results with the analytical solutions, experimental data and the GRACE code results showed the good agreement. (author)

  1. Potential of accuracy profile for method validation in inductively coupled plasma spectrochemistry

    International Nuclear Information System (INIS)

    Mermet, J.M.; Granier, G.

    2012-01-01

    Method validation is usually performed over a range of concentrations for which analytical criteria must be verified. One important criterion in quantitative analysis is accuracy, i.e. the contribution of both trueness and precision. The study of accuracy over this range is called an accuracy profile and provides experimental tolerance intervals. Comparison with acceptability limits fixed by the end user defines a validity domain. This work describes the computation involved in the building of the tolerance intervals, particularly for the intermediate precision with within-laboratory experiments and for the reproducibility with interlaboratory studies. Computation is based on ISO 5725‐4 and on previously published work. Moreover, the bias uncertainty is also computed to verify the bias contribution to accuracy. The various types of accuracy profile behavior are exemplified with results obtained by using ICP-MS and ICP-AES. This procedure allows the analyst to define unambiguously a validity domain for a given accuracy. However, because the experiments are time-consuming, the accuracy profile method is mainly dedicated to method validation. - Highlights: ► An analytical method is defined by its accuracy, i.e. both trueness and precision. ► The accuracy as a function of an analyte concentration is an accuracy profile. ► Profile basic concepts are explained for trueness and intermediate precision. ► Profile-based tolerance intervals have to be compared with acceptability limits. ► Typical accuracy profiles are given for both ICP-AES and ICP-MS techniques.

  2. A Review of Level 2 Parent-Report Instruments Used to Screen Children Aged 1.5-5 for Autism: A Meta-Analytic Update

    Science.gov (United States)

    Hampton, Justin; Strand, Paul S.

    2015-01-01

    The present study utilized meta-analytic procedures to estimate the diagnostic validity of instruments used to screen young children, ages 1.5-5 years, for autism. Five scales met inclusion criteria, and data from 18 studies contributed the meta-analysis. Results revealed that 4 of 5 scales met criteria for "good" validity, including two…

  3. Analytical and numerical study of validation test-cases for multi-physic problems: application to magneto-hydro-dynamic

    Directory of Open Access Journals (Sweden)

    D Cébron

    2016-04-01

    Full Text Available The present paper is concerned with the numerical simulation of Magneto-Hydro-Dynamic (MHD problems with industrial tools. MHD has receivedattention some twenty to thirty years ago as a possible alternative inpropulsion applications; MHD propelled ships have even been designed forthat purpose. However, such propulsion systems have been proved of lowefficiency and fundamental researches in the area have progressivelyreceived much less attention over the past decades. Numerical simulationof MHD problem could however provide interesting solutions in the field ofturbulent flow control. The development of recent efficient numericaltechniques for multi-physic applications provide promising tool for theengineer for that purpose. In the present paper, some elementary testcases in laminar flow with magnetic forcing terms are analysed; equationsof the coupled problem are exposed, analytical solutions are derived ineach case and are compared to numerical solutions obtained with anumerical tool for multi-physic applications. The present work can be seenas a validation of numerical tools (based on the finite element method foracademic as well as industrial application purposes.

  4. Validation results of satellite mock-up capturing experiment using nets

    Science.gov (United States)

    Medina, Alberto; Cercós, Lorenzo; Stefanescu, Raluca M.; Benvenuto, Riccardo; Pesce, Vincenzo; Marcon, Marco; Lavagna, Michèle; González, Iván; Rodríguez López, Nuria; Wormnes, Kjetil

    2017-05-01

    The PATENDER activity (Net parametric characterization and parabolic flight), funded by the European Space Agency (ESA) via its Clean Space initiative, was aiming to validate a simulation tool for designing nets for capturing space debris. This validation has been performed through a set of different experiments under microgravity conditions where a net was launched capturing and wrapping a satellite mock-up. This paper presents the architecture of the thrown-net dynamics simulator together with the set-up of the deployment experiment and its trajectory reconstruction results on a parabolic flight (Novespace A-310, June 2015). The simulator has been implemented within the Blender framework in order to provide a highly configurable tool, able to reproduce different scenarios for Active Debris Removal missions. The experiment has been performed over thirty parabolas offering around 22 s of zero-g conditions. Flexible meshed fabric structure (the net) ejected from a container and propelled by corner masses (the bullets) arranged around its circumference have been launched at different initial velocities and launching angles using a pneumatic-based dedicated mechanism (representing the chaser satellite) against a target mock-up (the target satellite). High-speed motion cameras were recording the experiment allowing 3D reconstruction of the net motion. The net knots have been coloured to allow the images post-process using colour segmentation, stereo matching and iterative closest point (ICP) for knots tracking. The final objective of the activity was the validation of the net deployment and wrapping simulator using images recorded during the parabolic flight. The high-resolution images acquired have been post-processed to determine accurately the initial conditions and generate the reference data (position and velocity of all knots of the net along its deployment and wrapping of the target mock-up) for the simulator validation. The simulator has been properly

  5. Quantitative Comparison of Ternary Eutectic Phase-Field Simulations with Analytical 3D Jackson-Hunt Approaches

    Science.gov (United States)

    Steinmetz, Philipp; Kellner, Michael; Hötzer, Johannes; Nestler, Britta

    2018-02-01

    For the analytical description of the relationship between undercoolings, lamellar spacings and growth velocities during the directional solidification of ternary eutectics in 2D and 3D, different extensions based on the theory of Jackson and Hunt are reported in the literature. Besides analytical approaches, the phase-field method has been established to study the spatially complex microstructure evolution during the solidification of eutectic alloys. The understanding of the fundamental mechanisms controlling the morphology development in multiphase, multicomponent systems is of high interest. For this purpose, a comparison is made between the analytical extensions and three-dimensional phase-field simulations of directional solidification in an ideal ternary eutectic system. Based on the observed accordance in two-dimensional validation cases, the experimentally reported, inherently three-dimensional chain-like pattern is investigated in extensive simulation studies. The results are quantitatively compared with the analytical results reported in the literature, and with a newly derived approach which uses equal undercoolings. A good accordance of the undercooling-spacing characteristics between simulations and the analytical Jackson-Hunt apporaches are found. The results show that the applied phase-field model, which is based on the Grand potential approach, is able to describe the analytically predicted relationship between the undercooling and the lamellar arrangements during the directional solidification of a ternary eutectic system in 3D.

  6. Analytic method study of point-reactor kinetic equation when cold start-up

    International Nuclear Information System (INIS)

    Zhang Fan; Chen Wenzhen; Gui Xuewen

    2008-01-01

    The reactor cold start-up is a process of inserting reactivity by lifting control rod discontinuously. Inserting too much reactivity will cause short-period and may cause an overpressure accident in the primary loop. It is therefore very important to understand the rule of neutron density variation and to find out the relationships among the speed of lifting control rod, and the duration and speed of neutron density response. It is also helpful for the operators to grasp the rule in order to avoid a start-up accident. This paper starts with one-group delayed neutron point-reactor kinetics equations and provides their analytic solution when reactivity is introduced by lifting control rods discontinuously. The analytic expression is validated by comparison with practical data. It is shown that the analytic solution agrees well with numerical solution. Using this analytical solution, the relationships among neutron density response with the speed of lifting control rod and its duration are also studied. By comparing the results with those under the condition of step inserted reactivity, useful conclusions are drawn

  7. Intercalibration of analytical methods on marine environmental samples. Results of MEDPOL-II exercise for the intercalibration of chlorinated hydrocarbon measurements on mussel homogenate (MA-M-2/OC)

    International Nuclear Information System (INIS)

    1986-10-01

    Mussels have been considered as good indicators of chlorinated hydrocarbon pollution of the marine environment and this led to the development of mussel watch programmes in many countries in the late seventies. These intercalibration exercises were arranged in order to increase the quality of analytical capabilities of environmental laboratories. The samples MA-M-2/0C of Mediterranean mussels with chlorinated hydrocarbon content were checked by 27 laboratories. It was judged highly suitable for these laboratories to have at their disposal a reference material made of mussel tissue with robust estimations of the true values with respect to several chlorinated hydrocarbons. Such a material would allow chemists to check the validity of new analytical procedures

  8. Results Of Analytical Sample Crosschecks For Next Generation Solvent Extraction Samples Isopar L Concentration And pH

    International Nuclear Information System (INIS)

    Peters, T.; Fink, S.

    2011-01-01

    As part of the implementation process for the Next Generation Cesium Extraction Solvent (NGCS), SRNL and F/H Lab performed a series of analytical cross-checks to ensure that the components in the NGCS solvent system do not constitute an undue analytical challenge. For measurement of entrained Isopar(reg s ign) L in aqueous solutions, both labs performed similarly with results more reliable at higher concentrations (near 50 mg/L). Low bias occurred in both labs, as seen previously for comparable blind studies for the baseline solvent system. SRNL recommends consideration to use of Teflon(trademark) caps on all sample containers used for this purpose. For pH measurements, the labs showed reasonable agreement but considerable positive bias for dilute boric acid solutions. SRNL recommends consideration of using an alternate analytical method for qualification of boric acid concentrations.

  9. Analytical method (HPLC, validation used for identification and assay of the pharmaceutical active ingredient, Tylosin tartrate for veterinary use and its finite product Tilodem 50, hydrosoluble powder

    Directory of Open Access Journals (Sweden)

    Maria Neagu

    2010-12-01

    Full Text Available In SC DELOS IMPEX ’96 SRL the quality of the active pharmaceutical ingredient (API for the finite product Tilodem 50 - hydrosoluble powder was acomkplished in the respect of last European Pharmacopoeia.The method for analysis used in this purpose was the compendial method „Tylosin tartrate for veterinary use” in EurPh. in vigour edition and represent a variant developed and validation „in house”.The parameters which was included in the methodology validation for chromatographic method are the followings: Selectivity, Linearity, Linearity range, Detection and Quantification limits, Precision, Repeatability (intra day, Inter-Day Reproductibility, Accuracy, Robustness, Solutions’ stability and System suitability. According to the European Pharmacopoeia, the active pharmaceutical ingredient is consistent, in terms of quality, if it contains Tylosin A - minimum 80% and the amount of Tylosin A, B, C, D, at minimum 95%. Identification and determination of each component separately (Tylosin A, B, C, D is possible by chromatographic separation-HPLC. Validation of analytical methods is presented below.

  10. Verification and validation of decision support software: Expert Choice{trademark} and PCM{trademark}

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Q.H.; Martin, J.D.

    1994-11-04

    This report documents the verification and validation of two decision support programs: EXPERT CHOICE{trademark} and PCM{trademark}. Both programs use the Analytic Hierarchy Process (AHP) -- or pairwise comparison technique -- developed by Dr. Thomas L. Saaty. In order to provide an independent method for the validating the two programs, the pairwise comparison algorithm was developed for a standard mathematical program. A standard data set -- selecting a car to purchase -- was used with each of the three programs for validation. The results show that both commercial programs performed correctly.

  11. Simplified Analytical Methods to Analyze Lock Gates Submitted to Ship Collisions and Earthquakes

    Directory of Open Access Journals (Sweden)

    Buldgen Loic

    2015-09-01

    Full Text Available This paper presents two simplified analytical methods to analyze lock gates submitted to two different accidental loads. The case of an impact involving a vessel is first investigated. In this situation, the resistance of the struck gate is evaluated by assuming a local and a global deforming mode. The super-element method is used in the first case, while an equivalent beam model is simultaneously introduced to capture the overall bending motion of the structure. The second accidental load considered in this paper is the seismic action, for which an analytical method is presented to evaluate the total hydrodynamic pressure applied on a lock gate during an earthquake, due account being taken of the fluid-structure interaction. For each of these two actions, numerical validations are presented and the analytical results are compared to finite-element solutions.

  12. Resonant amplification of neutrino transitions in the Sun: exact analytical results

    International Nuclear Information System (INIS)

    Toshev, S.; Petkov, P.

    1988-01-01

    We investigate in detail the Mikheyev-Smirnov-Wolfenstein explanation of the solar neutrino puzzle using analytical expressions for the neutrino transition probabilities in matter with exponentially varying electron number density

  13. Analytic processor model for fast design-space exploration

    NARCIS (Netherlands)

    Jongerius, R.; Mariani, G.; Anghel, A.; Dittmann, G.; Vermij, E.; Corporaal, H.

    2015-01-01

    In this paper, we propose an analytic model that takes as inputs a) a parametric microarchitecture-independent characterization of the target workload, and b) a hardware configuration of the core and the memory hierarchy, and returns as output an estimation of processor-core performance. To validate

  14. In-house validation of a liquid chromatography-tandem mass spectrometry method for the determination of selective androgen receptor modulators (SARMS) in bovine urine.

    Science.gov (United States)

    Schmidt, Kathrin S; Mankertz, Joachim

    2018-06-01

    A sensitive and robust LC-MS/MS method allowing the rapid screening and confirmation of selective androgen receptor modulators in bovine urine was developed and successfully validated according to Commission Decision 2002/657/EC, chapter 3.1.3 'alternative validation', by applying a matrix-comprehensive in-house validation concept. The confirmation of the analytes in the validation samples was achieved both on the basis of the MRM ion ratios as laid down in Commission Decision 2002/657/EC and by comparison of their enhanced product ion (EPI) spectra with a reference mass spectral library by making use of the QTRAP technology. Here, in addition to the MRM survey scan, EPI spectra were generated in a data-dependent way according to an information-dependent acquisition criterion. Moreover, stability studies of the analytes in solution and in matrix according to an isochronous approach proved the stability of the analytes in solution and in matrix for at least the duration of the validation study. To identify factors that have a significant influence on the test method in routine analysis, a factorial effect analysis was performed. To this end, factors considered to be relevant for the method in routine analysis (e.g. operator, storage duration of the extracts before measurement, different cartridge lots and different hydrolysis conditions) were systematically varied on two levels. The examination of the extent to which these factors influence the measurement results of the individual analytes showed that none of the validation factors exerts a significant influence on the measurement results.

  15. Electron Beam Return-Current Losses in Solar Flares: Initial Comparison of Analytical and Numerical Results

    Science.gov (United States)

    Holman, Gordon

    2010-01-01

    Accelerated electrons play an important role in the energetics of solar flares. Understanding the process or processes that accelerate these electrons to high, nonthermal energies also depends on understanding the evolution of these electrons between the acceleration region and the region where they are observed through their hard X-ray or radio emission. Energy losses in the co-spatial electric field that drives the current-neutralizing return current can flatten the electron distribution toward low energies. This in turn flattens the corresponding bremsstrahlung hard X-ray spectrum toward low energies. The lost electron beam energy also enhances heating in the coronal part of the flare loop. Extending earlier work by Knight & Sturrock (1977), Emslie (1980), Diakonov & Somov (1988), and Litvinenko & Somov (1991), I have derived analytical and semi-analytical results for the nonthermal electron distribution function and the self-consistent electric field strength in the presence of a steady-state return-current. I review these results, presented previously at the 2009 SPD Meeting in Boulder, CO, and compare them and computed X-ray spectra with numerical results obtained by Zharkova & Gordovskii (2005, 2006). The phYSical significance of similarities and differences in the results will be emphasized. This work is supported by NASA's Heliophysics Guest Investigator Program and the RHESSI Project.

  16. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    Science.gov (United States)

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  17. A chromatographic method validation to quantify tablets Mephenesine of national production

    International Nuclear Information System (INIS)

    Suarez Perez, Yania; Izquierdo Castro, Adalberto; Milian Sanchez, Jana Daria

    2009-01-01

    Authors made validation of an analytical method by high performance liquid chromatography (HPLC) for quantification of Mephenesine in recently reformulated 500 mg tablets. With regard to its application to quality control, validation included the following parameters: linearity, accuracy, precision, and selectivity. Results were satisfactory within 50-150 % rank. In the case of its use in subsequent studies of chemical stability, the selectivity for stability and sensitivity was assessed. Estimated detection and quantification limits were appropriate, and the method was selective versus the possible degradation products. (Author)

  18. A Novel Analytic Technique for the Service Station Reliability in a Discrete-Time Repairable Queue

    Directory of Open Access Journals (Sweden)

    Renbin Liu

    2013-01-01

    Full Text Available This paper presents a decomposition technique for the service station reliability in a discrete-time repairable GeomX/G/1 queueing system, in which the server takes exhaustive service and multiple adaptive delayed vacation discipline. Using such a novel analytic technique, some important reliability indices and reliability relation equations of the service station are derived. Furthermore, the structures of the service station indices are also found. Finally, special cases and numerical examples validate the derived results and show that our analytic technique is applicable to reliability analysis of some complex discrete-time repairable bulk arrival queueing systems.

  19. Analytical Modeling Of The Steinmetz Coefficient For Single-Phase Transformer Eddy Current Loss Prediction

    Directory of Open Access Journals (Sweden)

    T. Aly Saandy

    2015-08-01

    Full Text Available Abstract This article presents to an analytical calculation methodology of the Steinmetz coefficient applied to the prediction of Eddy current loss in a single-phase transformer. Based on the electrical circuit theory the active power consumed by the core is expressed analytically in function of the electrical parameters as resistivity and the geometrical dimensions of the core. The proposed modeling approach is established with the duality parallel series. The required coefficient is identified from the empirical Steinmetz data based on the experimented active power expression. To verify the relevance of the model validations both by simulations with two in two different frequencies and measurements were carried out. The obtained results are in good agreement with the theoretical approach and the practical results.

  20. Unified analytical threshold voltage model for non-uniformly doped dual metal gate fully depleted silicon-on-insulator MOSFETs

    Science.gov (United States)

    Rao, Rathnamala; Katti, Guruprasad; Havaldar, Dnyanesh S.; DasGupta, Nandita; DasGupta, Amitava

    2009-03-01

    The paper describes the unified analytical threshold voltage model for non-uniformly doped, dual metal gate (DMG) fully depleted silicon-on-insulator (FDSOI) MOSFETs based on the solution of 2D Poisson's equation. 2D Poisson's equation is solved analytically for appropriate boundary conditions using separation of variables technique. The solution is then extended to obtain the threshold voltage of the FDSOI MOSFET. The model is able to handle any kind of non-uniform doping, viz. vertical, lateral as well as laterally asymetric channel (LAC) profile in the SOI film in addition to the DMG structure. The analytical results are validated with the numerical simulations using the device simulator MEDICI.

  1. A National Residue Control Plan from the analytical perspective-The Brazilian case

    International Nuclear Information System (INIS)

    Mauricio, Angelo de Q; Lins, Erick S.; Alvarenga, Marcelo B.

    2009-01-01

    Food safety is a strategic topic entailing not only national public health aspects but also competitiveness in international trade. An important component of any food safety program is the control and monitoring of residues posed by certain substances involved in food production. In turn, a National Residue Control Plan (NRCP) relies on an appropriate laboratory network, not only to generate analytical results, but also more broadly to verify and co-validate the controls built along the food production chain. Therefore laboratories operating under a NRCP should work in close cooperation with inspection bodies, fostering the critical alignment of the whole system with the principles of risk analysis. Beyond producing technically valid results, these laboratories should arguably be able to assist in the prediction and establishment of targets for official control. In pursuit of analytical excellence, the Brazilian government has developed a strategic plan for Official Agricultural Laboratories. Inserted in a national agenda for agricultural risk analysis, the plan has succeeded in raising laboratory budget by approximately 200%, it has started a rigorous program for personnel capacity-building, it has initiated strategic cooperation with international reference centres, and finally, it has completely renewed instrumental resources and rapidly triggered a program aimed at full laboratory compliance with ISO/IEC 17025 requirements

  2. A National Residue Control Plan from the analytical perspective--the Brazilian case.

    Science.gov (United States)

    Mauricio, Angelo de Q; Lins, Erick S; Alvarenga, Marcelo B

    2009-04-01

    Food safety is a strategic topic entailing not only national public health aspects but also competitiveness in international trade. An important component of any food safety program is the control and monitoring of residues posed by certain substances involved in food production. In turn, a National Residue Control Plan (NRCP) relies on an appropriate laboratory network, not only to generate analytical results, but also more broadly to verify and co-validate the controls built along the food production chain. Therefore laboratories operating under a NRCP should work in close cooperation with inspection bodies, fostering the critical alignment of the whole system with the principles of risk analysis. Beyond producing technically valid results, these laboratories should arguably be able to assist in the prediction and establishment of targets for official control. In pursuit of analytical excellence, the Brazilian government has developed a strategic plan for Official Agricultural Laboratories. Inserted in a national agenda for agricultural risk analysis, the plan has succeeded in raising laboratory budget by approximately 200%, it has started a rigorous program for personnel capacity-building, it has initiated strategic cooperation with international reference centres, and finally, it has completely renewed instrumental resources and rapidly triggered a program aimed at full laboratory compliance with ISO/IEC 17025 requirements.

  3. A National Residue Control Plan from the analytical perspective-The Brazilian case

    Energy Technology Data Exchange (ETDEWEB)

    Mauricio, Angelo de Q [Ministry of Agriculture, Livestock and Food Supply of Brazil, Esplanada dos Ministerios, Bloco D, Annex B, Room 436, Zip code 70043-900, Brasilia, DF (Brazil)], E-mail: angelo.mauricio@agricultura.gov.br; Lins, Erick S.; Alvarenga, Marcelo B. [Ministry of Agriculture, Livestock and Food Supply of Brazil, Esplanada dos Ministerios, Bloco D, Annex B, Room 436, Zip code 70043-900, Brasilia, DF (Brazil)

    2009-04-01

    Food safety is a strategic topic entailing not only national public health aspects but also competitiveness in international trade. An important component of any food safety program is the control and monitoring of residues posed by certain substances involved in food production. In turn, a National Residue Control Plan (NRCP) relies on an appropriate laboratory network, not only to generate analytical results, but also more broadly to verify and co-validate the controls built along the food production chain. Therefore laboratories operating under a NRCP should work in close cooperation with inspection bodies, fostering the critical alignment of the whole system with the principles of risk analysis. Beyond producing technically valid results, these laboratories should arguably be able to assist in the prediction and establishment of targets for official control. In pursuit of analytical excellence, the Brazilian government has developed a strategic plan for Official Agricultural Laboratories. Inserted in a national agenda for agricultural risk analysis, the plan has succeeded in raising laboratory budget by approximately 200%, it has started a rigorous program for personnel capacity-building, it has initiated strategic cooperation with international reference centres, and finally, it has completely renewed instrumental resources and rapidly triggered a program aimed at full laboratory compliance with ISO/IEC 17025 requirements.

  4. Analytical Method Development and Validation for the Quantification of Acetone and Isopropyl Alcohol in the Tartaric Acid Base Pellets of Dipyridamole Modified Release Capsules by Using Headspace Gas Chromatographic Technique

    Directory of Open Access Journals (Sweden)

    Sriram Valavala

    2018-01-01

    Full Text Available A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8 µm column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH. All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis.

  5. A program wide framework for evaluating data driven teaching and learning - earth analytics approaches, results and lessons learned

    Science.gov (United States)

    Wasser, L. A.; Gold, A. U.

    2017-12-01

    There is a deluge of earth systems data available to address cutting edge science problems yet specific skills are required to work with these data. The Earth analytics education program, a core component of Earth Lab at the University of Colorado - Boulder - is building a data intensive program that provides training in realms including 1) interdisciplinary communication and collaboration 2) earth science domain knowledge including geospatial science and remote sensing and 3) reproducible, open science workflows ("earth analytics"). The earth analytics program includes an undergraduate internship, undergraduate and graduate level courses and a professional certificate / degree program. All programs share the goals of preparing a STEM workforce for successful earth analytics driven careers. We are developing an program-wide evaluation framework that assesses the effectiveness of data intensive instruction combined with domain science learning to better understand and improve data-intensive teaching approaches using blends of online, in situ, asynchronous and synchronous learning. We are using targeted online search engine optimization (SEO) to increase visibility and in turn program reach. Finally our design targets longitudinal program impacts on participant career tracts over time.. Here we present results from evaluation of both an interdisciplinary undergrad / graduate level earth analytics course and and undergraduate internship. Early results suggest that a blended approach to learning and teaching that includes both synchronous in-person teaching and active classroom hands-on learning combined with asynchronous learning in the form of online materials lead to student success. Further we will present our model for longitudinal tracking of participant's career focus overtime to better understand long-term program impacts. We also demonstrate the impact of SEO optimization on online content reach and program visibility.

  6. Parametric validations of analytical lifetime estimates for radiation belt electron diffusion by whistler waves

    Directory of Open Access Journals (Sweden)

    A. V. Artemyev

    2013-04-01

    Full Text Available The lifetimes of electrons trapped in Earth's radiation belts can be calculated from quasi-linear pitch-angle diffusion by whistler-mode waves, provided that their frequency spectrum is broad enough and/or their average amplitude is not too large. Extensive comparisons between improved analytical lifetime estimates and full numerical calculations have been performed in a broad parameter range representative of a large part of the magnetosphere from L ~ 2 to 6. The effects of observed very oblique whistler waves are taken into account in both numerical and analytical calculations. Analytical lifetimes (and pitch-angle diffusion coefficients are found to be in good agreement with full numerical calculations based on CRRES and Cluster hiss and lightning-generated wave measurements inside the plasmasphere and Cluster lower-band chorus waves measurements in the outer belt for electron energies ranging from 100 keV to 5 MeV. Comparisons with lifetimes recently obtained from electron flux measurements on SAMPEX, SCATHA, SAC-C and DEMETER also show reasonable agreement.

  7. Heavy-quark QCD vacuum polarisation function. Analytical results at four loops

    International Nuclear Information System (INIS)

    Kniehl, B.A.; Kotikov, A.V.

    2006-07-01

    The first two moments of the heavy-quark vacuum polarisation function at four loops in quantum chromo-dynamics are found in fully analytical form by evaluating the missing massive four-loop tadpole master integrals. (orig.)

  8. The Mistra experiment for field containment code validation first results

    International Nuclear Information System (INIS)

    Caron-Charles, M.; Blumenfeld, L.

    2001-01-01

    The MISTRA facility is a large scale experiment, designed for the purpose of thermal-hydraulics multi-D codes validation. A short description of the facility, the set up of the instrumentation and the test program are presented. Then, the first experimental results, studying helium injection in the containment and their calculations are detailed. (author)

  9. Steady-state analytical model of suspended p-type 3C-SiC bridges under consideration of Joule heating

    Science.gov (United States)

    Balakrishnan, Vivekananthan; Dinh, Toan; Phan, Hoang-Phuong; Kozeki, Takahiro; Namazu, Takahiro; Viet Dao, Dzung; Nguyen, Nam-Trung

    2017-07-01

    This paper reports an analytical model and its validation for a released microscale heater made of 3C-SiC thin films. A model for the equivalent electrical and thermal parameters was developed for the two-layer multi-segment heat and electric conduction. The model is based on a 1D energy equation, which considers the temperature-dependent resistivity and allows for the prediction of voltage-current and power-current characteristics of the microheater. The steady-state analytical model was validated by experimental characterization. The results, in particular the nonlinearity caused by temperature dependency, are in good agreement. The low power consumption of the order of 0.18 mW at approximately 310 K indicates the potential use of the structure as thermal sensors in portable applications.

  10. Simple quasi-analytical holonomic homogenization model for the non-linear analysis of in-plane loaded masonry panels: Part 2, structural implementation and validation

    Science.gov (United States)

    Milani, G.; Bertolesi, E.

    2017-07-01

    The simple quasi analytical holonomic homogenization approach for the non-linear analysis of in-plane loaded masonry presented in Part 1 is here implemented at a structural leveland validated. For such implementation, a Rigid Body and Spring Mass model (RBSM) is adopted, relying into a numerical modelling constituted by rigid elements interconnected by homogenized inelastic normal and shear springs placed at the interfaces between adjoining elements. Such approach is also known as HRBSM. The inherit advantage is that it is not necessary to solve a homogenization problem at each load step in each Gauss point, and a direct implementation into a commercial software by means of an external user supplied subroutine is straightforward. In order to have an insight into the capabilities of the present approach to reasonably reproduce masonry behavior at a structural level, non-linear static analyses are conducted on a shear wall, for which experimental and numerical data are available in the technical literature. Quite accurate results are obtained with a very limited computational effort.

  11. The environmental evaluation of substation based on the fuzzy analytic hierarchy process

    Science.gov (United States)

    Qian, Wenxiao; Zuo, Xiujiang; Chen, Yuandong; Ye, Ming; Fang, Zhankai; Yang, Fan

    2018-02-01

    This paper studies on the different influences on the environment of the substations and puts forward an index system of environmental protection through the fuzzy analytic hierarchy process. A comprehensive environmental evaluation on a substation is carried out through investigation and measurement of the current environmental factors, and the statistical data has validated the effectiveness and feasibility of this evaluation index system. The results indicate that the proposed model has high efficiency.

  12. Radiant Energy Measurements from a Scaled Jet Engine Axisymmetric Exhaust Nozzle for a Baseline Code Validation Case

    Science.gov (United States)

    Baumeister, Joseph F.

    1994-01-01

    A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.

  13. Application of analytical procedure on system reliability, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    2000-01-01

    In the Ship Research Institute, research and development of GO-FLOW procedure with various advanced functions as a system reliability analysis method occupying main part of the probabilistic safety assessment (PSA) were promoted. In this study, as an important evaluation technique on executing PSA with lower than level 3, by intending fundamental upgrading of the GO-FLOW procedure, a safety assessment system using the GO-FLOW as well as an analytical function coupling of dynamic behavior analytical function and physical behavior of the system with stochastic phenomenon change were developed. In 1998 fiscal year, preparation and verification of various functions such as dependence addition between the headings, rearrangement in order of time, positioning of same heading to plural positions, calculation of forming frequency with elapsing time were carried out. And, on a simulation analysis function of accident sequence, confirmation on analysis covering all of main accident sequence in the reactor for improved marine reactor, MRX was carried out. In addition, a function near automatically producible on input data for analysis was also prepared. As a result, the conventional analysis not always easy understanding on analytical results except an expert of PSA was solved, and understanding of the accident phenomenon, verification of validity on analysis, feedback to analysis, and feedback to design could be easily carried out. (G.K.)

  14. Translating tumor biology into personalized treatment planning: analytical performance characteristics of the Oncotype DX Colon Cancer Assay.

    Science.gov (United States)

    Clark-Langone, Kim M; Sangli, Chithra; Krishnakumar, Jayadevi; Watson, Drew

    2010-12-23

    The Oncotype DX Colon Cancer Assay is a new diagnostic test for determining the likelihood of recurrence in stage II colon cancer patients after surgical resection using fixed paraffin embedded (FPE) primary colon tumor tissue. Like the Oncotype DX Breast Cancer Assay, this is a high complexity, multi-analyte, reverse transcription (RT) polymerase chain reaction (PCR) assay that measures the expression levels of specific cancer-related genes. By capturing the biology underlying each patient's tumor, the Oncotype DX Colon Cancer Assay provides a Recurrence Score (RS) that reflects an individualized risk of disease recurrence. Here we describe its analytical performance using pre-determined performance criteria, which is a critical component of molecular diagnostic test validation. All analytical measurements met pre-specified performance criteria. PCR amplification efficiency for all 12 assays was high, ranging from 96% to 107%, while linearity was demonstrated over an 11 log2 concentration range for all assays. Based on estimated components of variance for FPE RNA pools, analytical reproducibility and precision demonstrated low SDs for individual genes (0.16 to 0.32 CTs), gene groups (≤ 0.05 normalized/aggregate CTs) and RS (≤ 1.38 RS units). Analytical performance characteristics shown here for both individual genes and gene groups in the Oncotype DX Colon Cancer Assay demonstrate consistent translation of specific biology of individual tumors into clinically useful diagnostic information. The results of these studies illustrate how the analytical capability of the Oncotype DX Colon Cancer Assay has enabled clinical validation of a test to determine individualized recurrence risk after colon cancer surgery.

  15. A comprehensive analytical solution of the nonlinear pendulum

    International Nuclear Information System (INIS)

    Ochs, Karlheinz

    2011-01-01

    In this paper, an analytical solution for the differential equation of the simple but nonlinear pendulum is derived. This solution is valid for any time and is not limited to any special initial instance or initial values. Moreover, this solution holds if the pendulum swings over or not. The method of approach is based on Jacobi elliptic functions and starts with the solution of a pendulum that swings over. Due to a meticulous sign correction term, this solution is also valid if the pendulum does not swing over.

  16. Comparison of Analytical and Measured Performance Results on Network Coding in IEEE 802.11 Ad-Hoc Networks

    DEFF Research Database (Denmark)

    Zhao, Fang; Médard, Muriel; Hundebøll, Martin

    2012-01-01

    CATWOMAN that can run on standard WiFi hardware. We present an analytical model to evaluate the performance of COPE in simple networks, and our results show the excellent predictive quality of this model. By closely examining the performance in two simple topologies, we observe that the coding gain results...

  17. Validation and assessment of uncertainty of chemical tests as a tool for the reliability analysis of wastewater IPEN

    International Nuclear Information System (INIS)

    Silva, Renan A.; Martins, Elaine A.J.; Furusawa, Helio A.

    2011-01-01

    The validation of analytical methods has become an indispensable tool for the analysis in chemical laboratories, including being required for such accreditation. However, even if a laboratory using validated methods of analysis there is the possibility that these methods generate results discrepant with reality by making necessary the addition of a quantitative attribute (a value) which indicates the degree of certainty the extent or the analytical method used. This measure assigned to the result of measurement is called measurement uncertainty. We estimate this uncertainty with a level of confidence both direction, an analytical result has limited significance if not carried out proper assessment of its uncertainty. One of the activities of this work was to elaborate a program to help the validation and evaluation of uncertainty in chemical analysis. The program was developed with Visual Basic programming language and method of evaluation of uncertainty introduced the following concepts based on the GUM (Guide to the Expression of Uncertainty in Measurement). This evaluation program uncertainty measurement will be applied to chemical analysis in support of the characterization of the Nuclear Fuel Cycle developed by IPEN and the study of organic substances in wastewater associated with professional activities of the Institute. In the first case, primarily for the determination of total uranium and the second case for substances that were generated by human activities and that are contained in resolution 357/2005. As strategy for development of this work was considered the PDCA cycle to improve the efficiency of each step and minimize errors while performing the experimental part. The program should be validated to meet requirements of standards such as, for example, the standard ISO/IEC 17025. The application, it is projected to use in other analytical procedures of both the Nuclear Fuel Cycle and in the control program and chemical waste management of IPEN

  18. Validation and assessment of uncertainty of chemical tests as a tool for the reliability analysis of wastewater IPEN

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Renan A.; Martins, Elaine A.J.; Furusawa, Helio A., E-mail: elaine@ipen.br, E-mail: helioaf@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The validation of analytical methods has become an indispensable tool for the analysis in chemical laboratories, including being required for such accreditation. However, even if a laboratory using validated methods of analysis there is the possibility that these methods generate results discrepant with reality by making necessary the addition of a quantitative attribute (a value) which indicates the degree of certainty the extent or the analytical method used. This measure assigned to the result of measurement is called measurement uncertainty. We estimate this uncertainty with a level of confidence both direction, an analytical result has limited significance if not carried out proper assessment of its uncertainty. One of the activities of this work was to elaborate a program to help the validation and evaluation of uncertainty in chemical analysis. The program was developed with Visual Basic programming language and method of evaluation of uncertainty introduced the following concepts based on the GUM (Guide to the Expression of Uncertainty in Measurement). This evaluation program uncertainty measurement will be applied to chemical analysis in support of the characterization of the Nuclear Fuel Cycle developed by IPEN and the study of organic substances in wastewater associated with professional activities of the Institute. In the first case, primarily for the determination of total uranium and the second case for substances that were generated by human activities and that are contained in resolution 357/2005. As strategy for development of this work was considered the PDCA cycle to improve the efficiency of each step and minimize errors while performing the experimental part. The program should be validated to meet requirements of standards such as, for example, the standard ISO/IEC 17025. The application, it is projected to use in other analytical procedures of both the Nuclear Fuel Cycle and in the control program and chemical waste management of IPEN

  19. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  20. Validation Results for LEWICE 3.0

    Science.gov (United States)

    Wright, William B.

    2005-01-01

    A research project is underway at NASA Glenn to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. This report will present results from version 3.0 of this software, which is called LEWICE. This version differs from previous releases in that it incorporates additional thermal analysis capabilities, a pneumatic boot model, interfaces to computational fluid dynamics (CFD) flow solvers and has an empirical model for the supercooled large droplet (SLD) regime. An extensive comparison of the results in a quantifiable manner against the database of ice shapes and collection efficiency that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has also been performed. The complete set of data used for this comparison will eventually be available in a contractor report. This paper will show the differences in collection efficiency between LEWICE 3.0 and experimental data. Due to the large amount of validation data available, a separate report is planned for ice shape comparison. This report will first describe the LEWICE 3.0 model for water collection. A semi-empirical approach was used to incorporate first order physical effects of large droplet phenomena into icing software. Comparisons are then made to every single element two-dimensional case in the water collection database. Each condition was run using the following five assumptions: 1) potential flow, no splashing; 2) potential flow, no splashing with 21 bin drop size distributions and a lift correction (angle of attack adjustment); 3) potential flow, with splashing; 4) Navier-Stokes, no splashing; and 5) Navier-Stokes, with splashing. Quantitative comparisons are shown for impingement limit, maximum water catch, and total collection efficiency. The results show that the predicted results are within the accuracy limits of the experimental data for the majority of cases.

  1. DEVELOPMENT AND VALIDATION OF AN HPLC-DAD ANALYTICAL METHOD TO QUANTIFY 5-METHOXYFLAVONES IN METHANOLIC EXTRACTS OF Vochysia divergens POHL CULTURED UNDER STRESS CONDITIONS

    Directory of Open Access Journals (Sweden)

    Letícia Pereira Pimenta

    Full Text Available Vochysia divergens Pohl, known as "Cambara" in Brazil, is an invasive species that is expanding throughout Pantanal in Brazil, to form mono-dominant communities. This expansion is affecting the agricultural areas that support the typical seasonal flood and drought conditions of this biome. This article describes the development and validation of an HPLC-DAD analytical method to quantify 5-methoxyflavones in methanolic extracts of greenhouse-grown V. divergens associated with one of two endophytic fungal species Zopfiella tetraspora (Zt or Melanconiella elegans (Me and later subjected to water stress. The developed method gave good validation parameters and was successfully applied to quantify the flavones 3',5-dimethoxy luteolin-7-O-β-glucopyranoside (1, 5-methoxy luteolin (2, and 3',5-dimethoxy luteolin (3 in the target extracts. Inoculation of the plant with Zt decreased the concentration of flavone 1 in the extract by 2.69-fold as compared to the control. Inoculation of the plant with Zt or Me did not significantly alter the contents of flavones 2 and 3 in the extracts as compared to the control. Therefore, the aerial parts of germinated V. divergens plants inoculated with either Zt or Me responded differently in terms of the production of flavones. These results can cast light on the symbiosis between fungal microorganisms and V. divergens, which most likely influences the response of V. divergens to changes in the availability of water in Pantanal.

  2. A Complete Validated Learning Analytics Framework: Designing Issues from Data Preparation Perspective

    Science.gov (United States)

    Tlili, Ahmed; Essalmi, Fathi; Jemni, Mohamed; Kinshuk; Chen, Nian-Shing

    2018-01-01

    With the rapid growth of online education in recent years, Learning Analytics (LA) has gained increasing attention from researchers and educational institutions as an area which can improve the overall effectiveness of learning experiences. However, the lack of guidelines on what should be taken into consideration during application of LA hinders…

  3. Analytical method comparisons for the accurate determination of PCBs in sediments

    Energy Technology Data Exchange (ETDEWEB)

    Numata, M.; Yarita, T.; Aoyagi, Y.; Yamazaki, M.; Takatsu, A. [National Metrology Institute of Japan, Tsukuba (Japan)

    2004-09-15

    National Metrology Institute of Japan in National Institute of Advanced Industrial Science and Technology (NMIJ/AIST) has been developing several matrix reference materials, for example, sediments, water and biological tissues, for the determinations of heavy metals and organometallic compounds. The matrix compositions of those certified reference materials (CRMs) are similar to compositions of actual samples, and those are useful for validating analytical procedures. ''Primary methods of measurements'' are essential to obtain accurate and SI-traceable certified values in the reference materials, because the methods have the highest quality of measurement. However, inappropriate analytical operations, such as incomplete extraction of analytes or crosscontamination during analytical procedures, will cause error of analytical results, even if one of the primary methods, isotope-dilution, is utilized. To avoid possible procedural bias for the certification of reference materials, we employ more than two analytical methods which have been optimized beforehand. Because the accurate determination of trace POPs in the environment is important to evaluate their risk, reliable CRMs are required by environmental chemists. Therefore, we have also been preparing matrix CRMs for the determination of POPs. To establish accurate analytical procedures for the certification of POPs, extraction is one of the critical steps as described above. In general, conventional extraction techniques for the determination of POPs, such as Soxhlet extraction (SOX) and saponification (SAP), have been characterized well, and introduced as official methods for environmental analysis. On the other hand, emerging techniques, such as microwave-assisted extraction (MAE), pressurized fluid extraction (PFE) and supercritical fluid extraction (SFE), give higher recovery yields of analytes with relatively short extraction time and small amount of solvent, by reasons of the high

  4. Development and validation of simple RP-HPLC-PDA analytical protocol for zileuton assisted with Design of Experiments for robustness determination

    Directory of Open Access Journals (Sweden)

    Saurabh B. Ganorkar

    2017-02-01

    Full Text Available A simple, rapid, sensitive, robust, stability-indicating RP-HPLC-PDA analytical protocol was developed and validated for the analysis of zileuton racemate in bulk and in tablet formulation. Development of method and resolution of degradation products from forced; hydrolytic (acidic, basic, neutral, oxidative, photolytic (acidic, basic, neutral, solid state and thermal (dry heat degradation was achieved on a LC – GC Qualisil BDS C18 column (250 mm × 4.6 mm × 5 μm by isocratic mode at ambient temperature, employing a mobile phase methanol and (0.2%, v/v orthophosphoric acid in ratio of (80:20, v/v at a flow rate of 1.0 mL min−1 and detection at 260 nm. ‘Design of Experiments’ (DOE employing ‘Central Composite Design’ (CCD and ‘Response Surface Methodology’ (RSM were applied as an advancement to traditional ‘One Variable at Time’ (OVAT approach to evaluate the effects of variations in selected factors (methanol content, flow rate, concentration of orthophosphoric acid as graphical interpretation for robustness and statistical interpretation was achieved with Multiple Linear Regression (MLR and ANOVA. The method succeeded over the validation parameters: linearity, precision, accuracy, limit of detection and limit of quantitation, and robustness. The method was applied effectively for analysis of in-house zileuton tablets.

  5. A novel stress distribution analytical model of O-ring seals under different properties of materials

    International Nuclear Information System (INIS)

    Wu, Di; Wang, Shao Ping; Wang, Xing Jian

    2017-01-01

    The elastomeric O-ring seals have been widely used as sealing elements in hydraulic systems. The sealing performance of O-ring seals is related to stress distribution. The stresses distribution depends on the squeeze rate and internal pressure, and would vary with properties of O-ring seals materials. Thus, in order to study the sealing performance of O-ring seals, it is necessary to describe the analytic relationship between stress distribution and properties of O-ring seals materials. For this purpose, a novel Stress distribution analytical model (SDAM) is proposed in this paper. The analytical model utilizes two stress complex functions to describe the stress distribution of O-ring seals. The proposed SDAM can express not only the analytical relationship between stress distribution and Young’s modulus, but also the one between stress distribution and Poisson’s ratio. Finally, compared results between finite element analysis and the SDAM validate that the proposed model can effectively reveal the stress distribution under different properties for O-ring materials

  6. A novel stress distribution analytical model of O-ring seals under different properties of materials

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Di; Wang, Shao Ping; Wang, Xing Jian [School of Automation Science and Electrical Engineering, Beihang University, Beijing (China)

    2017-01-15

    The elastomeric O-ring seals have been widely used as sealing elements in hydraulic systems. The sealing performance of O-ring seals is related to stress distribution. The stresses distribution depends on the squeeze rate and internal pressure, and would vary with properties of O-ring seals materials. Thus, in order to study the sealing performance of O-ring seals, it is necessary to describe the analytic relationship between stress distribution and properties of O-ring seals materials. For this purpose, a novel Stress distribution analytical model (SDAM) is proposed in this paper. The analytical model utilizes two stress complex functions to describe the stress distribution of O-ring seals. The proposed SDAM can express not only the analytical relationship between stress distribution and Young’s modulus, but also the one between stress distribution and Poisson’s ratio. Finally, compared results between finite element analysis and the SDAM validate that the proposed model can effectively reveal the stress distribution under different properties for O-ring materials.

  7. A Validated Reverse Phase HPLC Analytical Method for Quantitation of Glycoalkaloids in Solanum lycocarpum and Its Extracts

    Directory of Open Access Journals (Sweden)

    Renata Fabiane Jorge Tiossi

    2012-01-01

    Full Text Available Solanum lycocarpum (Solanaceae is native to the Brazilian Cerrado. Fruits of this species contain the glycoalkaloids solasonine (SN and solamargine (SM, which display antiparasitic and anticancer properties. A method has been developed for the extraction and HPLC-UV analysis of the SN and SM in different parts of S. lycocarpum, mainly comprising ripe and unripe fruits, leaf, and stem. This analytical method was validated and gave good detection response with linearity over a dynamic range of 0.77–1000.00 μg mL−1 and recovery in the range of 80.92–91.71%, allowing a reliable quantitation of the target compounds. Unripe fruits displayed higher concentrations of glycoalkaloids (1.04% ± 0.01 of SN and 0.69% ± 0.00 of SM than the ripe fruits (0.83% ± 0.02 of SN and 0.60% ± 0.01 of SM. Quantitation of glycoalkaloids in the alkaloidic extract gave 45.09% ± 1.14 of SN and 44.37% ± 0.60 of SM, respectively.

  8. Strain accumulation in a prototypic lmfbr nozzle: Experimental and analytical correlation

    International Nuclear Information System (INIS)

    Woodward, W.S.; Dhalia, A.K.; Berton, P.A.

    1986-01-01

    At an early stage in the design of the primary inlet nozzle for the Intermediate Heat Exchanger (IHX) of the Fast Flux Test Facility (FFTF), it was predicted that the inelastic strain accumulation during elevated temperature operation (1050 0 F/566 0 C) would exceed the ASME Code design allowables. Therefore, a proof test of a prototypic FFTF IHX nozzle was performed in the Westinghouse Creep Ratcheting Test Facility (CRTF) to measure the ratchet strain increments during the most severe postulated FFTF plant thermal transients. In addition, analytical procedures similar to those used in the plant design, were used to predict strain accumulation in the CRTF nozzle. This paper describes how the proof test was successfully completed, and it shows that both the test measurements and analytical predictions confirm that the FFTF IHX nozzle, subjected to postulated thermal and mechanical loadings, complies with the ASME Code strain limits. Also, these results provide a measure of validation for the analytical procedures used in the design of FFTF as well as demonstrate the structural adequacy of the FFTF IHX primary inlet nozzle

  9. Polarimetric and angular light-scattering from dense media: Comparison of a vectorial radiative transfer model with analytical, stochastic and experimental approaches

    International Nuclear Information System (INIS)

    Riviere, Nicolas; Ceolato, Romain; Hespel, Laurent

    2013-01-01

    Our work presents computations via a vectorial radiative transfer model of the polarimetric and angular light scattered by a stratified dense medium with small and intermediate optical thickness. We report the validation of this model using analytical results and different computational methods like stochastic algorithms. Moreover, we check the model with experimental data from a specific scatterometer developed at the Onera. The advantages and disadvantages of a radiative approach are discussed. This paper represents a step toward the characterization of particles in dense media involving multiple scattering. -- Highlights: • A vectorial radiative transfer model to simulate the light scattered by stratified layers is developed. • The vectorial radiative transfer equation is solved using an adding–doubling technique. • The results are compared to analytical and stochastic data. • Validation with experimental data from a scatterometer developed at Onera is presented

  10. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  11. Pre-analytical issues in the haemostasis laboratory: guidance for the clinical laboratories.

    Science.gov (United States)

    Magnette, A; Chatelain, M; Chatelain, B; Ten Cate, H; Mullier, F

    2016-01-01

    Ensuring quality has become a daily requirement in laboratories. In haemostasis, even more than in other disciplines of biology, quality is determined by a pre-analytical step that encompasses all procedures, starting with the formulation of the medical question, and includes patient preparation, sample collection, handling, transportation, processing, and storage until time of analysis. This step, based on a variety of manual activities, is the most vulnerable part of the total testing process and is a major component of the reliability and validity of results in haemostasis and constitutes the most important source of erroneous or un-interpretable results. Pre-analytical errors may occur throughout the testing process and arise from unsuitable, inappropriate or wrongly handled procedures. Problems may arise during the collection of blood specimens such as misidentification of the sample, use of inadequate devices or needles, incorrect order of draw, prolonged tourniquet placing, unsuccessful attempts to locate the vein, incorrect use of additive tubes, collection of unsuitable samples for quality or quantity, inappropriate mixing of a sample, etc. Some factors can alter the result of a sample constituent after collection during transportation, preparation and storage. Laboratory errors can often have serious adverse consequences. Lack of standardized procedures for sample collection accounts for most of the errors encountered within the total testing process. They can also have clinical consequences as well as a significant impact on patient care, especially those related to specialized tests as these are often considered as "diagnostic". Controlling pre-analytical variables is critical since this has a direct influence on the quality of results and on their clinical reliability. The accurate standardization of the pre-analytical phase is of pivotal importance for achieving reliable results of coagulation tests and should reduce the side effects of the influence

  12. Analytical and experimental comparisons of modal properties of a flood water storage tank

    International Nuclear Information System (INIS)

    Thinnes, G.L.; Dooley, W.T.; Gorman, V.W.

    1986-01-01

    Comparisons of measured frequencies, mode shapes, and damping from experimental modal testing and analytical predictions have been performed on a vertically standing 90,000 liter flood water storage tank. The purpose of the study was to compare the accuracy of analytical calculations with experimentally obtained data. The need for this comparison arises because safety assessments of the integrity of such vessels are normally based upon analyses which have not usually been validated by experiments. The tank was excited using random input from an electromagnetic shaker. Data reduction was performed using frequency response functions. Analyses, including modal analysis calculations, were performed on the tank for three water level conditions using finite element methods. Results of the analyses are presented, comparisons to test data are shown, and conclusions and recommendations are made as a result of these studies. 5 refs., 8 figs., 2 tabs

  13. An integrated approach to validation of safeguards and security program performance

    International Nuclear Information System (INIS)

    Altman, W.D.; Hunt, J.S.; Hockert, J.W.

    1988-01-01

    Department of Energy (DOE) requirements for safeguards and security programs are becoming increasingly performance oriented. Master Safeguards and Security Agreemtns specify performance levels for systems protecting DOE security interests. In order to measure and validate security system performance, Lawrence Livermore National Laboratory (LLNL) has developed cost effective validation tools and a comprehensive validation approach that synthesizes information gained from different activities such as force on force exercises, limited scope performance tests, equipment testing, vulnerability analyses, and computer modeling; into an overall assessment of the performance of the protection system. The analytic approach employs logic diagrams adapted from the fault and event trees used in probabilistic risk assessment. The synthesis of the results from the various validation activities is accomplished using a method developed by LLNL, based upon Bayes' theorem

  14. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark® for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens

    Directory of Open Access Journals (Sweden)

    Jeffrey S. Larson

    2010-01-01

    Full Text Available We report here the results of the analytical validation of assays that measure HER2 total protein (H2T and HER2 homodimer (H2D expression in Formalin Fixed Paraffin Embedded (FFPE breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7–10 times greater than conventional immunohistochemistry (IHC (HercepTest. The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC or on indirect assessments of gene amplification (FISH.

  15. D4.1 Learning analytics: theoretical background, methodology and expected results

    NARCIS (Netherlands)

    Tammets, Kairit; Laanpere, Mart; Eradze, Maka; Brouns, Francis; Padrón-Nápoles, Carmen; De Rosa, Rosanna; Ferrari, Chiara

    2014-01-01

    The purpose of the EMMA project is to showcase excellence in innovative teaching methodologies and learning approaches through the large-scale piloting of MOOCs on different subjects. The main objectives related with the implementation of learning analytics in EMMa project are to: ● develop the

  16. Analytic validation and comparison of three commercial immunoassays for measurement of plasma atrial/A-type natriuretic peptide concentration in horses

    DEFF Research Database (Denmark)

    Trachsel, D S; Schwarzwald, C C; Grenacher, B

    2014-01-01

    Measurement of atrial/A-type natriuretic peptide (ANP) concentrations may be of use for assessment of cardiac disease, and reliable data on the analytic performance of available assays are needed. To assess the suitability for clinical use of commercially available ANP assays, intra-assay and inter......-Altman analyses. For all assays, precision was moderate but acceptable and dilution parallelism was good. All assays showed analytic performance similar to other immunoassays used in veterinary medicine. However, the results from the three assays were poorly comparable. Our study highlights the need...

  17. Relativistic quantum mechanic calculation of photoionization cross-section of hydrogenic and non-hydrogenic states using analytical potentials

    International Nuclear Information System (INIS)

    Rodriguez, R.; Gil, J.M.; Rubiano, J.G.; Florido, R.; Martel, P.; Minguez, E.

    2005-01-01

    Photoionization process is a subject of special importance in many areas of physics. Numerical methods must be used in order to obtain photoionization cross-sections for non-hydrogenic levels. The atomic data required to calculate them is huge so self-consistent calculations increase computing time considerably. Analytical potentials are a useful alternative because they avoid the iterative procedures typical in self-consistent models. In this work, we present a relativistic quantum calculation of photoionization cross-sections for isolated ions based on an analytical potential to obtain the required atomic data, which is valid both for hydrogenic and non-hydrogenic ions. Comparisons between our results and others obtained using either widely used analytical expressions for the cross-sections or more sophisticated calculations are done

  18. Validation of the method for investigation of radiopharmaceuticals for in vitro use

    International Nuclear Information System (INIS)

    Vranjes, S; Jovanovic, M.; Orlic, M.; Lazic, E. . E-mail address of corresponding author: sanjav@vin.bg.ac.yu

    2005-01-01

    The aim of this study was to validate analytical method for determination of total radioactivity and radioactive concentration of 125 I-triiodotironin, radiopharmaceutical for in vitro use. Analytical parameters: selectivity, accuracy, linearity and range of this method were determined. Values obtained for all parameters are reasonable for analytical methods, therefore this method could be used for farther investigation. (author)

  19. Analytic manifolds in uniform algebras

    International Nuclear Information System (INIS)

    Tonev, T.V.

    1988-12-01

    Here we extend Bear-Hile's result concerning the version of famous Bishop's theorem for one-dimensional analytic structures in two directions: for n-dimensional complex analytic manifolds, n>1, and for generalized analytic manifolds. 14 refs

  20. Analytical Validation of Accelerator Mass Spectrometry for Pharmaceutical Development: the Measurement of Carbon-14 Isotope Ratio

    International Nuclear Information System (INIS)

    Keck, B.D.; Ognibene, T.; Vogel, J.S.

    2010-01-01

    Accelerator mass spectrometry (AMS) is an isotope based measurement technology that utilizes carbon-14 labeled compounds in the pharmaceutical development process to measure compounds at very low concentrations, empowers microdosing as an investigational tool, and extends the utility of 14 C labeled compounds to dramatically lower levels. It is a form of isotope ratio mass spectrometry that can provide either measurements of total compound equivalents or, when coupled to separation technology such as chromatography, quantitation of specific compounds. The properties of AMS as a measurement technique are investigated here, and the parameters of method validation are shown. AMS, independent of any separation technique to which it may be coupled, is shown to be accurate, linear, precise, and robust. As the sensitivity and universality of AMS is constantly being explored and expanded, this work underpins many areas of pharmaceutical development including drug metabolism as well as absorption, distribution and excretion of pharmaceutical compounds as a fundamental step in drug development. The validation parameters for pharmaceutical analyses were examined for the accelerator mass spectrometry measurement of 14 C/C ratio, independent of chemical separation procedures. The isotope ratio measurement was specific (owing to the 14 C label), stable across samples storage conditions for at least one year, linear over 4 orders of magnitude with an analytical range from one tenth Modern to at least 2000 Modern (instrument specific). Further, accuracy was excellent between 1 and 3 percent while precision expressed as coefficient of variation is between 1 and 6% determined primarily by radiocarbon content and the time spent analyzing a sample. Sensitivity, expressed as LOD and LLOQ was 1 and 10 attomoles of carbon-14 (which can be expressed as compound equivalents) and for a typical small molecule labeled at 10% incorporated with 14 C corresponds to 30 fg equivalents. AMS

  1. Analytical Validation of Accelerator Mass Spectrometry for Pharmaceutical Development: the Measurement of Carbon-14 Isotope Ratio.

    Energy Technology Data Exchange (ETDEWEB)

    Keck, B D; Ognibene, T; Vogel, J S

    2010-02-05

    Accelerator mass spectrometry (AMS) is an isotope based measurement technology that utilizes carbon-14 labeled compounds in the pharmaceutical development process to measure compounds at very low concentrations, empowers microdosing as an investigational tool, and extends the utility of {sup 14}C labeled compounds to dramatically lower levels. It is a form of isotope ratio mass spectrometry that can provide either measurements of total compound equivalents or, when coupled to separation technology such as chromatography, quantitation of specific compounds. The properties of AMS as a measurement technique are investigated here, and the parameters of method validation are shown. AMS, independent of any separation technique to which it may be coupled, is shown to be accurate, linear, precise, and robust. As the sensitivity and universality of AMS is constantly being explored and expanded, this work underpins many areas of pharmaceutical development including drug metabolism as well as absorption, distribution and excretion of pharmaceutical compounds as a fundamental step in drug development. The validation parameters for pharmaceutical analyses were examined for the accelerator mass spectrometry measurement of {sup 14}C/C ratio, independent of chemical separation procedures. The isotope ratio measurement was specific (owing to the {sup 14}C label), stable across samples storage conditions for at least one year, linear over 4 orders of magnitude with an analytical range from one tenth Modern to at least 2000 Modern (instrument specific). Further, accuracy was excellent between 1 and 3 percent while precision expressed as coefficient of variation is between 1 and 6% determined primarily by radiocarbon content and the time spent analyzing a sample. Sensitivity, expressed as LOD and LLOQ was 1 and 10 attomoles of carbon-14 (which can be expressed as compound equivalents) and for a typical small molecule labeled at 10% incorporated with {sup 14}C corresponds to 30 fg

  2. Analytical method development of nifedipine and its degradants binary mixture using high performance liquid chromatography through a quality by design approach

    Science.gov (United States)

    Choiri, S.; Ainurofiq, A.; Ratri, R.; Zulmi, M. U.

    2018-03-01

    Nifedipin (NIF) is a photo-labile drug that easily degrades when it exposures a sunlight. This research aimed to develop of an analytical method using a high-performance liquid chromatography and implemented a quality by design approach to obtain effective, efficient, and validated analytical methods of NIF and its degradants. A 22 full factorial design approach with a curvature as a center point was applied to optimize of the analytical condition of NIF and its degradants. Mobile phase composition (MPC) and flow rate (FR) as factors determined on the system suitability parameters. The selected condition was validated by cross-validation using a leave one out technique. Alteration of MPC affected on time retention significantly. Furthermore, an increase of FR reduced the tailing factor. In addition, the interaction of both factors affected on an increase of the theoretical plates and resolution of NIF and its degradants. The selected analytical condition of NIF and its degradants has been validated at range 1 – 16 µg/mL that had good linearity, precision, accuration and efficient due to an analysis time within 10 min.

  3. Analytical method for high resolution liquid chromatography for quality control French Macaw

    International Nuclear Information System (INIS)

    Garcia Penna, Caridad M; Torres Amaro, Leonid; Menendez Castillo, Rosa; Sanchez, Esther; Martinez Espinosa, Vivian; Gonzalez, Maria Lidia; Rodriguez, Carlos

    2007-01-01

    Was developed and validated an analytical method for high resolution liquid chromatography applicable to quality control of drugs dry French Macaw (Senna alata L. Roxb.) With ultraviolet detection at 340 nm. The method for high resolution liquid chromatography used to quantify the sennosides A and B, main components, was validated and proved to be specific, linear, precise and accurate. (Author)

  4. Data validation summary report for the 100-HR-3 Round 8, Phases 1 and 2 groundwater sampling task

    International Nuclear Information System (INIS)

    1996-01-01

    This report presents a summary of data validation results on groundwater samples collected for the 100-HR-3 Round 8 Groundwater Sampling task. The analyses performed for this project consisted of: metals, general chemistry, and radiochemistry. The laboratories conducting the analyses were Quanterra Environmental Services (QES) and Lockheed Analytical Services. As required by the contract and the WHC statement of work (WHC 1994), data validation was conducted using the Westinghouse data validation procedures for chemical and radiochemical analyses (WHC 1993a and 1993b). Sample results were validated to levels A and D as described in the data validation procedures. At the completion of validation and verification of each data package, a data validation summary was prepared and transmitted with the original documentation to Environmental Restoration Contract (ERC) for inclusion in the project QA record

  5. 42 CFR 493.571 - Disclosure of accreditation, State and CMS validation inspection results.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Disclosure of accreditation, State and CMS... Program § 493.571 Disclosure of accreditation, State and CMS validation inspection results. (a) Accreditation organization inspection results. CMS may disclose accreditation organization inspection results to...

  6. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  7. Dark matter as a dynamic effect due to a non-minimal gravitational coupling with matter (II): Numerical results

    International Nuclear Information System (INIS)

    Paramos, J; Bertolami, O

    2010-01-01

    Following the previous contribution discussing the rich phenomenology of models possessing a non-minimal coupling between matter and geometry, with emphasis on its characteristics and analytical results, the obtained 'dark matter' mimicking mechanism is numerically studied. This allows for ascertaining the order of magnitude of the relevant parameters, leading to a validation of the analytical results and the discussion of possible cosmological implications and deviation from universality.

  8. Twist-2 at seven loops in planar N=4 SYM theory: full result and analytic properties

    Energy Technology Data Exchange (ETDEWEB)

    Marboe, Christian [School of Mathematics, Trinity College Dublin,College Green, Dublin 2 (Ireland); Institut für Mathematik und Institut für Physik, Humboldt-Universität zu Berlin,IRIS Adlershof, Zum Großen Windkanal 6, 12489 Berlin (Germany); Velizhanin, Vitaly [Theoretical Physics Division, NRC “Kurchatov Institute”,Petersburg Nuclear Physics Institute, Orlova Roscha,Gatchina, 188300 St. Petersburg (Russian Federation); Institut für Mathematik und Institut für Physik, Humboldt-Universität zu Berlin,IRIS Adlershof, Zum Großen Windkanal 6, 12489 Berlin (Germany)

    2016-11-04

    The anomalous dimension of twist-2 operators of arbitrary spin in planar N=4 SYM theory is found at seven loops by using the quantum spectral curve to compute values at fixed spin, and reconstructing the general result using the LLL-algorithm together with modular arithmetic. The result of the analytic continuation to negative spin is presented, and its relation with the recently computed correction to the BFKL and double-logarithmic equation is discussed.

  9. Fast analytical scatter estimation using graphics processing units.

    Science.gov (United States)

    Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris

    2015-01-01

    To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.

  10. Translating tumor biology into personalized treatment planning: analytical performance characteristics of the Oncotype DX® Colon Cancer Assay

    International Nuclear Information System (INIS)

    Clark-Langone, Kim M; Sangli, Chithra; Krishnakumar, Jayadevi; Watson, Drew

    2010-01-01

    The Oncotype DX ® Colon Cancer Assay is a new diagnostic test for determining the likelihood of recurrence in stage II colon cancer patients after surgical resection using fixed paraffin embedded (FPE) primary colon tumor tissue. Like the Oncotype DX Breast Cancer Assay, this is a high complexity, multi-analyte, reverse transcription (RT) polymerase chain reaction (PCR) assay that measures the expression levels of specific cancer-related genes. By capturing the biology underlying each patient's tumor, the Oncotype DX Colon Cancer Assay provides a Recurrence Score (RS) that reflects an individualized risk of disease recurrence. Here we describe its analytical performance using pre-determined performance criteria, which is a critical component of molecular diagnostic test validation. All analytical measurements met pre-specified performance criteria. PCR amplification efficiency for all 12 assays was high, ranging from 96% to 107%, while linearity was demonstrated over an 11 log 2 concentration range for all assays. Based on estimated components of variance for FPE RNA pools, analytical reproducibility and precision demonstrated low SDs for individual genes (0.16 to 0.32 C T s), gene groups (≤0.05 normalized/aggregate C T s) and RS (≤1.38 RS units). Analytical performance characteristics shown here for both individual genes and gene groups in the Oncotype DX Colon Cancer Assay demonstrate consistent translation of specific biology of individual tumors into clinically useful diagnostic information. The results of these studies illustrate how the analytical capability of the Oncotype DX Colon Cancer Assay has enabled clinical validation of a test to determine individualized recurrence risk after colon cancer surgery

  11. Analytic model for ultrasound energy receivers and their optimal electric loads II: Experimental validation

    Science.gov (United States)

    Gorostiaga, M.; Wapler, M. C.; Wallrabe, U.

    2017-10-01

    In this paper, we verify the two optimal electric load concepts based on the zero reflection condition and on the power maximization approach for ultrasound energy receivers. We test a high loss 1-3 composite transducer, and find that the measurements agree very well with the predictions of the analytic model for plate transducers that we have developed previously. Additionally, we also confirm that the power maximization and zero reflection loads are very different when the losses in the receiver are high. Finally, we compare the optimal load predictions by the KLM and the analytic models with frequency dependent attenuation to evaluate the influence of the viscosity.

  12. Fuzzy-logic based strategy for validation of multiplex methods: example with qualitative GMO assays.

    Science.gov (United States)

    Bellocchi, Gianni; Bertholet, Vincent; Hamels, Sandrine; Moens, W; Remacle, José; Van den Eede, Guy

    2010-02-01

    This paper illustrates the advantages that a fuzzy-based aggregation method could bring into the validation of a multiplex method for GMO detection (DualChip GMO kit, Eppendorf). Guidelines for validation of chemical, bio-chemical, pharmaceutical and genetic methods have been developed and ad hoc validation statistics are available and routinely used, for in-house and inter-laboratory testing, and decision-making. Fuzzy logic allows summarising the information obtained by independent validation statistics into one synthetic indicator of overall method performance. The microarray technology, introduced for simultaneous identification of multiple GMOs, poses specific validation issues (patterns of performance for a variety of GMOs at different concentrations). A fuzzy-based indicator for overall evaluation is illustrated in this paper, and applied to validation data for different genetically modified elements. Remarks were drawn on the analytical results. The fuzzy-logic based rules were shown to be applicable to improve interpretation of results and facilitate overall evaluation of the multiplex method.

  13. Piecewise linear emulator of the nonlinear Schroedinger equation and the resulting analytic solutions for Bose-Einstein condensates

    International Nuclear Information System (INIS)

    Theodorakis, Stavros

    2003-01-01

    We emulate the cubic term Ψ 3 in the nonlinear Schroedinger equation by a piecewise linear term, thus reducing the problem to a set of uncoupled linear inhomogeneous differential equations. The resulting analytic expressions constitute an excellent approximation to the exact solutions, as is explicitly shown in the case of the kink, the vortex, and a δ function trap. Such a piecewise linear emulation can be used for any differential equation where the only nonlinearity is a Ψ 3 one. In particular, it can be used for the nonlinear Schroedinger equation in the presence of harmonic traps, giving analytic Bose-Einstein condensate solutions that reproduce very accurately the numerically calculated ones in one, two, and three dimensions

  14. Proficiency Testing by Interlaboratory Comparison Performed in 2010-2015 for Neutron Activation Analysis and Other Analytical Techniques

    International Nuclear Information System (INIS)

    2017-12-01

    The IAEA supports its Member States to increase the utilization of their research reactors. Small and medium sized reactors are mostly used for neutron activation analysis (NAA). Although the markets for NAA laboratories have been identified, demonstration of valid analytical results and organizational quality of the work process are preconditions for expanding the stakeholder community, particularly in commercial routine application of this powerful technique. The IAEA has implemented a new mechanism for supporting NAA laboratories in demonstrating their analytical performance by participation in proficiency testing schemes by interlaboratory comparison. This activity makes possible the identification of deviations and non-conformities, their causes and the process to implement effective approaches to eliminate them. Over 30 laboratories participated between 2010 and 2015 in consecutive proficiency tests organized by the IAEA in conjunction with the Wageningen Evaluating Programmes for Analytical Laboratories (WEPAL) to assess their analytical performances. This publication reports the findings and includes lessons learned of this activity. An attached CD-ROM contains many individual participating laboratory papers sharing their individual results and experience gained through this participation.

  15. Role of the IAEA's ALMERA network in harmonization of analytical procedures applicable worldwide for radiological emergencies

    International Nuclear Information System (INIS)

    Pitois, A.; Osvath, I.; Tarjan, S.; Groening, M.; Osborn, D.; )

    2016-01-01

    The International Atomic Energy Agency (IAEA) coordinates and provides analytical support to the worldwide network of Analytical Laboratories for the Measurement of Environmental Radioactivity (ALMERA), consisting at the end of 2015 of 154 laboratories in 85 countries. This network, established by the IAEA in 1995, has for aim to provide timely and reliable measurement results of environmental radioactivity in routine monitoring and emergency situations. The IAEA supports the ALMERA laboratories in their routine and emergency response environmental monitoring activities by organizing proficiency tests and inter-laboratory comparison exercises, developing validated analytical procedures for environmental radioactivity measurement, and organizing training courses and workshops. The network also acts as a forum for sharing knowledge and expertise. The aim of this paper is to describe the current status of ALMERA analytical method development activities for radiological emergencies and the plans for further development in the field

  16. An analytic parton shower. Algorithms, implementation and validation

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, Sebastian

    2012-06-15

    The realistic simulation of particle collisions is an indispensable tool to interpret the data measured at high-energy colliders, for example the now running Large Hadron Collider at CERN. These collisions at these colliders are usually simulated in the form of exclusive events. This thesis focuses on the perturbative QCD part involved in the simulation of these events, particularly parton showers and the consistent combination of parton showers and matrix elements. We present an existing parton shower algorithm for emissions off final state partons along with some major improvements. Moreover, we present a new parton shower algorithm for emissions off incoming partons. The aim of these particular algorithms, called analytic parton shower algorithms, is to be able to calculate the probabilities for branchings and for whole events after the event has been generated. This allows a reweighting procedure to be applied after the events have been simulated. We show a detailed description of the algorithms, their implementation and the interfaces to the event generator WHIZARD. Moreover we discuss the implementation of a MLM-type matching procedure and an interface to the shower and hadronization routines from PYTHIA. Finally, we compare several predictions by our implementation to experimental measurements at LEP, Tevatron and LHC, as well as to predictions obtained using PYTHIA. (orig.)

  17. An analytic parton shower. Algorithms, implementation and validation

    International Nuclear Information System (INIS)

    Schmidt, Sebastian

    2012-06-01

    The realistic simulation of particle collisions is an indispensable tool to interpret the data measured at high-energy colliders, for example the now running Large Hadron Collider at CERN. These collisions at these colliders are usually simulated in the form of exclusive events. This thesis focuses on the perturbative QCD part involved in the simulation of these events, particularly parton showers and the consistent combination of parton showers and matrix elements. We present an existing parton shower algorithm for emissions off final state partons along with some major improvements. Moreover, we present a new parton shower algorithm for emissions off incoming partons. The aim of these particular algorithms, called analytic parton shower algorithms, is to be able to calculate the probabilities for branchings and for whole events after the event has been generated. This allows a reweighting procedure to be applied after the events have been simulated. We show a detailed description of the algorithms, their implementation and the interfaces to the event generator WHIZARD. Moreover we discuss the implementation of a MLM-type matching procedure and an interface to the shower and hadronization routines from PYTHIA. Finally, we compare several predictions by our implementation to experimental measurements at LEP, Tevatron and LHC, as well as to predictions obtained using PYTHIA. (orig.)

  18. Dependability validation by means of fault injection: method, implementation, application

    International Nuclear Information System (INIS)

    Arlat, Jean

    1990-01-01

    This dissertation presents theoretical and practical results concerning the use of fault injection as a means for testing fault tolerance in the framework of the experimental dependability validation of computer systems. The dissertation first presents the state-of-the-art of published work on fault injection, encompassing both hardware (fault simulation, physical fault Injection) and software (mutation testing) issues. Next, the major attributes of fault injection (faults and their activation, experimental readouts and measures, are characterized taking into account: i) the abstraction levels used to represent the system during the various phases of its development (analytical, empirical and physical models), and Il) the validation objectives (verification and evaluation). An evaluation method is subsequently proposed that combines the analytical modeling approaches (Monte Carlo Simulations, closed-form expressions. Markov chains) used for the representation of the fault occurrence process and the experimental fault Injection approaches (fault Simulation and physical injection); characterizing the error processing and fault treatment provided by the fault tolerance mechanisms. An experimental tool - MESSALINE - is then defined and presented. This tool enables physical faults to be Injected In an hardware and software prototype of the system to be validated. Finally, the application of MESSALINE for testing two fault-tolerant systems possessing very dissimilar features and the utilization of the experimental results obtained - both as design feedbacks and for dependability measures evaluation - are used to illustrate the relevance of the method. (author) [fr

  19. Tank 241-T-204, core 188 analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Nuzum, J.L.

    1997-07-24

    TANK 241-T-204, CORE 188, ANALYTICAL RESULTS FOR THE FINAL REPORT. This document is the final laboratory report for Tank 241 -T-204. Push mode core segments were removed from Riser 3 between March 27, 1997, and April 11, 1997. Segments were received and extruded at 222-8 Laboratory. Analyses were performed in accordance with Tank 241-T-204 Push Mode Core Sampling and analysis Plan (TRAP) (Winkleman, 1997), Letter of instruction for Core Sample Analysis of Tanks 241-T-201, 241- T-202, 241-T-203, and 241-T-204 (LAY) (Bell, 1997), and Safety Screening Data Qual@ Objective (DO) ODukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT) or differential scanning calorimetry (DC) analyses exceeded the notification limits stated in DO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group and are not considered in this report.

  20. Analytical model for Stirling cycle machine design

    Energy Technology Data Exchange (ETDEWEB)

    Formosa, F. [Laboratoire SYMME, Universite de Savoie, BP 80439, 74944 Annecy le Vieux Cedex (France); Despesse, G. [Laboratoire Capteurs Actionneurs et Recuperation d' Energie, CEA-LETI-MINATEC, Grenoble (France)

    2010-10-15

    In order to study further the promising free piston Stirling engine architecture, there is a need of an analytical thermodynamic model which could be used in a dynamical analysis for preliminary design. To aim at more realistic values, the models have to take into account the heat losses and irreversibilities on the engine. An analytical model which encompasses the critical flaws of the regenerator and furthermore the heat exchangers effectivenesses has been developed. This model has been validated using the whole range of the experimental data available from the General Motor GPU-3 Stirling engine prototype. The effects of the technological and operating parameters on Stirling engine performance have been investigated. In addition to the regenerator influence, the effect of the cooler effectiveness is underlined. (author)

  1. WEB ANALYTICS COMBINED WITH EYE TRACKING FOR SUCCESSFUL USER EXPERIENCE DESIGN: A CASE STUDY

    OpenAIRE

    Magdalena BORYS; Monika CZWÓRNÓG; Tomasz RATAJCZYK

    2016-01-01

    The authors propose a new approach for the mobile user experience design process by means of web analytics and eye-tracking. The proposed method was applied to design the LUT mobile website. In the method, to create the mobile website design, data of various users and their behaviour were gathered and analysed using the web analytics tool. Next, based on the findings from web analytics, the mobile prototype for the website was created and validated in eye-tracking usability testing. The analy...

  2. Analytical Results for Scaling Properties of the Spectrum of the Fibonacci Chain

    Science.gov (United States)

    Piéchon, Frédéric; Benakli, Mourad; Jagannathan, Anuradha

    1995-06-01

    We solve the approximate renormalization group found by Niu and Nori for a quasiperiodic tight-binding Hamiltonian on the Fibonacci chain. This enables us to characterize analytically the spectral properties of this model.

  3. Fields, particles and analyticity: recent results or 30 goldberg (ER) variations on B.A.C.H

    International Nuclear Information System (INIS)

    Bros, J.

    1991-01-01

    As it is known, Axiomatic Field Theory (A) implies double analyticity of the η-point functions in space-time and energy-momentum Complex Variables (C), with various interconnections by Fourier-Laplace analysis. When the latter is replaced by. Harmonic Analysis (H) on spheres and hyperboloids, a new kind of double analyticity results from (A) (i.e. from locality, spectral condition, temperateness and invariance): complex angular momentum is thereby introduced (a missing chapter in (A)). Exploitation of Asymptotic Completeness via Bethe-Salpeter-type equations (B) leads to new developments of the previous theme on (A, C, H) (complex angular momentum) and of other themes on (A,C) (crossing, Haag-Swieca property etc...). Various aspects of (A) + (B) have been implemented in Constructive Field Theory (composite spectrum, asymptotic properties etc...) by a combination of specific techniques and of model-independent methods

  4. Fields, particles and analyticity: recent results or 30 goldberg (ER) variations on B.A.C.H

    Energy Technology Data Exchange (ETDEWEB)

    Bros, J

    1992-12-31

    As it is known, Axiomatic Field Theory (A) implies double analyticity of the {eta}-point functions in space-time and energy-momentum Complex Variables (C), with various interconnections by Fourier-Laplace analysis. When the latter is replaced by. Harmonic Analysis (H) on spheres and hyperboloids, a new kind of double analyticity results from (A) (i.e. from locality, spectral condition, temperateness and invariance): complex angular momentum is thereby introduced (a missing chapter in (A)). Exploitation of Asymptotic Completeness via Bethe-Salpeter-type equations (B) leads to new developments of the previous theme on (A, C, H) (complex angular momentum) and of other themes on (A,C) (crossing, Haag-Swieca property etc...). Various aspects of (A) + (B) have been implemented in Constructive Field Theory (composite spectrum, asymptotic properties etc...) by a combination of specific techniques and of model-independent methods.

  5. Manufacturing data analytics using a virtual factory representation.

    Science.gov (United States)

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  6. Croatian Analytical Terminology

    Directory of Open Access Journals (Sweden)

    Kastelan-Macan; M.

    2008-04-01

    Full Text Available Results of analytical research are necessary in all human activities. They are inevitable in making decisions in the environmental chemistry, agriculture, forestry, veterinary medicine, pharmaceutical industry, and biochemistry. Without analytical measurements the quality of materials and products cannot be assessed, so that analytical chemistry is an essential part of technical sciences and disciplines.The language of Croatian science, and analytical chemistry within it, was one of the goals of our predecessors. Due to the political situation, they did not succeed entirely, but for the scientists in independent Croatia this is a duty, because language is one of the most important features of the Croatian identity. The awareness of the need to introduce Croatian terminology was systematically developed in the second half of the 19th century, along with the founding of scientific societies and the wish of scientists to write their scientific works in Croatian, so that the results of their research may be applied in economy. Many authors of textbooks from the 19th and the first half of the 20th century contributed to Croatian analytical terminology (F. Rački, B. Šulek, P. Žulić, G. Pexidr, J. Domac, G. Janeček , F. Bubanović, V. Njegovan and others. M. DeŢelić published the first systematic chemical terminology in 1940, adjusted to the IUPAC recommendations. In the second half of 20th century textbooks in classic analytical chemistry were written by V. Marjanović-Krajovan, M. Gyiketta-Ogrizek, S. Žilić and others. I. Filipović wrote the General and Inorganic Chemistry textbook and the Laboratory Handbook (in collaboration with P. Sabioncello and contributed greatly to establishing the terminology in instrumental analytical methods.The source of Croatian nomenclature in modern analytical chemistry today are translated textbooks by Skoog, West and Holler, as well as by Günnzler i Gremlich, and original textbooks by S. Turina, Z.

  7. The impact of pre-analytical variables on the stability of neurofilament proteins in CSF, determined by a novel validated SinglePlex Luminex assay and ELISA.

    Science.gov (United States)

    Koel-Simmelink, Marleen J A; Vennegoor, Anke; Killestein, Joep; Blankenstein, Marinus A; Norgren, Niklas; Korth, Carsten; Teunissen, Charlotte E

    2014-01-15

    Neurofilament (Nf) proteins have been shown to be promising biomarkers for monitoring and predicting disease progression for various neurological diseases. The aim of this study was to evaluate the effects of pre-analytical variables on the concentration of neurofilament heavy (NfH) and neurofilament light (NfL) proteins. For NfH an in-house newly-developed and validated SinglePlex Luminex assay was used; ELISA was used to analyze NfL. For the NfL ELISA assay, the intra- and inter-assay variation was respectively, 1.5% and 16.7%. Analytical performance of the NfH SinglePlex Luminex assay in terms of sensitivity (6.6pg/mL), recovery in cerebrospinal fluid (CSF) (between 90 and 104%), linearity (from 6.6-1250pg/mL), and inter- and intra-assay variation (<8%) were good. Concentrations of both NfL and NfH appeared not negatively affected by blood contamination, repeated freeze-thaw cycles (up to 4), delayed processing (up to 24hours) and during long-term storage at -20°C, 4°C, and room temperature. A decrease in concentration was observed during storage of both neurofilament proteins up to 21days at 37°C, which was significant by day 5. The newly developed NfH SinglePlex Luminex assay has a good sensitivity and is robust. Moreover, both NfH and NfL are stable under the most prevalent pre-analytical variations. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Analytical dose modeling for preclinical proton irradiation of millimetric targets.

    Science.gov (United States)

    Vanstalle, Marie; Constanzo, Julie; Karakaya, Yusuf; Finck, Christian; Rousseau, Marc; Brasse, David

    2018-01-01

    Due to the considerable development of proton radiotherapy, several proton platforms have emerged to irradiate small animals in order to study the biological effectiveness of proton radiation. A dedicated analytical treatment planning tool was developed in this study to accurately calculate the delivered dose given the specific constraints imposed by the small dimensions of the irradiated areas. The treatment planning system (TPS) developed in this study is based on an analytical formulation of the Bragg peak and uses experimental range values of protons. The method was validated after comparison with experimental data from the literature and then compared to Monte Carlo simulations conducted using Geant4. Three examples of treatment planning, performed with phantoms made of water targets and bone-slab insert, were generated with the analytical formulation and Geant4. Each treatment planning was evaluated using dose-volume histograms and gamma index maps. We demonstrate the value of the analytical function for mouse irradiation, which requires a targeting accuracy of 0.1 mm. Using the appropriate database, the analytical modeling limits the errors caused by misestimating the stopping power. For example, 99% of a 1-mm tumor irradiated with a 24-MeV beam receives the prescribed dose. The analytical dose deviations from the prescribed dose remain within the dose tolerances stated by report 62 of the International Commission on Radiation Units and Measurements for all tested configurations. In addition, the gamma index maps show that the highly constrained targeting accuracy of 0.1 mm for mouse irradiation leads to a significant disagreement between Geant4 and the reference. This simulated treatment planning is nevertheless compatible with a targeting accuracy exceeding 0.2 mm, corresponding to rat and rabbit irradiations. Good dose accuracy for millimetric tumors is achieved with the analytical calculation used in this work. These volume sizes are typical in mouse

  9. Analytical Modeling of Triple-Metal Hetero-Dielectric DG SON TFET

    Science.gov (United States)

    Mahajan, Aman; Dash, Dinesh Kumar; Banerjee, Pritha; Sarkar, Subir Kumar

    2018-02-01

    In this paper, a 2-D analytical model of triple-metal hetero-dielectric DG TFET is presented by combining the concepts of triple material gate engineering and hetero-dielectric engineering. Three metals with different work functions are used as both front- and back gate electrodes to modulate the barrier at source/channel and channel/drain interface. In addition to this, front gate dielectric consists of high-K HfO2 at source end and low-K SiO2 at drain side, whereas back gate dielectric is replaced by air to further improve the ON current of the device. Surface potential and electric field of the proposed device are formulated solving 2-D Poisson's equation and Young's approximation. Based on this electric field expression, tunneling current is obtained by using Kane's model. Several device parameters are varied to examine the behavior of the proposed device. The analytical model is validated with TCAD simulation results for proving the accuracy of our proposed model.

  10. Distribution of Steps with Finite-Range Interactions: Analytic Approximations and Numerical Results

    Science.gov (United States)

    GonzáLez, Diego Luis; Jaramillo, Diego Felipe; TéLlez, Gabriel; Einstein, T. L.

    2013-03-01

    While most Monte Carlo simulations assume only nearest-neighbor steps interact elastically, most analytic frameworks (especially the generalized Wigner distribution) posit that each step elastically repels all others. In addition to the elastic repulsions, we allow for possible surface-state-mediated interactions. We investigate analytically and numerically how next-nearest neighbor (NNN) interactions and, more generally, interactions out to q'th nearest neighbor alter the form of the terrace-width distribution and of pair correlation functions (i.e. the sum over n'th neighbor distribution functions, which we investigated recently.[2] For physically plausible interactions, we find modest changes when NNN interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.

  11. Pre-analytical and analytical validations and clinical applications of a miniaturized, simple and cost-effective solid phase extraction combined with LC-MS/MS for the simultaneous determination of catecholamines and metanephrines in spot urine samples.

    Science.gov (United States)

    Li, Xiaoguang Sunny; Li, Shu; Kellermann, Gottfried

    2016-10-01

    It remains a challenge to simultaneously quantify catecholamines and metanephrines in a simple, sensitive and cost-effective manner due to pre-analytical and analytical constraints. Herein, we describe such a method consisting of a miniaturized sample preparation and selective LC-MS/MS detection by the use of second morning spot urine samples. Ten microliters of second morning urine sample were subjected to solid phase extraction on an Oasis HLB microplate upon complexation with phenylboronic acid. The analytes were well-resolved on a Luna PFP column followed by tandem mass spectrometric detection. Full validation and suitability of spot urine sampling and biological variation were investigated. The extraction recovery and matrix effect are 74.1-97.3% and 84.1-119.0%, respectively. The linearity range is 2.5-500, 0.5-500, 2.5-1250, 2.5-1250 and 0.5-1250ng/mL for norepinephrine, epinephrine, dopamine, normetanephrine and metanephrine, respectively. The intra- and inter-assay imprecisions are ≤9.4% for spiked quality control samples, and the respective recoveries are 97.2-112.5% and 95.9-104.0%. The Deming regression slope is 0.90-1.08, and the mean Bland-Altman percentage difference is from -3.29 to 11.85 between a published and proposed method (n=50). A correlation observed for the spot and 24h urine collections is significant (n=20, p<0.0001, r: 0.84-0.95, slope: 0.61-0.98). No statistical differences are found in day-to-day biological variability (n=20). Reference intervals are established for an apparently healthy population (n=88). The developed method, being practical, sensitive, reliable and cost-effective, is expected to set a new stage for routine testing, basic research and clinical applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Validation of Mean Drift Forces Computed with the BEM Code NEMOH

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg

    This report covers a simple investigation of mean drift forces found by use of the boundary element method code NEMOH. The results from NEMOH are compared to analytical results from literature and to numerical values found from the commercial software package WADAM by DNV-GL. The work was conduct...... under the project ”Mooring Solutions for Large Wave Energy Converters”, during", Work Package 4: Full Dynamic Analysis". The validation compares results from a simple sphere and from a vertical cylinder....

  13. Method validation for chemical composition determination by electron microprobe with wavelength dispersive spectrometer

    Science.gov (United States)

    Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.

    2016-07-01

    The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.

  14. Analytical modeling of Schottky tunneling source impact ionization MOSFET with reduced breakdown voltage

    Directory of Open Access Journals (Sweden)

    Sangeeta Singh

    2016-03-01

    Full Text Available In this paper, we have investigated a novel Schottky tunneling source impact ionization MOSFET (STS-IMOS to lower the breakdown voltage of conventional impact ionization MOS (IMOS and developed an analytical model for the same. In STS-IMOS there is an accumulative effect of both impact ionization and source induced barrier tunneling. The silicide source offers very low parasitic resistance, the outcome of which is an increment in voltage drop across the intrinsic region for the same applied bias. This reduces operating voltage and hence, it exhibits a significant reduction in both breakdown and threshold voltage. STS-IMOS shows high immunity against hot electron damage. As a result of this the device reliability increases magnificently. The analytical model for impact ionization current (Iii is developed based on the integration of ionization integral (M. Similarly, to get Schottky tunneling current (ITun expression, Wentzel–Kramers–Brillouin (WKB approximation is employed. Analytical models for threshold voltage and subthreshold slope is optimized against Schottky barrier height (ϕB variation. The expression for the drain current is computed as a function of gate-to-drain bias via integral expression. It is validated by comparing it with the technology computer-aided design (TCAD simulation results as well. In essence, this analytical framework provides the physical background for better understanding of STS-IMOS and its performance estimation.

  15. Advanced, Analytic, Automated (AAA) Measurement of Engagement during Learning

    Science.gov (United States)

    D'Mello, Sidney; Dieterle, Ed; Duckworth, Angela

    2017-01-01

    It is generally acknowledged that engagement plays a critical role in learning. Unfortunately, the study of engagement has been stymied by a lack of valid and efficient measures. We introduce the advanced, analytic, and automated (AAA) approach to measure engagement at fine-grained temporal resolutions. The AAA measurement approach is grounded in…

  16. Application of advanced nuclear and instrumental analytical techniques for characterisation of environmental materials

    International Nuclear Information System (INIS)

    Sudersanan, M.; Pawaskar, P.B.; Kayasth, S.R.; Kumar, S.C.

    2002-01-01

    Full text: Increasing realisation about the toxic effects of metal ions in environmental materials has given an impetus to research on analytical techniques for their characterization. The large number of analytes present at very low levels has necessitated the use of sensitive, selective and element specific techniques for their characterization. The concern about precision and accuracy on such analysis, which have socio-economic bearing, has emphasized the use of Certified Reference Materials and the use of multi-technique approach for the unambiguous characterization of analytes. The recent work carried out at Analytical Chemistry Division, BARC on these aspects is presented in this paper. Increasing use of fossil fuels has led to the generation of large quantities of fly ash which pose problems of safe disposal. The utilization of these materials for land filling is an attractive option but the presence of trace amounts of toxic metals like mercury, arsenic, lead etc may cause environmental problems. In view of the inhomogeneous nature of the material, efficient sample processing is an important factor, in addition to the validation of the results by the use of proper standards. Analysis was carried out on flyash samples received as reference materials and also as samples from commercial sources using a combination of both nuclear techniques like INAA and RNAA as well as other techniques like AAS, ICPAES, cold vapour AAS for mercury and hydride generation technique for arsenic. Similar analysis using nuclear techniques was employed for the characterization of air particulates. Biological materials often serve as sensitive indicator materials for pollution measurements. They are also employed for studies on the uptake of toxic metals like U, Th, Cd, Pb, Hg etc. The presence of large amounts of organic materials in them necessitate an appropriate sample dissolution procedure. In view of the possibility of loss of certain analytes like Cd, Hg, As, by high

  17. SEMI-ANALYTIC GALAXY EVOLUTION (SAGE): MODEL CALIBRATION AND BASIC RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Croton, Darren J.; Stevens, Adam R. H.; Tonini, Chiara; Garel, Thibault; Bernyk, Maksym; Bibiano, Antonio; Hodkinson, Luke; Mutch, Simon J.; Poole, Gregory B.; Shattow, Genevieve M. [Centre for Astrophysics and Supercomputing, Swinburne University of Technology, P.O. Box 218, Hawthorn, Victoria 3122 (Australia)

    2016-02-15

    This paper describes a new publicly available codebase for modeling galaxy formation in a cosmological context, the “Semi-Analytic Galaxy Evolution” model, or sage for short.{sup 5} sage is a significant update to the 2006 model of Croton et al. and has been rebuilt to be modular and customizable. The model will run on any N-body simulation whose trees are organized in a supported format and contain a minimum set of basic halo properties. In this work, we present the baryonic prescriptions implemented in sage to describe the formation and evolution of galaxies, and their calibration for three N-body simulations: Millennium, Bolshoi, and GiggleZ. Updated physics include the following: gas accretion, ejection due to feedback, and reincorporation via the galactic fountain; a new gas cooling–radio mode active galactic nucleus (AGN) heating cycle; AGN feedback in the quasar mode; a new treatment of gas in satellite galaxies; and galaxy mergers, disruption, and the build-up of intra-cluster stars. Throughout, we show the results of a common default parameterization on each simulation, with a focus on the local galaxy population.

  18. SEMI-ANALYTIC GALAXY EVOLUTION (SAGE): MODEL CALIBRATION AND BASIC RESULTS

    International Nuclear Information System (INIS)

    Croton, Darren J.; Stevens, Adam R. H.; Tonini, Chiara; Garel, Thibault; Bernyk, Maksym; Bibiano, Antonio; Hodkinson, Luke; Mutch, Simon J.; Poole, Gregory B.; Shattow, Genevieve M.

    2016-01-01

    This paper describes a new publicly available codebase for modeling galaxy formation in a cosmological context, the “Semi-Analytic Galaxy Evolution” model, or sage for short. 5 sage is a significant update to the 2006 model of Croton et al. and has been rebuilt to be modular and customizable. The model will run on any N-body simulation whose trees are organized in a supported format and contain a minimum set of basic halo properties. In this work, we present the baryonic prescriptions implemented in sage to describe the formation and evolution of galaxies, and their calibration for three N-body simulations: Millennium, Bolshoi, and GiggleZ. Updated physics include the following: gas accretion, ejection due to feedback, and reincorporation via the galactic fountain; a new gas cooling–radio mode active galactic nucleus (AGN) heating cycle; AGN feedback in the quasar mode; a new treatment of gas in satellite galaxies; and galaxy mergers, disruption, and the build-up of intra-cluster stars. Throughout, we show the results of a common default parameterization on each simulation, with a focus on the local galaxy population

  19. Analytical results from salt batch 9 routine DSSHT and SEHT monthly samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-06-01

    Strip Effluent Hold Tank (SEHT) and Decontaminated Salt Solution Hold Tank (DSSHT) samples from several of the “microbatches” of Integrated Salt Disposition Project (ISDP) Salt Batch (“Macrobatch”) 9 have been analyzed for 238Pu, 90Sr, 137Cs, cations (Inductively Coupled Plasma Emission Spectroscopy - ICPES), and anions (Ion Chromatography Anions - IC-A). The analytical results from the current microbatch samples are similar to those from previous macrobatch samples. The Cs removal continues to be acceptable, with decontamination factors (DF) averaging 25700 (107% RSD). The bulk chemistry of the DSSHT and SEHT samples do not show any signs of unusual behavior, other than lacking the anticipated degree of dilution that is calculated to occur during Modular Caustic-Side Solvent Extraction Unit (MCU) processing.

  20. Simulation analysis of impact tests of steel plate reinforced concrete and reinforced concrete slabs against aircraft impact and its validation with experimental results

    International Nuclear Information System (INIS)

    Sadiq, Muhammad; Xiu Yun, Zhu; Rong, Pan

    2014-01-01

    Highlights: • Simulation analysis is carried out with two constitutive concrete models. • Winfrith model can better simulate nonlinear response of concrete than CSCM model. • Performance of steel plate concrete is better than reinforced concrete. • Thickness of safety related structures can be reduced by adopting steel plates. • Analysis results, mainly concrete material models should be validated. - Abstract: The steel plate reinforced concrete and reinforced concrete structures are used in nuclear power plants for protection against impact of an aircraft. In order to compare the impact resistance performance of steel plate reinforced concrete and reinforced concrete slabs panels, simulation analysis of 1/7.5 scale model impact tests is carried out by using finite element code ANSYS/LS-DYNA. The damage modes of all finite element models, velocity time history curves of the aircraft engine and damage to aircraft model are compared with the impact test results of steel plate reinforced concrete and reinforced concrete slab panels. The results indicate that finite element simulation results correlate well with the experimental results especially for constitutive winfrith concrete model. Also, the impact resistance performance of steel plate reinforced concrete slab panels is better than reinforced concrete slab panels, particularly the rear face steel plate is very effective in preventing the perforation and scabbing of concrete than conventional reinforced concrete structures. In this way, the thickness of steel plate reinforced concrete structures can be reduced in important structures like nuclear power plants against impact of aircraft. It also demonstrates the methodology to validate the analysis procedure with experimental and analytical studies. It may be effectively employed to predict the precise response of safety related structures against aircraft impact

  1. Framework for pedagogical learning analytics

    OpenAIRE

    Heilala, Ville

    2018-01-01

    Learning analytics is an emergent technological practice and a multidisciplinary scientific discipline, which goal is to facilitate effective learning and knowledge of learning. In this design science research, I combine knowledge discovery process, a concept of pedagogical knowledge, ethics of learning analytics and microservice architecture. The result is a framework for pedagogical learning analytics. The framework is applied and evaluated in the context of agency analytics. The framework ...

  2. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  3. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  4. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  5. Validation of Slosh Modeling Approach Using STAR-CCM+

    Science.gov (United States)

    Benson, David J.; Ng, Wanyi

    2018-01-01

    Without an adequate understanding of propellant slosh, the spacecraft attitude control system may be inadequate to control the spacecraft or there may be an unexpected loss of science observation time due to higher slosh settling times. Computational fluid dynamics (CFD) is used to model propellant slosh. STAR-CCM+ is a commercially available CFD code. This paper seeks to validate the CFD modeling approach via a comparison between STAR-CCM+ liquid slosh modeling results and experimental, empirically, and analytically derived results. The geometries examined are a bare right cylinder tank and a right cylinder with a single ring baffle.

  6. Why do ultrasoft repulsive particles cluster and crystallize? Analytical results from density-functional theory.

    Science.gov (United States)

    Likos, Christos N; Mladek, Bianca M; Gottwald, Dieter; Kahl, Gerhard

    2007-06-14

    We demonstrate the accuracy of the hypernetted chain closure and of the mean-field approximation for the calculation of the fluid-state properties of systems interacting by means of bounded and positive pair potentials with oscillating Fourier transforms. Subsequently, we prove the validity of a bilinear, random-phase density functional for arbitrary inhomogeneous phases of the same systems. On the basis of this functional, we calculate analytically the freezing parameters of the latter. We demonstrate explicitly that the stable crystals feature a lattice constant that is independent of density and whose value is dictated by the position of the negative minimum of the Fourier transform of the pair potential. This property is equivalent with the existence of clusters, whose population scales proportionally to the density. We establish that regardless of the form of the interaction potential and of the location on the freezing line, all cluster crystals have a universal Lindemann ratio Lf=0.189 at freezing. We further make an explicit link between the aforementioned density functional and the harmonic theory of crystals. This allows us to establish an equivalence between the emergence of clusters and the existence of negative Fourier components of the interaction potential. Finally, we make a connection between the class of models at hand and the system of infinite-dimensional hard spheres, when the limits of interaction steepness and space dimension are both taken to infinity in a particularly described fashion.

  7. Development and validation of a spectroscopic method for the ...

    African Journals Online (AJOL)

    Development and validation of a spectroscopic method for the simultaneous analysis of ... advanced analytical methods such as high pressure liquid ..... equipment. DECLARATIONS ... high-performance liquid chromatography. J Chromatogr.

  8. Assessment model validity document - HYDRASTAR. A stochastic continuum program for groundwater flow

    Energy Technology Data Exchange (ETDEWEB)

    Gylling, B. [Kemakta Konsult AB, Stockholm (Sweden); Eriksson, Lars [Equa Simulation AB, Sundbyberg (Sweden)

    2001-12-01

    The prevailing document addresses validation of the stochastic continuum model HYDRASTAR designed for Monte Carlo simulations of groundwater flow in fractured rocks. Here, validation is defined as a process to demonstrate that a model concept is fit for its purpose. Preferably, the validation is carried out by comparison of model predictions with independent field observations and experimental measurements. In addition, other sources can also be used to confirm that the model concept gives acceptable results. One method is to compare results with the ones achieved using other model concepts for the same set of input data. Another method is to compare model results with analytical solutions. The model concept HYDRASTAR has been used in several studies including performance assessments of hypothetical repositories for spent nuclear fuel. In the performance assessments, the main tasks for HYDRASTAR have been to calculate groundwater travel time distributions, repository flux distributions, path lines and their exit locations. The results have then been used by other model concepts to calculate the near field release and far field transport. The aim and framework for the validation process includes describing the applicability of the model concept for its purpose in order to build confidence in the concept. Preferably, this is made by comparisons of simulation results with the corresponding field experiments or field measurements. Here, two comparisons with experimental results are reported. In both cases the agreement was reasonably fair. In the broader and more general context of the validation process, HYDRASTAR results have been compared with other models and analytical solutions. Commonly, the approximation calculations agree well with the medians of model ensemble results. Additional indications that HYDRASTAR is suitable for its purpose were obtained from the comparisons with results from other model concepts. Several verification studies have been made for

  9. Self-adaptive numerical integrator for analytic functions

    International Nuclear Information System (INIS)

    Garribba, S.; Quartapelle, L.; Reina, G.

    1978-01-01

    A new adaptive algorithm for the integration of analytical functions is presented. The algorithm processes the integration interval by generating local subintervals whose length is controlled through a feedback loop. The control is obtained by means of a relation derived on an analytical basis and valid for an arbitrary integration rule: two different estimates of an integral are used to compute the interval length necessary to obtain an integral estimate with accuracy within the assigned error bounds. The implied method for local generation of subintervals and an effective assumption of error partition among subintervals give rise to an adaptive algorithm provided with a highly accurate and very efficient integration procedure. The particular algorithm obtained by choosing the 6-point Gauss-Legendre integration rule is considered and extensive comparisons are made with other outstanding integration algorithms

  10. Two analytical models for evaluating performance of Gigabit Ethernet Hosts

    International Nuclear Information System (INIS)

    Salah, K.

    2006-01-01

    Two analytical models are developed to study the impact of interrupt overhead on operating system performance of network hosts when subjected to Gigabit network traffic. Under heavy network traffic, the system performance will be negatively affected due to interrupt overhead caused by incoming traffic. In particular, excessive latency and significant degradation in system throughput can be experienced. Also user application may livelock as the CPU power is mostly consumed by interrupt handling and protocol processing. In this paper we present and compare two analytical models that capture host behavior and evaluate its performance. The first model is based Markov processes and queuing theory, while the second, which is more accurate but more complex is a pure Markov process. For the most part both models give mathematically-equivalent closed-form solutions for a number of important system performance metrics. These metrics include throughput, latency and stability condition, CPU utilization of interrupt handling and protocol processing and CPU availability for user applications. The analysis yields insight into understanding and predicting the impact of system and network choices on the performance of interrupt-driven systems when subjected to light and heavy network loads. More, importantly, our analytical work can also be valuable in improving host performance. The paper gives guidelines and recommendations to address design and implementation issues. Simulation and reported experimental results show that our analytical models are valid and give a good approximation. (author)

  11. Validation of analytical methods for the quality control of Naproxen suppositories

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar; Hernandez Contreras, Orestes Yuniel

    2011-01-01

    The analysis methods that will be used for the quality control of the future Cuban-made Naproxen suppositories for adults and children were developed for the first time in this paper. One method based on direct ultraviolet spectrophotometry was put forward, which proved to be specific, linear, accurate and precise for the quality control of Naproxen suppositories, taking into account the presence of chromophore groups in their structure. Likewise, the direct semi-aqueous acid-base volumetry method aimed at the quality control of the Naproxen raw material was changed and adapted to the quality control of suppositories. On the basis of the validation process, there was demonstrated the adequate specificity of this method with respect to the formulation components, as well as its linearity, accuracy and precision in 1-3 mg/ml range. The final results were compared and no significant statistical differences among the replicas per each dose were found in both methods; therefore, both may be used in the quality control of Naproxen suppositories

  12. Analytical modeling of pressure transient behavior for coalbed methane transport in anisotropic media

    International Nuclear Information System (INIS)

    Wang, Lei; Wang, Xiaodong

    2014-01-01

    Resulting from the nature of anisotropy of coal media, it is a meaningful work to evaluate pressure transient behavior and flow characteristics within coals. In this article, a complete analytical model called the elliptical flow model is established by combining the theory of elliptical flow in anisotropic media and Fick's laws about the diffusion of coalbed methane. To investigate pressure transient behavior, analytical solutions were first obtained through introducing a series of special functions (Mathieu functions), which are extremely complex and are hard to calculate. Thus, a computer program was developed to establish type curves, on which the effects of the parameters, including anisotropy coefficient, storage coefficient, transfer coefficient and rate constant, were analyzed in detail. Calculative results show that the existence of anisotropy would cause great pressure depletion. To validate new analytical solutions, previous results were used to compare with the new results. It is found that a better agreement between the solutions obtained in this work and the literature was achieved. Finally, a case study is used to explain the effects of the parameters, including rock total compressibility coefficient, coal medium porosity and anisotropic permeability, sorption time constant, Langmuir volume and fluid viscosity, on bottom-hole pressure behavior. It is necessary to coordinate these parameters so as to reduce the pressure depletion. (paper)

  13. Critical evaluation of analytical models for stochastic heating in dual-frequency capacitive discharges

    International Nuclear Information System (INIS)

    Sharma, S; Turner, M M

    2013-01-01

    Dual-frequency capacitive discharges are widespread in the semiconductor industry and are used, for example, in etching of semiconductor materials to manufacture microchips. In low-pressure dual radio-frequency capacitive discharges, stochastic heating is an important phenomenon. Recent theoretical work on this problem using several different approaches has produced results that are broadly in agreement insofar as scaling with the discharge parameters is concerned, but there remains some disagreement in detail concerning the absolute size of the effect for the case of dual-frequency capacitive discharges. In this work, we investigate the dependence of stochastic heating on various discharge parameters with the help of particle-in-cell (PIC) simulation. The dual-frequency analytical models are in fair agreement with PIC results for values of the low-frequency current density amplitude J lf (or dimensionless control parameter H lf ∼ 5) typical of many modern experiments. However, for higher values of J lf (or higher H lf ), new physical phenomena (like field reversal, reflection of ions, etc) appear and the simulation results deviate from existing dual-frequency analytical models. On the other hand, for lower J lf (or lower H lf ) again the simulation results deviate from analytical models. So this research work produces a relatively extensive set of simulation data that may be used to validate theories over a wide range of parameters. (paper)

  14. Vibration Based Diagnosis for Planetary Gearboxes Using an Analytical Model

    Directory of Open Access Journals (Sweden)

    Liu Hong

    2016-01-01

    Full Text Available The application of conventional vibration based diagnostic techniques to planetary gearboxes is a challenge because of the complexity of frequency components in the measured spectrum, which is the result of relative motions between the rotary planets and the fixed accelerometer. In practice, since the fault signatures are usually contaminated by noises and vibrations from other mechanical components of gearboxes, the diagnostic efficacy may further deteriorate. Thus, it is essential to develop a novel vibration based scheme to diagnose gear failures for planetary gearboxes. Following a brief literature review, the paper begins with the introduction of an analytical model of planetary gear-sets developed by the authors in previous works, which can predict the distinct behaviors of fault introduced sidebands. This analytical model is easy to implement because the only prerequisite information is the basic geometry of the planetary gear-set. Afterwards, an automated diagnostic scheme is proposed to cope with the challenges associated with the characteristic configuration of planetary gearboxes. The proposed vibration based scheme integrates the analytical model, a denoising algorithm, and frequency domain indicators into one synergistic system for the detection and identification of damaged gear teeth in planetary gearboxes. Its performance is validated with the dynamic simulations and the experimental data from a planetary gearbox test rig.

  15. Microfluidic paper-based analytical devices for potential use in quantitative and direct detection of disease biomarkers in clinical analysis.

    Science.gov (United States)

    Lim, Wei Yin; Goh, Boon Tong; Khor, Sook Mei

    2017-08-15

    Clinicians, working in the health-care diagnostic systems of developing countries, currently face the challenges of rising costs, increased number of patient visits, and limited resources. A significant trend is using low-cost substrates to develop microfluidic devices for diagnostic purposes. Various fabrication techniques, materials, and detection methods have been explored to develop these devices. Microfluidic paper-based analytical devices (μPADs) have gained attention for sensing multiplex analytes, confirming diagnostic test results, rapid sample analysis, and reducing the volume of samples and analytical reagents. μPADs, which can provide accurate and reliable direct measurement without sample pretreatment, can reduce patient medical burden and yield rapid test results, aiding physicians in choosing appropriate treatment. The objectives of this review are to provide an overview of the strategies used for developing paper-based sensors with enhanced analytical performances and to discuss the current challenges, limitations, advantages, disadvantages, and future prospects of paper-based microfluidic platforms in clinical diagnostics. μPADs, with validated and justified analytical performances, can potentially improve the quality of life by providing inexpensive, rapid, portable, biodegradable, and reliable diagnostics. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Surrogate analyte approach for quantitation of endogenous NAD(+) in human acidified blood samples using liquid chromatography coupled with electrospray ionization tandem mass spectrometry.

    Science.gov (United States)

    Liu, Liling; Cui, Zhiyi; Deng, Yuzhong; Dean, Brian; Hop, Cornelis E C A; Liang, Xiaorong

    2016-02-01

    A high-performance liquid chromatography tandem mass spectrometry (LC-MS/MS) assay for the quantitative determination of NAD(+) in human whole blood using a surrogate analyte approach was developed and validated. Human whole blood was acidified using 0.5N perchloric acid at a ratio of 1:3 (v:v, blood:perchloric acid) during sample collection. 25μL of acidified blood was extracted using a protein precipitation method and the resulting extracts were analyzed using reverse-phase chromatography and positive electrospray ionization mass spectrometry. (13)C5-NAD(+) was used as the surrogate analyte for authentic analyte, NAD(+). The standard curve ranging from 0.250 to 25.0μg/mL in acidified human blood for (13)C5-NAD(+) was fitted to a 1/x(2) weighted linear regression model. The LC-MS/MS response between surrogate analyte and authentic analyte at the same concentration was obtained before and after the batch run. This response factor was not applied when determining the NAD(+) concentration from the (13)C5-NAD(+) standard curve since the percent difference was less than 5%. The precision and accuracy of the LC-MS/MS assay based on the five analytical QC levels were well within the acceptance criteria from both FDA and EMA guidance for bioanalytical method validation. Average extraction recovery of (13)C5-NAD(+) was 94.6% across the curve range. Matrix factor was 0.99 for both high and low QC indicating minimal ion suppression or enhancement. The validated assay was used to measure the baseline level of NAD(+) in 29 male and 21 female human subjects. This assay was also used to study the circadian effect of endogenous level of NAD(+) in 10 human subjects. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Psychometric Validation of Stress and Compliance Scale for ...

    African Journals Online (AJOL)

    Purpose: To provide factorial analytical findings, and to construct validation and normative .... participants were asked to fold the questionnaire ... both cross loaded onto both factors. .... Compas B, Grant K. Ey S. Psychosocial stress and child.

  18. Cellular Scanning Strategy for Selective Laser Melting: Capturing Thermal Trends with a Low-Fidelity, Pseudo-Analytical Model

    Directory of Open Access Journals (Sweden)

    Sankhya Mohanty

    2014-01-01

    Full Text Available Simulations of additive manufacturing processes are known to be computationally expensive. The resulting large runtimes prohibit their application in secondary analysis requiring several complete simulations such as optimization studies, and sensitivity analysis. In this paper, a low-fidelity pseudo-analytical model has been introduced to enable such secondary analysis. The model has been able to mimic a finite element model and was able to capture the thermal trends associated with the process. The model has been validated and subsequently applied in a small optimization case study. The pseudo-analytical modelling technique is established as a fast tool for primary modelling investigations.

  19. 42 CFR 476.85 - Conclusive effect of QIO initial denial determinations and changes as a result of DRG validations.

    Science.gov (United States)

    2010-10-01

    ... determinations and changes as a result of DRG validations. 476.85 Section 476.85 Public Health CENTERS FOR... denial determinations and changes as a result of DRG validations. A QIO initial denial determination or change as a result of DRG validation is final and binding unless, in accordance with the procedures in...

  20. Experimental, numerical, and analytical studies on the seismic response of steel-plate concrete (SC) composite shear walls

    Science.gov (United States)

    Epackachi, Siamak

    The seismic performance of rectangular steel-plate concrete (SC) composite shear walls is assessed for application to buildings and mission-critical infrastructure. The SC walls considered in this study were composed of two steel faceplates and infill concrete. The steel faceplates were connected together and to the infill concrete using tie rods and headed studs, respectively. The research focused on the in-plane behavior of flexure- and flexure-shear-critical SC walls. An experimental program was executed in the NEES laboratory at the University at Buffalo and was followed by numerical and analytical studies. In the experimental program, four large-size specimens were tested under displacement-controlled cyclic loading. The design variables considered in the testing program included wall thickness, reinforcement ratio, and slenderness ratio. The aspect ratio (height-to-length) of the four walls was 1.0. Each SC wall was installed on top of a re-usable foundation block. A bolted baseplate to RC foundation connection was used for all four walls. The walls were identified to be flexure- and flexure-shear critical. The progression of damage in the four walls was identical, namely, cracking and crushing of the infill concrete at the toes of the walls, outward buckling and yielding of the steel faceplates near the base of the wall, and tearing of the faceplates at their junctions with the baseplate. A robust finite element model was developed in LS-DYNA for nonlinear cyclic analysis of the flexure- and flexure-shear-critical SC walls. The DYNA model was validated using the results of the cyclic tests of the four SC walls. The validated and benchmarked models were then used to conduct a parametric study, which investigated the effects of wall aspect ratio, reinforcement ratio, wall thickness, and uniaxial concrete compressive strength on the in-plane response of SC walls. Simplified analytical models, suitable for preliminary analysis and design of SC walls, were

  1. Analysis of environmental contamination resulting from catastrophic incidents: part 2. Building laboratory capability by selecting and developing analytical methodologies.

    Science.gov (United States)

    Magnuson, Matthew; Campisano, Romy; Griggs, John; Fitz-James, Schatzi; Hall, Kathy; Mapp, Latisha; Mullins, Marissa; Nichols, Tonya; Shah, Sanjiv; Silvestri, Erin; Smith, Terry; Willison, Stuart; Ernst, Hiba

    2014-11-01

    Catastrophic incidents can generate a large number of samples of analytically diverse types, including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for sample analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to acceptable levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illustrates the result of applying this principle, in the form of a compendium of analytical methods for contaminants of interest. The compendium is based on experience with actual incidents, where appropriate and available. This paper also discusses efforts aimed at adaptation of existing methods to increase fitness-for-purpose and development of innovative methods when necessary. The contaminants of interest are primarily those potentially released through catastrophes resulting from malicious activity

  2. Analytical model and design of spoke-type permanent-magnet machines accounting for saturation and nonlinearity of magnetic bridges

    Science.gov (United States)

    Liang, Peixin; Chai, Feng; Bi, Yunlong; Pei, Yulong; Cheng, Shukang

    2016-11-01

    Based on subdomain model, this paper presents an analytical method for predicting the no-load magnetic field distribution, back-EMF and torque in general spoke-type motors with magnetic bridges. Taking into account the saturation and nonlinearity of magnetic material, the magnetic bridges are equivalent to fan-shaped saturation regions. For getting standard boundary conditions, a lumped parameter magnetic circuit model and iterative method are employed to calculate the permeability. The final field domain is divided into five types of simple subdomains. Based on the method of separation of variables, the analytical expression of each subdomain is derived. The analytical results of the magnetic field distribution, Back-EMF and torque are verified by finite element method, which confirms the validity of the proposed model for facilitating the motor design and optimization.

  3. Tank 241-U-106, cores 147 and 148, analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Steen, F.H.

    1996-09-27

    This document is the final report deliverable for tank 241-U-106 push mode core segments collected between May 8, 1996 and May 10, 1996 and received by the 222-S Laboratory between May 14, 1996 and May 16, 1996. The segments were subsampled and analyzed in accordance with the Tank 241-U-106 Push Mode Core Sampling and analysis Plan (TSAP), the Historical Model Evaluation Data Requirements (Historical DQO), Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO) and the Safety Screening Data Quality Objective (DQO). The analytical results are included in Table 1.

  4. Three-neutrino oscillations in matter: Analytical results in the adiabatic approximaton

    International Nuclear Information System (INIS)

    Petcov, S.T.; Toshev, S.

    1987-01-01

    Analytical expressions for the probabilities of the transitions between different neutrino flavours in matter in the case of three lepton families and small vacuum mixing angles are obtained in the adiabatic approximation. A brief discussion of the characteristic features of the Mikheyev-Smirnov-Wolfenstein effect in the system of the three neutrino flavours ν e , ν μ and ν τ is also given. (orig.)

  5. Monte Carlo and analytical model predictions of leakage neutron exposures from passively scattered proton therapy

    International Nuclear Information System (INIS)

    Pérez-Andújar, Angélica; Zhang, Rui; Newhauser, Wayne

    2013-01-01

    Purpose: Stray neutron radiation is of concern after radiation therapy, especially in children, because of the high risk it might carry for secondary cancers. Several previous studies predicted the stray neutron exposure from proton therapy, mostly using Monte Carlo simulations. Promising attempts to develop analytical models have also been reported, but these were limited to only a few proton beam energies. The purpose of this study was to develop an analytical model to predict leakage neutron equivalent dose from passively scattered proton beams in the 100-250-MeV interval.Methods: To develop and validate the analytical model, the authors used values of equivalent dose per therapeutic absorbed dose (H/D) predicted with Monte Carlo simulations. The authors also characterized the behavior of the mean neutron radiation-weighting factor, w R , as a function of depth in a water phantom and distance from the beam central axis.Results: The simulated and analytical predictions agreed well. On average, the percentage difference between the analytical model and the Monte Carlo simulations was 10% for the energies and positions studied. The authors found that w R was highest at the shallowest depth and decreased with depth until around 10 cm, where it started to increase slowly with depth. This was consistent among all energies.Conclusion: Simple analytical methods are promising alternatives to complex and slow Monte Carlo simulations to predict H/D values. The authors' results also provide improved understanding of the behavior of w R which strongly depends on depth, but is nearly independent of lateral distance from the beam central axis

  6. Analytical and Experimental Study for Validation of the Device to Confine BN Reactor Melted Fuel

    International Nuclear Information System (INIS)

    Rogozhkin, S.; Osipov, S.; Sobolev, V.; Shepelev, S.; Kozhaev, A.; Mavrin, M.; Ryabov, A.

    2013-01-01

    To validate the design and confirm the design characteristics of the special retaining device (core catcher) used for protection of BN reactor vessel in the case of a severe beyond-design basis accident with core melting, computational and experimental studies were carried out. The Tray test facility that uses water as coolant was developed and fabricated by OKBM; experimental studies were performed. To verify the methodical approach used for the computational study, experimental results obtained in the Tray test facility were compared with numerical simulation results obtained by the STAR-CCM+ CFD code

  7. Development and validation of analytical methodology for determination of polycyclic aromatic hydrocarbons (PAHS) in sediments. Assesment of Pedroso Park dam, Santo Andre, SP; Desenvolvimento e validacao de metodologia analitica para determinacao de hidrocarbonetos policiclicos aromaticos (HPAS) em sedimentos. Avaliacao da represa do Parque Pedroso, Santo Andre, SP

    Energy Technology Data Exchange (ETDEWEB)

    Brito, Carlos Fernando de

    2009-07-01

    The polycyclic aromatic hydrocarbons (PAHs), by being considered persistent contaminants, by their ubiquity in the environment and by the recognition of their genotoxicity, have stimulated research activities in order to determine and evaluate their sources, transport, processing, biological effects and accumulation in compartments of aquatic and terrestrial ecosystems. In this work, the matrix studied was sediment collected at Pedroso Park's dam at Santo Andre, SP. The analytical technique employed was liquid chromatography in reverse phase with a UV/Vis detector. Statistics treatment of the data was established during the process of developing the methodology for which there was reliable results. The steps involved were evaluated using the concept of Validation of Chemical Testing. The parameters selected for the analytical validation were selectivity, linearity, Working Range, Sensitivity, Accuracy, Precision, Limit of Detection, Limit of quantification and robustness. These parameters showed satisfactory results, allowing the application of the methodology, and is a simple method that allows the minimization of contamination and loss of compounds by over-handling. For the PAHs tested were no found positive results, above the limit of detection, in any of the samples collected in the first phase. But, at the second collection, were found small changes mainly acenaphthylene, fluorene and benzo[a]anthracene. Although the area is preserved, it is possible to realize little signs of contamination. (author)

  8. Studies on the spectral interference of gadolinium on different analytes in inductively coupled plasma atomic emission spectroscopy

    International Nuclear Information System (INIS)

    Sengupta, Arijit; Thulasidas, S.K.; Natarajan, V.; Airan, Yougant

    2015-01-01

    Due to the multi-electronic nature, rare earth elements are prone to exhibit spectral interference in ICP-AES, which leads to erroneous determination of analytes in presence of such matrix. This interference is very significant, when the analytes are to be determined at trace level in presence of emission rich matrix elements. An attempt was made to understand the spectral interference of Gd on 29 common analytes like Ag, Al, B, Ba, Bi, Ca, Cd, Ce, Co, Cr, Cu, Dy, Fe, Ga, Gd, In, La, Li, Lu, Mg, Mn, Na, Nd, Ni, Pb, Pr, Sr, Tl and Zn using ICP-AES with capacitive Charged Coupled Device (CCD) as detector. The present study includes identification of suitable interference free analytical lines of these analytes, evaluation of correction factor for each analytical line and determination of tolerance levels of these analytical lines along with the ICP-AES based methodology for simultaneous determination of Gd. Based on the spectral interference study, an ICP-AES based method was developed for the determination of these analytes at trace level in presence of Gd matrix without chemical separation. Further the developed methodology was validated using synthetic samples prepared from commercially available reference material solution of individual element; the results were found to be satisfactory. The method was also compared with other existing techniques

  9. Analytical Validation of Quantitative Real-Time PCR Methods for Quantification of Trypanosoma cruzi DNA in Blood Samples from Chagas Disease Patients.

    Science.gov (United States)

    Ramírez, Juan Carlos; Cura, Carolina Inés; da Cruz Moreira, Otacilio; Lages-Silva, Eliane; Juiz, Natalia; Velázquez, Elsa; Ramírez, Juan David; Alberti, Anahí; Pavia, Paula; Flores-Chávez, María Delmans; Muñoz-Calderón, Arturo; Pérez-Morales, Deyanira; Santalla, José; Marcos da Matta Guedes, Paulo; Peneau, Julie; Marcet, Paula; Padilla, Carlos; Cruz-Robles, David; Valencia, Edward; Crisante, Gladys Elena; Greif, Gonzalo; Zulantay, Inés; Costales, Jaime Alfredo; Alvarez-Martínez, Miriam; Martínez, Norma Edith; Villarroel, Rodrigo; Villarroel, Sandro; Sánchez, Zunilda; Bisio, Margarita; Parrado, Rudy; Maria da Cunha Galvão, Lúcia; Jácome da Câmara, Antonia Cláudia; Espinoza, Bertha; Alarcón de Noya, Belkisyole; Puerta, Concepción; Riarte, Adelina; Diosque, Patricio; Sosa-Estani, Sergio; Guhl, Felipe; Ribeiro, Isabela; Aznar, Christine; Britto, Constança; Yadón, Zaida Estela; Schijman, Alejandro G

    2015-09-01

    An international study was performed by 26 experienced PCR laboratories from 14 countries to assess the performance of duplex quantitative real-time PCR (qPCR) strategies on the basis of TaqMan probes for detection and quantification of parasitic loads in peripheral blood samples from Chagas disease patients. Two methods were studied: Satellite DNA (SatDNA) qPCR and kinetoplastid DNA (kDNA) qPCR. Both methods included an internal amplification control. Reportable range, analytical sensitivity, limits of detection and quantification, and precision were estimated according to international guidelines. In addition, inclusivity and exclusivity were estimated with DNA from stocks representing the different Trypanosoma cruzi discrete typing units and Trypanosoma rangeli and Leishmania spp. Both methods were challenged against 156 blood samples provided by the participant laboratories, including samples from acute and chronic patients with varied clinical findings, infected by oral route or vectorial transmission. kDNA qPCR showed better analytical sensitivity than SatDNA qPCR with limits of detection of 0.23 and 0.70 parasite equivalents/mL, respectively. Analyses of clinical samples revealed a high concordance in terms of sensitivity and parasitic loads determined by both SatDNA and kDNA qPCRs. This effort is a major step toward international validation of qPCR methods for the quantification of T. cruzi DNA in human blood samples, aiming to provide an accurate surrogate biomarker for diagnosis and treatment monitoring for patients with Chagas disease. Copyright © 2015 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  10. Analytical model and design of spoke-type permanent-magnet machines accounting for saturation and nonlinearity of magnetic bridges

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Peixin; Chai, Feng [State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin 150001 (China); Department of Electrical Engineering, Harbin Institute of Technology, Harbin 150001 (China); Bi, Yunlong [Department of Electrical Engineering, Harbin Institute of Technology, Harbin 150001 (China); Pei, Yulong, E-mail: peiyulong1@163.com [Department of Electrical Engineering, Harbin Institute of Technology, Harbin 150001 (China); Cheng, Shukang [State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin 150001 (China); Department of Electrical Engineering, Harbin Institute of Technology, Harbin 150001 (China)

    2016-11-01

    Based on subdomain model, this paper presents an analytical method for predicting the no-load magnetic field distribution, back-EMF and torque in general spoke-type motors with magnetic bridges. Taking into account the saturation and nonlinearity of magnetic material, the magnetic bridges are equivalent to fan-shaped saturation regions. For getting standard boundary conditions, a lumped parameter magnetic circuit model and iterative method are employed to calculate the permeability. The final field domain is divided into five types of simple subdomains. Based on the method of separation of variables, the analytical expression of each subdomain is derived. The analytical results of the magnetic field distribution, Back-EMF and torque are verified by finite element method, which confirms the validity of the proposed model for facilitating the motor design and optimization. - Highlights: • The no-load magnetic field of poke-type motors is firstly calculated by analytical method. • The magnetic circuit model and iterative method are employed to calculate the permeability. • The analytical expression of each subdomain is derived.. • The proposed method can effectively reduce the predesign stages duration.

  11. Analytical model and design of spoke-type permanent-magnet machines accounting for saturation and nonlinearity of magnetic bridges

    International Nuclear Information System (INIS)

    Liang, Peixin; Chai, Feng; Bi, Yunlong; Pei, Yulong; Cheng, Shukang

    2016-01-01

    Based on subdomain model, this paper presents an analytical method for predicting the no-load magnetic field distribution, back-EMF and torque in general spoke-type motors with magnetic bridges. Taking into account the saturation and nonlinearity of magnetic material, the magnetic bridges are equivalent to fan-shaped saturation regions. For getting standard boundary conditions, a lumped parameter magnetic circuit model and iterative method are employed to calculate the permeability. The final field domain is divided into five types of simple subdomains. Based on the method of separation of variables, the analytical expression of each subdomain is derived. The analytical results of the magnetic field distribution, Back-EMF and torque are verified by finite element method, which confirms the validity of the proposed model for facilitating the motor design and optimization. - Highlights: • The no-load magnetic field of poke-type motors is firstly calculated by analytical method. • The magnetic circuit model and iterative method are employed to calculate the permeability. • The analytical expression of each subdomain is derived.. • The proposed method can effectively reduce the predesign stages duration.

  12. A new analytical method for quantification of olive and palm oil in blends with other vegetable edible oils based on the chromatographic fingerprints from the methyl-transesterified fraction.

    Science.gov (United States)

    Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis

    2017-03-01

    A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Analytical expression for position sensitivity of linear response beam position monitor having inter-electrode cross talk

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Mukesh, E-mail: mukeshk@rrcat.gov.in [Beam Diagnostics Section, Indus Operations, Beam Dynamics & Diagnostics Division, Raja Ramanna Centre for Advanced Technology, Indore, 452013 MP (India); Homi Bhabha National Institute, Training School Complex, Anushakti Nagar, Mumbai 400 094 (India); Ojha, A.; Garg, A.D.; Puntambekar, T.A. [Beam Diagnostics Section, Indus Operations, Beam Dynamics & Diagnostics Division, Raja Ramanna Centre for Advanced Technology, Indore, 452013 MP (India); Senecha, V.K. [Homi Bhabha National Institute, Training School Complex, Anushakti Nagar, Mumbai 400 094 (India); Ion Source Lab., Proton Linac & Superconducting Cavities Division, Raja Ramanna Centre for Advanced Technology, Indore, 452013 MP (India)

    2017-02-01

    According to the quasi electrostatic model of linear response capacitive beam position monitor (BPM), the position sensitivity of the device depends only on the aperture of the device and it is independent of processing frequency and load impedance. In practice, however, due to the inter-electrode capacitive coupling (cross talk), the actual position sensitivity of the device decreases with increasing frequency and load impedance. We have taken into account the inter-electrode capacitance to derive and propose a new analytical expression for the position sensitivity as a function of frequency and load impedance. The sensitivity of a linear response shoe-box type BPM has been obtained through simulation using CST Studio Suite to verify and confirm the validity of the new analytical equation. Good agreement between the simulation results and the new analytical expression suggest that this method can be exploited for proper designing of BPM.

  14. Dual metal gate tunneling field effect transistors based on MOSFETs: A 2-D analytical approach

    Science.gov (United States)

    Ramezani, Zeinab; Orouji, Ali A.

    2018-01-01

    A novel 2-D analytical drain current model of novel Dual Metal Gate Tunnel Field Effect Transistors Based on MOSFETs (DMG-TFET) is presented in this paper. The proposed Tunneling FET is extracted from a MOSFET structure by employing an additional electrode in the source region with an appropriate work function to induce holes in the N+ source region and hence makes it as a P+ source region. The electric field is derived which is utilized to extract the expression of the drain current by analytically integrating the band to band tunneling generation rate in the tunneling region based on the potential profile by solving the Poisson's equation. Through this model, the effects of the thin film thickness and gate voltage on the potential, the electric field, and the effects of the thin film thickness on the tunneling current can be studied. To validate our present model we use SILVACO ATLAS device simulator and the analytical results have been compared with it and found a good agreement.

  15. Validated analytical methodology for the simultaneous determination of a wide range of pesticides in human blood using GC-MS/MS and LC-ESI/MS/MS and its application in two poisoning cases.

    Science.gov (United States)

    Luzardo, Octavio P; Almeida-González, Maira; Ruiz-Suárez, Norberto; Zumbado, Manuel; Henríquez-Hernández, Luis A; Meilán, María José; Camacho, María; Boada, Luis D

    2015-09-01

    Pesticides are frequently responsible for human poisoning and often the information on the involved substance is lacking. The great variety of pesticides that could be responsible for intoxication makes necessary the development of powerful and versatile analytical methodologies, which allows the identification of the unknown toxic substance. Here we developed a methodology for simultaneous identification and quantification in human blood of 109 highly toxic pesticides. The application of this analytical scheme would help in minimizing the cost of this type of chemical identification, maximizing the chances of identifying the pesticide involved. In the methodology that we present here, we use a liquid-liquid extraction, followed by one single purification step, and quantitation of analytes by a combination of liquid and gas chromatography, both coupled to triple quadrupole mass spectrometry, which is operated in the mode of multiple reaction monitoring. The methodology has been fully validated, and its applicability has been demonstrated in two recent cases involving one self-poisoning fatality and one non-fatal homicidal attempt. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  16. Validation of pestice multi residue analysis method on cucumber

    International Nuclear Information System (INIS)

    2011-01-01

    In this study we aimed to validate the method of multi pesticide residue analysis on cucumber. Before real sample injection, system suitability test was performed in gas chromatography (GC). For this purpose, a sensitive pesticide mixture was used for GC-NPD and estimated the performance parameters such as number of effective theoretical plates, resolution factor, asymmetry, tailing and selectivity. It was detected that the system was suitable for calibration and sample injection. Samples were fortified at the level of 0.02, 0.2, 0.8 and 1 mg/kg with mixture of dichlorvos, malathion and chloropyrifos pesticides. In the fortification step 1 4C-carbaryl was also added on homogenized analytical portions to make use of 1 4C labelled pesticides for the determining extraction efficiency. Then the basic analytical process, such as ethyl acetate extraction, filtration, evaporation and cleanup, were performed. The GPC calibration using 1 4C- carbaryl and fortification mixture (dichlorvos, malathion and chloropyrifos) showed that pesticide fraction come through the column between the 8-23 ml fractions. The recovery of 1 4C-carbaryl after the extraction and cleanup step were 92.63-111.73 % and 74.83-102.22 %, respectively. The stability of pesticides during analysis is an important factor. In this study, stability test was performed including matrix effect. Our calculation and t test results showed that above mentioned pesticides were not stabile during sample processing in our laboratory conditions and it was found that sample comminution with dry ice may improve stability. In the other part of the study, 1 4C-chloropyrifos was used to determine homogeneity of analytical portions taken from laboratory samples. Use of 1 4C labelled pesticides allows us for quick quantification analyte, even with out clean-up. The analytical results show that after sample processing with waring blender, analytical portions were homogenous. Sample processing uncertainty depending on quantity of

  17. Big data analytics to improve cardiovascular care: promise and challenges.

    Science.gov (United States)

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  18. Strengthening of reinforced concrete beams with basalt-based FRP sheets: An analytical assessment

    International Nuclear Information System (INIS)

    Nerilli, Francesca; Vairo, Giuseppe

    2016-01-01

    In this paper the effectiveness of the flexural strengthening of RC beams through basalt fiber-reinforced sheets is investigated. The non-linear flexural response of RC beams strengthened with FRP composites applied at the traction side is described via an analytical formulation. Validation results and some comparative analyses confirm soundness and consistency of the proposed approach, and highlight the good mechanical performances (in terms of strength and ductility enhancement of the beam) produced by basalt-based reinforcements in comparison with traditional glass or carbon FRPs.

  19. Strengthening of reinforced concrete beams with basalt-based FRP sheets: An analytical assessment

    Energy Technology Data Exchange (ETDEWEB)

    Nerilli, Francesca [Unicusano - Università degli Studi Niccolò Cusano Telematica Roma, 00166 Rome (Italy); Vairo, Giuseppe [Università degli Studi di Roma “Tor Vergata”- (DICII), 00133 Rome (Italy)

    2016-06-08

    In this paper the effectiveness of the flexural strengthening of RC beams through basalt fiber-reinforced sheets is investigated. The non-linear flexural response of RC beams strengthened with FRP composites applied at the traction side is described via an analytical formulation. Validation results and some comparative analyses confirm soundness and consistency of the proposed approach, and highlight the good mechanical performances (in terms of strength and ductility enhancement of the beam) produced by basalt-based reinforcements in comparison with traditional glass or carbon FRPs.

  20. Strengthening of reinforced concrete beams with basalt-based FRP sheets: An analytical assessment

    Science.gov (United States)

    Nerilli, Francesca; Vairo, Giuseppe

    2016-06-01

    In this paper the effectiveness of the flexural strengthening of RC beams through basalt fiber-reinforced sheets is investigated. The non-linear flexural response of RC beams strengthened with FRP composites applied at the traction side is described via an analytical formulation. Validation results and some comparative analyses confirm soundness and consistency of the proposed approach, and highlight the good mechanical performances (in terms of strength and ductility enhancement of the beam) produced by basalt-based reinforcements in comparison with traditional glass or carbon FRPs.

  1. Pancultural self-enhancement reloaded: a meta-analytic reply to Heine (2005).

    Science.gov (United States)

    Sedikides, Constantine; Gaertner, Lowell; Vevea, Jack L

    2005-10-01

    C. Sedikides, L. Gaertner, and Y. Toguchi (2003) reported findings favoring the universality of self-enhancement. S. J. Heine (2005) challenged the authors' research on evidential and logical grounds. In response, the authors carried out 2 meta-analytic investigations. The results backed the C. Sedikides et al. (2003) theory and findings. Both Westerners and Easterners self-enhanced tactically. Westerners self-enhanced on attributes relevant to the cultural ideal of individualism, whereas Easterners self-enhanced on attributes relevant to the cultural ideal of collectivism (in both cases, because of the personal importance of the ideal). Self-enhancement motivation is universal, although its manifestations are strategically sensitive to cultural context. The authors respond to other aspects of Heine's critique by discussing why researchers should empirically validate the comparison dimension (individualistic vs. collectivistic) and defending why the better-than-average effect is a valid measure of self-enhancement.

  2. Analytical model of tilted driver–pickup coils for eddy current nondestructive evaluation

    Science.gov (United States)

    Cao, Bing-Hua; Li, Chao; Fan, Meng-Bao; Ye, Bo; Tian, Gui-Yun

    2018-03-01

    A driver-pickup probe possesses better sensitivity and flexibility due to individual optimization of a coil. It is frequently observed in an eddy current (EC) array probe. In this work, a tilted non-coaxial driver-pickup probe above a multilayered conducting plate is analytically modeled with spatial transformation for eddy current nondestructive evaluation. Basically, the core of the formulation is to obtain the projection of magnetic vector potential (MVP) from the driver coil onto the vector along the tilted pickup coil, which is divided into two key steps. The first step is to make a projection of MVP along the pickup coil onto a horizontal plane, and the second one is to build the relationship between the projected MVP and the MVP along the driver coil. Afterwards, an analytical model for the case of a layered plate is established with the reflection and transmission theory of electromagnetic fields. The calculated values from the resulting model indicate good agreement with those from the finite element model (FEM) and experiments, which validates the developed analytical model. Project supported by the National Natural Science Foundation of China (Grant Nos. 61701500, 51677187, and 51465024).

  3. Green analytical chemistry introduction to chloropropanols determination at no economic and analytical performance costs?

    Science.gov (United States)

    Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek

    2016-01-15

    In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Application of Multi-Analyte Methods for Pesticide Formulations

    Energy Technology Data Exchange (ETDEWEB)

    Lantos, J.; Virtics, I. [Plant Protection & Soil Conservation Service of Szabolcs-Szatmár-Bereg County, Nyíregyháza (Hungary)

    2009-07-15

    The application of multi-analyte methods for pesticide formulations by GC analysis is discussed. HPLC was used to determine active ingredients. HPLC elution sequences were related to individual n-octanol/water partition coefficients. Real laboratory data are presented and evaluated with regard to validation requirements. The retention time data of pesticides on different HPLC columns under gradient and isocratic conditions are compared to illustrate the applicability of the methodologies. (author)

  5. An analytical model for annular flow boiling heat transfer in microchannel heat sinks

    International Nuclear Information System (INIS)

    Megahed, A.; Hassan, I.

    2009-01-01

    An analytical model has been developed to predict flow boiling heat transfer coefficient in microchannel heat sinks. The new analytical model is proposed to predict the two-phase heat transfer coefficient during annular flow regime based on the separated model. Opposing to the majority of annular flow heat transfer models, the model is based on fundamental conservation principles. The model considers the characteristics of microchannel heat sink during annular flow and eliminates using any empirical closure relations. Comparison with limited experimental data was found to validate the usefulness of this analytical model. The model predicts the experimental data with a mean absolute error 8%. (author)

  6. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  7. Impact mechanics of ship collisions and validations with experimental results

    DEFF Research Database (Denmark)

    Zhang, Shengming; Villavicencio, R.; Zhu, L.

    2017-01-01

    Closed-form analytical solutions for the energy released for deforming and crushing ofstructures and the impact impulse during ship collisions were developed and published inMarine Structures in 1998 [1]. The proposed mathematical models have been used bymany engineers and researchers although th...

  8. A semi-analytical study on helical springs made of shape memory polymer

    International Nuclear Information System (INIS)

    Baghani, M; Naghdabadi, R; Arghavani, J

    2012-01-01

    In this paper, the responses of shape memory polymer (SMP) helical springs under axial force are studied both analytically and numerically. In the analytical solution, we first derive the response of a cylindrical tube under torsional loadings. This solution can be used for helical springs in which both the curvature and pitch effects are negligible. This is the case for helical springs with large ratios of the mean coil radius to the cross sectional radius (spring index) and also small pitch angles. Making use of this solution simplifies the analysis of the helical springs to that of the torsion of a straight bar with circular cross section. The 3D phenomenological constitutive model recently proposed for SMPs is also reduced to the 1D shear case. Thus, an analytical solution for the torsional response of SMP tubes in a full cycle of stress-free strain recovery is derived. In addition, the curvature effect is added to the formulation and the SMP helical spring is analyzed using the exact solution presented for torsion of curved SMP tubes. In this modified solution, the effect of the direct shear force is also considered. In the numerical analysis, the 3D constitutive equations are implemented in a finite element program and a full cycle of stress-free strain recovery of an SMP (extension or compression) helical spring is simulated. Analytical and numerical results are compared and it is shown that the analytical solution gives accurate stress distributions in the cross section of the helical SMP spring besides the global load–deflection response. Some case studies are presented to show the validity of the presented analytical method. (paper)

  9. Individualism: a valid and important dimension of cultural differences between nations.

    Science.gov (United States)

    Schimmack, Ulrich; Oishi, Shigehiro; Diener, Ed

    2005-01-01

    Oyserman, Coon, and Kemmelmeier's (2002) meta-analysis suggested problems in the measurement of individualism and collectivism. Studies using Hofstede's individualism scores show little convergent validity with more recent measures of individualism and collectivism. We propose that the lack of convergent validity is due to national differences in response styles. Whereas Hofstede statistically controlled for response styles, Oyserman et al.'s meta-analysis relied on uncorrected ratings. Data from an international student survey demonstrated convergent validity between Hofstede's individualism dimension and horizontal individualism when response styles were statistically controlled, whereas uncorrected scores correlated highly with the individualism scores in Oyserman et al.'s meta-analysis. Uncorrected horizontal individualism scores and meta-analytic individualism scores did not correlate significantly with nations' development, whereas corrected horizontal individualism scores and Hofstede's individualism dimension were significantly correlated with development. This pattern of results suggests that individualism is a valid construct for cross-cultural comparisons, but that the measurement of this construct needs improvement.

  10. Toxicologic evaluation of analytes from Tank 241-C-103

    International Nuclear Information System (INIS)

    Mahlum, D.D.; Young, J.Y.; Weller, R.E.

    1994-11-01

    Westinghouse Hanford Company requested PNL to assemble a toxicology review panel (TRP) to evaluate analytical data compiled by WHC, and provide advice concerning potential health effects associated with exposure to tank-vapor constituents. The team's objectives would be to (1) review procedures used for sampling vapors from tanks, (2) identify constituents in tank-vapor samples that could be related to symptoms reported by workers, (3) evaluate the toxicological implications of those constituents by comparison to establish toxicological databases, (4) provide advice for additional analytical efforts, and (5) support other activities as requested by WHC. The TRP represents a wide range of expertise, including toxicology, industrial hygiene, and occupational medicine. The TRP prepared a list of target analytes that chemists at the Oregon Graduate Institute/Sandia (OGI), Oak Ridge National Laboratory (ORNL), and PNL used to establish validated methods for quantitative analysis of head-space vapors from Tank 241-C-103. this list was used by the analytical laboratories to develop appropriate analytical methods for samples from Tank 241-C-103. Target compounds on the list included acetone, acetonitrile, ammonia, benzene, 1, 3-butadiene, butanal, n-butanol, hexane, 2-hexanone, methylene chloride, nitric oxide, nitrogen dioxide, nitrous oxide, dodecane, tridecane, propane nitrile, sulfur oxide, tributyl phosphate, and vinylidene chloride. The TRP considered constituent concentrations, current exposure limits, reliability of data relative to toxicity, consistency of the analytical data, and whether the material was carcinogenic or teratogenic. A final consideration in the analyte selection process was to include representative chemicals for each class of compounds found

  11. Results of an interlaboratory comparison of analytical methods for contaminants of emerging concern in water.

    Science.gov (United States)

    Vanderford, Brett J; Drewes, Jörg E; Eaton, Andrew; Guo, Yingbo C; Haghani, Ali; Hoppe-Jones, Christiane; Schluesener, Michael P; Snyder, Shane A; Ternes, Thomas; Wood, Curtis J

    2014-01-07

    An evaluation of existing analytical methods used to measure contaminants of emerging concern (CECs) was performed through an interlaboratory comparison involving 25 research and commercial laboratories. In total, 52 methods were used in the single-blind study to determine method accuracy and comparability for 22 target compounds, including pharmaceuticals, personal care products, and steroid hormones, all at ng/L levels in surface and drinking water. Method biases ranged from caffeine, NP, OP, and triclosan had false positive rates >15%. In addition, some methods reported false positives for 17β-estradiol and 17α-ethynylestradiol in unspiked drinking water and deionized water, respectively, at levels higher than published predicted no-effect concentrations for these compounds in the environment. False negative rates were also generally contamination, misinterpretation of background interferences, and/or inappropriate setting of detection/quantification levels for analysis at low ng/L levels. The results of both comparisons were collectively assessed to identify parameters that resulted in the best overall method performance. Liquid chromatography-tandem mass spectrometry coupled with the calibration technique of isotope dilution were able to accurately quantify most compounds with an average bias of <10% for both matrixes. These findings suggest that this method of analysis is suitable at environmentally relevant levels for most of the compounds studied. This work underscores the need for robust, standardized analytical methods for CECs to improve data quality, increase comparability between studies, and help reduce false positive and false negative rates.

  12. WEB ANALYTICS COMBINED WITH EYE TRACKING FOR SUCCESSFUL USER EXPERIENCE DESIGN: A CASE STUDY

    Directory of Open Access Journals (Sweden)

    Magdalena BORYS

    2016-12-01

    Full Text Available The authors propose a new approach for the mobile user experience design process by means of web analytics and eye-tracking. The proposed method was applied to design the LUT mobile website. In the method, to create the mobile website design, data of various users and their behaviour were gathered and analysed using the web analytics tool. Next, based on the findings from web analytics, the mobile prototype for the website was created and validated in eye-tracking usability testing. The analysis of participants’ behaviour during eye-tracking sessions allowed improvements of the prototype.

  13. Analytic Approximate Solutions for Unsteady Two-Dimensional and Axisymmetric Squeezing Flows between Parallel Plates

    Directory of Open Access Journals (Sweden)

    Mohammad Mehdi Rashidi

    2008-01-01

    Full Text Available The flow of a viscous incompressible fluid between two parallel plates due to the normal motion of the plates is investigated. The unsteady Navier-Stokes equations are reduced to a nonlinear fourth-order differential equation by using similarity solutions. Homotopy analysis method (HAM is used to solve this nonlinear equation analytically. The convergence of the obtained series solution is carefully analyzed. The validity of our solutions is verified by the numerical results obtained by fourth-order Runge-Kutta.

  14. 42 CFR 476.94 - Notice of QIO initial denial determination and changes as a result of a DRG validation.

    Science.gov (United States)

    2010-10-01

    ... changes as a result of a DRG validation. 476.94 Section 476.94 Public Health CENTERS FOR MEDICARE... changes as a result of a DRG validation. (a) Notice of initial denial determination—(1) Parties to be... retrospective review, (excluding DRG validation and post procedure review), within 3 working days of the initial...

  15. Helios: Understanding Solar Evolution Through Text Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Randazzese, Lucien [SRI International, Menlo Park, CA (United States)

    2016-12-02

    This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance, or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.

  16. The Validity of Conscientiousness Is Overestimated in the Prediction of Job Performance.

    Science.gov (United States)

    Kepes, Sven; McDaniel, Michael A

    2015-01-01

    Sensitivity analyses refer to investigations of the degree to which the results of a meta-analysis remain stable when conditions of the data or the analysis change. To the extent that results remain stable, one can refer to them as robust. Sensitivity analyses are rarely conducted in the organizational science literature. Despite conscientiousness being a valued predictor in employment selection, sensitivity analyses have not been conducted with respect to meta-analytic estimates of the correlation (i.e., validity) between conscientiousness and job performance. To address this deficiency, we reanalyzed the largest collection of conscientiousness validity data in the personnel selection literature and conducted a variety of sensitivity analyses. Publication bias analyses demonstrated that the validity of conscientiousness is moderately overestimated (by around 30%; a correlation difference of about .06). The misestimation of the validity appears to be due primarily to suppression of small effects sizes in the journal literature. These inflated validity estimates result in an overestimate of the dollar utility of personnel selection by millions of dollars and should be of considerable concern for organizations. The fields of management and applied psychology seldom conduct sensitivity analyses. Through the use of sensitivity analyses, this paper documents that the existing literature overestimates the validity of conscientiousness in the prediction of job performance. Our data show that effect sizes from journal articles are largely responsible for this overestimation.

  17. 2D Analytical Modeling of Magnetic Vector Potential in Surface Mounted and Surface Inset Permanent Magnet Machines

    Directory of Open Access Journals (Sweden)

    A. Jabbari

    2017-12-01

    Full Text Available A 2D analytical method for magnetic vector potential calculation in inner rotor surface mounted and surface inset permanent magnet machines considering slotting effects, magnetization orientation and winding layout has been proposed in this paper. The analytical method is based on the resolution of Laplace and Poisson equations as well as Maxwell equation in quasi- Cartesian coordinate by using sub-domain method and hyperbolic functions. The developed method is applied on the performance computation of two prototypes surface mounted permanent magnet motors and two prototypes surface inset permanent magnet motors. A radial and a parallel magnetization orientation is considered for each type of motor. The results of these models are validated through FEM method.

  18. A closed-form analytical model for predicting 3D boundary layer displacement thickness for the validation of viscous flow solvers

    Science.gov (United States)

    Kumar, V. R. Sanal; Sankar, Vigneshwaran; Chandrasekaran, Nichith; Saravanan, Vignesh; Natarajan, Vishnu; Padmanabhan, Sathyan; Sukumaran, Ajith; Mani, Sivabalan; Rameshkumar, Tharikaa; Nagaraju Doddi, Hema Sai; Vysaprasad, Krithika; Sharan, Sharad; Murugesh, Pavithra; Shankar, S. Ganesh; Nejaamtheen, Mohammed Niyasdeen; Baskaran, Roshan Vignesh; Rahman Mohamed Rafic, Sulthan Ariff; Harisrinivasan, Ukeshkumar; Srinivasan, Vivek

    2018-02-01

    A closed-form analytical model is developed for estimating the 3D boundary-layer-displacement thickness of an internal flow system at the Sanal flow choking condition for adiabatic flows obeying the physics of compressible viscous fluids. At this unique condition the boundary-layer blockage induced fluid-throat choking and the adiabatic wall-friction persuaded flow choking occur at a single sonic-fluid-throat location. The beauty and novelty of this model is that without missing the flow physics we could predict the exact boundary-layer blockage of both 2D and 3D cases at the sonic-fluid-throat from the known values of the inlet Mach number, the adiabatic index of the gas and the inlet port diameter of the internal flow system. We found that the 3D blockage factor is 47.33 % lower than the 2D blockage factor with air as the working fluid. We concluded that the exact prediction of the boundary-layer-displacement thickness at the sonic-fluid-throat provides a means to correctly pinpoint the causes of errors of the viscous flow solvers. The methodology presented herein with state-of-the-art will play pivotal roles in future physical and biological sciences for a credible verification, calibration and validation of various viscous flow solvers for high-fidelity 2D/3D numerical simulations of real-world flows. Furthermore, our closed-form analytical model will be useful for the solid and hybrid rocket designers for the grain-port-geometry optimization of new generation single-stage-to-orbit dual-thrust-motors with the highest promising propellant loading density within the given envelope without manifestation of the Sanal flow choking leading to possible shock waves causing catastrophic failures.

  19. A closed-form analytical model for predicting 3D boundary layer displacement thickness for the validation of viscous flow solvers

    Directory of Open Access Journals (Sweden)

    V. R. Sanal Kumar

    2018-02-01

    Full Text Available A closed-form analytical model is developed for estimating the 3D boundary-layer-displacement thickness of an internal flow system at the Sanal flow choking condition for adiabatic flows obeying the physics of compressible viscous fluids. At this unique condition the boundary-layer blockage induced fluid-throat choking and the adiabatic wall-friction persuaded flow choking occur at a single sonic-fluid-throat location. The beauty and novelty of this model is that without missing the flow physics we could predict the exact boundary-layer blockage of both 2D and 3D cases at the sonic-fluid-throat from the known values of the inlet Mach number, the adiabatic index of the gas and the inlet port diameter of the internal flow system. We found that the 3D blockage factor is 47.33 % lower than the 2D blockage factor with air as the working fluid. We concluded that the exact prediction of the boundary-layer-displacement thickness at the sonic-fluid-throat provides a means to correctly pinpoint the causes of errors of the viscous flow solvers. The methodology presented herein with state-of-the-art will play pivotal roles in future physical and biological sciences for a credible verification, calibration and validation of various viscous flow solvers for high-fidelity 2D/3D numerical simulations of real-world flows. Furthermore, our closed-form analytical model will be useful for the solid and hybrid rocket designers for the grain-port-geometry optimization of new generation single-stage-to-orbit dual-thrust-motors with the highest promising propellant loading density within the given envelope without manifestation of the Sanal flow choking leading to possible shock waves causing catastrophic failures.

  20. On the analytical modeling of the nonlinear vibrations of pretensioned space structures

    Science.gov (United States)

    Housner, J. M.; Belvin, W. K.

    1983-01-01

    Pretensioned structures are receiving considerable attention as candidate large space structures. A typical example is a hoop-column antenna. The large number of preloaded members requires efficient analytical methods for concept validation and design. Validation through analyses is especially important since ground testing may be limited due to gravity effects and structural size. The present investigation has the objective to present an examination of the analytical modeling of pretensioned members undergoing nonlinear vibrations. Two approximate nonlinear analysis are developed to model general structural arrangements which include beam-columns and pretensioned cables attached to a common nucleus, such as may occur at a joint of a pretensioned structure. Attention is given to structures undergoing nonlinear steady-state oscillations due to sinusoidal excitation forces. Three analyses, linear, quasi-linear, and nonlinear are conducted and applied to study the response of a relatively simple cable stiffened structure.

  1. Validation of a BOTDR-based system for the detection of smuggling tunnels

    Science.gov (United States)

    Elkayam, Itai; Klar, Assaf; Linker, Raphael; Marshall, Alec M.

    2010-04-01

    Cross-border smuggling tunnels enable unmonitored movement of people, drugs and weapons and pose a very serious threat to homeland security. Recently, Klar and Linker (2009) [SPIE paper No. 731603] presented an analytical study of the feasibility of a Brillouin Optical Time Domain Reflectometry (BOTDR) based system for the detection of small sized smuggling tunnels. The current study extends this work by validating the analytical models against real strain measurements in soil obtained from small scale experiments in a geotechnical centrifuge. The soil strains were obtained using an image analysis method that tracked the displacement of discrete patches of soil through a sequence of digital images of the soil around the tunnel during the centrifuge test. The results of the present study are in agreement with those of a previous study which was based on synthetic signals generated using empirical and analytical models from the literature.

  2. ELISA validation and determination of cut-off level for chloramphenicol residues in honey

    Directory of Open Access Journals (Sweden)

    Biernacki Bogumił

    2015-09-01

    Full Text Available An analytical validation of a screening ELISA for detection of chloramphenicol (CAP in honey was conducted according to the Commission Decision 2002/657/EC and Guidelines for the Validation of Screening Methods for Residues of Veterinary Medicines. The analyte was extracted from honey with a water and ethyl acetate mixture, and CAP concentrations were measured photometrically at 450 nm. The recovery rate of the analyte from spiked samples was 79%. The cut-off level of CAP in honey as the minimum recovery (0.17 units was established. Detection capability (CCβ was fixed at 0.25 μg kg−1. No relevant interferences between matrix effects and structurally related substances including florfenicol and thiamphenicol were observed. The ELISA method should be useful for determination of CAP residues in honey monitoring.

  3. Determination of polychlorinated dibenzodioxins and polychlorinated dibenzofurans (PCDDs/PCDFs) in food and feed using a bioassay. Result of a validation study

    Energy Technology Data Exchange (ETDEWEB)

    Gizzi, G.; Holst, C. von; Anklam, E. [Commission of the European Communities, Geel (Belgium). Joint Research Centre, Inst. for Reference Materials and Measurement, Food Safety and Quality Unit; Hoogenboom, R. [RIKILT-Intitute of Food Safety, Wageningen (Netherlands); Rose, M. [Defra Central Science Laboratory, Sand Hutton, York (United Kingdom)

    2004-09-15

    It is estimated that more than 90% of dioxins consumed by humans come from foods derived from animals. The European Commission through a Council Regulation (No 2375/2001) and a Directive (2001/102/EC), both revised by the Commission Recommendation (2002/201/EC), has set maximum levels for dioxins in food and feedstuffs. To implement the regulation, dioxin-monitoring programs of food and feedstuffs will be undertaken by the Member States requiring the analysis of large amounts of samples. Food and feed companies will have to control their products before putting them into the market. The monitoring for the presence of dioxins in food and feeds needs fast and cheap screening methods in order to select samples with potentially high levels of dioxins to be then analysed by a confirmatory method like HRGC/HRMS. Bioassays like the DR CALUX {sup registered} - assay have claimed to provide a suitable alternative for the screening of large number of samples, reducing costs and the required time of analysis. These methods have to comply with the specific characteristics considered into two Commission Directives (2002/69/EC; 2002/70/EC), establishing the requirements for the determination of dioxin and dioxin-like PCBs for the official control of food and feedstuffs. The European Commission's Joint Research Centre is pursuing validation of alternative techniques in food and feed materials. In order to evaluate the applicability of the DR CALUX {sup registered} technique as screening method in compliance with the Commission Directives, a validation study was organised in collaboration with CSL and RIKILT. The aim of validating an analytical method is first to determine its performance characteristics (e.g. variability, bias, rate of false positive and false negative results), and secondly to evaluate if the method is fit for the purpose. Two approaches are commonly used: an in-house validation is preferentially performed first in order to establish whether the method is

  4. Assay Validation For Quantitation of Sn 2+ In Radiopharmaceutical Kits

    International Nuclear Information System (INIS)

    Muthalib, A; Ramli, Martalena; Herlina; Sarmini, Endang; Suharmadi; Besari, Canti

    1998-01-01

    An assay validation for quantitation of Sn2+ in radiopharmaceutical kits based on indirect iodometric titration is described. The method is based on the oxidation of sn2+ using a known excess of iodine and the excess unreacted iodine titrated with thiosulphate. Typical analytical parameters considered in this assay validation are precision, accuracy, selectivity or specificity, range, and linearity. The precision of the analytical method is quit good represented by coefficient of variance in range of 1.0% to 6.9 %, for 10 runs of analysis except one analysis shows the coefficient of 10.2 %. The method has an accuracy of 95.6 % - 99 % as percent recoveries at theoretical Sn2+ amounts of 463 μg to 2318μg

  5. Network Traffic Analysis With Query Driven VisualizationSC 2005HPC Analytics Results

    Energy Technology Data Exchange (ETDEWEB)

    Stockinger, Kurt; Wu, Kesheng; Campbell, Scott; Lau, Stephen; Fisk, Mike; Gavrilov, Eugene; Kent, Alex; Davis, Christopher E.; Olinger,Rick; Young, Rob; Prewett, Jim; Weber, Paul; Caudell, Thomas P.; Bethel,E. Wes; Smith, Steve

    2005-09-01

    Our analytics challenge is to identify, characterize, and visualize anomalous subsets of large collections of network connection data. We use a combination of HPC resources, advanced algorithms, and visualization techniques. To effectively and efficiently identify the salient portions of the data, we rely on a multi-stage workflow that includes data acquisition, summarization (feature extraction), novelty detection, and classification. Once these subsets of interest have been identified and automatically characterized, we use a state-of-the-art-high-dimensional query system to extract data subsets for interactive visualization. Our approach is equally useful for other large-data analysis problems where it is more practical to identify interesting subsets of the data for visualization than to render all data elements. By reducing the size of the rendering workload, we enable highly interactive and useful visualizations. As a result of this work we were able to analyze six months worth of data interactively with response times two orders of magnitude shorter than with conventional methods.

  6. Methods used by Elsam for monitoring precision and accuracy of analytical results

    Energy Technology Data Exchange (ETDEWEB)

    Hinnerskov Jensen, J [Soenderjyllands Hoejspaendingsvaerk, Faelleskemikerne, Aabenraa (Denmark)

    1996-12-01

    Performing round robins at regular intervals is the primary method used by ELsam for monitoring precision and accuracy of analytical results. The firs round robin was started in 1974, and today 5 round robins are running. These are focused on: boiler water and steam, lubricating oils, coal, ion chromatography and dissolved gases in transformer oils. Besides the power plant laboratories in Elsam, the participants are power plant laboratories from the rest of Denmark, industrial and commercial laboratories in Denmark, and finally foreign laboratories. The calculated standard deviations or reproducibilities are compared with acceptable values. These values originate from ISO, ASTM and the like, or from own experiences. Besides providing the laboratories with a tool to check their momentary performance, the round robins are vary suitable for evaluating systematic developments on a long term basis. By splitting up the uncertainty according to methods, sample preparation/analysis, etc., knowledge can be extracted from the round robins for use in many other situations. (au)

  7. Paraxial light distribution in the focal region of a lens: a comparison of several analytical solutions and a numerical result

    Science.gov (United States)

    Wu, Yang; Kelly, Damien P.

    2014-12-01

    The distribution of the complex field in the focal region of a lens is a classical optical diffraction problem. Today, it remains of significant theoretical importance for understanding the properties of imaging systems. In the paraxial regime, it is possible to find analytical solutions in the neighborhood of the focus, when a plane wave is incident on a focusing lens whose finite extent is limited by a circular aperture. For example, in Born and Wolf's treatment of this problem, two different, but mathematically equivalent analytical solutions, are presented that describe the 3D field distribution using infinite sums of ? and ? type Lommel functions. An alternative solution expresses the distribution in terms of Zernike polynomials, and was presented by Nijboer in 1947. More recently, Cao derived an alternative analytical solution by expanding the Fresnel kernel using a Taylor series expansion. In practical calculations, however, only a finite number of terms from these infinite series expansions is actually used to calculate the distribution in the focal region. In this manuscript, we compare and contrast each of these different solutions to a numerically calculated result, paying particular attention to how quickly each solution converges for a range of different spatial locations behind the focusing lens. We also examine the time taken to calculate each of the analytical solutions. The numerical solution is calculated in a polar coordinate system and is semi-analytic. The integration over the angle is solved analytically, while the radial coordinate is sampled with a sampling interval of ? and then numerically integrated. This produces an infinite set of replicas in the diffraction plane, that are located in circular rings centered at the optical axis and each with radii given by ?, where ? is the replica order. These circular replicas are shown to be fundamentally different from the replicas that arise in a Cartesian coordinate system.

  8. Assuring the Quality of Test Results in the Field of Nuclear Techniques and Ionizing Radiation. The Practical Implementation of Section 5.9 of the EN ISO/IEC 17025 Standard

    Science.gov (United States)

    Cucu, Daniela; Woods, Mike

    2008-08-01

    The paper aims to present a practical approach for testing laboratories to ensure the quality of their test results. It is based on the experience gained in assessing a large number of testing laboratories, discussing with management and staff, reviewing results obtained in national and international PTs and ILCs and exchanging information in the EA laboratory committee. According to EN ISO/IEC 17025, an accredited laboratory has to implement a programme to ensure the quality of its test results for each measurand. Pre-analytical, analytical and post-analytical measures shall be applied in a systematic manner. They shall include both quality control and quality assurance measures. When designing the quality assurance programme a laboratory should consider pre-analytical activities (like personnel training, selection and validation of test methods, qualifying equipment), analytical activities ranging from sampling, sample preparation, instrumental analysis and post-analytical activities (like decoding, calculation, use of statistical tests or packages, management of results). Designed on different levels (analyst, quality manager and technical manager), including a variety of measures, the programme shall ensure the validity and accuracy of test results, the adequacy of the management system, prove the laboratory's competence in performing tests under accreditation and last but not least show the comparability of test results. Laboratory management should establish performance targets and review periodically QC/QA results against them, implementing appropriate measures in case of non-compliance.

  9. Assuring the Quality of Test Results in the Field of Nuclear Techniques and Ionizing Radiation. The Practical Implementation of Section 5.9 of the EN ISO/IEC 17025 Standard

    International Nuclear Information System (INIS)

    Cucu, Daniela; Woods, Mike

    2008-01-01

    The paper aims to present a practical approach for testing laboratories to ensure the quality of their test results. It is based on the experience gained in assessing a large number of testing laboratories, discussing with management and staff, reviewing results obtained in national and international PTs and ILCs and exchanging information in the EA laboratory committee.According to EN ISO/IEC 17025, an accredited laboratory has to implement a programme to ensure the quality of its test results for each measurand. Pre-analytical, analytical and post-analytical measures shall be applied in a systematic manner. They shall include both quality control and quality assurance measures.When designing the quality assurance programme a laboratory should consider pre-analytical activities (like personnel training, selection and validation of test methods, qualifying equipment), analytical activities ranging from sampling, sample preparation, instrumental analysis and post-analytical activities (like decoding, calculation, use of statistical tests or packages, management of results).Designed on different levels (analyst, quality manager and technical manager), including a variety of measures, the programme shall ensure the validity and accuracy of test results, the adequacy of the management system, prove the laboratory's competence in performing tests under accreditation and last but not least show the comparability of test results.Laboratory management should establish performance targets and review periodically QC/QA results against them, implementing appropriate measures in case of non-compliance

  10. Modelling by partial least squares the relationship between the HPLC mobile phases and analytes on phenyl column.

    Science.gov (United States)

    Markopoulou, Catherine K; Kouskoura, Maria G; Koundourellis, John E

    2011-06-01

    Twenty-five descriptors and 61 structurally different analytes have been used on a partial least squares (PLS) to latent structure technique in order to study chromatographically their interaction mechanism on a phenyl column. According to the model, 240 different retention times of the analytes, expressed as Y variable (log k), at different % MeOH mobile-phase concentrations have been correlated with their theoretical most important structural or molecular descriptors. The goodness-of-fit was estimated by the coefficient of multiple determinations r(2) (0.919), and the root mean square error of estimation (RMSEE=0.1283) values with a predictive ability (Q(2)) of 0.901. The model was further validated using cross-validation (CV), validated by 20 response permutations r(2) (0.0, 0.0146), Q(2) (0.0, -0.136) and validated by external prediction. The contribution of certain mechanism interactions between the analytes, the mobile phase and the column, proportional or counterbalancing is also studied. Trying to evaluate the influence on Y of every variable in a PLS model, VIP (variables importance in the projection) plot provides evidence that lipophilicity (expressed as Log D, Log P), polarizability, refractivity and the eluting power of the mobile phase are dominant in the retention mechanism on a phenyl column. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Analytic modeling, simulation and interpretation of broadband beam coupling impedance bench measurements

    Energy Technology Data Exchange (ETDEWEB)

    Niedermayer, U., E-mail: niedermayer@temf.tu-darmstadt.de [Institut für Theorie Elektromagnetischer Felder (TEMF), Technische Universität Darmstadt, Schloßgartenstraße 8, 64289 Darmstadt (Germany); Eidam, L. [Institut für Theorie Elektromagnetischer Felder (TEMF), Technische Universität Darmstadt, Schloßgartenstraße 8, 64289 Darmstadt (Germany); Boine-Frankenheim, O. [Institut für Theorie Elektromagnetischer Felder (TEMF), Technische Universität Darmstadt, Schloßgartenstraße 8, 64289 Darmstadt (Germany); GSI Helmholzzentrum für Schwerionenforschung, Planckstraße 1, 64291 Darmstadt (Germany)

    2015-03-11

    First, a generalized theoretical approach towards beam coupling impedances and stretched-wire measurements is introduced. Applied to a circular symmetric setup, this approach allows to compare beam and wire impedances. The conversion formulas for TEM scattering parameters from measurements to impedances are thoroughly analyzed and compared to the analytical beam impedance solution. A proof of validity for the distributed impedance formula is given. The interaction of the beam or the TEM wave with dispersive material such as ferrite is discussed. The dependence of the obtained beam impedance on the relativistic velocity β is investigated and found as material property dependent. Second, numerical simulations of wakefields and scattering parameters are compared. The applicability of scattering parameter conversion formulas for finite device length is investigated. Laboratory measurement results for a circularly symmetric test setup, i.e. a ferrite ring, are shown and compared to analytic and numeric models. The optimization of the measurement process and error reduction strategies are discussed.

  12. Analytical modeling and analysis of magnetic field and torque for novel axial flux eddy current couplers with PM excitation

    Science.gov (United States)

    Li, Zhao; Wang, Dazhi; Zheng, Di; Yu, Linxin

    2017-10-01

    Rotational permanent magnet eddy current couplers are promising devices for torque and speed transmission without any mechanical contact. In this study, flux-concentration disk-type permanent magnet eddy current couplers with double conductor rotor are investigated. Given the drawback of the accurate three-dimensional finite element method, this paper proposes a mixed two-dimensional analytical modeling approach. Based on this approach, the closed-form expressions of magnetic field, eddy current, electromagnetic force and torque for such devices are obtained. Finally, a three-dimensional finite element method is employed to validate the analytical results. Besides, a prototype is manufactured and tested for the torque-speed characteristic.

  13. Validation of a Blood-Based Laboratory Test to Aid in the Confirmation of a Diagnosis of Schizophrenia

    Directory of Open Access Journals (Sweden)

    Emanuel Schwarz

    2010-05-01

    Full Text Available We describe the validation of a serum-based test developed by Rules-Based Medicine which can be used to help confirm the diagnosis of schizophrenia. In preliminary studies using multiplex immunoassay profiling technology, we identified a disease signature comprised of 51 analytes which could distinguish schizophrenia (n = 250 from control (n = 230 subjects. In the next stage, these analytes were developed as a refined 51-plex immunoassay panel for validation using a large independent cohort of schizophrenia (n = 577 and control (n = 229 subjects. The resulting test yielded an overall sensitivity of 83% and specificity of 83% with a receiver operating characteristic area under the curve (ROC-AUC of 89%. These 51 immunoassays and the associated decision rule delivered a sensitive and specific prediction for the presence of schizophrenia in patients compared to matched healthy controls.

  14. Fluxball magnetic field analysis using a hybrid analytical/FEM/BEM with equivalent currents

    International Nuclear Information System (INIS)

    Fernandes, João F.P.; Camilo, Fernando M.; Machado, V. Maló

    2016-01-01

    In this paper, a fluxball electric machine is analyzed concerning the magnetic flux, force and torque. A novel method is proposed based in a special hybrid FEM/BEM (Finite Element Method/Boundary Element Method) with equivalent currents by using an analytical treatment for the source field determination. The method can be applied to evaluate the magnetic field in axisymmetric problems, in the presence of several magnetic materials. Same results obtained by a commercial Finite Element Analysis tool are presented for validation purposes with the proposed method. - Highlights: • The Fluxball machine magnetic field is analyzed by a new FEM/BEM/Analytical method. • The method is adequate for axisymmetric non homogeneous magnetic field problems. • The source magnetic field is evaluated considering a non-magnetic equivalent problem. • Material magnetization vectors are accounted by using equivalent currents. • A strong reduction of the finite element domain is achieved.

  15. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies

  16. Thresholds for statistical and clinical significance in systematic reviews with meta-analytic methods

    DEFF Research Database (Denmark)

    Jakobsen, Janus Christian; Wetterslev, Jorn; Winkel, Per

    2014-01-01

    BACKGROUND: Thresholds for statistical significance when assessing meta-analysis results are being insufficiently demonstrated by traditional 95% confidence intervals and P-values. Assessment of intervention effects in systematic reviews with meta-analysis deserves greater rigour. METHODS......: Methodologies for assessing statistical and clinical significance of intervention effects in systematic reviews were considered. Balancing simplicity and comprehensiveness, an operational procedure was developed, based mainly on The Cochrane Collaboration methodology and the Grading of Recommendations...... Assessment, Development, and Evaluation (GRADE) guidelines. RESULTS: We propose an eight-step procedure for better validation of meta-analytic results in systematic reviews (1) Obtain the 95% confidence intervals and the P-values from both fixed-effect and random-effects meta-analyses and report the most...

  17. Analytical method for optimization of maintenance policy based on available system failure data

    International Nuclear Information System (INIS)

    Coria, V.H.; Maximov, S.; Rivas-Dávalos, F.; Melchor, C.L.; Guardado, J.L.

    2015-01-01

    An analytical optimization method for preventive maintenance (PM) policy with minimal repair at failure, periodic maintenance, and replacement is proposed for systems with historical failure time data influenced by a current PM policy. The method includes a new imperfect PM model based on Weibull distribution and incorporates the current maintenance interval T 0 and the optimal maintenance interval T to be found. The Weibull parameters are analytically estimated using maximum likelihood estimation. Based on this model, the optimal number of PM and the optimal maintenance interval for minimizing the expected cost over an infinite time horizon are also analytically determined. A number of examples are presented involving different failure time data and current maintenance intervals to analyze how the proposed analytical optimization method for periodic PM policy performances in response to changes in the distribution of the failure data and the current maintenance interval. - Highlights: • An analytical optimization method for preventive maintenance (PM) policy is proposed. • A new imperfect PM model is developed. • The Weibull parameters are analytically estimated using maximum likelihood. • The optimal maintenance interval and number of PM are also analytically determined. • The model is validated by several numerical examples

  18. Weak-field asymptotic theory of tunneling ionization: benchmark analytical results for two-electron atoms

    International Nuclear Information System (INIS)

    Trinh, Vinh H; Morishita, Toru; Tolstikhin, Oleg I

    2015-01-01

    The recently developed many-electron weak-field asymptotic theory of tunneling ionization of atoms and molecules in an external static electric field (Tolstikhin et al 2014, Phys. Rev. A 89, 013421) is extended to the first-order terms in the asymptotic expansion in field. To highlight the results, here we present a simple analytical formula giving the rate of tunneling ionization of two-electron atoms H − and He. Comparison with fully-correlated ab initio calculations available for these systems shows that the first-order theory works quantitatively in a wide range of fields up to the onset of over-the-barrier ionization and hence is expected to find numerous applications in strong-field physics. (fast track communication)

  19. Analytical Modelling of Wireless Power Transfer (WPT) Systems for Electric Vehicle Application

    Energy Technology Data Exchange (ETDEWEB)

    Chinthavali, Madhu Sudhan [ORNL; Campbell, Steven L [ORNL

    2016-01-01

    This paper presents an analytical model for wireless power transfer system used in electric vehicle application. The equivalent circuit model for each major component of the system is described, including the input voltage source, resonant network, transformer, nonlinear diode rectifier load, etc. Based on the circuit model, the primary side compensation capacitance, equivalent input impedance, active / reactive power are calculated, which provides a guideline for parameter selection. Moreover, the voltage gain curve from dc output to dc input is derived as well. A hardware prototype with series-parallel resonant stage is built to verify the developed model. The experimental results from the hardware are compared with the model predicted results to show the validity of the model.

  20. MARS Validation Plan and Status

    International Nuclear Information System (INIS)

    Ahn, Seung-hoon; Cho, Yong-jin

    2008-01-01

    The KINS Reactor Thermal-hydraulic Analysis System (KINS-RETAS) under development is directed toward a realistic analysis approach of best-estimate (BE) codes and realistic assumptions. In this system, MARS is pivoted to provide the BE Thermal-Hydraulic (T-H) response in core and reactor coolant system to various operational transients and accidental conditions. As required for other BE codes, the qualification is essential to ensure reliable and reasonable accuracy for a targeted MARS application. Validation is a key element of the code qualification, and determines the capability of a computer code in predicting the major phenomena expected to occur. The MARS validation was made by its developer KAERI, on basic premise that its backbone code RELAP5/MOD3.2 is well qualified against analytical solutions, test or operational data. A screening was made to select the test data for MARS validation; some models transplanted from RELAP5, if already validated and found to be acceptable, were screened out from assessment. It seems to be reasonable, but does not demonstrate whether code adequacy complies with the software QA guidelines. Especially there may be much difficulty in validating the life-cycle products such as code updates or modifications. This paper presents the plan for MARS validation, and the current implementation status

  1. Formative assessment and learning analytics

    NARCIS (Netherlands)

    Tempelaar, D.T.; Heck, A.; Cuypers, H.; van der Kooij, H.; van de Vrie, E.; Suthers, D.; Verbert, K.; Duval, E.; Ochoa, X.

    2013-01-01

    Learning analytics seeks to enhance the learning process through systematic measurements of learning related data, and informing learners and teachers of the results of these measurements, so as to support the control of the learning process. Learning analytics has various sources of information,

  2. Analytically derived weighting factors for transmission tomography cone beam projections

    International Nuclear Information System (INIS)

    Yao Weiguang; Leszczynski, Konrad

    2009-01-01

    Weighting factors, which define the contributions of individual voxels of a 3D object to individual projection elements (pixels) on the detector, are the basic elements required in iterative tomographic reconstructions from transmission projections. Exact or as accurate as possible values for weighting factors are required in high-resolution reconstructions. Geometric complexity of the problem, however, makes it difficult to obtain exact weighting factor values. In this work, we derive an analytical expression for the weighting factors in cone beam projection geometry. The resulting formula is validated and applied to reconstruction from mega and kilovoltage x-ray cone beam projections. The reconstruction speed and accuracy are significantly improved by using the weighting factor values.

  3. Solar neutrino masses and mixing from bilinear R-parity broken supersymmetry: Analytical versus numerical results

    Science.gov (United States)

    Díaz, M.; Hirsch, M.; Porod, W.; Romão, J.; Valle, J.

    2003-07-01

    We give an analytical calculation of solar neutrino masses and mixing at one-loop order within bilinear R-parity breaking supersymmetry, and compare our results to the exact numerical calculation. Our method is based on a systematic perturbative expansion of R-parity violating vertices to leading order. We find in general quite good agreement between the approximate and full numerical calculations, but the approximate expressions are much simpler to implement. Our formalism works especially well for the case of the large mixing angle Mikheyev-Smirnov-Wolfenstein solution, now strongly favored by the recent KamLAND reactor neutrino data.

  4. An analytic solution of the static problem of inclined risers conveying fluid

    KAUST Repository

    Alfosail, Feras

    2016-05-28

    We use the method of matched asymptotic expansion to develop an analytic solution to the static problem of clamped–clamped inclined risers conveying fluid. The inclined riser is modeled as an Euler–Bernoulli beam taking into account its self-weight, mid-plane stretching, an applied axial tension, and the internal fluid velocity. The solution consists of three parts: an outer solution valid away from the two boundaries and two inner solutions valid near the two ends. The three solutions are then matched and combined into a so-called composite expansion. A Newton–Raphson method is used to determine the value of the mid-plane stretching corresponding to each applied tension and internal velocity. The analytic solution is in good agreement with those obtained with other solution methods for large values of applied tensions. Therefore, it can be used to replace other mathematical solution methods that suffer numerical limitations and high computational cost. © 2016 Springer Science+Business Media Dordrecht

  5. Tank 241-T-105, cores 205 and 207 analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final laboratory report for tank 241-T-105 push mode core segments collected between June 24, 1997 and June 30, 1997. The segments were subsampled and analyzed in accordance with the Tank Push Mode Core Sampling and Analysis Plan (TSAP) (Field,1997), the Tank Safety Screening Data Quality Objective (Safety DQO) (Dukelow, et al., 1995) and Tank 241-T-105 Sample Analysis (memo) (Field, 1997a). The analytical results are included in Table 1. None of the subsamples submitted for the differential scanning calorimetry (DSC) analysis or total alpha activity (AT) exceeded the notification limits as stated in the TSAP (Field, 1997). The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems (TWRS) Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and not considered in this report

  6. Large leak sodium-water reaction code SWACS and its validation

    International Nuclear Information System (INIS)

    Miyake, O.; Shindo, Y.; Hiroi, H.; Tanabe, H.; Sato, M.

    1982-01-01

    A computer code SWACS for analyzing the large leak accident of an LMFBR steam generators has been developed and validated. Five tests data obtained by SWAT-3 test facility were compared with code results. In each of SWAT-3 tests, a double-ended guillotine rupture of one tube was simulated in a helical coil steam generator model with 1/2.5 scaled test vessel to the prototype SG. The analytical results, including an initial pressure spike, a propagated pressure in a secondary system, and a quasi-steady pressure, indicate that the overall large-leak event could be predicted in reasonably good agreement

  7. Experimental validation of the twins prediction program for rolling noise. Pt.2: results

    NARCIS (Netherlands)

    Thompson, D.J.; Fodiman, P.; Mahé, H.

    1996-01-01

    Two extensive measurement campaigns have been carried out to validate the TWINS prediction program for rolling noise, as described in part 1 of this paper. This second part presents the experimental results of vibration and noise during train pass-bys and compares them with predictions from the

  8. Analytical solution of the thermo-mechanical stresses in a multilayered composite pressure vessel considering the influence of the closed ends

    International Nuclear Information System (INIS)

    Zhang, Q.; Wang, Z.W.; Tang, C.Y.; Hu, D.P.; Liu, P.Q.; Xia, L.Z.

    2012-01-01

    Limited work has been reported on determining the thermo-mechanical stresses in a multilayered composite pressure vessel when the influence of its closed ends is considered. In this study, an analytical solution was derived for determining the stress distribution of a multilayered composite pressure vessel subjected to an internal fluid pressure and a thermal load, based on thermo-elasticity theory. In the solution, a pseudo extrusion pressure was proposed to emulate the effect of the closed ends of the pressure vessel. To validate the analytical solution, the stress distribution of the pressure vessel was also computed using finite element (FE) method. It was found that the analytical results were in good agreement with the computational ones, and the effect of thermal load on the stress distribution was discussed in detail. The proposed analytical solution provides an exact means to design multilayered composite pressure vessels. Highlights: ► The thermal-mechanical stress was derived for a multilayered pressure vessel. ► A new pseudo extrusion pressure was proposed to emulate the effect of closed ends. ► The analytical results are in good agreement with the computational ones using FEM. ► The solution provides an exact way to design the multilayered pressure vessel.

  9. A Semi-Analytical Method for the PDFs of A Ship Rolling in Random Oblique Waves

    Science.gov (United States)

    Liu, Li-qin; Liu, Ya-liu; Xu, Wan-hai; Li, Yan; Tang, You-gang

    2018-03-01

    The PDFs (probability density functions) and probability of a ship rolling under the random parametric and forced excitations were studied by a semi-analytical method. The rolling motion equation of the ship in random oblique waves was established. The righting arm obtained by the numerical simulation was approximately fitted by an analytical function. The irregular waves were decomposed into two Gauss stationary random processes, and the CARMA (2, 1) model was used to fit the spectral density function of parametric and forced excitations. The stochastic energy envelope averaging method was used to solve the PDFs and the probability. The validity of the semi-analytical method was verified by the Monte Carlo method. The C11 ship was taken as an example, and the influences of the system parameters on the PDFs and probability were analyzed. The results show that the probability of ship rolling is affected by the characteristic wave height, wave length, and the heading angle. In order to provide proper advice for the ship's manoeuvring, the parametric excitations should be considered appropriately when the ship navigates in the oblique seas.

  10. Update of Standard Practices for New Method Validation in Forensic Toxicology.

    Science.gov (United States)

    Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T

    2017-01-01

    International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  11. Learning Style Scales: a valid and reliable questionnaire

    Directory of Open Access Journals (Sweden)

    Abdolghani Abdollahimohammad

    2014-08-01

    Full Text Available Purpose: Learning-style instruments assist students in developing their own learning strategies and outcomes, in eliminating learning barriers, and in acknowledging peer diversity. Only a few psychometrically validated learning-style instruments are available. This study aimed to develop a valid and reliable learning-style instrument for nursing students. Methods: A cross-sectional survey study was conducted in two nursing schools in two countries. A purposive sample of 156 undergraduate nursing students participated in the study. Face and content validity was obtained from an expert panel. The LSS construct was established using principal axis factoring (PAF with oblimin rotation, a scree plot test, and parallel analysis (PA. The reliability of LSS was tested using Cronbach’s α, corrected item-total correlation, and test-retest. Results: Factor analysis revealed five components, confirmed by PA and a relatively clear curve on the scree plot. Component strength and interpretability were also confirmed. The factors were labeled as perceptive, solitary, analytic, competitive, and imaginative learning styles. Cronbach’s α was > 0.70 for all subscales in both study populations. The corrected item-total correlations were > 0.30 for the items in each component. Conclusion: The LSS is a valid and reliable inventory for evaluating learning style preferences in nursing students in various multicultural environments.

  12. Separation of very hydrophobic analytes by micellar electrokinetic chromatography IV. Modeling of the effective electrophoretic mobility from carbon number equivalents and octanol-water partition coefficients.

    Science.gov (United States)

    Huhn, Carolin; Pyell, Ute

    2008-07-11

    It is investigated whether those relationships derived within an optimization scheme developed previously to optimize separations in micellar electrokinetic chromatography can be used to model effective electrophoretic mobilities of analytes strongly differing in their properties (polarity and type of interaction with the pseudostationary phase). The modeling is based on two parameter sets: (i) carbon number equivalents or octanol-water partition coefficients as analyte descriptors and (ii) four coefficients describing properties of the separation electrolyte (based on retention data for a homologous series of alkyl phenyl ketones used as reference analytes). The applicability of the proposed model is validated comparing experimental and calculated effective electrophoretic mobilities. The results demonstrate that the model can effectively be used to predict effective electrophoretic mobilities of neutral analytes from the determined carbon number equivalents or from octanol-water partition coefficients provided that the solvation parameters of the analytes of interest are similar to those of the reference analytes.

  13. Perioperative and ICU Healthcare Analytics within a Veterans Integrated System Network: a Qualitative Gap Analysis.

    Science.gov (United States)

    Mudumbai, Seshadri; Ayer, Ferenc; Stefanko, Jerry

    2017-08-01

    Health care facilities are implementing analytics platforms as a way to document quality of care. However, few gap analyses exist on platforms specifically designed for patients treated in the Operating Room, Post-Anesthesia Care Unit, and Intensive Care Unit (ICU). As part of a quality improvement effort, we undertook a gap analysis of an existing analytics platform within the Veterans Healthcare Administration. The objectives were to identify themes associated with 1) current clinical use cases and stakeholder needs; 2) information flow and pain points; and 3) recommendations for future analytics development. Methods consisted of semi-structured interviews in 2 phases with a diverse set (n = 9) of support personnel and end users from five facilities across a Veterans Integrated Service Network. Phase 1 identified underlying needs and previous experiences with the analytics platform across various roles and operational responsibilities. Phase 2 validated preliminary feedback, lessons learned, and recommendations for improvement. Emerging themes suggested that the existing system met a small pool of national reporting requirements. However, pain points were identified with accessing data in several information system silos and performing multiple manual validation steps of data content. Notable recommendations included enhancing systems integration to create "one-stop shopping" for data, and developing a capability to perform trends analysis. Our gap analysis suggests that analytics platforms designed for surgical and ICU patients should employ approaches similar to those being used for primary care patients.

  14. A new self-firing MOS-thyristor device: optimization of the turn-off performance and experimental results

    Energy Technology Data Exchange (ETDEWEB)

    Breil, M.; Sanchez, J.L.; Austin, P.; Laur, J.P.

    1998-12-01

    In this paper, a new integrated self-firing and controlled turn-off MOS-thyristor structure is investigated. An analytical model describing the turn-off operation and parasitic latch-up has been developed, allowing to highlight and optimize the physical and geometrical parameters acting upon main electrical characteristics. The analytical model is validated by 2D simulations using PISCES. The technological fabrication process is optimized by 2D simulations using SUPREM IV. Electrical characterization results of fabricated test structures are presented. (authors) 6 refs.

  15. Comparison of the effectiveness of analytical wake models for wind farm with constant and variable hub heights

    International Nuclear Information System (INIS)

    Wang, Longyan; Tan, Andy C.C.; Cholette, Michael; Gu, Yuantong

    2016-01-01

    Highlights: • The effectiveness of three analytical wake models is studied. • The results of the analytical wake models are compared with the CFD simulations. • The results of CFD simulation are verified by comparison to the offshore wind farm observation data. • The onshore wind farm with both constant and different hub height turbines are analyzed. • PARK model is able to predict the total wind farm power production well with tuned surface roughness value. - Abstract: Extensive power losses of wind farm have been witnessed due to the wake interactions between wind turbines. By applying analytical wake models which describe the wind speed deficits in the wake quantitatively, the power losses can be regained to a large extent through wind farm layout optimization, and this has been extensively reported in literature. Nevertheless, the effectiveness of the analytical wake models in predicting the wind farm power production have rarely been studied and compared for wind farm with both constant and variable wind turbine hub heights. In this study, the effectiveness of three different analytical wake models (PARK model, Larsen model and B-P model) is thoroughly compared over a wide range of wake properties. After the validation with the observation data from offshore wind farm, CFD simulations are used to verify the effectiveness of the analytical wake models for an onshore wind farm. The results show that when using the PARK model the surface roughness value (z 0 ) must be carefully tuned to achieve good performance in predicting the wind farm power production. For the other two analytical wake models, their effectiveness varies depending on the situation of wind farm (offshore or onshore) and the wind turbine hub heights (constant or variable). It was found that the results of B-P model agree well with the CFD simulations for offshore wind farm, but not for the onshore wind farm. The Larsen model is more accurate for the wind farm with variable wind turbine

  16. A multi-band semi-analytical algorithm for estimating chlorophyll-a concentration in the Yellow River Estuary, China.

    Science.gov (United States)

    Chen, Jun; Quan, Wenting; Cui, Tingwei

    2015-01-01

    In this study, two sample semi-analytical algorithms and one new unified multi-band semi-analytical algorithm (UMSA) for estimating chlorophyll-a (Chla) concentration were constructed by specifying optimal wavelengths. The three sample semi-analytical algorithms, including the three-band semi-analytical algorithm (TSA), four-band semi-analytical algorithm (FSA), and UMSA algorithm, were calibrated and validated by the dataset collected in the Yellow River Estuary between September 1 and 10, 2009. By comparing of the accuracy of assessment of TSA, FSA, and UMSA algorithms, it was found that the UMSA algorithm had a superior performance in comparison with the two other algorithms, TSA and FSA. Using the UMSA algorithm in retrieving Chla concentration in the Yellow River Estuary decreased by 25.54% NRMSE (normalized root mean square error) when compared with the FSA algorithm, and 29.66% NRMSE in comparison with the TSA algorithm. These are very significant improvements upon previous methods. Additionally, the study revealed that the TSA and FSA algorithms are merely more specific forms of the UMSA algorithm. Owing to the special form of the UMSA algorithm, if the same bands were used for both the TSA and UMSA algorithms or FSA and UMSA algorithms, the UMSA algorithm would theoretically produce superior results in comparison with the TSA and FSA algorithms. Thus, good results may also be produced if the UMSA algorithm were to be applied for predicting Chla concentration for datasets of Gitelson et al. (2008) and Le et al. (2009).

  17. Validity of proposed DSM-5 diagnostic criteria for nicotine use disorder: results from 734 Israeli lifetime smokers

    Science.gov (United States)

    Shmulewitz, D.; Wall, M.M.; Aharonovich, E.; Spivak, B.; Weizman, A.; Frisch, A.; Grant, B. F.; Hasin, D.

    2013-01-01

    Background The fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) proposes aligning nicotine use disorder (NUD) criteria with those for other substances, by including the current DSM fourth edition (DSM-IV) nicotine dependence (ND) criteria, three abuse criteria (neglect roles, hazardous use, interpersonal problems) and craving. Although NUD criteria indicate one latent trait, evidence is lacking on: (1) validity of each criterion; (2) validity of the criteria as a set; (3) comparative validity between DSM-5 NUD and DSM-IV ND criterion sets; and (4) NUD prevalence. Method Nicotine criteria (DSM-IV ND, abuse and craving) and external validators (e.g. smoking soon after awakening, number of cigarettes per day) were assessed with a structured interview in 734 lifetime smokers from an Israeli household sample. Regression analysis evaluated the association between validators and each criterion. Receiver operating characteristic analysis assessed the association of the validators with the DSM-5 NUD set (number of criteria endorsed) and tested whether DSM-5 or DSM-IV provided the most discriminating criterion set. Changes in prevalence were examined. Results Each DSM-5 NUD criterion was significantly associated with the validators, with strength of associations similar across the criteria. As a set, DSM-5 criteria were significantly associated with the validators, were significantly more discriminating than DSM-IV ND criteria, and led to increased prevalence of binary NUD (two or more criteria) over ND. Conclusions All findings address previous concerns about the DSM-IV nicotine diagnosis and its criteria and support the proposed changes for DSM-5 NUD, which should result in improved diagnosis of nicotine disorders. PMID:23312475

  18. Design and experimental results of the 1-T Bitter Electromagnet Testing Apparatus (BETA)

    Science.gov (United States)

    Bates, E. M.; Birmingham, W. J.; Romero-Talamás, C. A.

    2018-05-01

    The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) technical prototype of the 10 T Adjustable Long Pulsed High-Field Apparatus. BETA's final design specifications are highlighted in this paper which include electromagnetic, thermal, and stress analyses. We discuss here the design and fabrication of BETA's core, vessel, cooling, and electrical subsystems. The electrical system of BETA is composed of a scalable solid-state DC breaker circuit. Experimental results display the stable operation of BETA at 1 T. These results are compared to both analytical design and finite element calculations. Experimental results validate analytical magnet designing methods developed at the Dusty Plasma Laboratory. The theoretical steady state maxima and the limits of BETA's design are explored in this paper.

  19. An Analytical Method for the Abel Inversion of Asymmetrical Gaussian Profiles

    International Nuclear Information System (INIS)

    Xu Guosheng; Wan Baonian

    2007-01-01

    An analytical algorithm for fast calculation of the Abel inversion for density profile measurement in tokamak is developed. Based upon the assumptions that the particle source is negligibly small in the plasma core region, density profiles can be approximated by an asymmetrical Gaussian distribution controlled only by one parameter V 0 /D and V 0 /D is constant along the radial direction, the analytical algorithm is presented and examined against a testing profile. The validity is confirmed by benchmark with the standard Abel inversion method and the theoretical profile. The scope of application as well as the error analysis is also discussed in detail

  20. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    Energy Technology Data Exchange (ETDEWEB)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z. [and others

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).