WorldWideScience

Sample records for validate analytical results

  1. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    Science.gov (United States)

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012

  2. Analytical results for Abelian projection

    International Nuclear Information System (INIS)

    Ogilivie, Michael C.

    1999-01-01

    Analytic methods for Abelian projection are developed, and a number of results related to string tension measurements are obtained. It is proven that even without gauge fixing, Abelian projection yields string tensions of the underlying non-Abelian theory. Strong arguments are given for similar results in the case where gauge fixing is employed. The subgroup used for projection need only contain the center of the gauge group, and need not be Abelian. While gauge fixing is shown to be in principle unnecessary for the success of Abelian projection, it is computationally advantageous for the same reasons that improved operators, e.g., the use of fat links, are advantageous in Wilson loop measurements

  3. Kawerau fluid chemistry : analytical results

    International Nuclear Information System (INIS)

    Mroczek, E.K.; Christenson, B.W.; Mountain, B.; Stewart, M.K.

    2001-01-01

    This report summarises the water and gas analytical data collected from Kawerau geothermal field 1998-2000 under the Sustainable Management of Geothermal and Mineral Resources (GMR) Project, Objective 2 'Understanding New Zealand Geothermal Systems'. The work is part of the continuing effort to characterise the chemical, thermal and isotopic signatures of the deep magmatic heat sources which drive our geothermal systems. At Kawerau there is clear indication that the present-day heat source relates to young volcanism within the field. However, being at the margins of the explored reservoir, little is presently known of the characteristics of that heat source. The Kawerau study follows on directly from the recently completed work characterising the geochemical signatures of the Ohaaki hydrothermal system. In the latter study the interpretation of the radiogenic noble gas isotope systematics was of fundamental importance in characterising the magmatic heat source. Unfortunately the collaboration with LLNL, which analysed the isotopes, could not be extended to include the Kawerau data. The gas samples have been archived and will be analysed once a new collaborator is found to continue the work. The purpose of the present compilation is to facilitate the final completion of the study by ensuring the data is accessible in one report. (author). 5 refs., 2 figs., 9 tabs

  4. Validation of Analytical Damping Ratio by Fatigue Stress Limit

    Science.gov (United States)

    Foong, Faruq Muhammad; Chung Ket, Thein; Beng Lee, Ooi; Aziz, Abdul Rashid Abdul

    2018-03-01

    The optimisation process of a vibration energy harvester is usually restricted to experimental approaches due to the lack of an analytical equation to describe the damping of a system. This study derives an analytical equation, which describes the first mode damping ratio of a clamp-free cantilever beam under harmonic base excitation by combining the transverse equation of motion of the beam with the damping-stress equation. This equation, as opposed to other common damping determination methods, is independent of experimental inputs or finite element simulations and can be solved using a simple iterative convergence method. The derived equation was determined to be correct for cases when the maximum bending stress in the beam is below the fatigue limit stress of the beam. However, an increasing trend in the error between the experiment and the analytical results were observed at high stress levels. Hence, the fatigue limit stress was used as a parameter to define the validity of the analytical equation.

  5. Cryptography based on neural networks - analytical results

    International Nuclear Information System (INIS)

    Rosen-Zvi, Michal; Kanter, Ido; Kinzel, Wolfgang

    2002-01-01

    The mutual learning process between two parity feed-forward networks with discrete and continuous weights is studied analytically, and we find that the number of steps required to achieve full synchronization between the two networks in the case of discrete weights is finite. The synchronization process is shown to be non-self-averaging and the analytical solution is based on random auxiliary variables. The learning time of an attacker that is trying to imitate one of the networks is examined analytically and is found to be much longer than the synchronization time. Analytical results are found to be in agreement with simulations. (letter to the editor)

  6. Development and validation of analytical methods for dietary supplements

    International Nuclear Information System (INIS)

    Sullivan, Darryl; Crowley, Richard

    2006-01-01

    The expanding use of innovative botanical ingredients in dietary supplements and foods has resulted in a flurry of research aimed at the development and validation of analytical methods for accurate measurement of active ingredients. The pressing need for these methods is being met through an expansive collaborative initiative involving industry, government, and analytical organizations. This effort has resulted in the validation of several important assays as well as important advances in the method engineering procedures which have improved the efficiency of the process. The initiative has also allowed researchers to hurdle many of the barricades that have hindered accurate analysis such as the lack of reference standards and comparative data. As the availability for nutraceutical products continues to increase these methods will provide consumers and regulators with the scientific information needed to assure safety and dependable labeling

  7. Risk analysis of analytical validations by probabilistic modification of FMEA

    DEFF Research Database (Denmark)

    Barends, D.M.; Oldenhof, M.T.; Vredenbregt, M.J.

    2012-01-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection...... and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring...... of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence...

  8. Valid, legally defensible data from your analytical laboratories

    International Nuclear Information System (INIS)

    Gay, D.D.; Allen, V.C.

    1989-01-01

    This paper discusses the definition of valid, legally defensible data. The authors describe the expectations of project managers and what should be gleaned from the laboratory in regard to analytical data

  9. Maritime Analytics Prototype: Phase 3 Validation

    Science.gov (United States)

    2014-01-01

    different so we need a flexible analysis set hierarchy encoded as directories or groups – like a recipe [C.3.1.4n] Improve the GUI:  Provide more...Problems zooming and panning on the timeline [C.1.2.1c, C.1.2.4e, C.1.3.1c, C.1.1.4c, C.1.1.4b]  Selected the wrong year and then the vessel...Scholtz_VAMetrics_2006.pdf] [21] J. Thomas, and K. Cook , Illuminating the Path, the Research and Development Agenda for Visual analytics: IEEE, 2005. [22

  10. Dynamically triangulated surfaces - some analytical results

    International Nuclear Information System (INIS)

    Kostov, I.K.

    1987-01-01

    We give a brief review of the analytical results concerning the model of dynamically triangulated surfaces. We will discuss the possible types of critical behaviour (depending on the dimension D of the embedding space) and the exact solutions obtained for D=0 and D=-2. The latter are important as a check of the Monte Carlo simulations applyed to study the model in more physical dimensions. They give also some general insight of its critical properties

  11. Compact tokamak reactors. Part 1 (analytic results)

    International Nuclear Information System (INIS)

    Wootton, A.J.; Wiley, J.C.; Edmonds, P.H.; Ross, D.W.

    1996-01-01

    We discuss the possible use of tokamaks for thermonuclear power plants, in particular tokamaks with low aspect ratio and copper toroidal field coils. Three approaches are presented. First we review and summarize the existing literature. Second, using simple analytic estimates, the size of the smallest tokamak to produce an ignited plasma is derived. This steady state energy balance analysis is then extended to determine the smallest tokamak power plant, by including the power required to drive the toroidal field, and considering two extremes of plasma current drive efficiency. The analytic results will be augmented by a numerical calculation which permits arbitrary plasma current drive efficiency; the results of which will be presented in Part II. Third, a scaling from any given reference reactor design to a copper toroidal field coil device is discussed. Throughout the paper the importance of various restrictions is emphasized, in particular plasma current drive efficiency, plasma confinement, plasma safety factor, plasma elongation, plasma beta, neutron wall loading, blanket availability and recirculating electric power. We conclude that the latest published reactor studies, which show little advantage in using low aspect ratio unless remarkably high efficiency plasma current drive and low safety factor are combined, can be reproduced with the analytic model

  12. Risk analysis by FMEA as an element of analytical validation.

    Science.gov (United States)

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-05

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.

  13. Risk analysis of analytical validations by probabilistic modification of FMEA.

    Science.gov (United States)

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Consistency of FMEA used in the validation of analytical procedures.

    Science.gov (United States)

    Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M

    2011-02-20

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. Consistency of FMEA used in the validation of analytical procedures

    DEFF Research Database (Denmark)

    Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten

    2011-01-01

    is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating......In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...

  16. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    Science.gov (United States)

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  17. $W^+ W^-$ + Jet: Compact Analytic Results

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, John [Fermilab; Miller, David [Glasgow U.; Robens, Tania [Dresden, Tech. U.

    2016-01-14

    In the second run of the LHC, which started in April 2015, an accurate understanding of Standard Model processes is more crucial than ever. Processes including electroweak gauge bosons serve as standard candles for SM measurements, and equally constitute important background for BSM searches. We here present the NLO QCD virtual contributions to W+W- + jet in an analytic format obtained through unitarity methods and show results for the full process using an implementation into the Monte Carlo event generator MCFM. Phenomenologically, we investigate total as well as differential cross sections for the LHC with 14 TeV center-of-mass energy, as well as a future 100 TeV proton-proton machine. In the format presented here, the one-loop virtual contributions also serve as important ingredients in the calculation of W+W- pair production at NNLO.

  18. Quantifying the measurement uncertainty of results from environmental analytical methods.

    Science.gov (United States)

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  19. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    Development and Validation of Analytical Method for Losartan-Copper Complex Using UV-Vis Spectrophotometry. ... Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with ...

  20. Analytical thermal model validation for Cassini radioisotope thermoelectric generator

    International Nuclear Information System (INIS)

    Lin, E.I.

    1997-01-01

    The Saturn-bound Cassini spacecraft is designed to rely, without precedent, on the waste heat from its three radioisotope thermoelectric generators (RTGs) to warm the propulsion module subsystem, and the RTG end dome temperature is a key determining factor of the amount of waste heat delivered. A previously validated SINDA thermal model of the RTG was the sole guide to understanding its complex thermal behavior, but displayed large discrepancies against some initial thermal development test data. A careful revalidation effort led to significant modifications and adjustments of the model, which result in a doubling of the radiative heat transfer from the heat source support assemblies to the end domes and bring up the end dome and flange temperature predictions to within 2 C of the pertinent test data. The increased inboard end dome temperature has a considerable impact on thermal control of the spacecraft central body. The validation process offers an example of physically-driven analytical model calibration with test data from not only an electrical simulator but also a nuclear-fueled flight unit, and has established the end dome temperatures of a flight RTG where no in-flight or ground-test data existed before

  1. Measuring Students' Writing Ability on a Computer-Analytic Developmental Scale: An Exploratory Validity Study

    Science.gov (United States)

    Burdick, Hal; Swartz, Carl W.; Stenner, A. Jackson; Fitzgerald, Jill; Burdick, Don; Hanlon, Sean T.

    2013-01-01

    The purpose of the study was to explore the validity of a novel computer-analytic developmental scale, the Writing Ability Developmental Scale. On the whole, collective results supported the validity of the scale. It was sensitive to writing ability differences across grades and sensitive to within-grade variability as compared to human-rated…

  2. Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.

    Science.gov (United States)

    Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro

    2017-07-01

    The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian

  3. Interacting Brownian Swarms: Some Analytical Results

    Directory of Open Access Journals (Sweden)

    Guillaume Sartoretti

    2016-01-01

    Full Text Available We consider the dynamics of swarms of scalar Brownian agents subject to local imitation mechanisms implemented using mutual rank-based interactions. For appropriate values of the underlying control parameters, the swarm propagates tightly and the distances separating successive agents are iid exponential random variables. Implicitly, the implementation of rank-based mutual interactions, requires that agents have infinite interaction ranges. Using the probabilistic size of the swarm’s support, we analytically estimate the critical interaction range below that flocked swarms cannot survive. In the second part of the paper, we consider the interactions between two flocked swarms of Brownian agents with finite interaction ranges. Both swarms travel with different barycentric velocities, and agents from both swarms indifferently interact with each other. For appropriate initial configurations, both swarms eventually collide (i.e., all agents interact. Depending on the values of the control parameters, one of the following patterns emerges after collision: (i Both swarms remain essentially flocked, or (ii the swarms become ultimately quasi-free and recover their nominal barycentric speeds. We derive a set of analytical flocking conditions based on the generalized rank-based Brownian motion. An extensive set of numerical simulations corroborates our analytical findings.

  4. Milestone M4900: Simulant Mixing Analytical Results

    Energy Technology Data Exchange (ETDEWEB)

    Kaplan, D.I.

    2001-07-26

    This report addresses Milestone M4900, ''Simulant Mixing Sample Analysis Results,'' and contains the data generated during the ''Mixing of Process Heels, Process Solutions, and Recycle Streams: Small-Scale Simulant'' task. The Task Technical and Quality Assurance Plan for this task is BNF-003-98-0079A. A report with a narrative description and discussion of the data will be issued separately.

  5. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    Science.gov (United States)

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  6. Path integral analysis of Jarzynski's equality: Analytical results

    Science.gov (United States)

    Minh, David D. L.; Adib, Artur B.

    2009-02-01

    We apply path integrals to study nonequilibrium work theorems in the context of Brownian dynamics, deriving in particular the equations of motion governing the most typical and most dominant trajectories. For the analytically soluble cases of a moving harmonic potential and a harmonic oscillator with a time-dependent natural frequency, we find such trajectories, evaluate the work-weighted propagators, and validate Jarzynski’s equality.

  7. Oxcarbazepine: validation and application of an analytical method

    Directory of Open Access Journals (Sweden)

    Paula Cristina Rezende Enéas

    2010-06-01

    Full Text Available Oxcarbazepine (OXC is an important anticonvulsant and mood stabilizing drug. A pharmacopoeial monograph for OXC is not yet available and therefore the development and validation of a new analytical method for quantification of this drug is essential. In the present study, a UV spectrophotometric method for the determination of OXC was developed. The various parameters, such as linearity, precision, accuracy and specificity, were studied according to International Conference on Harmonization Guidelines. Batches of 150 mg OXC capsules were prepared and analyzed using the validated UV method. The formulations were also evaluated for parameters including drug-excipient compatibility, flowability, uniformity of weight, disintegration time, assay, uniformity of content and the amount of drug dissolved during the first hour.Oxcarbazepina (OXC é um fármaco anticonvulsivante e estabilizante do humor. O desenvolvimento e validação de método analítico para quantificação da OXC são de fundamental importância devido à ausência de monografias farmacopéicas oficiais para esse fármaco. Nesse trabalho, um método espectrofotométrico UV para determinação da OXC foi desenvolvido. O método proposto foi validado seguindo os parâmetros de linearidade, precisão, exatidão e especificidade de acordo com as normas da Conferência Internacional de Harmonização. Cápsulas de OXC 150 mg foram preparadas e analisadas utilizando-se o método analítico validado. As formulações foram avaliadas com relação à compatibilidade fármaco-excipientes, fluidez, determinação de peso, tempo de desintegração, doseamento, uniformidade de conteúdo e quantidade do fármaco dissolvido após 60 minutos.

  8. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  9. Validation of an advanced analytical procedure applied to the measurement of environmental radioactivity.

    Science.gov (United States)

    Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van

    2018-04-01

    In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    Science.gov (United States)

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  11. Results from the First Validation Phase of CAP code

    International Nuclear Information System (INIS)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul; Ha, Sang Jun; Choi, Hoon

    2010-01-01

    The second stage of Safety Analysis Code Development for Nuclear Power Plants was lunched on Apirl, 2010 and is scheduled to be through 2012, of which the scope of work shall cover from code validation to licensing preparation. As a part of this project, CAP(Containment Analysis Package) will follow the same procedures. CAP's validation works are organized hieratically into four validation steps using; 1) Fundamental phenomena. 2) Principal phenomena (mixing and transport) and components in containment. 3) Demonstration test by small, middle, large facilities and International Standard Problems. 4) Comparison with other containment codes such as GOTHIC or COMTEMPT. In addition, collecting the experimental data related to containment phenomena and then constructing the database is one of the major works during the second stage as a part of this project. From the validation process of fundamental phenomenon, it could be expected that the current capability and the future improvements of CAP code will be revealed. For this purpose, simple but significant problems, which have the exact analytical solution, were selected and calculated for validation of fundamental phenomena. In this paper, some results of validation problems for the selected fundamental phenomena will be summarized and discussed briefly

  12. Preliminary results of testing bioassay analytical performance standards

    International Nuclear Information System (INIS)

    Fisher, D.R.; Robinson, A.V.; Hadley, R.T.

    1983-08-01

    The analytical performance of both in vivo and in vitro bioassay laboratories is being studied to determine the capability of these laboratories to meet the minimum criteria for accuracy and precision specified in the draft ANSI Standard N13.30, Performance Criteria for Radiobioassay. This paper presents preliminary results of the first round of testing

  13. Analytical validation of Gentian NGAL particle-enhanced enhanced turbidimetric immunoassay (PETIA

    Directory of Open Access Journals (Sweden)

    Gian Luca Salvagno

    2017-08-01

    Full Text Available Objectives: This study was designed to validate the analytical performance of the new Gentian particle-enhanced enhanced turbidimetric immunoassay (PETIA for measuring neutrophil gelatinase-associated lipocalin (NGAL in serum samples. Design and methods: Analytical validation of the Gentian NGAL assay was carried out on a Roche Cobas c501 and was based on assessment of limit of blank (LOB, limit of detection (LOD, functional sensitivity, imprecision, linearity and concordance with the BioPorto NGAL test. Results: The LOB and LOD of Gentian NGAL were found to be 3.8 ng/mL and 6.3 ng/mL, respectively. An analytical coefficient of variation (CV of 20% corresponded to a NGAL value of 10 ng/mL. The intra-assay and inter-assay imprecision (CV was between 0.4 and 5.2% and 0.6 and 7.1% and the total imprecision (CV was 3.7%. The linearity was optimal at NGAL concentrations between 37 and 1420 ng/mL (r=1.00; p<0.001. An excellent correlation was observed between values measured with Gentian NGAL and BioPorto NGAL in 74 routine serum samples (r=0.993. The mean percentage bias of the Gentian assay versus the Bioporto assay was +3.1% (95% CI, +1.6% to +4.5%. Conclusions: These results show that Gentian NGAL may be a viable option to other commercial immunoassays for both routine and urgent assessment of serum NGAL. Keywords: Neutrophil gelatinase-associated lipocalin, NGAL, Analytical validation, Acute kidney injury

  14. Evaluation and analytical validation of a handheld digital refractometer for urine specific gravity measurement

    Directory of Open Access Journals (Sweden)

    Sara P. Wyness

    2016-08-01

    Full Text Available Objectives: Refractometers are commonly used to determine urine specific gravity (SG in the assessment of hydration status and urine specimen validity testing. Few comprehensive performance evaluations are available demonstrating refractometer capability from a clinical laboratory perspective. The objective of this study was therefore to conduct an analytical validation of a handheld digital refractometer used for human urine SG testing. Design and methods: A MISCO Palm Abbe™ refractometer was used for all experiments, including device familiarization, carryover, precision, accuracy, linearity, analytical sensitivity, evaluation of potential substances which contribute to SG (i.e. “interference”, and reference interval evaluation. A manual refractometer, urine osmometer, and a solute score (sum of urine chloride, creatinine, glucose, potassium, sodium, total protein, and urea nitrogen; all in mg/dL were used as comparative methods for accuracy assessment. Results: Significant carryover was not observed. A wash step was still included as good laboratory practice. Low imprecision (%CV, <0.01 was demonstrated using low and high QC material. Accuracy studies showed strong correlation to manual refractometry. Linear correlation was also demonstrated between SG, osmolality, and solute score. Linearity of Palm Abbe performance was verified with observed error of ≤0.1%. Increases in SG were observed with increasing concentrations of albumin, creatinine, glucose, hemoglobin, sodium chloride, and urea. Transference of a previously published urine SG reference interval of 1.0020–1.0300 was validated. Conclusions: The Palm Abbe digital refractometer was a fast, simple, and accurate way to measure urine SG. Analytical validity was confirmed by the present experiments. Keywords: Specific gravity, Osmolality, Digital refractometry, Hydration, Sports medicine, Urine drug testing, Urine adulteration

  15. Modeling Run Test Validity: A Meta-Analytic Approach

    National Research Council Canada - National Science Library

    Vickers, Ross

    2002-01-01

    .... This study utilized data from 166 samples (N = 5,757) to test the general hypothesis that differences in testing methods could account for the cross-situational variation in validity. Only runs >2 km...

  16. On the validity and practical applicability of derivative analyticity relations

    International Nuclear Information System (INIS)

    Kolar, P.; Fischer, J.

    1983-09-01

    We examine derivative analyticity relations (DAR), which were originally proposed by Bronzan as an alternative to dispersion relations and in which the dispersion integral is replaced by a tangent series of derivatives. We characterize the class of functions satisfying DAR, and show that outside this class the dispersion integral represents a Borel-like sum of tangent series. We point out difficulties connected with the application of DAR. (author)

  17. Analytical validation of a new point-of-care assay for serum amyloid A in horses.

    Science.gov (United States)

    Schwartz, D; Pusterla, N; Jacobsen, S; Christopher, M M

    2018-01-17

    Serum amyloid A (SAA) is a major acute phase protein in horses. A new point-of-care (POC) test for SAA (Stablelab) is available, but studies evaluating its analytical accuracy are lacking. To evaluate the analytical performance of the SAA POC test by 1) determining linearity and precision, 2) comparing results in whole blood with those in serum or plasma, and 3) comparing POC results with those obtained using a previously validated turbidimetric immunoassay (TIA). Assay validation. Analytical validation of the POC test was done in accordance with American Society of Veterinary Clinical Pathology guidelines using residual equine serum/plasma and whole blood samples from the Clinical Pathology Laboratory at the University of California-Davis. A TIA was used as the reference method. We also evaluated the effect of haematocrit (HCT). The POC test was linear for SAA concentrations of up to at least 1000 μg/mL (r = 0.991). Intra-assay CVs were 13, 18 and 15% at high (782 μg/mL), intermediate (116 μg/mL) and low (64 μg/mL) concentrations. Inter-assay (inter-batch) CVs were 45, 14 and 15% at high (1372 μg/mL), intermediate (140 μg/mL) and low (56 μg/mL) concentrations. SAA results in whole blood were significantly lower than those in serum/plasma (P = 0.0002), but were positively correlated (r = 0.908) and not affected by HCT (P = 0.261); proportional negative bias was observed in samples with SAA>500 μg/mL. The difference between methods exceeded the 95% confidence interval of the combined imprecision of both methods (15%). Analytical validation could not be performed in whole blood, the sample most likely to be used stall side. The POC test has acceptable accuracy and precision in equine serum/plasma with SAA concentrations of up to at least 1000 μg/mL. Low inter-batch precision at high concentrations may affect serial measurements, and the use of the same test batch and sample type (serum/plasma or whole blood) is recommended. Comparison of results between the

  18. Analytical techniques and method validation for the measurement of selected semivolatile and nonvolatile organofluorochemicals in air.

    Science.gov (United States)

    Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M

    2004-09-01

    The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used

  19. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Stracuzzi, David John [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Brost, Randolph [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Chen, Maximillian Gene [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Malinas, Rebecca [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Peterson, Matthew Gregor [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Phillips, Cynthia A. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Robinson, David G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Woodbridge, Diane [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search, and discuss a number of possible improvements for each.

  20. Validated analytical modeling of diesel engine regulated exhaust CO emission rate

    Directory of Open Access Journals (Sweden)

    Waleed F Faris

    2016-06-01

    Full Text Available Albeit vehicle analytical models are often favorable for explainable mathematical trends, no analytical model has been developed of the regulated diesel exhaust CO emission rate for trucks yet. This research unprecedentedly develops and validates for trucks a model of the steady speed regulated diesel exhaust CO emission rate analytically. It has been found that the steady speed–based CO exhaust emission rate is based on (1 CO2 dissociation, (2 the water–gas shift reaction, and (3 the incomplete combustion of hydrocarbon. It has been found as well that the steady speed–based CO exhaust emission rate based on CO2 dissociation is considerably less than the rate that is based on the water–gas shift reaction. It has also been found that the steady speed–based CO exhaust emission rate based on the water–gas shift reaction is the dominant source of CO exhaust emission. The study shows that the average percentage of deviation of the steady speed–based simulated results from the corresponding field data is 1.7% for all freeway cycles with 99% coefficient of determination at the confidence level of 95%. This deviation of the simulated results from field data outperforms its counterpart of widely recognized models such as the comprehensive modal emissions model and VT-Micro for all freeway cycles.

  1. Validation Results for LEWICE 3.0

    Science.gov (United States)

    Wright, William B.

    2005-01-01

    A research project is underway at NASA Glenn to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. This report will present results from version 3.0 of this software, which is called LEWICE. This version differs from previous releases in that it incorporates additional thermal analysis capabilities, a pneumatic boot model, interfaces to computational fluid dynamics (CFD) flow solvers and has an empirical model for the supercooled large droplet (SLD) regime. An extensive comparison of the results in a quantifiable manner against the database of ice shapes and collection efficiency that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has also been performed. The complete set of data used for this comparison will eventually be available in a contractor report. This paper will show the differences in collection efficiency between LEWICE 3.0 and experimental data. Due to the large amount of validation data available, a separate report is planned for ice shape comparison. This report will first describe the LEWICE 3.0 model for water collection. A semi-empirical approach was used to incorporate first order physical effects of large droplet phenomena into icing software. Comparisons are then made to every single element two-dimensional case in the water collection database. Each condition was run using the following five assumptions: 1) potential flow, no splashing; 2) potential flow, no splashing with 21 bin drop size distributions and a lift correction (angle of attack adjustment); 3) potential flow, with splashing; 4) Navier-Stokes, no splashing; and 5) Navier-Stokes, with splashing. Quantitative comparisons are shown for impingement limit, maximum water catch, and total collection efficiency. The results show that the predicted results are within the accuracy limits of the experimental data for the majority of cases.

  2. Analytic results for asymmetric random walk with exponential transition probabilities

    International Nuclear Information System (INIS)

    Gutkowicz-Krusin, D.; Procaccia, I.; Ross, J.

    1978-01-01

    We present here exact analytic results for a random walk on a one-dimensional lattice with asymmetric, exponentially distributed jump probabilities. We derive the generating functions of such a walk for a perfect lattice and for a lattice with absorbing boundaries. We obtain solutions for some interesting moment properties, such as mean first passage time, drift velocity, dispersion, and branching ratio for absorption. The symmetric exponential walk is solved as a special case. The scaling of the mean first passage time with the size of the system for the exponentially distributed walk is determined by the symmetry and is independent of the range

  3. Analytical results for a hole in an antiferromagnet

    International Nuclear Information System (INIS)

    Li, Y.M.; d'Ambrumenil, N.; Su, Z.B.

    1996-04-01

    The Green's function for a hole moving in an antiferromagnet is derived analytically in the long-wavelength limit. We find that the infrared divergence is eliminated in two and higher dimensions so that the quasiparticle weight is finite. Our results also suggest that the hole motion is polaronic in nature with a bandwidth proportional to t 2 /J exp[-c(t/J) 2 ] (c is a constant) for J/t >or approx 0.5. The connection of the long-wavelength approximation to the first-order approximation in the cumulant expansion is also clarified. (author). 23 refs, 2 figs

  4. An analytic parton shower. Algorithms, implementation and validation

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, Sebastian

    2012-06-15

    The realistic simulation of particle collisions is an indispensable tool to interpret the data measured at high-energy colliders, for example the now running Large Hadron Collider at CERN. These collisions at these colliders are usually simulated in the form of exclusive events. This thesis focuses on the perturbative QCD part involved in the simulation of these events, particularly parton showers and the consistent combination of parton showers and matrix elements. We present an existing parton shower algorithm for emissions off final state partons along with some major improvements. Moreover, we present a new parton shower algorithm for emissions off incoming partons. The aim of these particular algorithms, called analytic parton shower algorithms, is to be able to calculate the probabilities for branchings and for whole events after the event has been generated. This allows a reweighting procedure to be applied after the events have been simulated. We show a detailed description of the algorithms, their implementation and the interfaces to the event generator WHIZARD. Moreover we discuss the implementation of a MLM-type matching procedure and an interface to the shower and hadronization routines from PYTHIA. Finally, we compare several predictions by our implementation to experimental measurements at LEP, Tevatron and LHC, as well as to predictions obtained using PYTHIA. (orig.)

  5. An analytic parton shower. Algorithms, implementation and validation

    International Nuclear Information System (INIS)

    Schmidt, Sebastian

    2012-06-01

    The realistic simulation of particle collisions is an indispensable tool to interpret the data measured at high-energy colliders, for example the now running Large Hadron Collider at CERN. These collisions at these colliders are usually simulated in the form of exclusive events. This thesis focuses on the perturbative QCD part involved in the simulation of these events, particularly parton showers and the consistent combination of parton showers and matrix elements. We present an existing parton shower algorithm for emissions off final state partons along with some major improvements. Moreover, we present a new parton shower algorithm for emissions off incoming partons. The aim of these particular algorithms, called analytic parton shower algorithms, is to be able to calculate the probabilities for branchings and for whole events after the event has been generated. This allows a reweighting procedure to be applied after the events have been simulated. We show a detailed description of the algorithms, their implementation and the interfaces to the event generator WHIZARD. Moreover we discuss the implementation of a MLM-type matching procedure and an interface to the shower and hadronization routines from PYTHIA. Finally, we compare several predictions by our implementation to experimental measurements at LEP, Tevatron and LHC, as well as to predictions obtained using PYTHIA. (orig.)

  6. Analytical validation of an ultra low-cost mobile phone microplate reader for infectious disease testing.

    Science.gov (United States)

    Wang, Li-Ju; Naudé, Nicole; Demissie, Misganaw; Crivaro, Anne; Kamoun, Malek; Wang, Ping; Li, Lei

    2018-07-01

    Most mobile health (mHealth) diagnostic devices for laboratory tests only analyze one sample at a time, which is not suitable for large volume serology testing, especially in low-resource settings with shortage of health professionals. In this study, we developed an ultra-low-cost clinically-accurate mobile phone microplate reader (mReader), and clinically validated this optical device for 12 infectious disease tests. The mReader optically reads 96 samples on a microplate at one time. 771 de-identified patient samples were tested for 12 serology assays for bacterial/viral infections. The mReader and the clinical instrument blindly read and analyzed all tests in parallel. The analytical accuracy and the diagnostic performance of the mReader were evaluated across the clinical reportable categories by comparison with clinical laboratorial testing results. The mReader exhibited 97.59-99.90% analytical accuracy and envision the mReader can benefit underserved areas/populations and low-resource settings in rural clinics/hospitals at a low cost (~$50 USD) with clinical-level analytical quality. It has the potential to improve health access, speed up healthcare delivery, and reduce health disparities and education disparities by providing access to a low-cost spectrophotometer. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. An analytic solution for numerical modeling validation in electromagnetics: the resistive sphere

    Science.gov (United States)

    Swidinsky, Andrei; Liu, Lifei

    2017-11-01

    We derive the electromagnetic response of a resistive sphere to an electric dipole source buried in a conductive whole space. The solution consists of an infinite series of spherical Bessel functions and associated Legendre polynomials, and follows the well-studied problem of a conductive sphere buried in a resistive whole space in the presence of a magnetic dipole. Our result is particularly useful for controlled-source electromagnetic problems using a grounded electric dipole transmitter and can be used to check numerical methods of calculating the response of resistive targets (such as finite difference, finite volume, finite element and integral equation). While we elect to focus on the resistive sphere in our examples, the expressions in this paper are completely general and allow for arbitrary source frequency, sphere radius, transmitter position, receiver position and sphere/host conductivity contrast so that conductive target responses can also be checked. Commonly used mesh validation techniques consist of comparisons against other numerical codes, but such solutions may not always be reliable or readily available. Alternatively, the response of simple 1-D models can be tested against well-known whole space, half-space and layered earth solutions, but such an approach is inadequate for validating models with curved surfaces. We demonstrate that our theoretical results can be used as a complementary validation tool by comparing analytic electric fields to those calculated through a finite-element analysis; the software implementation of this infinite series solution is made available for direct and immediate application.

  8. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    Science.gov (United States)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  9. Analytical results of radiochemistry of the JRR-3M

    International Nuclear Information System (INIS)

    Yoshijima, Tetsuo; Tanaka, Sumitoshi

    1997-07-01

    The JRR-3 was modified for upgrading to enhance the experimental capabilities in 1990 as JRR-3M. JRR-3M is pool type research reactor, moderated and cooled by light water with a maximum thermal power of 20 MWt and a thermal neutron flux of about 2x10 14 n/cm 2 ·sec. The core internal structure and fuel cladding tube is made by aluminum alloy. The cooling systems are composed of primary cooling system, secondary cooling system, heavy water reflector system and helium gas system. The primary piping system, reactor pool and heavy water reflector system is constructed of type 304 stainless steel. The main objectives of radiochemistry are check the general corrosion of structural materials and detection of failed fuel elements for safe operation of reactor plant. In this report analytical results of radiochemistry and evaluation of radionuclides of cooling systems in the JRR-3M are described. (author)

  10. Evaluation of analytical results on DOE Quality Assessment Program Samples

    International Nuclear Information System (INIS)

    Jaquish, R.E.; Kinnison, R.R.; Mathur, S.P.; Sastry, R.

    1985-01-01

    Criteria were developed for evaluating the participants analytical results in the DOE Quality Assessment Program (QAP). Historical data from previous QAP studies were analyzed using descriptive statistical methods to determine the interlaboratory precision that had been attained. Performance criteria used in other similar programs were also reviewed. Using these data, precision values and control limits were recommended for each type of analysis performed in the QA program. Results of the analysis performed by the QAP participants on the November 1983 samples were statistically analyzed and evaluated. The Environmental Measurements Laboratory (EML) values were used as the known values and 3-sigma precision values were used as control limits. Results were submitted by 26 participating laboratories for 49 different radionuclide media combinations. The participants reported 419 results and of these, 350 or 84% were within control limits. Special attention was given to the data from gamma spectral analysis of air filters and water samples. both normal probability and box plots were prepared for each nuclide to help evaluate the distribution of the data. Results that were outside the expected range were identified and suggestions made that laboratories check calculations, and procedures on these results

  11. Validation of analytical methods for the stability studies of naproxen suppositories for infant and adult use

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar

    2011-01-01

    Analytical and validating studies were performed in this paper, with a view to using them in the stability studies of the future formulations of naproxen suppositories for children and adults. The most influential factors in the naproxen stability were determined, that is, the major degradation occurred in acid medium, oxidative medium and by light action. One high-performance liquid chromatography-based method was evaluated, which proved to be adequate to quantify naproxen in suppositories and was selective against degradation products. The quantification limit was 3,480 μg, so it was valid for these studies. Additionally, the parameters specificity for stability, detection and quantification limits were evaluated for the direct semi-aqueous acid-base method, which was formerly validated for the quality control and showed satisfactory results. Nevertheless, the volumetric methods were not regarded as stability indicators; therefore, this method will be used along with the chromatographic methods of choice, that is, thin-layer chromatography and highperformance liquid chromatography, to determine the degradation products

  12. Dosimetric validation of the anisotropic analytical algorithm for photon dose calculation: fundamental characterization in water

    International Nuclear Information System (INIS)

    Fogliata, Antonella; Nicolini, Giorgia; Vanetti, Eugenio; Clivio, Alessandro; Cozzi, Luca

    2006-01-01

    In July 2005 a new algorithm was released by Varian Medical Systems for the Eclipse planning system and installed in our institute. It is the anisotropic analytical algorithm (AAA) for photon dose calculations, a convolution/superposition model for the first time implemented in a Varian planning system. It was therefore necessary to perform validation studies at different levels with a wide investigation approach. To validate the basic performances of the AAA, a detailed analysis of data computed by the AAA configuration algorithm was carried out and data were compared against measurements. To better appraise the performance of AAA and the capability of its configuration to tailor machine-specific characteristics, data obtained from the pencil beam convolution (PBC) algorithm implemented in Eclipse were also added in the comparison. Since the purpose of the paper is to address the basic performances of the AAA and of its configuration procedures, only data relative to measurements in water will be reported. Validation was carried out for three beams: 6 MV and 15 MV from a Clinac 2100C/D and 6 MV from a Clinac 6EX. Generally AAA calculations reproduced very well measured data, and small deviations were observed, on average, for all the quantities investigated for open and wedged fields. In particular, percentage depth-dose curves showed on average differences between calculation and measurement smaller than 1% or 1 mm, and computed profiles in the flattened region matched measurements with deviations smaller than 1% for all beams, field sizes, depths and wedges. Percentage differences in output factors were observed as small as 1% on average (with a range smaller than ±2%) for all conditions. Additional tests were carried out for enhanced dynamic wedges with results comparable to previous results. The basic dosimetric validation of the AAA was therefore considered satisfactory

  13. Analytical Performance Modeling and Validation of Intel’s Xeon Phi Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Chunduri, Sudheer; Balaprakash, Prasanna; Morozov, Vitali; Vishwanath, Venkatram; Kumaran, Kalyan

    2017-01-01

    Modeling the performance of scientific applications on emerging hardware plays a central role in achieving extreme-scale computing goals. Analytical models that capture the interaction between applications and hardware characteristics are attractive because even a reasonably accurate model can be useful for performance tuning before the hardware is made available. In this paper, we develop a hardware model for Intel’s second-generation Xeon Phi architecture code-named Knights Landing (KNL) for the SKOPE framework. We validate the KNL hardware model by projecting the performance of mini-benchmarks and application kernels. The results show that our KNL model can project the performance with prediction errors of 10% to 20%. The hardware model also provides informative recommendations for code transformations and tuning.

  14. Determination of proline in honey: comparison between official methods, optimization and validation of the analytical methodology.

    Science.gov (United States)

    Truzzi, Cristina; Annibaldi, Anna; Illuminati, Silvia; Finale, Carolina; Scarponi, Giuseppe

    2014-05-01

    The study compares official spectrophotometric methods for the determination of proline content in honey - those of the International Honey Commission (IHC) and the Association of Official Analytical Chemists (AOAC) - with the original Ough method. Results show that the extra time-consuming treatment stages added by the IHC method with respect to the Ough method are pointless. We demonstrate that the AOACs method proves to be the best in terms of accuracy and time saving. The optimized waiting time for the absorbance recording is set at 35min from the removal of reaction tubes from the boiling bath used in the sample treatment. The optimized method was validated in the matrix: linearity up to 1800mgL(-1), limit of detection 20mgL(-1), limit of quantification 61mgL(-1). The method was applied to 43 unifloral honey samples from the Marche region, Italy. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    Science.gov (United States)

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…

  16. Analytical validation of a novel multiplex test for detection of advanced adenoma and colorectal cancer in symptomatic patients.

    Science.gov (United States)

    Dillon, Roslyn; Croner, Lisa J; Bucci, John; Kairs, Stefanie N; You, Jia; Beasley, Sharon; Blimline, Mark; Carino, Rochele B; Chan, Vicky C; Cuevas, Danissa; Diggs, Jeff; Jennings, Megan; Levy, Jacob; Mina, Ginger; Yee, Alvin; Wilcox, Bruce

    2018-05-30

    Early detection of colorectal cancer (CRC) is key to reducing associated mortality. Despite the importance of early detection, approximately 40% of individuals in the United States between the ages of 50-75 have never been screened for CRC. The low compliance with colonoscopy and fecal-based screening may be addressed with a non-invasive alternative such as a blood-based test. We describe here the analytical validation of a multiplexed blood-based assay that measures the plasma concentrations of 15 proteins to assess advanced adenoma (AA) and CRC risk in symptomatic patients. The test was developed on an electrochemiluminescent immunoassay platform employing four multi-marker panels, to be implemented in the clinic as a laboratory developed test (LDT). Under the Clinical Laboratory Improvement Amendments (CLIA) and College of American Pathologists (CAP) regulations, a United States-based clinical laboratory utilizing an LDT must establish performance characteristics relating to analytical validity prior to releasing patient test results. This report describes a series of studies demonstrating the precision, accuracy, analytical sensitivity, and analytical specificity for each of the 15 assays, as required by CLIA/CAP. In addition, the report describes studies characterizing each of the assays' dynamic range, parallelism, tolerance to common interfering substances, spike recovery, and stability to sample freeze-thaw cycles. Upon completion of the analytical characterization, a clinical accuracy study was performed to evaluate concordance of AA and CRC classifier model calls using the analytical method intended for use in the clinic. Of 434 symptomatic patient samples tested, the percent agreement with original CRC and AA calls was 87% and 92% respectively. All studies followed CLSI guidelines and met the regulatory requirements for implementation of a new LDT. The results provide the analytical evidence to support the implementation of the novel multi-marker test as

  17. Analytic results for the one loop NMHV H anti qqgg amplitude

    Energy Technology Data Exchange (ETDEWEB)

    Badger, Simon [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Campbell, John M. [Glasgow Univ. (United Kingdom). Dept. of Physics and Astronomy; Ellis, R. Keith [Fermilab, Batavia, IL (United States); Williams, Ciaran [Durham Univ. (United Kingdom). Dept. of Physics

    2009-10-23

    We compute the one-loop amplitude for a Higgs boson, a quark-antiquark pair and a pair of gluons of negative helicity, i.e. for the next-to-maximally helicity violating (NMHV) case, A(H, 1{sup -} {sub anti} {sub q}, 2{sup +}{sub q}, 3{sup -}{sub g}, 4{sup -}{sub g}). The calculation is performed using an effective Lagrangian which is valid in the limit of very large top quark mass. As a result of this paper all amplitudes for the transition of a Higgs boson into 4 partons are now known analytically at one-loop order. (orig.)

  18. Analytic results for the one loop NMHV H anti qqgg amplitude

    International Nuclear Information System (INIS)

    Badger, Simon; Campbell, John M.; Williams, Ciaran

    2009-01-01

    We compute the one-loop amplitude for a Higgs boson, a quark-antiquark pair and a pair of gluons of negative helicity, i.e. for the next-to-maximally helicity violating (NMHV) case, A(H, 1 - anti q , 2 + q , 3 - g , 4 - g ). The calculation is performed using an effective Lagrangian which is valid in the limit of very large top quark mass. As a result of this paper all amplitudes for the transition of a Higgs boson into 4 partons are now known analytically at one-loop order. (orig.)

  19. Principles of Single-Laboratory Validation of Analytical Methods for Testing the Chemical Composition of Pesticides

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Underlying theoretical and practical approaches towards pesticide formulation analysis are discussed, i.e. general principles, performance characteristics, applicability of validation data, verification of method performance, and adaptation of validated methods by other laboratories. The principles of single laboratory validation of analytical methods for testing the chemical composition of pesticides are outlined. Also the theoretical background is described for performing pesticide formulation analysis as outlined in ISO, CIPAC/AOAC and IUPAC guidelines, including methodological characteristics such as specificity, selectivity, linearity, accuracy, trueness, precision and bias. Appendices I–III hereof give practical and elaborated examples on how to use the Horwitz approach and formulae for estimating the target standard deviation towards acceptable analytical repeatability. The estimation of trueness and the establishment of typical within-laboratory reproducibility are treated in greater detail by means of worked-out examples. (author)

  20. Ethical leadership: meta-analytic evidence of criterion-related and incremental validity.

    Science.gov (United States)

    Ng, Thomas W H; Feldman, Daniel C

    2015-05-01

    This study examines the criterion-related and incremental validity of ethical leadership (EL) with meta-analytic data. Across 101 samples published over the last 15 years (N = 29,620), we observed that EL demonstrated acceptable criterion-related validity with variables that tap followers' job attitudes, job performance, and evaluations of their leaders. Further, followers' trust in the leader mediated the relationships of EL with job attitudes and performance. In terms of incremental validity, we found that EL significantly, albeit weakly in some cases, predicted task performance, citizenship behavior, and counterproductive work behavior-even after controlling for the effects of such variables as transformational leadership, use of contingent rewards, management by exception, interactional fairness, and destructive leadership. The article concludes with a discussion of ways to strengthen the incremental validity of EL. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  1. Validation of an analytical methodology for the quantitative analysis of petroleum hydrocarbons in marine sediment samples

    Directory of Open Access Journals (Sweden)

    Eloy Yordad Companioni Damas

    2009-01-01

    Full Text Available This work describes a validation of an analytical procedure for the analysis of petroleum hydrocarbons in marine sediment samples. The proposed protocol is able to measure n-alkanes and polycyclic aromatic hydrocarbons (PAH in samples at concentrations as low as 30 ng/g, with a precision better than 15% for most of analytes. The extraction efficiency of fortified sediments varied from 65.1 to 105.6% and 59.7 to 97.8%, for n-alkanes and PAH in the ranges: C16 - C32 and fluoranthene - benzo(apyrene, respectively. The analytical protocol was applied to determine petroleum hydrocarbons in sediments collected from a marine coastal zone.

  2. Validation of the analytical method for sodium dichloroisocyanurate aimed at drinking water disinfection

    International Nuclear Information System (INIS)

    Martinez Alvarez, Luis Octavio; Alejo Cisneros, Pedro; Garcia Pereira, Reynaldo; Campos Valdez, Doraily

    2014-01-01

    Cuba has developed the first effervescent 3.5 mg sodium dichloroisocyanurate tablets as a non-therapeutic active principle. This ingredient releases certain amount of chlorine when dissolved into a litre of water and it can cause adequate disinfection of drinking water ready to be taken after 30 min. Developing and validating an analytical iodometric method applicable to the quality control of effervescent 3.5 mg sodium dichloroisocyanurate tablets

  3. Wetting boundary condition for the color-gradient lattice Boltzmann method: Validation with analytical and experimental data

    Science.gov (United States)

    Akai, Takashi; Bijeljic, Branko; Blunt, Martin J.

    2018-06-01

    In the color gradient lattice Boltzmann model (CG-LBM), a fictitious-density wetting boundary condition has been widely used because of its ease of implementation. However, as we show, this may lead to inaccurate results in some cases. In this paper, a new scheme for the wetting boundary condition is proposed which can handle complicated 3D geometries. The validity of our method for static problems is demonstrated by comparing the simulated results to analytical solutions in 2D and 3D geometries with curved boundaries. Then, capillary rise simulations are performed to study dynamic problems where the three-phase contact line moves. The results are compared to experimental results in the literature (Heshmati and Piri, 2014). If a constant contact angle is assumed, the simulations agree with the analytical solution based on the Lucas-Washburn equation. However, to match the experiments, we need to implement a dynamic contact angle that varies with the flow rate.

  4. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals

    DEFF Research Database (Denmark)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G

    2018-01-01

    for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision...... are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation...

  5. Analytical support of plant specific SAMG development validation of SAMG using MELCOR 1.8.5

    International Nuclear Information System (INIS)

    Duspiva, Jiri

    2006-01-01

    They are two NPPs in operation in Czech Republic. Both of NPPs operated in CR have already implemented EOPs, developed under collaboration with the WESE. The project on SAMG development has started and follows the previous one for EOPs also with the WESE as the leading organization. Plant specific SAMGs for the Temelin as well as Dukovany NPPs are based on the WOG generic SAMGs. The analytical support of plant specific SAMGs development is performed by the NRI Rez within the validation process. Basic conditions as well as their filling by NRI Rez are focused on analyst, analytical tools and their applications. More detail description is attended to the approach of the preparation of the MELCOR code application to the evaluation of hydrogen risk, validation of recent set of hydrogen passive autocatalytic recombiners and definition of proposals to amend system of hydrogen removal. Such kind of parametric calculations will request to perform very wide set of runs. It could not be possible with the whole plant model and decoupling of such calculation with storing of mass and energy sources into the containment is only one way. The example of this decoupling for the LOCA scenario is shown. It includes seven sources - heat losses from primary and secondary circuits, fluid blowndown through cold leg break, fission products blowndown through cold leg break, fluid blowndown through break in reactor pressure vessel bottom head, fission products through break in reactor pressure vessel bottom head, melt ejection from reactor pressure vessel to cavity and gas masses and heat losses from corium in cavity. The stand alone containment analysis was tested in two configurations - with or without taking of fission products into account. Testing showed very good agreement of all calculations until lower head failure and acceptable agreement after that. Also some problematic features appeared. The stand alone test with fission product was possible only after the changes in source code

  6. Development, validation and evaluation of an analytical method for the determination of monomeric and oligomeric procyanidins in apple extracts.

    Science.gov (United States)

    Hollands, Wendy J; Voorspoels, Stefan; Jacobs, Griet; Aaby, Kjersti; Meisland, Ane; Garcia-Villalba, Rocio; Tomas-Barberan, Francisco; Piskula, Mariusz K; Mawson, Deborah; Vovk, Irena; Needs, Paul W; Kroon, Paul A

    2017-04-28

    There is a lack of data for individual oligomeric procyanidins in apples and apple extracts. Our aim was to develop, validate and evaluate an analytical method for the separation, identification and quantification of monomeric and oligomeric flavanols in apple extracts. To achieve this, we prepared two types of flavanol extracts from freeze-dried apples; one was an epicatechin-rich extract containing ∼30% (w/w) monomeric (-)-epicatechin which also contained oligomeric procyanidins (Extract A), the second was an oligomeric procyanidin-rich extract depleted of epicatechin (Extract B). The parameters considered for method optimisation were HPLC columns and conditions, sample heating, mass of extract and dilution volumes. The performance characteristics considered for method validation included standard linearity, method sensitivity, precision and trueness. Eight laboratories participated in the method evaluation. Chromatographic separation of the analytes was best achieved utilizing a Hilic column with a binary mobile phase consisting of acidic acetonitrile and acidic aqueous methanol. The final method showed linearity for epicatechin in the range 5-100μg/mL with a correlation co-efficient >0.999. Intra-day and inter-day precision of the analytes ranged from 2 to 6% and 2 to 13% respectively. Up to dp3, trueness of the method was >95% but decreased with increasing dp. Within laboratory precision showed RSD values <5 and 10% for monomers and oligomers, respectively. Between laboratory precision was 4 and 15% (Extract A) and 7 and 30% (Extract B) for monomers and oligomers, respectively. An analytical method for the separation, identification and quantification of procyanidins in an apple extract was developed, validated and assessed. The results of the inter-laboratory evaluation indicate that the method is reliable and reproducible. Copyright © 2017. Published by Elsevier B.V.

  7. Modification and validation of an analytical source model for external beam radiotherapy Monte Carlo dose calculations

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, Scott E., E-mail: sedavids@utmb.edu [Radiation Oncology, The University of Texas Medical Branch, Galveston, Texas 77555 (United States); Cui, Jing [Radiation Oncology, University of Southern California, Los Angeles, California 90033 (United States); Kry, Stephen; Ibbott, Geoffrey S.; Followill, David S. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Deasy, Joseph O. [Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, New York 10065 (United States); Vicic, Milos [Department of Applied Physics, University of Belgrade, Belgrade 11000 (Serbia); White, R. Allen [Bioinformatics and Computational Biology, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States)

    2016-08-15

    Purpose: A dose calculation tool, which combines the accuracy of the dose planning method (DPM) Monte Carlo code and the versatility of a practical analytical multisource model, which was previously reported has been improved and validated for the Varian 6 and 10 MV linear accelerators (linacs). The calculation tool can be used to calculate doses in advanced clinical application studies. One shortcoming of current clinical trials that report dose from patient plans is the lack of a standardized dose calculation methodology. Because commercial treatment planning systems (TPSs) have their own dose calculation algorithms and the clinical trial participant who uses these systems is responsible for commissioning the beam model, variation exists in the reported calculated dose distributions. Today’s modern linac is manufactured to tight specifications so that variability within a linac model is quite low. The expectation is that a single dose calculation tool for a specific linac model can be used to accurately recalculate dose from patient plans that have been submitted to the clinical trial community from any institution. The calculation tool would provide for a more meaningful outcome analysis. Methods: The analytical source model was described by a primary point source, a secondary extra-focal source, and a contaminant electron source. Off-axis energy softening and fluence effects were also included. The additions of hyperbolic functions have been incorporated into the model to correct for the changes in output and in electron contamination with field size. A multileaf collimator (MLC) model is included to facilitate phantom and patient dose calculations. An offset to the MLC leaf positions was used to correct for the rudimentary assumed primary point source. Results: Dose calculations of the depth dose and profiles for field sizes 4 × 4 to 40 × 40 cm agree with measurement within 2% of the maximum dose or 2 mm distance to agreement (DTA) for 95% of the data

  8. SU-E-T-479: Development and Validation of Analytical Models Predicting Secondary Neutron Radiation in Proton Therapy Applications

    International Nuclear Information System (INIS)

    Farah, J; Bonfrate, A; Donadille, L; Martinetti, F; Trompier, F; Clairand, I; De Olivera, A; Delacroix, S; Herault, J; Piau, S; Vabre, I

    2014-01-01

    Purpose: Test and validation of analytical models predicting leakage neutron exposure in passively scattered proton therapy. Methods: Taking inspiration from the literature, this work attempts to build an analytical model predicting neutron ambient dose equivalents, H*(10), within the local 75 MeV ocular proton therapy facility. MC simulations were first used to model H*(10) in the beam axis plane while considering a closed final collimator and pristine Bragg peak delivery. Next, MC-based analytical model was tested against simulation results and experimental measurements. The model was also expended in the vertical direction to enable a full 3D mapping of H*(10) inside the treatment room. Finally, the work focused on upgrading the literature model to clinically relevant configurations considering modulated beams, open collimators, patient-induced neutron fluctuations, etc. Results: The MC-based analytical model efficiently reproduced simulated H*(10) values with a maximum difference below 10%. In addition, it succeeded in predicting measured H*(10) values with differences <40%. The highest differences were registered at the closest and farthest positions from isocenter where the analytical model failed to faithfully reproduce the high neutron fluence and energy variations. The differences remains however acceptable taking into account the high measurement/simulation uncertainties and the end use of this model, i.e. radiation protection. Moreover, the model was successfully (differences < 20% on simulations and < 45% on measurements) extended to predict neutrons in the vertical direction with respect to the beam line as patients are in the upright seated position during ocular treatments. Accounting for the impact of beam modulation, collimation and the present of a patient in the beam path is far more challenging and conversion coefficients are currently being defined to predict stray neutrons in clinically representative treatment configurations. Conclusion

  9. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... analytical bias based on biological variation. RESULTS: Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. DISCUSSION: Patient results applied in analytical quality performance...

  10. Roll-up of validation results to a target application.

    Energy Technology Data Exchange (ETDEWEB)

    Hills, Richard Guy

    2013-09-01

    Suites of experiments are preformed over a validation hierarchy to test computational simulation models for complex applications. Experiments within the hierarchy can be performed at different conditions and configurations than those for an intended application, with each experiment testing only part of the physics relevant for the application. The purpose of the present work is to develop methodology to roll-up validation results to an application, and to assess the impact the validation hierarchy design has on the roll-up results. The roll-up is accomplished through the development of a meta-model that relates validation measurements throughout a hierarchy to the desired response quantities for the target application. The meta-model is developed using the computation simulation models for the experiments and the application. The meta-model approach is applied to a series of example transport problems that represent complete and incomplete coverage of the physics of the target application by the validation experiments.

  11. Validation of analytical method for quality control of B12 Vitamin-10 000 injection

    International Nuclear Information System (INIS)

    Botet Garcia, Martha; Garcia Penna, Caridad Margarita; Troche Concepcion, Yenilen; Cannizares Arencibia, Yanara; Moreno Correoso, Barbara

    2009-01-01

    Analytical method reported by USA Pharmacopeia was validated for quality control of injectable B 1 2 Vitamin (10 000 U) by UV spectrophotometry because this is a simpler and low-cost method allowing quality control of finished product. Calibration curve was graphed at 60 to 140% interval, where it was linear with a correlation coefficient similar to 0, 9999; statistical test for interception and slope was considered non-significant. There was a recovery of 99.7 % in study concentrations interval where the Cochran (G) and Student(t) test were not significant too. Variation coefficient in repetition study was similar to 0.59 % for the 6 assayed replies, whereas in intermediate precision analysis, the Fisher and Student tests were not significant. Analytical method was linear, precise, specific and exact in study concentrations interval

  12. Analytical method validation for quality control and the study of the 50 mg Propylthiouracil stability

    International Nuclear Information System (INIS)

    Valdes Bendoyro, Maria Olga; Garcia Penna, Caridad Margarita; Fernandez, Juan Lugones; Garcia Borges, Lisandra; Martinez Espinosa, Vivian

    2010-01-01

    A high-performance liquid chromatography analytical method was developed and validated for the quality control and stability studies of 50 mg Propylthiouracil tablets. Method is based in active principle separation through a 100 RP-18 RP-18 (5 μm) (250 x 4 mm) Lichrospher chromatography with UV detection to 272 nm, using a mobile phase composed by a ungaseous mixture of a 0.025 M buffer solution-monobasic potassium phosphate to pH= 4,6 ad acetonitrile in a 80:20 ratio with a flux speed of 0,5 mL/min. Analytical method was linear, precise, specific and exact in the study concentrations interval

  13. Use of reference materials for validating analytical methods. Applied to the determination of As, Co, Na, Hg, Se and Fe using neutron activation analysis

    International Nuclear Information System (INIS)

    Munoz, L; Andonie, O; Kohnenkamp, I

    2000-01-01

    The main purpose of an analytical laboratory is to provide reliable information on the nature and composition of the materials submitted for analysis. This purpose can only be attained if analytical methodologies that have the attributes of accuracy, precision, specificity and sensitivity, among others, are used. The process by which these attributes are evaluated is called validation of the analytical method. The Chilean Nuclear Energy Commission's Neutron Activation Analysis Laboratory is applying a quality guarantee program to ensure the quality of its analytical results, which aims, as well, to attain accreditation for some of its measurements. Validation of the analytical methodologies used is an essential part of applying this program. There are many forms of validation, from comparison with reference techniques to participation in inter-comparison rounds. Certified reference materials were used in this work in order to validate the application of neutron activation analysis in determining As, Co, Na, Hg, Se and Fe in shellfish samples. The use of reference materials was chosen because it is a simple option that easily detects sources of systematic errors. Neutron activation analysis is an instrumental analytical method that does not need chemical treatment and that is based on processes which take place in the nuclei of atoms, making the matrix effects unimportant and different biological reference materials can be used. The following certified reference materials were used for validating the method used: BCR human hair 397, NRCC dogfish muscle DORM-2, NRCC -dogfish liver DOLT-2, NIST - oyster tissue 1566, NIES - mussel 6 and BCR - tuna fish 464. The reference materials were analyzed using the procedure developed for the shellfish samples and the above-mentioned elements were determined. With the results obtained, the parameters of accuracy, precision, detection limit, quantification limit and uncertainty associated with the method were determined for each

  14. Interacting steps with finite-range interactions: Analytical approximation and numerical results

    Science.gov (United States)

    Jaramillo, Diego Felipe; Téllez, Gabriel; González, Diego Luis; Einstein, T. L.

    2013-05-01

    We calculate an analytical expression for the terrace-width distribution P(s) for an interacting step system with nearest- and next-nearest-neighbor interactions. Our model is derived by mapping the step system onto a statistically equivalent one-dimensional system of classical particles. The validity of the model is tested with several numerical simulations and experimental results. We explore the effect of the range of interactions q on the functional form of the terrace-width distribution and pair correlation functions. For physically plausible interactions, we find modest changes when next-nearest neighbor interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.

  15. Decentral gene expression analysis: analytical validation of the Endopredict genomic multianalyte breast cancer prognosis test

    Directory of Open Access Journals (Sweden)

    Kronenwett Ralf

    2012-10-01

    Full Text Available Abstract Background EndoPredict (EP is a clinically validated multianalyte gene expression test to predict distant metastasis in ER-positive, HER2-negative breast cancer treated with endocrine therapy alone. The test is based on the combined analysis of 12 genes in formalin-fixed, paraffin-embedded (FFPE tissue by reverse transcription-quantitative real-time PCR (RT-qPCR. Recently, it was shown that EP is feasible for reliable decentralized assessment of gene expression. The aim of this study was the analytical validation of the performance characteristics of the assay and its verification in a molecular-pathological routine laboratory. Methods Gene expression values to calculate the EP score were assayed by one-step RT-qPCR using RNA from FFPE tumor tissue. Limit of blank, limit of detection, linear range, and PCR efficiency were assessed for each of the 12 PCR assays using serial samples dilutions. Different breast cancer samples were used to evaluate RNA input range, precision and inter-laboratory variability. Results PCR assays were linear up to Cq values between 35.1 and 37.2. Amplification efficiencies ranged from 75% to 101%. The RNA input range without considerable change of the EP score was between 0.16 and 18.5 ng/μl. Analysis of precision (variation of day, day time, instrument, operator, reagent lots resulted in a total noise (standard deviation of 0.16 EP score units on a scale from 0 to 15. The major part of the total noise (SD 0.14 was caused by the replicate-to-replicate noise of the PCR assays (repeatability and was not associated with different operating conditions (reproducibility. Performance characteristics established in the manufacturer’s laboratory were verified in a routine molecular pathology laboratory. Comparison of 10 tumor samples analyzed in two different laboratories showed a Pearson coefficient of 0.995 and a mean deviation of 0.15 score units. Conclusions The EP test showed reproducible performance

  16. Analytical quality control in environmental analysis - Recent results and future trends of the IAEA's analytical quality control programme

    Energy Technology Data Exchange (ETDEWEB)

    Suschny, O; Heinonen, J

    1973-12-01

    The significance of analytical results depends critically on the degree of their reliability, an assessment of this reliability is indispensable if the results are to have any meaning at all. Environmental radionuclide analysis is a relatively new analytical field in which new methods are continuously being developed and into which many new laboratories have entered during the last ten to fifteen years. The scarcity of routine methods and the lack of experience of the new laboratories have made the need for the assessment of the reliability of results particularly urgent in this field. The IAEA, since 1962, has provided assistance to its member states by making available to their laboratories analytical quality control services in the form of standard samples, reference materials and the organization of analytical intercomparisons. The scope of this programme has increased over the years and now includes, in addition to environmental radionuclides, non-radioactive environmental contaminants which may be analysed by nuclear methods, materials for forensic neutron activation analysis, bioassay materials and nuclear fuel. The results obtained in recent intercomparisons demonstrate the continued need for these services. (author)

  17. Communicating Qualitative Analytical Results Following Grice's Conversational Maxims

    Science.gov (United States)

    Chenail, Jan S.; Chenail, Ronald J.

    2011-01-01

    Conducting qualitative research can be seen as a developing communication act through which researchers engage in a variety of conversations. Articulating the results of qualitative data analysis results can be an especially challenging part of this scholarly discussion for qualitative researchers. To help guide investigators through this…

  18. Analytic expressions for mode conversion in a plasma with a parabolic density profile: Generalized results

    International Nuclear Information System (INIS)

    Hinkel-Lipsker, D.E.; Fried, B.D.; Morales, G.J.

    1993-01-01

    This study provides an analytic solution to the general problem of mode conversion in an unmagnetized plasma. Specifically, an electromagnetic wave of frequency ω propagating through a plasma with a parabolic density profile of scale length L p is examined. The mode conversion points are located a distance Δ 0 from the peak of the profile, where the electron plasma frequency ω p (z) matches the wave frequency ω. The corresponding reflection, transmission, and mode conversion coefficients are expressed analytically in terms of parabolic cylinder functions for all values of Δ 0 . The method of solution is based on a source approximation technique that is valid when the electromagnetic and electrostatic scale lengths are well separated. For large Δ 0 , i.e., (cL p /ω) 1/2 much-lt Δ 0 p , the appropriately scaled result [D. E. Hinkel-Lipsker et al., Phys. Fluids B 4, 559 (1992)] for a linear density profile is recovered as the parabolic cylinder functions asymptotically become Airy functions. When Δ 0 →0, the special case of conversion at the peak of the profile [D. E. Hinkel-Lipsker et al., Phys. Fluids B 4, 1772 (1992)] is obtained

  19. Decentral gene expression analysis: analytical validation of the Endopredict genomic multianalyte breast cancer prognosis test

    International Nuclear Information System (INIS)

    Kronenwett, Ralf; Brase, Jan C; Weber, Karsten E; Fisch, Karin; Müller, Berit M; Schmidt, Marcus; Filipits, Martin; Dubsky, Peter; Petry, Christoph; Dietel, Manfred; Denkert, Carsten; Bohmann, Kerstin; Prinzler, Judith; Sinn, Bruno V; Haufe, Franziska; Roth, Claudia; Averdick, Manuela; Ropers, Tanja; Windbergs, Claudia

    2012-01-01

    EndoPredict (EP) is a clinically validated multianalyte gene expression test to predict distant metastasis in ER-positive, HER2-negative breast cancer treated with endocrine therapy alone. The test is based on the combined analysis of 12 genes in formalin-fixed, paraffin-embedded (FFPE) tissue by reverse transcription-quantitative real-time PCR (RT-qPCR). Recently, it was shown that EP is feasible for reliable decentralized assessment of gene expression. The aim of this study was the analytical validation of the performance characteristics of the assay and its verification in a molecular-pathological routine laboratory. Gene expression values to calculate the EP score were assayed by one-step RT-qPCR using RNA from FFPE tumor tissue. Limit of blank, limit of detection, linear range, and PCR efficiency were assessed for each of the 12 PCR assays using serial samples dilutions. Different breast cancer samples were used to evaluate RNA input range, precision and inter-laboratory variability. PCR assays were linear up to C q values between 35.1 and 37.2. Amplification efficiencies ranged from 75% to 101%. The RNA input range without considerable change of the EP score was between 0.16 and 18.5 ng/μl. Analysis of precision (variation of day, day time, instrument, operator, reagent lots) resulted in a total noise (standard deviation) of 0.16 EP score units on a scale from 0 to 15. The major part of the total noise (SD 0.14) was caused by the replicate-to-replicate noise of the PCR assays (repeatability) and was not associated with different operating conditions (reproducibility). Performance characteristics established in the manufacturer’s laboratory were verified in a routine molecular pathology laboratory. Comparison of 10 tumor samples analyzed in two different laboratories showed a Pearson coefficient of 0.995 and a mean deviation of 0.15 score units. The EP test showed reproducible performance characteristics with good precision and negligible laboratory

  20. Analytical results of Tank 38H core samples -- Fall 1999

    International Nuclear Information System (INIS)

    Swingle, R.F.

    2000-01-01

    Two samples were pulled from Tank 38H in the Fall of 1999: a variable depth sample (VDS) of the supernate was pulled in October and a core sample from the salt layer was pulled in December. Analysis of the rinse from the outside of the core sample indicated no sign of volatile or semivolatile organics. Both supernate and solids from the VDS and the dried core sample solids were analyzed for isotopes which could pose a criticality concern and also for elements which could serve as neutron poisons, as well as other elements. Results of the elemental analyses of these samples show significant elements present to mitigate the potential for nuclear criticality. However, it should be noted the results given for the VDS solids elemental analyses may be higher than the actual concentration in the solids, since the filter paper was dissolved along with the sample solids

  1. TMI-2 core debris analytical methods and results

    International Nuclear Information System (INIS)

    Akers, D.W.; Cook, B.A.

    1984-01-01

    A series of six grab samples was taken from the debris bed of the TMI-2 core in early September 1983. Five of these samples were sent to the Idaho National Engineering Laboratory for analysis. Presented is the analysis strategy for the samples and some of the data obtained from the early stages of examination of the samples (i.e., particle size-analysis, gamma spectrometry results, and fissile/fertile material analysis)

  2. Analytical results from Tank 38H criticality Sample HTF-093

    International Nuclear Information System (INIS)

    Wilmarth, W.R.

    2000-01-01

    Resumption of processing in the 242-16H Evaporator could cause salt dissolution in the Waste Concentration Receipt Tank (Tank 38H). Therefore, High Level Waste personnel sampled the tank at the salt surface. Results of elemental analysis of the dried sludge solids from this sample (HTF-093) show significant quantities of neutron poisons (i.e., sodium, iron, and manganese) present to mitigate the potential for nuclear criticality. Comparison of this sample with the previous chemical and radiometric analyses of H-Area Evaporator samples show high poison to actinide ratios

  3. Adequacy and validation of an analytical method for the quantification of lead in chamomile tisanes produced in Costa Rica

    International Nuclear Information System (INIS)

    Blanco Barrantes, Jeimy

    2014-01-01

    An analytical methodology is developed and validated to quantify lead in chamomile tisanes. Lead is quantified by utilizing the technique of flame atomic absorption spectroscopy in three brands of chamomile tisanes sold in Costa Rica to determine its safety and quality based on international standards. A method of sample preparation is established through a comparison of different forms of extraction. The acid digestion extraction method has been the procedure utilized, reaching an average recovery percentage of 97,1%, with a standard deviation of 2,3%. The optimization of the chosen analytical procedure and complete validation is performed. The results obtained in the validation of the analytical procedure have shown that the interval where is generated the best calibration curve in terms of the correlation coefficient and the value of the statistically significant intercept equal to zero, have been the comprised between (0,2-3,2) μg/mL (r 2 =0,9996), corresponding to a range between 20% to 320% of the maximum allowed limit. In addition, the procedure has been adequate in terms of accuracy (average recovery percentage 101,1%) and precision under repeatability and intermediate precision (RSD max. 9,3%) and limit of quantification (0,2551 μg/mL). The safety criterion of World Health Organization (WHO) is determined with respect to the concentration of lead in the analyzed products. The 9 analyzed samples of products to prepare chamomile tisanes have stayed without evidencing concentrations of lead above the limit value of 10 μg/g suggested for medicinal herbs by WHO [es

  4. Semi-physiologic model validation and bioequivalence trials simulation to select the best analyte for acetylsalicylic acid.

    Science.gov (United States)

    Cuesta-Gragera, Ana; Navarro-Fontestad, Carmen; Mangas-Sanjuan, Victor; González-Álvarez, Isabel; García-Arieta, Alfredo; Trocóniz, Iñaki F; Casabó, Vicente G; Bermejo, Marival

    2015-07-10

    The objective of this paper is to apply a previously developed semi-physiologic pharmacokinetic model implemented in NONMEM to simulate bioequivalence trials (BE) of acetyl salicylic acid (ASA) in order to validate the model performance against ASA human experimental data. ASA is a drug with first-pass hepatic and intestinal metabolism following Michaelis-Menten kinetics that leads to the formation of two main metabolites in two generations (first and second generation metabolites). The first aim was to adapt the semi-physiological model for ASA in NOMMEN using ASA pharmacokinetic parameters from literature, showing its sequential metabolism. The second aim was to validate this model by comparing the results obtained in NONMEM simulations with published experimental data at a dose of 1000 mg. The validated model was used to simulate bioequivalence trials at 3 dose schemes (100, 1000 and 3000 mg) and with 6 test formulations with decreasing in vivo dissolution rate constants versus the reference formulation (kD 8-0.25 h (-1)). Finally, the third aim was to determine which analyte (parent drug, first generation or second generation metabolite) was more sensitive to changes in formulation performance. The validation results showed that the concentration-time curves obtained with the simulations reproduced closely the published experimental data, confirming model performance. The parent drug (ASA) was the analyte that showed to be more sensitive to the decrease in pharmaceutical quality, with the highest decrease in Cmax and AUC ratio between test and reference formulations. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Analytical validation of operator actions in case of primary to secondary leakage for VVER-1000/V320

    Energy Technology Data Exchange (ETDEWEB)

    Andreeva, M., E-mail: m_andreeva@inrne.bas.bg; Groudev, P., E-mail: pavlinpg@inrne.bas.bg; Pavlova, M., E-mail: pavlova@inrne.bas.bg

    2015-12-15

    Highlights: • We validate operator actions in case of primary to secondary leakage. • We perform four scenarios related to SGTR accident for VVER-1000/V320. • The reference power plant for the analyses is Unit 6 at Kozloduy NPP. • The RELAP5/MOD 3.2 computer code is used in performing the analyses. • The analyses confirm the effectiveness of operator actions during PRISE. - Abstract: This paper presents the results of analytical validation of operator actions in case of “Steam Generator Tube Rupture” (SGTR) for VVER-1000/V320 units at Kozloduy Nuclear Power Plant (KNPP), done during the development of Symptom Based Emergency Operating Procedures (SB EOPs) for this plant. The purpose of the analyses is to demonstrate the ability to terminate primary to secondary leakage and to indicate an effective strategy for preventing secondary leakage to the environment and in this way to prevent radiological release to the environment. Following depressurization and cooldown of reactor coolant system (RCS) with isolation of the affected steam generator (SG), in these analyses are validated options for post-SGTR cooldown by: • back up filling the ruptured SG; • using letdown system in the affected SG and • by opening Fast Acting Isolation Valve (FAIV) and using Steam Dump Facility to the Condenser (BRU-K). The results of the thermal-hydraulic analyses have been used to assist KNPP specialists in analytical validation of EOPs. The RELAP5/MOD3.2 computer code has been used for the analyses in a VVER-1000 Nuclear Power Plant (NPP) model. A model of VVER-1000 based on Unit 6 of Kozloduy NPP has been developed for the thermal-hydraulics code RELAP5/MOD3.2 at the Institute for Nuclear Research and Nuclear Energy – Bulgarian Academy of Sciences (INRNE-BAS). This paper is possible through the participation of leading specialists from KNPP.

  6. SU-E-J-145: Validation of An Analytical Model for in Vivo Range Verification Using GATE Monte Carlo Simulation in Proton Therapy

    International Nuclear Information System (INIS)

    Lee, C; Lin, H; Chao, T; Hsiao, I; Chuang, K

    2015-01-01

    Purpose: Predicted PET images on the basis of analytical filtering approach for proton range verification has been successful developed and validated using FLUKA Monte Carlo (MC) codes and phantom measurements. The purpose of the study is to validate the effectiveness of analytical filtering model for proton range verification on GATE/GEANT4 Monte Carlo simulation codes. Methods: In this study, we performed two experiments for validation of predicted β+-isotope by the analytical model with GATE/GEANT4 simulations. The first experiments to evaluate the accuracy of predicting β+-yields as a function of irradiated proton energies. In second experiment, we simulate homogeneous phantoms of different materials irradiated by a mono-energetic pencil-like proton beam. The results of filtered β+-yields distributions by the analytical model is compared with those of MC simulated β+-yields in proximal and distal fall-off ranges. Results: The results investigate the distribution between filtered β+-yields and MC simulated β+-yields distribution in different conditions. First, we found that the analytical filtering can be applied over the whole range of the therapeutic energies. Second, the range difference between filtered β+-yields and MC simulated β+-yields at the distal fall-off region are within 1.5mm for all materials used. The findings validated the usefulness of analytical filtering model on range verification of proton therapy on GATE Monte Carlo simulations. In addition, there is a larger discrepancy between filtered prediction and MC simulated β+-yields using GATE code, especially in proximal region. This discrepancy might Result from the absence of wellestablished theoretical models for predicting the nuclear interactions. Conclusion: Despite the fact that large discrepancies of the distributions between MC-simulated and predicted β+-yields were observed, the study prove the effectiveness of analytical filtering model for proton range verification using

  7. Short-Term Predictive Validity of Cluster Analytic and Dimensional Classification of Child Behavioral Adjustment in School

    Science.gov (United States)

    Kim, Sangwon; Kamphaus, Randy W.; Baker, Jean A.

    2006-01-01

    A constructive debate over the classification of child psychopathology can be stimulated by investigating the validity of different classification approaches. We examined and compared the short-term predictive validity of cluster analytic and dimensional classifications of child behavioral adjustment in school using the Behavior Assessment System…

  8. Validation of multivariate classification methods using analytical fingerprints – concept and case study on organic feed for laying hens

    NARCIS (Netherlands)

    Alewijn, Martin; Voet, van der Hilko; Ruth, van Saskia

    2016-01-01

    Multivariate classification methods based on analytical fingerprints have found many applications in the food and feed area, but practical applications are still scarce due to a lack of a generally accepted validation procedure. This paper proposes a new approach for validation of this type of

  9. Development and Validation Dissolution Analytical Method of Nimesulide beta-Cyclodextrin 400 mg Tablet

    Directory of Open Access Journals (Sweden)

    Carlos Eduardo Carvalho Pereira

    2016-10-01

    Full Text Available The nimesulide (N-(4-nitro-2-phenoxyphenylmethanesulfonamide belongs to the class of non-steroidal anti-inflammatory drugs (NSAIDs and category II of the biopharmaceutical classification, The complexation of nimesulide with b-cyclodextrin is a pharmacological strategy to increase the solubility of the drug The objective of this study was to develop and validate an analytical methodology for dissolving the nimesulide beta-cyclodextrin 400 mg tablet and meets the guidelines of ANVISA for drug registration purposes. Once developed, the dissolution methodology was validated according to the RE of parameters no.  899/2003. In the development of the method it was noted that the duration of the dissolution test was 60 minutes, the volume and the most suitable dissolution medium was 900 mL of aqueous solution of sodium lauryl sulfate 1% (w/ v. It was also noted that rotation of 100 rpm and the paddle apparatus was the most appropriate to evaluate the dissolution of the drug. Spectrophotometric methodology was used to quantify the percentage of dissolved drug. The wavelength was 390 nm using the quantification. The validation of the methodology, system suitability parameters, specificity/selectivity, linearity, precision, accuracy and robustness were satisfactory and proved that the developed dissolution methodology was duly executed. DOI: http://dx.doi.org/10.17807/orbital.v8i5.827

  10. Median of patient results as a tool for assessment of analytical stability.

    Science.gov (United States)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Analytical Validation of a New Enzymatic and Automatable Method for d-Xylose Measurement in Human Urine Samples

    Directory of Open Access Journals (Sweden)

    Israel Sánchez-Moreno

    2017-01-01

    Full Text Available Hypolactasia, or intestinal lactase deficiency, affects more than half of the world population. Currently, xylose quantification in urine after gaxilose oral administration for the noninvasive diagnosis of hypolactasia is performed with the hand-operated nonautomatable phloroglucinol reaction. This work demonstrates that a new enzymatic xylose quantification method, based on the activity of xylose dehydrogenase from Caulobacter crescentus, represents an excellent alternative to the manual phloroglucinol reaction. The new method is automatable and facilitates the use of the gaxilose test for hypolactasia diagnosis in the clinical practice. The analytical validation of the new technique was performed in three different autoanalyzers, using buffer or urine samples spiked with different xylose concentrations. For the comparison between the phloroglucinol and the enzymatic assays, 224 urine samples of patients to whom the gaxilose test had been prescribed were assayed by both methods. A mean bias of −16.08 mg of xylose was observed when comparing the results obtained by both techniques. After adjusting the cut-off of the enzymatic method to 19.18 mg of xylose, the Kappa coefficient was found to be 0.9531, indicating an excellent level of agreement between both analytical procedures. This new assay represents the first automatable enzymatic technique validated for xylose quantification in urine.

  12. Simultaneous determination of renal function biomarkers in urine using a validated paper-based microfluidic analytical device.

    Science.gov (United States)

    Rossini, Eduardo Luiz; Milani, Maria Izabel; Carrilho, Emanuel; Pezza, Leonardo; Pezza, Helena Redigolo

    2018-01-02

    In this paper, we describe a validated paper-based microfluidic analytical device for the simultaneous quantification of two important biomarkers of renal function in urine. This paper platform provides an inexpensive, simple, and easy to use colorimetric method for the quantification of creatinine (CRN) and uric acid (UA) in urine samples. The microfluidic paper-based analytical device (μPAD) consists of a main channel with three identical arms, each containing a circular testing zone and a circular uptake zone. Creatinine detection is based on the Jaffé reaction, in which CRN reacts with picrate to form an orange-red product. Uric acid quantification is based on the reduction of Fe 3+ to Fe 2+ by UA, which is detected in a colorimetric reaction using 1,10-phenanthroline. Under optimum conditions, obtained through chemometrics, the concentrations of the analytes showed good linear correlations with the effective intensities, and the method presented satisfactory repeatability. The limits of detection and the linear ranges, respectively, were 15.7 mg L -1 and 50-600 mg L -1 for CRN and 16.5 mg L -1 and 50-500 mg L -1 for UA. There were no statistically significant differences between the results obtained using the μPAD and a chromatographic comparative method (Student's t-test at 95% confidence level). Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Tank 48H Waste Composition and Results of Investigation of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Walker , D.D. [Westinghouse Savannah River Company, AIKEN, SC (United States)

    1997-04-02

    This report serves two purposes. First, it documents the analytical results of Tank 48H samples taken between April and August 1996. Second, it describes investigations of the precision of the sampling and analytical methods used on the Tank 48H samples.

  14. Complete analytic results for radiative-recoil corrections to ground-state muonium hyperfine splitting

    International Nuclear Information System (INIS)

    Karshenboim, S.G.; Shelyuto, V.A.; Eides, M.E.

    1988-01-01

    Analytic expressions are obtained for radiative corrections to the hyperfine splitting related to the muon line. The corresponding contribution amounts to (Z 2 a) (Za) (m/M) (9/2 ζ(3) - 3π 2 ln 2 + 39/8) in units of the Fermi hyperfine splitting energy. A complete analytic result for all radiative-recoil corrections is also presented

  15. Analytical validation and reference intervals for freezing point depression osmometer measurements of urine osmolality in dogs.

    Science.gov (United States)

    Guerrero, Samantha; Pastor, Josep; Tvarijonaviciute, Asta; Cerón, José Joaquín; Balestra, Graziano; Caldin, Marco

    2017-11-01

    Urine osmolality (UOsm) is considered the most accurate measure of urine concentration and is used to assess body fluid homeostasis and renal function. We performed analytical validation of freezing point depression measurement of canine UOsm, to establish reference intervals (RIs) and to determine the effect of age, sex, and reproductive status on UOsm in dogs. Clinically healthy dogs ( n = 1,991) were retrospectively selected and stratified in groups by age (young [0-12 mo], adults [13-84 mo], and seniors [>84 mo]), sex (females and males), and reproductive status (intact and neutered). RIs were calculated for each age group. Intra- and inter-assay coefficients of variation were dogs, and 366-2,178 mOsm/kg in seniors. Senior dogs had a significantly lower UOsm than young and adult dogs ( p dogs ( p dogs.

  16. A Validated Analytical Model for Availability Prediction of IPTV Services in VANETs

    Directory of Open Access Journals (Sweden)

    Bernd E. Wolfinger

    2014-12-01

    Full Text Available In vehicular ad hoc networks (VANETs, besides the original applications typically related to traffic safety, we nowadays can observe an increasing trend toward infotainment applications, such as IPTV services. Quality of experience (QoE, as observed by the end users of IPTV, is highly important to guarantee adequate user acceptance for the service. In IPTV, QoE is mainly determined by the availability of TV channels for the users. This paper presents an efficient and rather generally applicable analytical model that allows one to predict the blocking probability of TV channels, both for channel-switching-induced, as well as for handover-induced blocking events. We present the successful validation of the model by means of simulation, and we introduce a new measure for QoE. Numerous case studies illustrate how the analytical model and our new QoE measure can be applied successfully for the dimensioning of IPTV systems, taking into account the QoE requirements of the IPTV service users in strongly diverse traffic scenarios.

  17. The Mistra experiment for field containment code validation first results

    International Nuclear Information System (INIS)

    Caron-Charles, M.; Blumenfeld, L.

    2001-01-01

    The MISTRA facility is a large scale experiment, designed for the purpose of thermal-hydraulics multi-D codes validation. A short description of the facility, the set up of the instrumentation and the test program are presented. Then, the first experimental results, studying helium injection in the containment and their calculations are detailed. (author)

  18. Application of Statistical Methods to Activation Analytical Results near the Limit of Detection

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Wanscher, B.

    1978-01-01

    Reporting actual numbers instead of upper limits for analytical results at or below the detection limit may produce reliable data when these numbers are subjected to appropriate statistical processing. Particularly in radiometric methods, such as activation analysis, where individual standard...... deviations of analytical results may be estimated, improved discrimination may be based on the Analysis of Precision. Actual experimental results from a study of the concentrations of arsenic in human skin demonstrate the power of this principle....

  19. Surveillance of emerging drugs of abuse in Hong Kong: validation of an analytical tool.

    Science.gov (United States)

    Tang, Magdalene H Y; Ching, C K; Tse, M L; Ng, Carol; Lee, Caroline; Chong, Y K; Wong, Watson; Mak, Tony W L

    2015-04-01

    To validate a locally developed chromatography-based method to monitor emerging drugs of abuse whilst performing regular drug testing in abusers. Cross-sectional study. Eleven regional hospitals, seven social service units, and a tertiary level clinical toxicology laboratory in Hong Kong. A total of 972 drug abusers and high-risk individuals were recruited from acute, rehabilitation, and high-risk settings between 1 November 2011 and 31 July 2013. A subset of the participants was of South Asian ethnicity. In total, 2000 urine or hair specimens were collected. Proof of concept that surveillance of emerging drugs of abuse can be performed whilst conducting routine drug of abuse testing in patients. The method was successfully applied to 2000 samples with three emerging drugs of abuse detected in five samples: PMMA (paramethoxymethamphetamine), TFMPP [1-(3-trifluoromethylphenyl)piperazine], and methcathinone. The method also detected conventional drugs of abuse, with codeine, methadone, heroin, methamphetamine, and ketamine being the most frequently detected drugs. Other findings included the observation that South Asians had significantly higher rates of using opiates such as heroin, methadone, and codeine; and that ketamine and cocaine had significantly higher detection rates in acute subjects compared with the rehabilitation population. This locally developed analytical method is a valid tool for simultaneous surveillance of emerging drugs of abuse and routine drug monitoring of patients at minimal additional cost and effort. Continued, proactive surveillance and early identification of emerging drugs will facilitate prompt clinical, social, and legislative management.

  20. Development and validation of HPLC analytical method for quantitative determination of metronidazole in human plasma

    International Nuclear Information System (INIS)

    Safdar, K.A.; Shyum, S.B.; Usman, S.

    2016-01-01

    The objective of the present study was to develop a simple, rapid and sensitive reversed-phase high performance liquid chromatographic (RP-HPLC) analytical method with UV detection system for the quantitative determination of metronidazole in human plasma. The chromatographic separation was performed by using C18 RP column (250mm X 4.6mm, 5 meu m) as stationary phase and 0.01M potassium dihydrogen phosphate buffered at pH 3.0 and acetonitrile (83:17, v/v) as mobile phase at flow rate of 1.0 ml/min. The UV detection was carried out at 320nm. The method was validated as per the US FDA guideline for bioanalytical method validation and was found to be selective without interferences from mobile phase components, impurities and biological matrix. The method found to be linear over the concentration range of 0.2812 meu g/ml to 18.0 meu g/ml (r2 = 0.9987) with adequate level of accuracy and precision. The samples were found to be stable under various recommended laboratory and storage conditions. Therefore, the method can be used with adequate level of confidence and assurance for bioavailability, bioequivalence and other pharmacokinetic studies of metronidazole in human. (author)

  1. Final Report on the Analytical Results for Tank Farm Samples in Support of Salt Dissolution Evaluation

    International Nuclear Information System (INIS)

    Hobbs, D.T.

    1996-01-01

    Recent processing of dilute solutions through the 2H-Evaporator system caused dissolution of salt in Tank 38H, the concentrate receipt tank. This report documents analytical results for samples taken from this evaporator system

  2. New analytical results in the electromagnetic response of composite superconducting wire in parallel fields

    NARCIS (Netherlands)

    Niessen, E.M.J.; Niessen, E.M.J.; Zandbergen, P.J.

    1993-01-01

    Analytical results are presented concerning the electromagnetic response of a composite superconducting wire in fields parallel to the wire axis, using the Maxwell equations supplemented with constitutive equations. The problem is nonlinear due to the nonlinearity in the constitutive equation

  3. Analytic result for the one-loop scalar pentagon integral with massless propagators

    International Nuclear Information System (INIS)

    Kniehl, Bernd A.; Tarasov, Oleg V.

    2010-01-01

    The method of dimensional recurrences proposed by one of the authors (O. V.Tarasov, 1996) is applied to the evaluation of the pentagon-type scalar integral with on-shell external legs and massless internal lines. For the first time, an analytic result valid for arbitrary space-time dimension d and five arbitrary kinematic variables is presented. An explicit expression in terms of the Appell hypergeometric function F 3 and the Gauss hypergeometric function 2 F 1 , both admitting one-fold integral representations, is given. In the case when one kinematic variable vanishes, the integral reduces to a combination of Gauss hypergeometric functions 2 F 1 . For the case when one scalar invariant is large compared to the others, the asymptotic values of the integral in terms of Gauss hypergeometric functions 2 F 1 are presented in d=2-2ε, 4-2ε, and 6-2ε dimensions. For multi-Regge kinematics, the asymptotic value of the integral in d=4-2ε dimensions is given in terms of the Appell function F 3 and the Gauss hypergeometric function 2 F 1 . (orig.)

  4. Analytic result for the one-loop scalar pentagon integral with massless propagators

    Energy Technology Data Exchange (ETDEWEB)

    Kniehl, Bernd A.; Tarasov, Oleg V. [Hamburg Univ. (Germany). II. Inst. fuer Theoretische Physik

    2010-01-15

    The method of dimensional recurrences proposed by one of the authors (O. V.Tarasov, 1996) is applied to the evaluation of the pentagon-type scalar integral with on-shell external legs and massless internal lines. For the first time, an analytic result valid for arbitrary space-time dimension d and five arbitrary kinematic variables is presented. An explicit expression in terms of the Appell hypergeometric function F{sub 3} and the Gauss hypergeometric function {sub 2}F{sub 1}, both admitting one-fold integral representations, is given. In the case when one kinematic variable vanishes, the integral reduces to a combination of Gauss hypergeometric functions {sub 2}F{sub 1}. For the case when one scalar invariant is large compared to the others, the asymptotic values of the integral in terms of Gauss hypergeometric functions {sub 2}F{sub 1} are presented in d=2-2{epsilon}, 4-2{epsilon}, and 6-2{epsilon} dimensions. For multi-Regge kinematics, the asymptotic value of the integral in d=4-2{epsilon} dimensions is given in terms of the Appell function F{sub 3} and the Gauss hypergeometric function {sub 2}F{sub 1}. (orig.)

  5. Analytic result for the one-loop scalar pentagon integral with massless propagators

    International Nuclear Information System (INIS)

    Kniehl, Bernd A.; Tarasov, Oleg V.

    2010-01-01

    The method of dimensional recurrences proposed by Tarasov (1996, 2000) is applied to the evaluation of the pentagon-type scalar integral with on-shell external legs and massless internal lines. For the first time, an analytic result valid for arbitrary space-time dimension d and five arbitrary kinematic variables is presented. An explicit expression in terms of the Appell hypergeometric function F 3 and the Gauss hypergeometric function 2 F 1 , both admitting one-fold integral representations, is given. In the case when one kinematic variable vanishes, the integral reduces to a combination of Gauss hypergeometric functions 2 F 1 . For the case when one scalar invariant is large compared to the others, the asymptotic values of the integral in terms of Gauss hypergeometric functions 2 F 1 are presented in d=2-2ε, 4-2ε, and 6-2ε dimensions. For multi-Regge kinematics, the asymptotic value of the integral in d=4-2ε dimensions is given in terms of the Appell function F 3 and the Gauss hypergeometric function 2 F 1 .

  6. Analytical validation of a reference laboratory ELISA for the detection of feline leukemia virus p27 antigen.

    Science.gov (United States)

    Buch, Jesse S; Clark, Genevieve H; Cahill, Roberta; Thatcher, Brendon; Smith, Peter; Chandrashekar, Ramaswamy; Leutenegger, Christian M; O'Connor, Thomas P; Beall, Melissa J

    2017-09-01

    Feline leukemia virus (FeLV) is an oncogenic retrovirus of cats. Immunoassays for the p27 core protein of FeLV aid in the detection of FeLV infections. Commercial microtiter-plate ELISAs have rapid protocols and visual result interpretation, limiting their usefulness in high-throughput situations. The purpose of our study was to validate the PetChek FeLV 15 ELISA, which is designed for the reference laboratory, and incorporates sequential, orthogonal screening and confirmatory protocols. A cutoff for the screening assay was established with 100% accuracy using 309 feline samples (244 negative, 65 positive) defined by the combined results of FeLV PCR and an independent reference p27 antigen ELISA. Precision of the screening assay was measured using a panel of 3 samples (negative, low-positive, and high-positive). The intra-assay coefficient of variation (CV) was 3.9-7.9%; the inter-assay CV was 6.0-8.6%. For the confirmatory assay, the intra-assay CV was 3.0-4.7%, and the inter-assay CV was 7.4-9.7%. The analytical sensitivity for p27 antigen was 3.7 ng/mL for inactivated whole FeLV and 1.2 ng/mL for purified recombinant FeLV p27. Analytical specificity was demonstrated based on the absence of cross-reactivity to related retroviruses. No interference was observed for samples containing added bilirubin, hemoglobin, or lipids. Based on these results, the new high-throughput design of the PetChek FeLV 15 ELISA makes it suitable for use in reference laboratory settings and maintains overall analytical performance.

  7. Validation of an analytical method for the determination of the sodium content in foods

    International Nuclear Information System (INIS)

    Valverde Montero, Ericka; Silva Trejos, Paulina

    2012-01-01

    The analytical methodology for quantitative determination of sodium in foods by flame atomic absorption spectrometry was validated. The samples of 0,5 g was realized by microwave oven with 5,0 mL of nitric acid (HNO 3 ) to 65% by mass. The linearity range has been from 0,043 mg/L to 0,70 mg/L with a correlation coefficient equal to 0,998. The detection and quantification limits have reported 0,025 mg/L and 0,043 mg/L, respectively; with 0,805 Lmg -1 of calibration sensitivity and 44 Lmg -1 of analytical sensitivity. The precision was evaluated in terms of repeatability and have obtained a value equal to 2,9% RDS r . The trueness was determined using three NIST ® , certified standards SRM 1846 Infant Formula with a reported value for sodium of (2310 ± 130) mg/kg, SRM 8414 Bovine Muscle Powder with a reported value for sodium of (0,210 ± 0,008)% and SRM 8415 Whole Egg Powder with a reported value for sodium of (0,377 ± 0,034)% by mass. The bias have obtained an average between(-0,010 to 0,009) mg/L. From the list of foods that were selected for the study, for example, whole milk powder, white wheat bread, fresh cheese and mozzarella cheese have submitted highest content in sodium concentrations, ranging from (106 to 452) mg Na /100g. (author) [es

  8. Validation of an analytical method applicable to study of 1 mg/mL oral Risperidone solution stability

    International Nuclear Information System (INIS)

    Abreu Alvarez, Maikel; Garcia Penna, Caridad Margarita; Martinez Miranda, Lissette

    2010-01-01

    A validated analytical method by high-performance liquid chromatography (HPLC) was applicable to study of 1 mg/mL Risperidone oral solution stability. The above method was linear, accurate, specific and exact. A stability study of the 1 mg/mL Risperidone oral solution was developed determining its expiry date. The shelf life study was conducted for 24 months at room temperature; whereas the accelerated stability study was conducted with product under influence of humidity and temperature; analysis was made during 3 months. Formula fulfilled the quality specifications described in Pharmacopeia. The results of stability according to shelf life after 24 months showed that the product maintains the parameters determining its quality during this time and in accelerated studies there was not significant degradation (p> 0.05) in the product. Under mentioned conditions expiry date was of 2 years

  9. Analytical method development and validation for quantification of uranium by Fourier Transform Infrared Spectroscopy (FTIR) for routine quality control analysis

    International Nuclear Information System (INIS)

    Pereira, Elaine; Silva, Ieda de S.; Gomide, Ricardo G.; Pires, Maria Aparecida F.

    2015-01-01

    This work presents a low cost, simple and new methodology for direct determination uranium in different matrices uranium: organic phase (UO 2 (NO 3 ) 2 .2TBP - uranyl nitrate complex) and aqueous phase (UO 2 (NO 3 ) 2 - NTU - uranyl nitrate), based on Fourier Transform Infrared spectroscopy (FTIR) using KBr pellets technique. The analytical validation is essential to define if a developed methodology is completely adjusted to the objectives that it is destined and is considered one of the main instruments of quality control. The parameters used in the validation process were: selectivity, linearity, limits of detection (LD) and quantitation (LQ), precision (repeatability and intermediate precision), accuracy and robustness. The method for uranium in organic phase (UO 2 (NO 3 ) 2 .2TBP in hexane/embedded in KBr) was linear (r=0.9989) over the range of 1.0 g L -1 a 14.3 g L -1 , LD were 92.1 mg L -1 and LQ 113.1 mg L -1 , precision (RSD < 1.6% and p-value < 0.05), accurate (recovery of 100.1% - 102.9%). The method for uranium aqueous phase (UO 2 (NO 3 )2/embedded in KBr) was linear (r=0.9964) over the range of 5.4 g L -1 a 51.2 g L -1 , LD were 835 mg L -1 and LQ 958 mg L -1 , precision (RSD < 1.0% and p-value < 0.05), accurate (recovery of 99.1% - 102.0%). The FTIR method is robust regarding most of the variables analyzed, as the difference between results obtained under nominal and modified conditions were lower than the critical value for all analytical parameters studied. Some process samples were analyzed in FTIR and compared with gravimetric and x ray fluorescence (XRF) analyses showing similar results in all three methods. The statistical tests (Student-t and Fischer) showed that the techniques are equivalent. (author)

  10. Impurities in biogas - validation of analytical methods for siloxanes; Foeroreningar i biogas - validering av analysmetodik foer siloxaner

    Energy Technology Data Exchange (ETDEWEB)

    Arrhenius, Karine; Magnusson, Bertil; Sahlin, Eskil [SP Technical Research Institute of Sweden, Boraas (Sweden)

    2011-11-15

    Biogas produced from digester or landfill contains impurities which can be harmful for component that will be in contact with the biogas during its utilization. Among these, the siloxanes are often mentioned. During combustion, siloxanes are converted to silicon dioxide which accumulates on the heated surfaces in combustion equipment. Silicon dioxide is a solid compound and will remain in the engine and cause damages. Consequently, it is necessary to develop methods for the accurate determination of these compounds in biogases. In the first part of this report, a method for analysis of siloxanes in biogases was validated. The sampling was performed directly at the plant by drawing a small volume of biogas onto an adsorbent tube under a short period of time. These tubes were subsequently sent to the laboratory for analysis. The purpose of method validation is to demonstrate that the established method is fit for the purpose. This means that the method, as used by the laboratory generating the data, will provide data that meets a set of criteria concerning precision and accuracy. At the end, the uncertainty of the method was calculated. In the second part of this report, the validated method was applied to real samples collected in waste water treatment plants, co-digestion plants and plants digesting other wastes (agriculture waste). Results are presented at the end of this report. As expected, the biogases from waste water treatment plants contained largely higher concentrations of siloxanes than biogases from co-digestion plants and plants digesting agriculture wastes. The concentration of siloxanes in upgraded biogas regardless of which feedstock was digested and which upgrading technique was used was low.

  11. Analytical Validation of Accelerator Mass Spectrometry for Pharmaceutical Development: the Measurement of Carbon-14 Isotope Ratio

    International Nuclear Information System (INIS)

    Keck, B.D.; Ognibene, T.; Vogel, J.S.

    2010-01-01

    Accelerator mass spectrometry (AMS) is an isotope based measurement technology that utilizes carbon-14 labeled compounds in the pharmaceutical development process to measure compounds at very low concentrations, empowers microdosing as an investigational tool, and extends the utility of 14 C labeled compounds to dramatically lower levels. It is a form of isotope ratio mass spectrometry that can provide either measurements of total compound equivalents or, when coupled to separation technology such as chromatography, quantitation of specific compounds. The properties of AMS as a measurement technique are investigated here, and the parameters of method validation are shown. AMS, independent of any separation technique to which it may be coupled, is shown to be accurate, linear, precise, and robust. As the sensitivity and universality of AMS is constantly being explored and expanded, this work underpins many areas of pharmaceutical development including drug metabolism as well as absorption, distribution and excretion of pharmaceutical compounds as a fundamental step in drug development. The validation parameters for pharmaceutical analyses were examined for the accelerator mass spectrometry measurement of 14 C/C ratio, independent of chemical separation procedures. The isotope ratio measurement was specific (owing to the 14 C label), stable across samples storage conditions for at least one year, linear over 4 orders of magnitude with an analytical range from one tenth Modern to at least 2000 Modern (instrument specific). Further, accuracy was excellent between 1 and 3 percent while precision expressed as coefficient of variation is between 1 and 6% determined primarily by radiocarbon content and the time spent analyzing a sample. Sensitivity, expressed as LOD and LLOQ was 1 and 10 attomoles of carbon-14 (which can be expressed as compound equivalents) and for a typical small molecule labeled at 10% incorporated with 14 C corresponds to 30 fg equivalents. AMS

  12. Analytical Validation of Accelerator Mass Spectrometry for Pharmaceutical Development: the Measurement of Carbon-14 Isotope Ratio.

    Energy Technology Data Exchange (ETDEWEB)

    Keck, B D; Ognibene, T; Vogel, J S

    2010-02-05

    Accelerator mass spectrometry (AMS) is an isotope based measurement technology that utilizes carbon-14 labeled compounds in the pharmaceutical development process to measure compounds at very low concentrations, empowers microdosing as an investigational tool, and extends the utility of {sup 14}C labeled compounds to dramatically lower levels. It is a form of isotope ratio mass spectrometry that can provide either measurements of total compound equivalents or, when coupled to separation technology such as chromatography, quantitation of specific compounds. The properties of AMS as a measurement technique are investigated here, and the parameters of method validation are shown. AMS, independent of any separation technique to which it may be coupled, is shown to be accurate, linear, precise, and robust. As the sensitivity and universality of AMS is constantly being explored and expanded, this work underpins many areas of pharmaceutical development including drug metabolism as well as absorption, distribution and excretion of pharmaceutical compounds as a fundamental step in drug development. The validation parameters for pharmaceutical analyses were examined for the accelerator mass spectrometry measurement of {sup 14}C/C ratio, independent of chemical separation procedures. The isotope ratio measurement was specific (owing to the {sup 14}C label), stable across samples storage conditions for at least one year, linear over 4 orders of magnitude with an analytical range from one tenth Modern to at least 2000 Modern (instrument specific). Further, accuracy was excellent between 1 and 3 percent while precision expressed as coefficient of variation is between 1 and 6% determined primarily by radiocarbon content and the time spent analyzing a sample. Sensitivity, expressed as LOD and LLOQ was 1 and 10 attomoles of carbon-14 (which can be expressed as compound equivalents) and for a typical small molecule labeled at 10% incorporated with {sup 14}C corresponds to 30 fg

  13. Analytic Validation of Immunohistochemistry Assays: New Benchmark Data From a Survey of 1085 Laboratories.

    Science.gov (United States)

    Stuart, Lauren N; Volmar, Keith E; Nowak, Jan A; Fatheree, Lisa A; Souers, Rhona J; Fitzgibbons, Patrick L; Goldsmith, Jeffrey D; Astles, J Rex; Nakhleh, Raouf E

    2017-09-01

    - A cooperative agreement between the College of American Pathologists (CAP) and the United States Centers for Disease Control and Prevention was undertaken to measure laboratories' awareness and implementation of an evidence-based laboratory practice guideline (LPG) on immunohistochemical (IHC) validation practices published in 2014. - To establish new benchmark data on IHC laboratory practices. - A 2015 survey on IHC assay validation practices was sent to laboratories subscribed to specific CAP proficiency testing programs and to additional nonsubscribing laboratories that perform IHC testing. Specific questions were designed to capture laboratory practices not addressed in a 2010 survey. - The analysis was based on responses from 1085 laboratories that perform IHC staining. Ninety-six percent (809 of 844) always documented validation of IHC assays. Sixty percent (648 of 1078) had separate procedures for predictive and nonpredictive markers, 42.7% (220 of 515) had procedures for laboratory-developed tests, 50% (349 of 697) had procedures for testing cytologic specimens, and 46.2% (363 of 785) had procedures for testing decalcified specimens. Minimum case numbers were specified by 85.9% (720 of 838) of laboratories for nonpredictive markers and 76% (584 of 768) for predictive markers. Median concordance requirements were 95% for both types. For initial validation, 75.4% (538 of 714) of laboratories adopted the 20-case minimum for nonpredictive markers and 45.9% (266 of 579) adopted the 40-case minimum for predictive markers as outlined in the 2014 LPG. The most common method for validation was correlation with morphology and expected results. Laboratories also reported which assay changes necessitated revalidation and their minimum case requirements. - Benchmark data on current IHC validation practices and procedures may help laboratories understand the issues and influence further refinement of LPG recommendations.

  14. Analytical performances of food microbiology laboratories - critical analysis of 7 years of proficiency testing results.

    Science.gov (United States)

    Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J

    2016-02-01

    Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.

  15. Development and validation of analytical method for Naftopidil in human plasma by LC–MS/MS

    Directory of Open Access Journals (Sweden)

    Pritam S. Jain

    2015-09-01

    Full Text Available A highly sensitive and simple high-performance liquid chromatographic–tandem mass spectrometric (LC–MS-MS assay is developed and validated for the quantification of Naftopidil in human plasma. Naftopidil is extracted from human plasma by methyl tertiary butyl ether and analyzed using a reversed-phase gradient elution on a discovery C 18 5 μ (50 × 4.6 column. A methanol: 2 mM ammonium formate (90:10 as mobile phase, is used and detection was performed by MS using electrospray ionization in positive mode. Propranolol is used as the internal standard. The lower limits of quantification are 0.495 ng/mL. The calibration curves are linear over the concentration range of 0.495–200.577 ng/mL of plasma for each analyte. This novel LC–MS-MS method shows satisfactory accuracy and precision and is sufficiently sensitive for the performance of pharmacokinetic studies in humans.

  16. Analytical Validation of the ReEBOV Antigen Rapid Test for Point-of-Care Diagnosis of Ebola Virus Infection

    Science.gov (United States)

    Cross, Robert W.; Boisen, Matthew L.; Millett, Molly M.; Nelson, Diana S.; Oottamasathien, Darin; Hartnett, Jessica N.; Jones, Abigal B.; Goba, Augustine; Momoh, Mambu; Fullah, Mohamed; Bornholdt, Zachary A.; Fusco, Marnie L.; Abelson, Dafna M.; Oda, Shunichiro; Brown, Bethany L.; Pham, Ha; Rowland, Megan M.; Agans, Krystle N.; Geisbert, Joan B.; Heinrich, Megan L.; Kulakosky, Peter C.; Shaffer, Jeffrey G.; Schieffelin, John S.; Kargbo, Brima; Gbetuwa, Momoh; Gevao, Sahr M.; Wilson, Russell B.; Saphire, Erica Ollmann; Pitts, Kelly R.; Khan, Sheik Humarr; Grant, Donald S.; Geisbert, Thomas W.; Branco, Luis M.; Garry, Robert F.

    2016-01-01

    Background. Ebola virus disease (EVD) is a severe viral illness caused by Ebola virus (EBOV). The 2013–2016 EVD outbreak in West Africa is the largest recorded, with >11 000 deaths. Development of the ReEBOV Antigen Rapid Test (ReEBOV RDT) was expedited to provide a point-of-care test for suspected EVD cases. Methods. Recombinant EBOV viral protein 40 antigen was used to derive polyclonal antibodies for RDT and enzyme-linked immunosorbent assay development. ReEBOV RDT limits of detection (LOD), specificity, and interference were analytically validated on the basis of Food and Drug Administration (FDA) guidance. Results. The ReEBOV RDT specificity estimate was 95% for donor serum panels and 97% for donor whole-blood specimens. The RDT demonstrated sensitivity to 3 species of Ebolavirus (Zaire ebolavirus, Sudan ebolavirus, and Bundibugyo ebolavirus) associated with human disease, with no cross-reactivity by pathogens associated with non-EBOV febrile illness, including malaria parasites. Interference testing exhibited no reactivity by medications in common use. The LOD for antigen was 4.7 ng/test in serum and 9.4 ng/test in whole blood. Quantitative reverse transcription–polymerase chain reaction testing of nonhuman primate samples determined the range to be equivalent to 3.0 × 105–9.0 × 108 genomes/mL. Conclusions. The analytical validation presented here contributed to the ReEBOV RDT being the first antigen-based assay to receive FDA and World Health Organization emergency use authorization for this EVD outbreak, in February 2015. PMID:27587634

  17. Circular orbits of corotating binary black holes: Comparison between analytical and numerical results

    International Nuclear Information System (INIS)

    Damour, Thibault; Gourgoulhon, Eric; Grandclement, Philippe

    2002-01-01

    We compare recent numerical results, obtained within a 'helical Killing vector' approach, on circular orbits of corotating binary black holes to the analytical predictions made by the effective one-body (EOB) method (which has been recently extended to the case of spinning bodies). On the scale of the differences between the results obtained by different numerical methods, we find good agreement between numerical data and analytical predictions for several invariant functions describing the dynamical properties of circular orbits. This agreement is robust against the post-Newtonian accuracy used for the analytical estimates, as well as under choices of the resummation method for the EOB 'effective potential', and gets better as one uses a higher post-Newtonian accuracy. These findings open the way to a significant 'merging' of analytical and numerical methods, i.e. to matching an EOB-based analytical description of the (early and late) inspiral, up to the beginning of the plunge, to a numerical description of the plunge and merger. We illustrate also the 'flexibility' of the EOB approach, i.e. the possibility of determining some 'best fit' values for the analytical parameters by comparison with numerical data

  18. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  19. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  20. Validation of a new analytical procedure for determination of residual solvents in [18F]FDG by gas chromatography

    International Nuclear Information System (INIS)

    Costa, Flávia M.; Costa, Cassiano L.S.; Silva, Juliana B.; Ferreira, Soraya M.Z.M.D.

    2017-01-01

    Fludeoxyglucose F 18 ([ 18 F]FDG) is the most used radiopharmaceutical for positron emission tomography, especially on oncology. Organic solvents such as ether, ethanol and acetonitrile might be used in the synthesis of [ 18 F]FDG; however, they might not be completely removed during purification steps. The determination of residual solvents in [ 18 F]FDG is required in the European Pharmacopoeia (EP) and the United States Pharmacopeia (USP) monographs. While the procedure described in the EP is quite general, the one described in the USP requires a long runtime (about 13 minutes). In this work a simple and fast (4-minute) analytical procedure was developed and validated for determination of residual solvents in [ 18 F]FDG. Analyses were carried out in a Perkin Elmer gas chromatograph equipped with a flame ionization detector. The separation was obtained on a 0.53-mm x 30 m fused-silica column. Validation included the evaluation of various parameters, such as: specificity, linearity and range, limits of detection and quantitation, precision (repeatability and intermediate precision), accuracy, and robustness. Results were found to be within acceptable limits, indicating the developed procedure is suitable for its intended application. Considering the short half-life of fluorine-18 (109.7 minutes), this new method could be a valuable alternative for routine quality control of [ 18 F]FDG. (author)

  1. Validation of a new analytical procedure for determination of residual solvents in [{sup 18}F]FDG by gas chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Flávia M.; Costa, Cassiano L.S.; Silva, Juliana B.; Ferreira, Soraya M.Z.M.D., E-mail: flaviabiomedica@yahoo.com.br [Centro de Desenvolvimento da Tecnologia Nuclear (UPPR/CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Unidade de Pesquisa e Produção de Radiofármacos

    2017-07-01

    Fludeoxyglucose F 18 ([{sup 18}F]FDG) is the most used radiopharmaceutical for positron emission tomography, especially on oncology. Organic solvents such as ether, ethanol and acetonitrile might be used in the synthesis of [{sup 18}F]FDG; however, they might not be completely removed during purification steps. The determination of residual solvents in [{sup 18}F]FDG is required in the European Pharmacopoeia (EP) and the United States Pharmacopeia (USP) monographs. While the procedure described in the EP is quite general, the one described in the USP requires a long runtime (about 13 minutes). In this work a simple and fast (4-minute) analytical procedure was developed and validated for determination of residual solvents in [{sup 18}F]FDG. Analyses were carried out in a Perkin Elmer gas chromatograph equipped with a flame ionization detector. The separation was obtained on a 0.53-mm x 30 m fused-silica column. Validation included the evaluation of various parameters, such as: specificity, linearity and range, limits of detection and quantitation, precision (repeatability and intermediate precision), accuracy, and robustness. Results were found to be within acceptable limits, indicating the developed procedure is suitable for its intended application. Considering the short half-life of fluorine-18 (109.7 minutes), this new method could be a valuable alternative for routine quality control of [{sup 18}F]FDG. (author)

  2. Analytical and Experimental Study for Validation of the Device to Confine BN Reactor Melted Fuel

    International Nuclear Information System (INIS)

    Rogozhkin, S.; Osipov, S.; Sobolev, V.; Shepelev, S.; Kozhaev, A.; Mavrin, M.; Ryabov, A.

    2013-01-01

    To validate the design and confirm the design characteristics of the special retaining device (core catcher) used for protection of BN reactor vessel in the case of a severe beyond-design basis accident with core melting, computational and experimental studies were carried out. The Tray test facility that uses water as coolant was developed and fabricated by OKBM; experimental studies were performed. To verify the methodical approach used for the computational study, experimental results obtained in the Tray test facility were compared with numerical simulation results obtained by the STAR-CCM+ CFD code

  3. Diffusion weighted MRI by spatiotemporal encoding: Analytical description and in vivo validations

    Science.gov (United States)

    Solomon, Eddy; Shemesh, Noam; Frydman, Lucio

    2013-07-01

    Diffusion-weighted (DW) MRI is a powerful modality for studying microstructure in normal and pathological tissues. The accuracy derived from DW MRI depends on the acquisition of quality images, and on a precise assessment of the b-values involved. Conventional DW MRI tends to be of limited use in regions suffering from large magnetic field or chemical shift heterogeneities, which severely distort the MR images. In this study we propose novel sequences based on SPatio-temporal ENcoding (SPEN), which overcome such shortcomings owing to SPEN's inherent robustness to offsets. SPEN, however, relies on the simultaneous application of gradients and radiofrequency-swept pulses, which may impart different diffusion weightings along the spatial axes. These will be further complicated in DW measurements by the diffusion-sensitizing gradients, and will in general lead to complex, spatially-dependent b-values. This study presents a formalism for analyzing these diffusion-weighted SPEN (dSPEN) data, which takes into account the concomitant effects of adiabatic pulses, of the imaging as well as diffusion gradients, and of the cross-terms between them. These analytical b-values derivations are subject to experimental validations in phantom systems and ex vivo spinal cords. Excellent agreement is found between the theoretical predictions and these dSPEN experiments. The ensuing methodology is then demonstrated by in vivo mapping of diffusion in human breast - organs where conventional k-space DW acquisition methods are challenged by both field and chemical shift heterogeneities. These studies demonstrate the increased robustness of dSPEN vis-à-vis comparable DW echo planar imaging, and demonstrate the value of this new methodology for medium- or high-field diffusion measurements in heterogeneous systems.

  4. Review of analytical results from the proposed agent disposal facility site, Aberdeen Proving Ground

    Energy Technology Data Exchange (ETDEWEB)

    Brubaker, K.L.; Reed, L.L.; Myers, S.W.; Shepard, L.T.; Sydelko, T.G.

    1997-09-01

    Argonne National Laboratory reviewed the analytical results from 57 composite soil samples collected in the Bush River area of Aberdeen Proving Ground, Maryland. A suite of 16 analytical tests involving 11 different SW-846 methods was used to detect a wide range of organic and inorganic contaminants. One method (BTEX) was considered redundant, and two {open_quotes}single-number{close_quotes} methods (TPH and TOX) were found to lack the required specificity to yield unambiguous results, especially in a preliminary investigation. Volatile analytes detected at the site include 1, 1,2,2-tetrachloroethane, trichloroethylene, and tetrachloroethylene, all of which probably represent residual site contamination from past activities. Other volatile analytes detected include toluene, tridecane, methylene chloride, and trichlorofluoromethane. These compounds are probably not associated with site contamination but likely represent cross-contamination or, in the case of tridecane, a naturally occurring material. Semivolatile analytes detected include three different phthalates and low part-per-billion amounts of the pesticide DDT and its degradation product DDE. The pesticide could represent residual site contamination from past activities, and the phthalates are likely due, in part, to cross-contamination during sample handling. A number of high-molecular-weight hydrocarbons and hydrocarbon derivatives were detected and were probably naturally occurring compounds. 4 refs., 1 fig., 8 tabs.

  5. Validating and Determining the Weight of Items Used for Evaluating Clinical Governance Implementation Based on Analytic Hierarchy Process Model

    Directory of Open Access Journals (Sweden)

    Elaheh Hooshmand

    2015-10-01

    Full Text Available Background The purpose of implementing a system such as Clinical Governance (CG is to integrate, establish and globalize distinct policies in order to improve quality through increasing professional knowledge and the accountability of healthcare professional toward providing clinical excellence. Since CG is related to change, and change requires money and time, CG implementation has to be focused on priority areas that are in more dire need of change. The purpose of the present study was to validate and determine the significance of items used for evaluating CG implementation. Methods The present study was descriptive-quantitative in method and design. Items used for evaluating CG implementation were first validated by the Delphi method and then compared with one another and ranked based on the Analytical Hierarchy Process (AHP model. Results The items that were validated for evaluating CG implementation in Iran include performance evaluation, training and development, personnel motivation, clinical audit, clinical effectiveness, risk management, resource allocation, policies and strategies, external audit, information system management, research and development, CG structure, implementation prerequisites, the management of patients’ non-medical needs, complaints and patients’ participation in the treatment process. The most important items based on their degree of significance were training and development, performance evaluation, and risk management. The least important items included the management of patients’ non-medical needs, patients’ participation in the treatment process and research and development. Conclusion The fundamental requirements of CG implementation included having an effective policy at national level, avoiding perfectionism, using the expertise and potentials of the entire country and the coordination of this model with other models of quality improvement such as accreditation and patient safety.

  6. Adaptive cyclically dominating game on co-evolving networks: numerical and analytic results

    Science.gov (United States)

    Choi, Chi Wun; Xu, Chen; Hui, Pak Ming

    2017-10-01

    A co-evolving and adaptive Rock (R)-Paper (P)-Scissors (S) game (ARPS) in which an agent uses one of three cyclically dominating strategies is proposed and studied numerically and analytically. An agent takes adaptive actions to achieve a neighborhood to his advantage by rewiring a dissatisfying link with a probability p or switching strategy with a probability 1 - p. Numerical results revealed two phases in the steady state. An active phase for p pc has three separate clusters of agents using only R, P, and S, respectively with terminated adaptive actions. A mean-field theory based on the link densities in co-evolving network is formulated and the trinomial closure scheme is applied to obtain analytical solutions. The analytic results agree with simulation results on ARPS well. In addition, the different probabilities of winning, losing, and drawing a game among the agents are identified as the origin of the small discrepancy between analytic and simulation results. As a result of the adaptive actions, agents of higher degrees are often those being taken advantage of. Agents with a smaller (larger) degree than the mean degree have a higher (smaller) probability of winning than losing. The results are informative for future attempts on formulating more accurate theories.

  7. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  8. Fission product release from nuclear fuel II. Validation of ASTEC/ELSA on analytical and large scale experiments

    International Nuclear Information System (INIS)

    Brillant, G.; Marchetto, C.; Plumecocq, W.

    2013-01-01

    Highlights: • A wide range of experiments is presented for the ASTEC/ELSA code validation. • Analytical tests such as AECL, ORNL and VERCORS are considered. • A large-scale experiment, PHEBUS FPT1, is considered. • The good agreement with measurements shows the efficiency of the ASTEC modelling. • Improvements concern the FP release modelling from MOX and high burn-up UO 2 fuels. - Abstract: This article is the second of two articles dedicated to the mechanisms of fission product release from a degraded core. The models of fission product release from nuclear fuel in the ASTEC code have been described in detail in the first part of this work (Brillant et al., this issue). In this contribution, the validation of ELSA, the module of ASTEC that deals with fission product and structural material release from a degraded core, is presented. A large range of experimental tests, with various temperature and conditions for the fuel surrounding atmosphere (oxidising and reducing), is thus simulated with the ASTEC code. The validation database includes several analytical experiments with both bare fuel (e.g. MCE1 experiments) and cladded fuel (e.g. HCE3, VERCORS). Furthermore, the PHEBUS large-scale experiments are used for the validation of ASTEC. The rather satisfactory comparison between ELSA calculations and experimental measurements demonstrates the efficiency of the analytical models to describe fission product release in severe accident conditions

  9. Analytical method and result of radiation exposure for depressurization accident of HTTR

    International Nuclear Information System (INIS)

    Sawa, K.; Shiozawa, S.; Mikami, H.

    1990-01-01

    The Japan Atomic Energy Research Institute (JAERI) is now proceeding with the construction design of the High Temperature Engineering Test Reactor (HTTR). Since the HTTR has some characteristics different from LWRs, analytical method of radiation exposure in accidents provided for LWRs can not be applied directly. This paper describes the analytical method of radiation exposure developed by JAERI for the depressurization accident, which is the severest accident in respect to radiation exposure among the design basis accidents of the HTTR. The result is also described in this paper

  10. Tank 241-S-102, Core 232 analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    STEEN, F.H.

    1998-11-04

    This document is the analytical laboratory report for tank 241-S-102 push mode core segments collected between March 5, 1998 and April 2, 1998. The segments were subsampled and analyzed in accordance with the Tank 241-S-102 Retained Gas Sampler System Sampling and Analysis Plan (TSAP) (McCain, 1998), Letter of Instruction for Compatibility Analysis of Samples from Tank 241-S-102 (LOI) (Thompson, 1998) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DQO) (Mulkey and Miller, 1998). The analytical results are included in the data summary table (Table 1).

  11. Tank 241-BY-109, cores 201 and 203, analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final laboratory report for tank 241-BY-109 push mode core segments collected between June 6, 1997 and June 17, 1997. The segments were subsampled and analyzed in accordance with the Tank Push Mode Core Sampling and Analysis Plan (Bell, 1997), the Tank Safety Screening Data Quality Objective (Dukelow, et al, 1995). The analytical results are included

  12. Interpretation of results for tumor markers on the basis of analytical imprecision and biological variation

    DEFF Research Database (Denmark)

    Sölétormos, G; Schiøler, V; Nielsen, D

    1993-01-01

    Interpretation of results for CA 15.3, carcinoembryonic antigen (CEA), and tissue polypeptide antigen (TPA) during breast cancer monitoring requires data on intra- (CVP) and inter- (CVG) individual biological variation, analytical imprecision (CVA), and indices of individuality. The average CVP...

  13. Arnol'd tongues for a resonant injection-locked frequency divider: analytical and numerical results

    DEFF Research Database (Denmark)

    Bartuccelli, Michele; Deane, Jonathan H.B.; Gentile, Guido

    2010-01-01

    ’d tongues in the frequency–amplitude plane. In particular, we provide exact analytical formulae for the widths of the tongues, which correspond to the plateaux of the devil’s staircase picture. The results account for numerical and experimental findings presented in the literature for special driving terms...

  14. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    Science.gov (United States)

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  15. Analytical Method Validation of High-Performance Liquid Chromatography and Stability-Indicating Study of Medroxyprogesterone Acetate Intravaginal Sponges

    Directory of Open Access Journals (Sweden)

    Nidal Batrawi

    2017-02-01

    Full Text Available Medroxyprogesterone acetate is widely used in veterinary medicine as intravaginal dosage for the synchronization of breeding cycle in ewes and goats. The main goal of this study was to develop reverse-phase high-performance liquid chromatography method for the quantification of medroxyprogesterone acetate in veterinary vaginal sponges. A single high-performance liquid chromatography/UV isocratic run was used for the analytical assay of the active ingredient medroxyprogesterone. The chromatographic system consisted of a reverse-phase C18 column as the stationary phase and a mixture of 60% acetonitrile and 40% potassium dihydrogen phosphate buffer as the mobile phase; the pH was adjusted to 5.6. The method was validated according to the International Council for Harmonisation (ICH guidelines. Forced degradation studies were also performed to evaluate the stability-indicating properties and specificity of the method. Medroxyprogesterone was eluted at 5.9 minutes. The linearity of the method was confirmed in the range of 0.0576 to 0.1134 mg/mL ( R 2 > 0.999. The limit of quantification was shown to be 3.9 µg/mL. Precision and accuracy ranges were found to be %RSD <0.2 and 98% to 102%, respectively. Medroxyprogesterone capacity factor value of 2.1, tailing factor value of 1.03, and resolution value of 3.9 were obtained in accordance with ICH guidelines. Based on the obtained results, a rapid, precise, accurate, sensitive, and cost-effective analysis procedure was proposed for quantitative determination of medroxyprogesterone in vaginal sponges. This analytical method is the only available method to analyse medroxyprogesterone in veterinary intravaginal dosage form.

  16. Analytical validation of a melanoma diagnostic gene signature using formalin-fixed paraffin-embedded melanocytic lesions.

    Science.gov (United States)

    Warf, M Bryan; Flake, Darl D; Adams, Doug; Gutin, Alexander; Kolquist, Kathryn A; Wenstrup, Richard J; Roa, Benjamin B

    2015-01-01

    These studies were to validate the analytical performance of a gene expression signature that differentiates melanoma and nevi, using RNA expression from 14 signature genes and nine normalization genes that generates a melanoma diagnostic score (MDS). Formalin-fixed paraffin-embedded melanocytic lesions were evaluated in these studies. The overall SD of the assay was determined to be 0.69 MDS units. Individual amplicons within the signature had an average amplification efficiency of 92% and a SD less than 0.5 CT. The MDS was reproducible across a 2000-fold dilution range of input RNA. Melanin, an inhibitor of PCR, does not interfere with the signature. These studies indicate this signature is robust and reproducible and is analytically validated on formalin-fixed paraffin-embedded melanocytic lesions.

  17. Experimental validation of an analytical kinetic model for edge-localized modes in JET-ITER-like wall

    Science.gov (United States)

    Guillemaut, C.; Metzger, C.; Moulton, D.; Heinola, K.; O’Mullane, M.; Balboa, I.; Boom, J.; Matthews, G. F.; Silburn, S.; Solano, E. R.; contributors, JET

    2018-06-01

    The design and operation of future fusion devices relying on H-mode plasmas requires reliable modelling of edge-localized modes (ELMs) for precise prediction of divertor target conditions. An extensive experimental validation of simple analytical predictions of the time evolution of target plasma loads during ELMs has been carried out here in more than 70 JET-ITER-like wall H-mode experiments with a wide range of conditions. Comparisons of these analytical predictions with diagnostic measurements of target ion flux density, power density, impact energy and electron temperature during ELMs are presented in this paper and show excellent agreement. The analytical predictions tested here are made with the ‘free-streaming’ kinetic model (FSM) which describes ELMs as a quasi-neutral plasma bunch expanding along the magnetic field lines into the Scrape-Off Layer without collisions. Consequences of the FSM on energy reflection and deposition on divertor targets during ELMs are also discussed.

  18. Toward valid and reliable brain imaging results in eating disorders.

    Science.gov (United States)

    Frank, Guido K W; Favaro, Angela; Marsh, Rachel; Ehrlich, Stefan; Lawson, Elizabeth A

    2018-03-01

    Human brain imaging can help improve our understanding of mechanisms underlying brain function and how they drive behavior in health and disease. Such knowledge may eventually help us to devise better treatments for psychiatric disorders. However, the brain imaging literature in psychiatry and especially eating disorders has been inconsistent, and studies are often difficult to replicate. The extent or severity of extremes of eating and state of illness, which are often associated with differences in, for instance hormonal status, comorbidity, and medication use, commonly differ between studies and likely add to variation across study results. Those effects are in addition to the well-described problems arising from differences in task designs, data quality control procedures, image data preprocessing and analysis or statistical thresholds applied across studies. Which of those factors are most relevant to improve reproducibility is still a question for debate and further research. Here we propose guidelines for brain imaging research in eating disorders to acquire valid results that are more reliable and clinically useful. © 2018 Wiley Periodicals, Inc.

  19. Advances in classical and analytical mechanics: A reviews of author’s results

    Directory of Open Access Journals (Sweden)

    Hedrih-Stevanović Katica R.

    2013-01-01

    Full Text Available A review, in subjective choice, of author’s scientific results in area of: classical mechanics, analytical mechanics of discrete hereditary systems, analytical mechanics of discrete fractional order system vibrations, elastodynamics, nonlinear dynamics and hybrid system dynamics is presented. Main original author’s results were presented through the mathematical methods of mechanics with examples of applications for solving problems of mechanical real system dynamics abstracted to the theoretical models of mechanical discrete or continuum systems, as well as hybrid systems. Paper, also, presents serries of methods and scientific results authored by professors Mitropolyski, Andjelić and Rašković, as well as author’s of this paper original scientific research results obtained by methods of her professors. Vector method based on mass inertia moment vectors and corresponding deviational vector components for pole and oriented axis, defined in 1991 by K. Hedrih, is presented. Results in construction of analytical dynamics of hereditary discrete system obtained in collaboration with O. A. Gorosho are presented. Also, some selections of results author’s postgraduate students and doctorantes in area of nonlinear dynamics are presented. A list of scientific projects headed by author of this paper is presented with a list of doctoral dissertation and magister of sciences thesis which contain scientific research results obtained under the supervision by author of this paper or their fist doctoral candidates. [Projekat Ministarstva nauke Republike Srbije, br. ON174001: Dynamics of hybrid systems with complex structures

  20. Twist-2 at seven loops in planar N=4 SYM theory: full result and analytic properties

    Energy Technology Data Exchange (ETDEWEB)

    Marboe, Christian [School of Mathematics, Trinity College Dublin,College Green, Dublin 2 (Ireland); Institut für Mathematik und Institut für Physik, Humboldt-Universität zu Berlin,IRIS Adlershof, Zum Großen Windkanal 6, 12489 Berlin (Germany); Velizhanin, Vitaly [Theoretical Physics Division, NRC “Kurchatov Institute”,Petersburg Nuclear Physics Institute, Orlova Roscha,Gatchina, 188300 St. Petersburg (Russian Federation); Institut für Mathematik und Institut für Physik, Humboldt-Universität zu Berlin,IRIS Adlershof, Zum Großen Windkanal 6, 12489 Berlin (Germany)

    2016-11-04

    The anomalous dimension of twist-2 operators of arbitrary spin in planar N=4 SYM theory is found at seven loops by using the quantum spectral curve to compute values at fixed spin, and reconstructing the general result using the LLL-algorithm together with modular arithmetic. The result of the analytic continuation to negative spin is presented, and its relation with the recently computed correction to the BFKL and double-logarithmic equation is discussed.

  1. Improving the trust in results of numerical simulations and scientific data analytics

    Energy Technology Data Exchange (ETDEWEB)

    Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States); Constantinescu, Emil [Argonne National Lab. (ANL), Argonne, IL (United States); Hovland, Paul [Argonne National Lab. (ANL), Argonne, IL (United States); Peterka, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Phillips, Carolyn [Argonne National Lab. (ANL), Argonne, IL (United States); Snir, Marc [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, Stefan [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-04-30

    This white paper investigates several key aspects of the trust that a user can give to the results of numerical simulations and scientific data analytics. In this document, the notion of trust is related to the integrity of numerical simulations and data analytics applications. This white paper complements the DOE ASCR report on Cybersecurity for Scientific Computing Integrity by (1) exploring the sources of trust loss; (2) reviewing the definitions of trust in several areas; (3) providing numerous cases of result alteration, some of them leading to catastrophic failures; (4) examining the current notion of trust in numerical simulation and scientific data analytics; (5) providing a gap analysis; and (6) suggesting two important research directions and their respective research topics. To simplify the presentation without loss of generality, we consider that trust in results can be lost (or the results’ integrity impaired) because of any form of corruption happening during the execution of the numerical simulation or the data analytics application. In general, the sources of such corruption are threefold: errors, bugs, and attacks. Current applications are already using techniques to deal with different types of corruption. However, not all potential corruptions are covered by these techniques. We firmly believe that the current level of trust that a user has in the results is at least partially founded on ignorance of this issue or the hope that no undetected corruptions will occur during the execution. This white paper explores the notion of trust and suggests recommendations for developing a more scientifically grounded notion of trust in numerical simulation and scientific data analytics. We first formulate the problem and show that it goes beyond previous questions regarding the quality of results such as V&V, uncertainly quantification, and data assimilation. We then explore the complexity of this difficult problem, and we sketch complementary general

  2. Impact mechanics of ship collisions and validations with experimental results

    DEFF Research Database (Denmark)

    Zhang, Shengming; Villavicencio, R.; Zhu, L.

    2017-01-01

    Closed-form analytical solutions for the energy released for deforming and crushing ofstructures and the impact impulse during ship collisions were developed and published inMarine Structures in 1998 [1]. The proposed mathematical models have been used bymany engineers and researchers although th...

  3. Optimization of instrumental neutron activation analysis method by means of 2k experimental design technique aiming the validation of analytical procedures

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2013-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2 k experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  4. Optimization of instrumental neutron activation analysis method by means of 2{sup k} experimental design technique aiming the validation of analytical procedures

    Energy Technology Data Exchange (ETDEWEB)

    Petroni, Robson; Moreira, Edson G., E-mail: rpetroni@ipen.br, E-mail: emoreira@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2{sup k} experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  5. Numerical simulation and experimental validation of the three-dimensional flow field and relative analyte concentration distribution in an atmospheric pressure ion source.

    Science.gov (United States)

    Poehler, Thorsten; Kunte, Robert; Hoenen, Herwart; Jeschke, Peter; Wissdorf, Walter; Brockmann, Klaus J; Benter, Thorsten

    2011-11-01

    In this study, the validation and analysis of steady state numerical simulations of the gas flows within a multi-purpose ion source (MPIS) are presented. The experimental results were obtained with particle image velocimetry (PIV) measurements in a non-scaled MPIS. Two-dimensional time-averaged velocity and turbulent kinetic energy distributions are presented for two dry gas volume flow rates. The numerical results of the validation simulations are in very good agreement with the experimental data. All significant flow features have been correctly predicted within the accuracy of the experiments. For technical reasons, the experiments were conducted at room temperature. Thus, numerical simulations of ionization conditions at two operating points of the MPIS are also presented. It is clearly shown that the dry gas volume flow rate has the most significant impact on the overall flow pattern within the APLI source; far less critical is the (larger) nebulization gas flow. In addition to the approximate solution of Reynolds-Averaged Navier-Stokes equations, a transport equation for the relative analyte concentration has been solved. The results yield information on the three-dimensional analyte distribution within the source. It becomes evident that for ion transport into the MS ion transfer capillary, electromagnetic forces are at least as important as fluid dynamic forces. However, only the fluid dynamics determines the three-dimensional distribution of analyte gas. Thus, local flow phenomena in close proximity to the spray shield are strongly impacting on the ionization efficiency.

  6. Application of analytical quality by design principles for the determination of alkyl p-toluenesulfonates impurities in Aprepitant by HPLC. Validation using total-error concept.

    Science.gov (United States)

    Zacharis, Constantinos K; Vastardi, Elli

    2018-02-20

    In the research presented we report the development of a simple and robust liquid chromatographic method for the quantification of two genotoxic alkyl sulphonate impurities (namely methyl p-toluenesulfonate and isopropyl p-toluenesulfonate) in Aprepitant API substances using the Analytical Quality by Design (AQbD) approach. Following the steps of AQbD protocol, the selected critical method attributes (CMAs) were the separation criterions between the critical peak pairs, the analysis time and the peak efficiencies of the analytes. The critical method parameters (CMPs) included the flow rate, the gradient slope and the acetonitrile content at the first step of the gradient elution program. Multivariate experimental designs namely Plackett-Burman and Box-Behnken designs were conducted sequentially for factor screening and optimization of the method parameters. The optimal separation conditions were estimated using the desirability function. The method was fully validated in the range of 10-200% of the target concentration limit of the analytes using the "total error" approach. Accuracy profiles - a graphical decision making tool - were constructed using the results of the validation procedures. The β-expectation tolerance intervals did not exceed the acceptance criteria of±10%, meaning that 95% of future results will be included in the defined bias limits. The relative bias ranged between - 1.3-3.8% for both analytes, while the RSD values for repeatability and intermediate precision were less than 1.9% in all cases. The achieved limit of detection (LOD) and the limit of quantification (LOQ) were adequate for the specific purpose and found to be 0.02% (corresponding to 48μgg -1 in sample) for both methyl and isopropyl p-toluenesulfonate. As proof-of-concept, the validated method was successfully applied in the analysis of several Aprepitant batches indicating that this methodology could be used for routine quality control analyses. Copyright © 2017 Elsevier B

  7. An analytical model for backscattered luminance in fog: comparisons with Monte Carlo computations and experimental results

    International Nuclear Information System (INIS)

    Taillade, Frédéric; Dumont, Eric; Belin, Etienne

    2008-01-01

    We propose an analytical model for backscattered luminance in fog and derive an expression for the visibility signal-to-noise ratio as a function of meteorological visibility distance. The model uses single scattering processes. It is based on the Mie theory and the geometry of the optical device (emitter and receiver). In particular, we present an overlap function and take the phase function of fog into account. The results of the backscattered luminance obtained with our analytical model are compared to simulations made using the Monte Carlo method based on multiple scattering processes. An excellent agreement is found in that the discrepancy between the results is smaller than the Monte Carlo standard uncertainties. If we take no account of the geometry of the optical device, the results of the model-estimated backscattered luminance differ from the simulations by a factor 20. We also conclude that the signal-to-noise ratio computed with the Monte Carlo method and our analytical model is in good agreement with experimental results since the mean difference between the calculations and experimental measurements is smaller than the experimental uncertainty

  8. Tank 241-AN-104, cores 163 and 164 analytical results for the final report

    International Nuclear Information System (INIS)

    Steen, F.H.

    1997-01-01

    This document is the analytical laboratory report for tank 241-AN-104 push mode core segments collected between August 8, 1996 and September 12, 1996. The segments were subsampled and analyzed in accordance with the Tank 241-AAr-1 04 Push Mode Core Sampling and Analysis Plan (TSAP) (Winkelman, 1996), the Safety Screening Data Quality Objective (DQO) (Dukelow, et at., 1995) and the Flammable Gas Data Quality Objective (DQO) (Benar, 1995). The analytical results are included in a data summary table. None of the samples submitted for Differential Scanning Calorimetry (DSC), Total Alpha Activity (AT), Total Organic Carbon (TOC) and Plutonium analyses (239,240 Pu) exceeded notification limits as stated in the TSAP. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and not considered in this report

  9. Tank 241-AX-103, cores 212 and 214 analytical results for the final report

    International Nuclear Information System (INIS)

    Steen, F.H.

    1998-01-01

    This document is the analytical laboratory report for tank 241-AX-103 push mode core segments collected between July 30, 1997 and August 11, 1997. The segments were subsampled and analyzed in accordance with the Tank 241-AX-103 Push Mode Core Sampling and Analysis Plan (TSAP) (Comer, 1997), the Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995) and the Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO) (Turner, et al., 1995). The analytical results are included in the data summary table (Table 1). None of the samples submitted for Differential Scanning Calorimetry (DSC), Total Alpha Activity (AT), plutonium 239 (Pu239), and Total Organic Carbon (TOC) exceeded notification limits as stated in the TSAP (Conner, 1997). The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and not considered in this report

  10. Analytical results for entanglement in the five-qubit anisotropic Heisenberg model

    International Nuclear Information System (INIS)

    Wang Xiaoguang

    2004-01-01

    We solve the eigenvalue problem of the five-qubit anisotropic Heisenberg model, without use of Bethe's ansatz, and give analytical results for entanglement and mixedness of two nearest-neighbor qubits. The entanglement takes its maximum at Δ=1 (Δ>1) for the case of zero (finite) temperature with Δ being the anisotropic parameter. In contrast, the mixedness takes its minimum at Δ=1 (Δ>1) for the case of zero (finite) temperature

  11. Analytical validation of an ultraviolet-visible procedure for determining lutein concentration and application to lutein-loaded nanoparticles.

    Science.gov (United States)

    Silva, Jéssica Thaís do Prado; Silva, Anderson Clayton da; Geiss, Julia Maria Tonin; de Araújo, Pedro Henrique Hermes; Becker, Daniela; Bracht, Lívia; Leimann, Fernanda Vitória; Bona, Evandro; Guerra, Gustavo Petri; Gonçalves, Odinei Hess

    2017-09-01

    Lutein is a carotenoid presenting known anti-inflammatory and antioxidant properties. Lutein-rich diets have been associated with neurological improvement as well as reduction of the risk of vision loss due to Age-Related Macular Degeneration (AMD). Micro and nanoencapsulation have demonstrated to be effective techniques in protecting lutein against degradation and also in improving its bioavailability. However, actual lutein concentration inside the capsules and encapsulation efficiency are key parameters that must be precisely known when designing in vitro and in vivo tests. In this work an analytical procedure was validated for the determination of the actual lutein content in zein nanoparticles using ultraviolet-visible spectroscopy. Method validation followed the International Conference on Harmonisation (ICH) guidelines which evaluate linearity, detection limit, quantification limit, accuracy and precision. The validated methodology was applied to characterize lutein-loaded nanoparticles. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Analytic model for ultrasound energy receivers and their optimal electric loads II: Experimental validation

    Science.gov (United States)

    Gorostiaga, M.; Wapler, M. C.; Wallrabe, U.

    2017-10-01

    In this paper, we verify the two optimal electric load concepts based on the zero reflection condition and on the power maximization approach for ultrasound energy receivers. We test a high loss 1-3 composite transducer, and find that the measurements agree very well with the predictions of the analytic model for plate transducers that we have developed previously. Additionally, we also confirm that the power maximization and zero reflection loads are very different when the losses in the receiver are high. Finally, we compare the optimal load predictions by the KLM and the analytic models with frequency dependent attenuation to evaluate the influence of the viscosity.

  13. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    Science.gov (United States)

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  14. Study of a vibrating plate: comparison between experimental (ESPI) and analytical results

    Science.gov (United States)

    Romero, G.; Alvarez, L.; Alanís, E.; Nallim, L.; Grossi, R.

    2003-07-01

    Real-time electronic speckle pattern interferometry (ESPI) was used for tuning and visualization of natural frequencies of a trapezoidal plate. The plate was excited to resonant vibration by a sinusoidal acoustical source, which provided a continuous range of audio frequencies. Fringe patterns produced during the time-average recording of the vibrating plate—corresponding to several resonant frequencies—were registered. From these interferograms, calculations of vibrational amplitudes by means of zero-order Bessel functions were performed in some particular cases. The system was also studied analytically. The analytical approach developed is based on the Rayleigh-Ritz method and on the use of non-orthogonal right triangular co-ordinates. The deflection of the plate is approximated by a set of beam characteristic orthogonal polynomials generated by using the Gram-Schmidt procedure. A high degree of correlation between computational analysis and experimental results was observed.

  15. Practical approach to a procedure for judging the results of analytical verification measurements

    International Nuclear Information System (INIS)

    Beyrich, W.; Spannagel, G.

    1979-01-01

    For practical safeguards a particularly transparent procedure is described to judge analytical differences between declared and verified values based on experimental data relevant to the actual status of the measurement technique concerned. Essentially it consists of two parts: Derivation of distribution curves for the occurrence of interlaboratory differences from the results of analytical intercomparison programmes; and judging of observed differences using criteria established on the basis of these probability curves. By courtesy of the Euratom Safeguards Directorate, Luxembourg, the applicability of this judging procedure has been checked in practical data verification for safeguarding; the experience gained was encouraging and implementation of the method is intended. Its reliability might be improved further by evaluation of additional experimental data. (author)

  16. A Complete Validated Learning Analytics Framework: Designing Issues from Data Preparation Perspective

    Science.gov (United States)

    Tlili, Ahmed; Essalmi, Fathi; Jemni, Mohamed; Kinshuk; Chen, Nian-Shing

    2018-01-01

    With the rapid growth of online education in recent years, Learning Analytics (LA) has gained increasing attention from researchers and educational institutions as an area which can improve the overall effectiveness of learning experiences. However, the lack of guidelines on what should be taken into consideration during application of LA hinders…

  17. Urban roughness mapping validation techniques and some first results

    NARCIS (Netherlands)

    Bottema, M; Mestayer, PG

    1998-01-01

    Because of measuring problems related to evaluation of urban roughness parameters, a new approach using a roughness mapping tool has been tested: evaluation of roughness length z(o) and zero displacement z(d) from cadastral databases. Special attention needs to be given to the validation of the

  18. Validation of analytical methods for the quality control of Naproxen suppositories

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar; Hernandez Contreras, Orestes Yuniel

    2011-01-01

    The analysis methods that will be used for the quality control of the future Cuban-made Naproxen suppositories for adults and children were developed for the first time in this paper. One method based on direct ultraviolet spectrophotometry was put forward, which proved to be specific, linear, accurate and precise for the quality control of Naproxen suppositories, taking into account the presence of chromophore groups in their structure. Likewise, the direct semi-aqueous acid-base volumetry method aimed at the quality control of the Naproxen raw material was changed and adapted to the quality control of suppositories. On the basis of the validation process, there was demonstrated the adequate specificity of this method with respect to the formulation components, as well as its linearity, accuracy and precision in 1-3 mg/ml range. The final results were compared and no significant statistical differences among the replicas per each dose were found in both methods; therefore, both may be used in the quality control of Naproxen suppositories

  19. Rigorous results of low-energy models of the analytic S-matrix theory

    International Nuclear Information System (INIS)

    Meshcheryakov, V.A.

    1974-01-01

    Results of analytic S-matrix theory, mainly dealing with the static limit of dispersion relations, are applied to pion-nucleon scattering in the low-energy region. Various approaches to solving equations of the chew-Low type are discussed. It is concluded that interesting results are obtained by reducing the equations to a system of nonlinear difference equations; the crucial element of this approach being the study of functions on the whole Riemann surface. Boundary and crossing symmetry conditions are studied. (HFdV)

  20. Analytic results for planar three-loop integrals for massive form factors

    Energy Technology Data Exchange (ETDEWEB)

    Henn, Johannes M. [PRISMA Cluster of Excellence, Johannes Gutenberg Universität Mainz,55099 Mainz (Germany); Kavli Institute for Theoretical Physics, UC Santa Barbara,Santa Barbara (United States); Smirnov, Alexander V. [Research Computing Center, Moscow State University,119992 Moscow (Russian Federation); Smirnov, Vladimir A. [Skobeltsyn Institute of Nuclear Physics of Moscow State University,119992 Moscow (Russian Federation); Institut für Theoretische Teilchenphysik, Karlsruhe Institute of Technology (KIT),76128 Karlsruhe (Germany)

    2016-12-28

    We use the method of differential equations to analytically evaluate all planar three-loop Feynman integrals relevant for form factor calculations involving massive particles. Our results for ninety master integrals at general q{sup 2} are expressed in terms of multiple polylogarithms, and results for fiftyone master integrals at the threshold q{sup 2}=4m{sup 2} are expressed in terms of multiple polylogarithms of argument one, with indices equal to zero or to a sixth root of unity.

  1. Results from the Savannah River Laboratory model validation workshop

    International Nuclear Information System (INIS)

    Pepper, D.W.

    1981-01-01

    To evaluate existing and newly developed air pollution models used in DOE-funded laboratories, the Savannah River Laboratory sponsored a model validation workshop. The workshop used Kr-85 measurements and meteorology data obtained at SRL during 1975 to 1977. Individual laboratories used models to calculate daily, weekly, monthly or annual test periods. Cumulative integrated air concentrations were reported at each grid point and at each of the eight sampler locations

  2. Configuration and validation of an analytical model predicting secondary neutron radiation in proton therapy using Monte Carlo simulations and experimental measurements.

    Science.gov (United States)

    Farah, J; Bonfrate, A; De Marzi, L; De Oliveira, A; Delacroix, S; Martinetti, F; Trompier, F; Clairand, I

    2015-05-01

    This study focuses on the configuration and validation of an analytical model predicting leakage neutron doses in proton therapy. Using Monte Carlo (MC) calculations, a facility-specific analytical model was built to reproduce out-of-field neutron doses while separately accounting for the contribution of intra-nuclear cascade, evaporation, epithermal and thermal neutrons. This model was first trained to reproduce in-water neutron absorbed doses and in-air neutron ambient dose equivalents, H*(10), calculated using MCNPX. Its capacity in predicting out-of-field doses at any position not involved in the training phase was also checked. The model was next expanded to enable a full 3D mapping of H*(10) inside the treatment room, tested in a clinically relevant configuration and finally consolidated with experimental measurements. Following the literature approach, the work first proved that it is possible to build a facility-specific analytical model that efficiently reproduces in-water neutron doses and in-air H*(10) values with a maximum difference less than 25%. In addition, the analytical model succeeded in predicting out-of-field neutron doses in the lateral and vertical direction. Testing the analytical model in clinical configurations proved the need to separate the contribution of internal and external neutrons. The impact of modulation width on stray neutrons was found to be easily adjustable while beam collimation remains a challenging issue. Finally, the model performance agreed with experimental measurements with satisfactory results considering measurement and simulation uncertainties. Analytical models represent a promising solution that substitutes for time-consuming MC calculations when assessing doses to healthy organs. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  3. Development and validation of a simple high-performance liquid chromatography analytical method for simultaneous determination of phytosterols, cholesterol and squalene in parenteral lipid emulsions.

    Science.gov (United States)

    Novak, Ana; Gutiérrez-Zamora, Mercè; Domenech, Lluís; Suñé-Negre, Josep M; Miñarro, Montserrat; García-Montoya, Encarna; Llop, Josep M; Ticó, Josep R; Pérez-Lozano, Pilar

    2018-02-01

    A simple analytical method for simultaneous determination of phytosterols, cholesterol and squalene in lipid emulsions was developed owing to increased interest in their clinical effects. Method development was based on commonly used stationary (C 18 , C 8 and phenyl) and mobile phases (mixtures of acetonitrile, methanol and water) under isocratic conditions. Differences in stationary phases resulted in peak overlapping or coelution of different peaks. The best separation of all analyzed compounds was achieved on Zorbax Eclipse XDB C 8 (150 × 4.6 mm, 5 μm; Agilent) and ACN-H 2 O-MeOH, 80:19.5:0.5 (v/v/v). In order to achieve a shorter time of analysis, the method was further optimized and gradient separation was established. The optimized analytical method was validated and tested for routine use in lipid emulsion analyses. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Validation of Prototype Continuous Real-Time Vital Signs Video Analytics Monitoring System CCATT Viewer

    Science.gov (United States)

    2018-01-26

    traditional monitors, this capability will facilitate management of a group of patients. Innovative visual analytics of the complex array of real-time...redundant system could be useful in managing hundreds of bedside monitor data sources. With too many data sources, a single central server may suffer...collection rate. 3.2 Viewer Elements Design For detailed elements to display, as well as their color, line styles , and locations on the screen, we

  5. Validação de metodologia analítica para doseamento de soluções de lapachol por CLAE Validation of the analytical methodology for evaluation of lapachol in solution by HPCL

    Directory of Open Access Journals (Sweden)

    Said G. C. Fonseca

    2004-02-01

    Full Text Available Lapachol is a naphthoquinone found in several species of the Bignoniaceae family possessing mainly anticancer activity. The present work consists of the development and validation of analytical methodology for lapachol and its preparations. The results here obtained show that lapachol has a low quantification limit, that the analytical methodology is accurate, reproducible, robust and linear over the concentration range 0.5-100 µg/mL of lapachol.

  6. Parametric validations of analytical lifetime estimates for radiation belt electron diffusion by whistler waves

    Directory of Open Access Journals (Sweden)

    A. V. Artemyev

    2013-04-01

    Full Text Available The lifetimes of electrons trapped in Earth's radiation belts can be calculated from quasi-linear pitch-angle diffusion by whistler-mode waves, provided that their frequency spectrum is broad enough and/or their average amplitude is not too large. Extensive comparisons between improved analytical lifetime estimates and full numerical calculations have been performed in a broad parameter range representative of a large part of the magnetosphere from L ~ 2 to 6. The effects of observed very oblique whistler waves are taken into account in both numerical and analytical calculations. Analytical lifetimes (and pitch-angle diffusion coefficients are found to be in good agreement with full numerical calculations based on CRRES and Cluster hiss and lightning-generated wave measurements inside the plasmasphere and Cluster lower-band chorus waves measurements in the outer belt for electron energies ranging from 100 keV to 5 MeV. Comparisons with lifetimes recently obtained from electron flux measurements on SAMPEX, SCATHA, SAC-C and DEMETER also show reasonable agreement.

  7. An Analytical Model of Leakage Neutron Equivalent Dose for Passively-Scattered Proton Radiotherapy and Validation with Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Christopher; Newhauser, Wayne, E-mail: newhauser@lsu.edu [Department of Physics and Astronomy, Louisiana State University and Agricultural and Mechanical College, 202 Nicholson Hall, Baton Rouge, LA 70803 (United States); Mary Bird Perkins Cancer Center, 4950 Essen Lane, Baton Rouge, LA 70809 (United States); Farah, Jad [Institut de Radioprotection et de Sûreté Nucléaire, Service de Dosimétrie Externe, BP-17, 92262 Fontenay-aux-Roses (France)

    2015-05-18

    Exposure to stray neutrons increases the risk of second cancer development after proton therapy. Previously reported analytical models of this exposure were difficult to configure and had not been investigated below 100 MeV proton energy. The purposes of this study were to test an analytical model of neutron equivalent dose per therapeutic absorbed dose (H/D) at 75 MeV and to improve the model by reducing the number of configuration parameters and making it continuous in proton energy from 100 to 250 MeV. To develop the analytical model, we used previously published H/D values in water from Monte Carlo simulations of a general-purpose beamline for proton energies from 100 to 250 MeV. We also configured and tested the model on in-air neutron equivalent doses measured for a 75 MeV ocular beamline. Predicted H/D values from the analytical model and Monte Carlo agreed well from 100 to 250 MeV (10% average difference). Predicted H/D values from the analytical model also agreed well with measurements at 75 MeV (15% average difference). The results indicate that analytical models can give fast, reliable calculations of neutron exposure after proton therapy. This ability is absent in treatment planning systems but vital to second cancer risk estimation.

  8. An Analytical Model of Leakage Neutron Equivalent Dose for Passively-Scattered Proton Radiotherapy and Validation with Measurements

    International Nuclear Information System (INIS)

    Schneider, Christopher; Newhauser, Wayne; Farah, Jad

    2015-01-01

    Exposure to stray neutrons increases the risk of second cancer development after proton therapy. Previously reported analytical models of this exposure were difficult to configure and had not been investigated below 100 MeV proton energy. The purposes of this study were to test an analytical model of neutron equivalent dose per therapeutic absorbed dose (H/D) at 75 MeV and to improve the model by reducing the number of configuration parameters and making it continuous in proton energy from 100 to 250 MeV. To develop the analytical model, we used previously published H/D values in water from Monte Carlo simulations of a general-purpose beamline for proton energies from 100 to 250 MeV. We also configured and tested the model on in-air neutron equivalent doses measured for a 75 MeV ocular beamline. Predicted H/D values from the analytical model and Monte Carlo agreed well from 100 to 250 MeV (10% average difference). Predicted H/D values from the analytical model also agreed well with measurements at 75 MeV (15% average difference). The results indicate that analytical models can give fast, reliable calculations of neutron exposure after proton therapy. This ability is absent in treatment planning systems but vital to second cancer risk estimation

  9. An analytical model of leakage neutron equivalent dose for passively-scattered proton radiotherapy and validation with measurements.

    Science.gov (United States)

    Schneider, Christopher; Newhauser, Wayne; Farah, Jad

    2015-05-18

    Exposure to stray neutrons increases the risk of second cancer development after proton therapy. Previously reported analytical models of this exposure were difficult to configure and had not been investigated below 100 MeV proton energy. The purposes of this study were to test an analytical model of neutron equivalent dose per therapeutic absorbed dose  at 75 MeV and to improve the model by reducing the number of configuration parameters and making it continuous in proton energy from 100 to 250 MeV. To develop the analytical model, we used previously published H/D values in water from Monte Carlo simulations of a general-purpose beamline for proton energies from 100 to 250 MeV. We also configured and tested the model on in-air neutron equivalent doses measured for a 75 MeV ocular beamline. Predicted H/D values from the analytical model and Monte Carlo agreed well from 100 to 250 MeV (10% average difference). Predicted H/D values from the analytical model also agreed well with measurements at 75 MeV (15% average difference). The results indicate that analytical models can give fast, reliable calculations of neutron exposure after proton therapy. This ability is absent in treatment planning systems but vital to second cancer risk estimation.

  10. Electron Beam Return-Current Losses in Solar Flares: Initial Comparison of Analytical and Numerical Results

    Science.gov (United States)

    Holman, Gordon

    2010-01-01

    Accelerated electrons play an important role in the energetics of solar flares. Understanding the process or processes that accelerate these electrons to high, nonthermal energies also depends on understanding the evolution of these electrons between the acceleration region and the region where they are observed through their hard X-ray or radio emission. Energy losses in the co-spatial electric field that drives the current-neutralizing return current can flatten the electron distribution toward low energies. This in turn flattens the corresponding bremsstrahlung hard X-ray spectrum toward low energies. The lost electron beam energy also enhances heating in the coronal part of the flare loop. Extending earlier work by Knight & Sturrock (1977), Emslie (1980), Diakonov & Somov (1988), and Litvinenko & Somov (1991), I have derived analytical and semi-analytical results for the nonthermal electron distribution function and the self-consistent electric field strength in the presence of a steady-state return-current. I review these results, presented previously at the 2009 SPD Meeting in Boulder, CO, and compare them and computed X-ray spectra with numerical results obtained by Zharkova & Gordovskii (2005, 2006). The phYSical significance of similarities and differences in the results will be emphasized. This work is supported by NASA's Heliophysics Guest Investigator Program and the RHESSI Project.

  11. Tank 241-U-106, cores 147 and 148, analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Steen, F.H.

    1996-09-27

    This document is the final report deliverable for tank 241-U-106 push mode core segments collected between May 8, 1996 and May 10, 1996 and received by the 222-S Laboratory between May 14, 1996 and May 16, 1996. The segments were subsampled and analyzed in accordance with the Tank 241-U-106 Push Mode Core Sampling and analysis Plan (TSAP), the Historical Model Evaluation Data Requirements (Historical DQO), Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO) and the Safety Screening Data Quality Objective (DQO). The analytical results are included in Table 1.

  12. Solar neutrino masses and mixing from bilinear R-parity broken supersymmetry: Analytical versus numerical results

    Science.gov (United States)

    Díaz, M.; Hirsch, M.; Porod, W.; Romão, J.; Valle, J.

    2003-07-01

    We give an analytical calculation of solar neutrino masses and mixing at one-loop order within bilinear R-parity breaking supersymmetry, and compare our results to the exact numerical calculation. Our method is based on a systematic perturbative expansion of R-parity violating vertices to leading order. We find in general quite good agreement between the approximate and full numerical calculations, but the approximate expressions are much simpler to implement. Our formalism works especially well for the case of the large mixing angle Mikheyev-Smirnov-Wolfenstein solution, now strongly favored by the recent KamLAND reactor neutrino data.

  13. Why do ultrasoft repulsive particles cluster and crystallize? Analytical results from density-functional theory.

    Science.gov (United States)

    Likos, Christos N; Mladek, Bianca M; Gottwald, Dieter; Kahl, Gerhard

    2007-06-14

    We demonstrate the accuracy of the hypernetted chain closure and of the mean-field approximation for the calculation of the fluid-state properties of systems interacting by means of bounded and positive pair potentials with oscillating Fourier transforms. Subsequently, we prove the validity of a bilinear, random-phase density functional for arbitrary inhomogeneous phases of the same systems. On the basis of this functional, we calculate analytically the freezing parameters of the latter. We demonstrate explicitly that the stable crystals feature a lattice constant that is independent of density and whose value is dictated by the position of the negative minimum of the Fourier transform of the pair potential. This property is equivalent with the existence of clusters, whose population scales proportionally to the density. We establish that regardless of the form of the interaction potential and of the location on the freezing line, all cluster crystals have a universal Lindemann ratio Lf=0.189 at freezing. We further make an explicit link between the aforementioned density functional and the harmonic theory of crystals. This allows us to establish an equivalence between the emergence of clusters and the existence of negative Fourier components of the interaction potential. Finally, we make a connection between the class of models at hand and the system of infinite-dimensional hard spheres, when the limits of interaction steepness and space dimension are both taken to infinity in a particularly described fashion.

  14. Complex dynamics of memristive circuits: Analytical results and universal slow relaxation

    Science.gov (United States)

    Caravelli, F.; Traversa, F. L.; Di Ventra, M.

    2017-02-01

    Networks with memristive elements (resistors with memory) are being explored for a variety of applications ranging from unconventional computing to models of the brain. However, analytical results that highlight the role of the graph connectivity on the memory dynamics are still few, thus limiting our understanding of these important dynamical systems. In this paper, we derive an exact matrix equation of motion that takes into account all the network constraints of a purely memristive circuit, and we employ it to derive analytical results regarding its relaxation properties. We are able to describe the memory evolution in terms of orthogonal projection operators onto the subspace of fundamental loop space of the underlying circuit. This orthogonal projection explicitly reveals the coupling between the spatial and temporal sectors of the memristive circuits and compactly describes the circuit topology. For the case of disordered graphs, we are able to explain the emergence of a power-law relaxation as a superposition of exponential relaxation times with a broad range of scales using random matrices. This power law is also universal, namely independent of the topology of the underlying graph but dependent only on the density of loops. In the case of circuits subject to alternating voltage instead, we are able to obtain an approximate solution of the dynamics, which is tested against a specific network topology. These results suggest a much richer dynamics of memristive networks than previously considered.

  15. Tank 214-AW-105, grab samples, analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final report for tank 241-AW-105 grab samples. Twenty grabs samples were collected from risers 10A and 15A on August 20 and 21, 1996, of which eight were designated for the K Basin sludge compatibility and mixing studies. This document presents the analytical results for the remaining twelve samples. Analyses were performed in accordance with the Compatibility Grab Sampling and Analysis Plan (TSAP) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DO). The results for the previous sampling of this tank were reported in WHC-SD-WM-DP-149, Rev. 0, 60-Day Waste Compatibility Safety Issue and Final Results for Tank 241-A W-105, Grab Samples 5A W-95-1, 5A W-95-2 and 5A W-95-3. Three supernate samples exceeded the TOC notification limit (30,000 microg C/g dry weight). Appropriate notifications were made. No immediate notifications were required for any other analyte. The TSAP requested analyses for polychlorinated biphenyls (PCB) for all liquids and centrifuged solid subsamples. The PCB analysis of the liquid samples has been delayed and will be presented in a revision to this document

  16. Factor Analytic Validation of the Ford, Wolvin, and Chung Listening Competence Scale

    Science.gov (United States)

    Mickelson, William T.; Welch, S. A.

    2012-01-01

    This research begins to independently and quantitatively validate the Ford, Wolvin, and Chung (2000) Listening Competency Scale. Reliability and Confirmatory Factor analyses were conducted on two independent samples. The reliability estimates were found to be below those reported by Ford, Wolvin, and Chung (2000) and below acceptable levels for…

  17. Analytic nuclear scattering theories

    International Nuclear Information System (INIS)

    Di Marzio, F.; University of Melbourne, Parkville, VIC

    1999-01-01

    A wide range of nuclear reactions are examined in an analytical version of the usual distorted wave Born approximation. This new approach provides either semi analytic or fully analytic descriptions of the nuclear scattering processes. The resulting computational simplifications, when used within the limits of validity, allow very detailed tests of both nuclear interaction models as well as large basis models of nuclear structure to be performed

  18. Validation of an analytical method for determining halothane in urine as an instrument for evaluating occupational exposure

    International Nuclear Information System (INIS)

    Gonzalez Chamorro, Rita Maria; Jaime Novas, Arelis; Diaz Padron, Heliodora

    2010-01-01

    The occupational exposure to harmful substances may impose the apparition of determined significative changes in the normal physiology of the organism when the adequate security measures are not taken in time in a working place where the risk may be present. Among the chemical risks that may affect the workers' health are the inhalable anesthetic agents. With the objective to take the first steps for the introduction of an epidemiological surveillance system to this personnel, an analytical method for determining this anesthetic in urine was validated with the instrumental conditions created in our laboratory. To carry out this validation the following parameters were taken into account: specificity, lineament, precision, accuracy, detection limit and quantification limit, and the uncertainty of the method was calculated. In the validation procedure it was found that the technique is specific and precise, the detection limit was of 0,118 μg/L, and of the quantification limit of 0,354 μg/L. The global uncertainty was of 0,243, and the expanded of 0,486. The validated method, together with the posterior introduction of the biological exposure limits, will serve as an auxiliary means of diagnosis which will allow us a periodical control of the personnel exposure

  19. Tank 241-TX-118, core 236 analytical results for the final report

    International Nuclear Information System (INIS)

    ESCH, R.A.

    1998-01-01

    This document is the analytical laboratory report for tank 241-TX-118 push mode core segments collected between April 1, 1998 and April 13, 1998. The segments were subsampled and analyzed in accordance with the Tank 241-TX-118 Push Mode Core sampling and Analysis Plan (TSAP) (Benar, 1997), the Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995), the Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO) (Turner, et al, 1995) and the Historical Model Evaluation Data Requirements (Historical DQO) (Sipson, et al., 1995). The analytical results are included in the data summary table (Table 1). None of the samples submitted for Differential Scanning Calorimetry (DSC) and Total Organic Carbon (TOC) exceeded notification limits as stated in the TSAP (Benar, 1997). One sample exceeded the Total Alpha Activity (AT) analysis notification limit of 38.4microCi/g (based on a bulk density of 1.6), core 236 segment 1 lower half solids (S98T001524). Appropriate notifications were made. Plutonium 239/240 analysis was requested as a secondary analysis. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and are not considered in this report

  20. Results of an interlaboratory comparison of analytical methods for contaminants of emerging concern in water.

    Science.gov (United States)

    Vanderford, Brett J; Drewes, Jörg E; Eaton, Andrew; Guo, Yingbo C; Haghani, Ali; Hoppe-Jones, Christiane; Schluesener, Michael P; Snyder, Shane A; Ternes, Thomas; Wood, Curtis J

    2014-01-07

    An evaluation of existing analytical methods used to measure contaminants of emerging concern (CECs) was performed through an interlaboratory comparison involving 25 research and commercial laboratories. In total, 52 methods were used in the single-blind study to determine method accuracy and comparability for 22 target compounds, including pharmaceuticals, personal care products, and steroid hormones, all at ng/L levels in surface and drinking water. Method biases ranged from caffeine, NP, OP, and triclosan had false positive rates >15%. In addition, some methods reported false positives for 17β-estradiol and 17α-ethynylestradiol in unspiked drinking water and deionized water, respectively, at levels higher than published predicted no-effect concentrations for these compounds in the environment. False negative rates were also generally contamination, misinterpretation of background interferences, and/or inappropriate setting of detection/quantification levels for analysis at low ng/L levels. The results of both comparisons were collectively assessed to identify parameters that resulted in the best overall method performance. Liquid chromatography-tandem mass spectrometry coupled with the calibration technique of isotope dilution were able to accurately quantify most compounds with an average bias of <10% for both matrixes. These findings suggest that this method of analysis is suitable at environmentally relevant levels for most of the compounds studied. This work underscores the need for robust, standardized analytical methods for CECs to improve data quality, increase comparability between studies, and help reduce false positive and false negative rates.

  1. Tank 241-T-203, core 190 analytical results for the final report

    International Nuclear Information System (INIS)

    Steen, F.H.

    1997-01-01

    This document is the analytical laboratory report for tank 241-T-203 push mode core segments collected on April 17, 1997 and April 18, 1997. The segments were subsainpled and analyzed in accordance with the Tank 241-T-203 Push Mode Core Sampling andanalysis Plan (TSAP) (Schreiber, 1997a), the Safety Screening Data Quality Objective (DQO)(Dukelow, et al., 1995) and Leffer oflnstructionfor Core Sample Analysis of Tanks 241-T-201, 241-T-202, 241-T-203, and 241-T-204 (LOI)(Hall, 1997). The analytical results are included in the data summary report (Table 1). None of the samples submitted for Differential Scanning Calorimetry (DSC), Total Alpha Activity (AT) and Total Organic Carbon (TOC) exceeded notification limits as stated in the TSAP (Schreiber, 1997a). The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems (TWRS) Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997b) and not considered in this report

  2. Calculations for Adjusting Endogenous Biomarker Levels During Analytical Recovery Assessments for Ligand-Binding Assay Bioanalytical Method Validation.

    Science.gov (United States)

    Marcelletti, John F; Evans, Cindy L; Saxena, Manju; Lopez, Adriana E

    2015-07-01

    It is often necessary to adjust for detectable endogenous biomarker levels in spiked validation samples (VS) and in selectivity determinations during bioanalytical method validation for ligand-binding assays (LBA) with a matrix like normal human serum (NHS). Described herein are case studies of biomarker analyses using multiplex LBA which highlight the challenges associated with such adjustments when calculating percent analytical recovery (%AR). The LBA test methods were the Meso Scale Discovery V-PLEX® proinflammatory and cytokine panels with NHS as test matrix. The NHS matrix blank exhibited varied endogenous content of the 20 individual cytokines before spiking, ranging from undetectable to readily quantifiable. Addition and subtraction methods for adjusting endogenous cytokine levels in %AR calculations are both used in the bioanalytical field. The two methods were compared in %AR calculations following spiking and analysis of VS for cytokines having detectable endogenous levels in NHS. Calculations for %AR obtained by subtracting quantifiable endogenous biomarker concentrations from the respective total analytical VS values yielded reproducible and credible conclusions. The addition method, in contrast, yielded %AR conclusions that were frequently unreliable and discordant with values obtained with the subtraction adjustment method. It is shown that subtraction of assay signal attributable to matrix is a feasible alternative when endogenous biomarkers levels are below the limit of quantitation, but above the limit of detection. These analyses confirm that the subtraction method is preferable over that using addition to adjust for detectable endogenous biomarker levels when calculating %AR for biomarker LBA.

  3. Comparison between numerical and analytical results on the required rf current for stabilizing neoclassical tearing modes

    Science.gov (United States)

    Wang, Xiaojing; Yu, Qingquan; Zhang, Xiaodong; Zhang, Yang; Zhu, Sizheng; Wang, Xiaoguang; Wu, Bin

    2018-04-01

    Numerical studies on the stabilization of neoclassical tearing modes (NTMs) by electron cyclotron current drive (ECCD) have been carried out based on reduced MHD equations, focusing on the amount of the required driven current for mode stabilization and the comparison with analytical results. The dependence of the minimum driven current required for NTM stabilization on some parameters, including the bootstrap current density, radial width of the driven current, radial deviation of the driven current from the resonant surface, and the island width when applying ECCD, are studied. By fitting the numerical results, simple expressions for these dependences are obtained. Analysis based on the modified Rutherford equation (MRE) has also been carried out, and the corresponding results have the same trend as numerical ones, while a quantitative difference between them exists. This difference becomes smaller when the applied radio frequency (rf) current is smaller.

  4. Development and validation of analytical methodology for determination of polycyclic aromatic hydrocarbons (PAHS) in sediments. Assesment of Pedroso Park dam, Santo Andre, SP

    International Nuclear Information System (INIS)

    Brito, Carlos Fernando de

    2009-01-01

    The polycyclic aromatic hydrocarbons (PAHs), by being considered persistent contaminants, by their ubiquity in the environment and by the recognition of their genotoxicity, have stimulated research activities in order to determine and evaluate their sources, transport, processing, biological effects and accumulation in compartments of aquatic and terrestrial ecosystems. In this work, the matrix studied was sediment collected at Pedroso Park's dam at Santo Andre, SP. The analytical technique employed was liquid chromatography in reverse phase with a UV/Vis detector. Statistics treatment of the data was established during the process of developing the methodology for which there was reliable results. The steps involved were evaluated using the concept of Validation of Chemical Testing. The parameters selected for the analytical validation were selectivity, linearity, Working Range, Sensitivity, Accuracy, Precision, Limit of Detection, Limit of quantification and robustness. These parameters showed satisfactory results, allowing the application of the methodology, and is a simple method that allows the minimization of contamination and loss of compounds by over-handling. For the PAHs tested were no found positive results, above the limit of detection, in any of the samples collected in the first phase. But, at the second collection, were found small changes mainly acenaphthylene, fluorene and benzo[a]anthracene. Although the area is preserved, it is possible to realize little signs of contamination. (author)

  5. 42 CFR 476.84 - Changes as a result of DRG validation.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Changes as a result of DRG validation. 476.84... § 476.84 Changes as a result of DRG validation. A provider or practitioner may obtain a review by a QIO... in DRG assignment as a result of QIO validation activities. ...

  6. Analytical Characterization of Rococo Paintings in Egypt: Preliminary Results from El-Gawhara Palace at Cairo

    Directory of Open Access Journals (Sweden)

    Fatma REFAAT

    2012-12-01

    Full Text Available El-Gawhara palace (1813–1814 AD is situated south of the Mosque of Muhammad Ali in the Cairo Citadel. This palace is an important example of the best early 19th century rococo decorations in Egypt. The present study reports some of the results obtained from the application of different analytical techniques to characterize some rococo paintings at El-Gawhara palace at Cairo, Egypt. The characterization of the studied paintings was carried out by means of optical microscopy (OM, scanning electron microscopy equipped with an energy dispersive X-ray detector (EDS and Fourier transform infrared spectroscopy (FT−IR. The obtained results allowed the identification of the chemical composition, structure and the painting technique employed in these paintings. This methodology reveals some useful information on some rococo paintings dating back to the 19th century in Egypt.

  7. Tank 241-T-204, core 188 analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Nuzum, J.L.

    1997-07-24

    TANK 241-T-204, CORE 188, ANALYTICAL RESULTS FOR THE FINAL REPORT. This document is the final laboratory report for Tank 241 -T-204. Push mode core segments were removed from Riser 3 between March 27, 1997, and April 11, 1997. Segments were received and extruded at 222-8 Laboratory. Analyses were performed in accordance with Tank 241-T-204 Push Mode Core Sampling and analysis Plan (TRAP) (Winkleman, 1997), Letter of instruction for Core Sample Analysis of Tanks 241-T-201, 241- T-202, 241-T-203, and 241-T-204 (LAY) (Bell, 1997), and Safety Screening Data Qual@ Objective (DO) ODukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT) or differential scanning calorimetry (DC) analyses exceeded the notification limits stated in DO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group and are not considered in this report.

  8. Nonlinear heat conduction equations with memory: Physical meaning and analytical results

    Science.gov (United States)

    Artale Harris, Pietro; Garra, Roberto

    2017-06-01

    We study nonlinear heat conduction equations with memory effects within the framework of the fractional calculus approach to the generalized Maxwell-Cattaneo law. Our main aim is to derive the governing equations of heat propagation, considering both the empirical temperature-dependence of the thermal conductivity coefficient (which introduces nonlinearity) and memory effects, according to the general theory of Gurtin and Pipkin of finite velocity thermal propagation with memory. In this framework, we consider in detail two different approaches to the generalized Maxwell-Cattaneo law, based on the application of long-tail Mittag-Leffler memory function and power law relaxation functions, leading to nonlinear time-fractional telegraph and wave-type equations. We also discuss some explicit analytical results to the model equations based on the generalized separating variable method and discuss their meaning in relation to some well-known results of the ordinary case.

  9. Tank 241-T-105, cores 205 and 207 analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final laboratory report for tank 241-T-105 push mode core segments collected between June 24, 1997 and June 30, 1997. The segments were subsampled and analyzed in accordance with the Tank Push Mode Core Sampling and Analysis Plan (TSAP) (Field,1997), the Tank Safety Screening Data Quality Objective (Safety DQO) (Dukelow, et al., 1995) and Tank 241-T-105 Sample Analysis (memo) (Field, 1997a). The analytical results are included in Table 1. None of the subsamples submitted for the differential scanning calorimetry (DSC) analysis or total alpha activity (AT) exceeded the notification limits as stated in the TSAP (Field, 1997). The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems (TWRS) Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and not considered in this report

  10. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    Science.gov (United States)

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.

  11. Analytical Method Development and Validation of Solifenacin in Pharmaceutical Dosage Forms by RP-HPLC

    OpenAIRE

    Shaik, Rihana Parveen; Puttagunta, Srinivasa Babu; Kothapalli Bannoth, Chandrasekar; Challa, Bala Sekhara Reddy

    2014-01-01

    A new, accurate, precise, and robust HPLC method was developed and validated for the determination of solifenacin in tablet dosage form. The chromatographic separation was achieved on an Inertsil ODS 3V C18 (150 mm × 4.6 mm, 5 μm) stationary phase maintained at ambient temperature with a mobile phase combination of monobasic potassium phosphate (pH 3.5) containing 0.1% triethylamine and methanol (gradient mode) at a flow rate of 1.5 mL/min, and the detection was carried out by using UV detect...

  12. "INTRODUCING A FULL VALIDATED ANALYTICAL PROCEDURE AS AN OFFICIAL COMPENDIAL METHOD FOR FENTANYL TRANSDERMAL PATCHES"

    Directory of Open Access Journals (Sweden)

    Amir Mehdizadeh

    2005-04-01

    Full Text Available A simple, sensitive and specific HPLC method and also a simple and fast extraction procedure were developed for quantitative analysis of fentanyl transdermal patches. Chloroform, methanol and ethanol were used as extracting solvents with recovery percent of 92.1, 94.3 and 99.4% respectively. Fentanyl was extracted with ethanol and the eluted fentanyl through the C18 column was monitored by UV detection at 230 nm. The linearity was at the range of 0.5-10 µg/mL with correlation coefficient (r2 of 0.9992. Both intra and inter-day accuracy and precision were within acceptable limits. The detection limit (DL and quantitation limit (QL were 0.15 and 0.5 µg/mL, respectively. Other validation characteristics such as selectivity, robustness and ruggedness were evaluated. Following method validation, a system suitability test (SST including capacity factor (k´, plate number (N, tailing factor (T, and RSD was defined for routine test.

  13. Validation of the Analytical Method for the Determination of Flavonoids in Broccoli

    Directory of Open Access Journals (Sweden)

    Tuszyńska Magdalena

    2014-09-01

    Full Text Available A simple, accurate and selective HPLC method was developed and validated for determination of quercetin and kaempferol, which are the main flavonols in broccoli. The separation was achieved on a reversed-phase C18 column using a mobile phase composed of methanol/water (60/40 and phosphoric acid 0.2% at a flow rate of 1.0 ml min-1. The detection was carried out on a DAD detector at 370 nm. This method was validated according to the requirements for new methods, which include selectivity, linearity, precision, accuracy, limit of detection and limit of quantitation. The current method demonstrates good linearity, with R2 > 0.99. The recovery is within 98.07-102.15% and 97.92-101.83% for quercetin and kaempferol, respectively. The method is selective, in that quercetin and kaempferol are well separated from other compounds of broccoli with good resolution. The low limit of detection and limit of quantitation of quercetin and kaempferol enable the detection and quantitation of these flavonoids in broccoli at low con–centrations.

  14. Recent results on analytical plasma turbulence theory: Realizability, intermittency, submarginal turbulence, and self-organized criticality

    International Nuclear Information System (INIS)

    Krommes, J.A.

    2000-01-01

    Recent results and future challenges in the systematic analytical description of plasma turbulence are described. First, the importance of statistical realizability is stressed, and the development and successes of the Realizable Markovian Closure are briefly reviewed. Next, submarginal turbulence (linearly stable but nonlinearly self-sustained fluctuations) is considered and the relevance of nonlinear instability in neutral-fluid shear flows to submarginal turbulence in magnetized plasmas is discussed. For the Hasegawa-Wakatani equations, a self-consistency loop that leads to steady-state vortex regeneration in the presence of dissipation is demonstrated and a partial unification of recent work of Drake (for plasmas) and of Waleffe (for neutral fluids) is given. Brief remarks are made on the difficulties facing a quantitatively accurate statistical description of submarginal turbulence. Finally, possible connections between intermittency, submarginal turbulence, and self-organized criticality (SOC) are considered and outstanding questions are identified

  15. Analytical results from salt batch 9 routine DSSHT and SEHT monthly samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-06-01

    Strip Effluent Hold Tank (SEHT) and Decontaminated Salt Solution Hold Tank (DSSHT) samples from several of the “microbatches” of Integrated Salt Disposition Project (ISDP) Salt Batch (“Macrobatch”) 9 have been analyzed for 238Pu, 90Sr, 137Cs, cations (Inductively Coupled Plasma Emission Spectroscopy - ICPES), and anions (Ion Chromatography Anions - IC-A). The analytical results from the current microbatch samples are similar to those from previous macrobatch samples. The Cs removal continues to be acceptable, with decontamination factors (DF) averaging 25700 (107% RSD). The bulk chemistry of the DSSHT and SEHT samples do not show any signs of unusual behavior, other than lacking the anticipated degree of dilution that is calculated to occur during Modular Caustic-Side Solvent Extraction Unit (MCU) processing.

  16. Weak-field asymptotic theory of tunneling ionization: benchmark analytical results for two-electron atoms

    International Nuclear Information System (INIS)

    Trinh, Vinh H; Morishita, Toru; Tolstikhin, Oleg I

    2015-01-01

    The recently developed many-electron weak-field asymptotic theory of tunneling ionization of atoms and molecules in an external static electric field (Tolstikhin et al 2014, Phys. Rev. A 89, 013421) is extended to the first-order terms in the asymptotic expansion in field. To highlight the results, here we present a simple analytical formula giving the rate of tunneling ionization of two-electron atoms H − and He. Comparison with fully-correlated ab initio calculations available for these systems shows that the first-order theory works quantitatively in a wide range of fields up to the onset of over-the-barrier ionization and hence is expected to find numerous applications in strong-field physics. (fast track communication)

  17. Methods used by Elsam for monitoring precision and accuracy of analytical results

    Energy Technology Data Exchange (ETDEWEB)

    Hinnerskov Jensen, J [Soenderjyllands Hoejspaendingsvaerk, Faelleskemikerne, Aabenraa (Denmark)

    1996-12-01

    Performing round robins at regular intervals is the primary method used by ELsam for monitoring precision and accuracy of analytical results. The firs round robin was started in 1974, and today 5 round robins are running. These are focused on: boiler water and steam, lubricating oils, coal, ion chromatography and dissolved gases in transformer oils. Besides the power plant laboratories in Elsam, the participants are power plant laboratories from the rest of Denmark, industrial and commercial laboratories in Denmark, and finally foreign laboratories. The calculated standard deviations or reproducibilities are compared with acceptable values. These values originate from ISO, ASTM and the like, or from own experiences. Besides providing the laboratories with a tool to check their momentary performance, the round robins are vary suitable for evaluating systematic developments on a long term basis. By splitting up the uncertainty according to methods, sample preparation/analysis, etc., knowledge can be extracted from the round robins for use in many other situations. (au)

  18. Clinical validation of an epigenetic assay to predict negative histopathological results in repeat prostate biopsies.

    Science.gov (United States)

    Partin, Alan W; Van Neste, Leander; Klein, Eric A; Marks, Leonard S; Gee, Jason R; Troyer, Dean A; Rieger-Christ, Kimberly; Jones, J Stephen; Magi-Galluzzi, Cristina; Mangold, Leslie A; Trock, Bruce J; Lance, Raymond S; Bigley, Joseph W; Van Criekinge, Wim; Epstein, Jonathan I

    2014-10-01

    The DOCUMENT multicenter trial in the United States validated the performance of an epigenetic test as an independent predictor of prostate cancer risk to guide decision making for repeat biopsy. Confirming an increased negative predictive value could help avoid unnecessary repeat biopsies. We evaluated the archived, cancer negative prostate biopsy core tissue samples of 350 subjects from a total of 5 urological centers in the United States. All subjects underwent repeat biopsy within 24 months with a negative (controls) or positive (cases) histopathological result. Centralized blinded pathology evaluation of the 2 biopsy series was performed in all available subjects from each site. Biopsies were epigenetically profiled for GSTP1, APC and RASSF1 relative to the ACTB reference gene using quantitative methylation specific polymerase chain reaction. Predetermined analytical marker cutoffs were used to determine assay performance. Multivariate logistic regression was used to evaluate all risk factors. The epigenetic assay resulted in a negative predictive value of 88% (95% CI 85-91). In multivariate models correcting for age, prostate specific antigen, digital rectal examination, first biopsy histopathological characteristics and race the test proved to be the most significant independent predictor of patient outcome (OR 2.69, 95% CI 1.60-4.51). The DOCUMENT study validated that the epigenetic assay was a significant, independent predictor of prostate cancer detection in a repeat biopsy collected an average of 13 months after an initial negative result. Due to its 88% negative predictive value adding this epigenetic assay to other known risk factors may help decrease unnecessary repeat prostate biopsies. Copyright © 2014 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  19. Temperature based validation of the analytical model for the estimation of the amount of heat generated during friction stir welding

    Directory of Open Access Journals (Sweden)

    Milčić Dragan S.

    2012-01-01

    Full Text Available Friction stir welding is a solid-state welding technique that utilizes thermomechanical influence of the rotating welding tool on parent material resulting in a monolith joint - weld. On the contact of welding tool and parent material, significant stirring and deformation of parent material appears, and during this process, mechanical energy is partially transformed into heat. Generated heat affects the temperature of the welding tool and parent material, thus the proposed analytical model for the estimation of the amount of generated heat can be verified by temperature: analytically determined heat is used for numerical estimation of the temperature of parent material and this temperature is compared to the experimentally determined temperature. Numerical solution is estimated using the finite difference method - explicit scheme with adaptive grid, considering influence of temperature on material's conductivity, contact conditions between welding tool and parent material, material flow around welding tool, etc. The analytical model shows that 60-100% of mechanical power given to the welding tool is transformed into heat, while the comparison of results shows the maximal relative difference between the analytical and experimental temperature of about 10%.

  20. Tank 241-BY-112, cores 174 and 177 analytical results for the final report

    International Nuclear Information System (INIS)

    Nuzum, J.L.

    1997-01-01

    Results from bulk density tests ranged from 1.03 g/mL to 1.86 g/mL. The highest bulk density result of 1.86 g/mL was used to calculate the solid total alpha activity notification limit for this tank (33.1 uCi/g), Total Alpha (AT) Analysis. Attachment 2 contains the Data Verification and Deliverable (DVD) Summary Report for AT analyses. This report summarizes results from AT analyses and provides data qualifiers and total propagated uncertainty (TPU) values for results. The TPU values are based on the uncertainties inherent in each step of the analysis process. They may be used as an additional reference to determine reasonable RPD values which may be used to accept valid data that do not meet the TSAP acceptance criteria. A report guide is provided with the report to assist in understanding this summary report

  1. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 2: Appendices

    Science.gov (United States)

    Niiya, Karen E.; Walker, Richard E.; Pieper, Jerry L.; Nguyen, Thong V.

    1993-05-01

    This final report includes a discussion of the work accomplished during the period from Dec. 1988 through Nov. 1991. The objective of the program was to assemble existing performance and combustion stability models into a usable design methodology capable of designing and analyzing high-performance and stable LOX/hydrocarbon booster engines. The methodology was then used to design a validation engine. The capabilities and validity of the methodology were demonstrated using this engine in an extensive hot fire test program. The engine used LOX/RP-1 propellants and was tested over a range of mixture ratios, chamber pressures, and acoustic damping device configurations. This volume contains time domain and frequency domain stability plots which indicate the pressure perturbation amplitudes and frequencies from approximately 30 tests of a 50K thrust rocket engine using LOX/RP-1 propellants over a range of chamber pressures from 240 to 1750 psia with mixture ratios of from 1.2 to 7.5. The data is from test configurations which used both bitune and monotune acoustic cavities and from tests with no acoustic cavities. The engine had a length of 14 inches and a contraction ratio of 2.0 using a 7.68 inch diameter injector. The data was taken from both stable and unstable tests. All combustion instabilities were spontaneous in the first tangential mode. Although stability bombs were used and generated overpressures of approximately 20 percent, no tests were driven unstable by the bombs. The stability instrumentation included six high-frequency Kistler transducers in the combustion chamber, a high-frequency Kistler transducer in each propellant manifold, and tri-axial accelerometers. Performance data is presented, both characteristic velocity efficiencies and energy release efficiencies, for those tests of sufficient duration to record steady state values.

  2. Validation of an analytical method for the determination of aldehydes and acetone present in the ambient air at the metropolitan area of Costa Rica

    International Nuclear Information System (INIS)

    Rojas Marin, Jose Felix

    2010-01-01

    The analytical method validation has been conducted for the simultaneous determination of 15 carbonyl compounds, the main aldehydes and ketones present in ambient air. The compounds have been captured on cartridges packed with silica gel impregnated with 2.4-dinitrophenylhydrazine (DNPH) at a constant flow of about 1 lmin -1 . Carbonyl compounds present have formed the respective products, which are then eluted with acetonitrile (solid phase extraction). The extracts were analyzed by the technique of high resolution liquid chromatography with ultraviolet detector at a wavelength of 360 nm. The following results were obtained during method validation: linearity from 0.03 mgl -1 to 15 mgl -1 , limits of detection and quantification of 0.02 mgl -1 and 0.06 mgl -1 , the accuracy no significant bias at a confidence level of 95%, accuracy for repeatability and producibility of the analytical method are around 1%. Two sampling campaigns were made in dry and rainy seasons of 2009 for areas of San Jose, Heredia and Belen. The predominant compounds were found to be acetone, acetaldehyde and the formaldehyde was the most abundant in the city of San Jose, others do not have significant amounts, so there is strong correlation between formaldehyde and acetaldehyde suggesting that stem from a common source, possibly vehicle emissions. (author) [es

  3. Computer-aided test selection and result validation-opportunities and pitfalls

    DEFF Research Database (Denmark)

    McNair, P; Brender, J; Talmon, J

    1998-01-01

    /or to increase cost-efficiency). Our experience shows that there is a practical limit to the extent of exploitation of the principle of dynamic test scheduling, unless it is automated in one way or the other. This paper analyses some issues of concern related to the profession of clinical biochemistry, when......Dynamic test scheduling is concerned with pre-analytical preprocessing of the individual samples within a clinical laboratory production by means of decision algorithms. The purpose of such scheduling is to provide maximal information with minimal data production (to avoid data pollution and...... implementing such dynamic test scheduling within a Laboratory Information System (and/or an advanced analytical workstation). The challenge is related to 1) generation of appropriately validated decision models, and 2) mastering consequences of analytical imprecision and bias....

  4. Planck early results. XIV. ERCSC validation and extreme radio sources

    DEFF Research Database (Denmark)

    Lähteenmäki, A.; Lavonen, N.; León-Tavares, J.

    2011-01-01

    Planck's all-sky surveys at 30-857 GHz provide an unprecedented opportunity to follow the radio spectra of a large sample of extragalactic sources to frequencies 2-20 times higher than allowed by past, large-area, ground-based surveys. We combine the results of the Planck Early Release Compact So...

  5. Computational fluid dynamics simulations and validations of results

    CSIR Research Space (South Africa)

    Sitek, MA

    2013-09-01

    Full Text Available Wind flow influence on a high-rise building is analyzed. The research covers full-scale tests, wind-tunnel experiments and numerical simulations. In the present paper computational model used in simulations is described and the results, which were...

  6. Analytical validation of a flow cytometric protocol for quantification of platelet microparticles in dogs.

    Science.gov (United States)

    Cremer, Signe E; Krogh, Anne K H; Hedström, Matilda E K; Christiansen, Liselotte B; Tarnow, Inge; Kristensen, Annemarie T

    2018-06-01

    Platelet microparticles (PMPs) are subcellular procoagulant vesicles released upon platelet activation. In people with clinical diseases, alterations in PMP concentrations have been extensively investigated, but few canine studies exist. This study aims to validate a canine flow cytometric protocol for PMP quantification and to assess the influence of calcium on PMP concentrations. Microparticles (MP) were quantified in citrated whole blood (WB) and platelet-poor plasma (PPP) using flow cytometry. Anti-CD61 antibody and Annexin V (AnV) were used to detect platelets and phosphatidylserine, respectively. In 13 healthy dogs, CD61 + /AnV - concentrations were analyzed with/without a calcium buffer. CD61 + /AnV - , CD61 + /AnV + , and CD61 - /AnV + MP quantification were validated in 10 healthy dogs. The coefficient of variation (CV) for duplicate (intra-assay) and parallel (inter-assay) analyses and detection limits (DLs) were calculated. CD61 + /AnV - concentrations were higher in calcium buffer; 841,800 MP/μL (526,000-1,666,200) vs without; 474,200 MP/μL (278,800-997,500), P < .05. In WB, PMP were above DLs and demonstrated acceptable (<20%) intra-assay and inter-assay CVs in 9/10 dogs: 1.7% (0.5-8.9) and 9.0% (0.9-11.9), respectively, for CD61 + /AnV - and 2.4% (0.2-8.7) and 7.8% (0.0-12.8), respectively, for CD61 + /AnV + . Acceptable CVs were not seen for the CD61 - /AnV + MP. In PPP, quantifications were challenged by high inter-assay CV, overlapping DLs and hemolysis and lipemia interfered with quantification in 5/10 dogs. Calcium induced higher in vitro PMP concentrations, likely due to platelet activation. PMP concentrations were reliably quantified in WB, indicating the potential for clinical applications. PPP analyses were unreliable due to high inter-CV and DL overlap, and not obtainable due to hemolysis and lipemia interference. © 2018 American Society for Veterinary Clinical Pathology.

  7. Analytical Validation of the ReEBOV Antigen Rapid Test for Point-of-Care Diagnosis of Ebola Virus Infection.

    Science.gov (United States)

    Cross, Robert W; Boisen, Matthew L; Millett, Molly M; Nelson, Diana S; Oottamasathien, Darin; Hartnett, Jessica N; Jones, Abigal B; Goba, Augustine; Momoh, Mambu; Fullah, Mohamed; Bornholdt, Zachary A; Fusco, Marnie L; Abelson, Dafna M; Oda, Shunichiro; Brown, Bethany L; Pham, Ha; Rowland, Megan M; Agans, Krystle N; Geisbert, Joan B; Heinrich, Megan L; Kulakosky, Peter C; Shaffer, Jeffrey G; Schieffelin, John S; Kargbo, Brima; Gbetuwa, Momoh; Gevao, Sahr M; Wilson, Russell B; Saphire, Erica Ollmann; Pitts, Kelly R; Khan, Sheik Humarr; Grant, Donald S; Geisbert, Thomas W; Branco, Luis M; Garry, Robert F

    2016-10-15

     Ebola virus disease (EVD) is a severe viral illness caused by Ebola virus (EBOV). The 2013-2016 EVD outbreak in West Africa is the largest recorded, with >11 000 deaths. Development of the ReEBOV Antigen Rapid Test (ReEBOV RDT) was expedited to provide a point-of-care test for suspected EVD cases.  Recombinant EBOV viral protein 40 antigen was used to derive polyclonal antibodies for RDT and enzyme-linked immunosorbent assay development. ReEBOV RDT limits of detection (LOD), specificity, and interference were analytically validated on the basis of Food and Drug Administration (FDA) guidance.  The ReEBOV RDT specificity estimate was 95% for donor serum panels and 97% for donor whole-blood specimens. The RDT demonstrated sensitivity to 3 species of Ebolavirus (Zaire ebolavirus, Sudan ebolavirus, and Bundibugyo ebolavirus) associated with human disease, with no cross-reactivity by pathogens associated with non-EBOV febrile illness, including malaria parasites. Interference testing exhibited no reactivity by medications in common use. The LOD for antigen was 4.7 ng/test in serum and 9.4 ng/test in whole blood. Quantitative reverse transcription-polymerase chain reaction testing of nonhuman primate samples determined the range to be equivalent to 3.0 × 10 5 -9.0 × 10 8 genomes/mL.  The analytical validation presented here contributed to the ReEBOV RDT being the first antigen-based assay to receive FDA and World Health Organization emergency use authorization for this EVD outbreak, in February 2015. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  8. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening.

    Science.gov (United States)

    Lawton, Zachary E; Traub, Angelica; Fatigante, William L; Mancias, Jose; O'Leary, Adam E; Hall, Seth E; Wieland, Jamie R; Oberacher, Herbert; Gizzi, Michael C; Mulligan, Christopher C

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. Graphical Abstract ᅟ.

  9. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    Science.gov (United States)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  10. Validation of analytical method to quality control and the stability study of 0.025 % eyedrops Ketotiphen

    International Nuclear Information System (INIS)

    Troche Concepcion, Yenilen; Romero Diaz, Jacqueline Aylema; Garcia Penna, Caridad M

    2010-01-01

    The Ketotiphen eyedrop is prescribed to relief the signs and symptoms of allergic conjunctivitis due to its potent H 1a ntihistaminic effect showing some ability to inhibit the histamine release and other mediators in cases of mastocytosis. The aim of present paper was to develop and validate an analytical method for the high-performance liquid chromatography, to quality control and the stability studies of 0.025 % eyedrop Ketotiphen. Method was based on active principle separation by means of a Lichrosorb RP-18 (5 μm) (250 x 4 mm), with UV detection to 296 nm using a mobile phase including a non-gasified mixture of methanol:buffer-phosphate (75:25; pH 8.5) adding 1 mL of Isopropanol by each 1 000 mL of the previous mixture at a 1.2 mL/min flow velocity. The analytical method was linear, accurate, specific and exact during the study concentrations

  11. Development and validation of analytical method for the estimation of nateglinide in rabbit plasma

    Directory of Open Access Journals (Sweden)

    Nihar Ranjan Pani

    2012-12-01

    Full Text Available Nateglinide has been widely used in the treatment of type-2 diabetics as an insulin secretogoga. A reliable, rapid, simple and sensitive reversed-phase high performance liquid chromatography (RP-HPLC method was developed and validated for determination of nateglinide in rabbit plasma. The method was developed on Hypersil BDSC-18 column (250 mm×4.6 mm, 5 mm using a mobile phase of 10 mM phosphate buffer (pH 2.5 and acetonitrile (35:65, v/v. The elute was monitored with the UV–vis detector at 210 nm with a flow rate of 1 mL/min. Calibration curve was linear over the concentration range of 25–2000 ng/mL. The retention times of nateglinide and internal standard (gliclazide were 9.608 min and 11.821 min respectively. The developed RP-HPLC method can be successfully applied to the quantitative pharmacokinetic parameters determination of nateglinide in rabbit model. Keywords: HPLC, Nateglinide, Rabbit plasma, Pharmacokinetics

  12. The role of validated analytical methods in JECFA drug assessments and evaluation for recommending MRLs.

    Science.gov (United States)

    Boison, Joe O

    2016-05-01

    The Joint Food and Agriculture Organization and World Health Organization (FAO/WHO) Expert Committee on Food Additives (JECFA) is one of three Codex committees tasked with applying risk analysis and relying on independent scientific advice provided by expert bodies organized by FAO/WHO when developing standards. While not officially part of the Codex Alimentarius Commission structure, JECFA provides independent scientific advice to the Commission and its specialist committees such as the Codex Committee on Residues of Veterinary Drugs in Foods (CCRVDF) in setting maximum residue limits (MRLs) for veterinary drugs. Codex methods of analysis (Types I, II, III, and IV) are defined in the Codex Procedural Manual as are criteria to be used for selecting methods of analysis. However, if a method is to be used under a single laboratory condition to support regulatory work, it must be validated according to an internationally recognized protocol and the use of the method must be embedded in a quality assurance system in compliance with ISO/IEC 17025:2005. This paper examines the attributes of the methods used to generate residue depletion data for drug registration and/or licensing and for supporting regulatory enforcement initiatives that experts consider to be useful and appropriate in their assessment of methods of analysis. Copyright © 2016 Her Majesty the Queen in Right of Canada. Drug Testing and Analysis © 2016 John Wiley & Sons, Ltd. © 2016 Her Majesty the Queen in Right of Canada. Drug Testing and Analysis © 2016 John Wiley & Sons, Ltd.

  13. Suspended sediment assessment by combining sound attenuation and backscatter measurements - analytical method and experimental validation

    Science.gov (United States)

    Guerrero, Massimo; Di Federico, Vittorio

    2018-03-01

    The use of acoustic techniques has become common for estimating suspended sediment in water environments. An emitted beam propagates into water producing backscatter and attenuation, which depend on scattering particles concentration and size distribution. Unfortunately, the actual particles size distribution (PSD) may largely affect the accuracy of concentration quantification through the unknown coefficients of backscattering strength, ks2, and normalized attenuation, ζs. This issue was partially solved by applying the multi-frequency approach. Despite this possibility, a relevant scientific and practical question remains regarding the possibility of using acoustic methods to investigate poorly sorted sediment in the spectrum ranging from clay to fine sand. The aim of this study is to investigate the possibility of combining the measurement of sound attenuation and backscatter to determine ζs for the suspended particles and the corresponding concentration. The proposed method is moderately dependent from actual PSD, thus relaxing the need of frequent calibrations to account for changes in ks2 and ζs coefficients. Laboratory tests were conducted under controlled conditions to validate this measurement technique. With respect to existing approaches, the developed method more accurately estimates the concentration of suspended particles ranging from clay to fine sand and, at the same time, gives an indication on their actual PSD.

  14. Tank 241-S-106, cores 183, 184 and 187 analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final laboratory report for tank 241-S-106 push mode core segments collected between February 12, 1997 and March 21, 1997. The segments were subsampled and analyzed in accordance with the Tank Push Mode Core Sampling and Analysis Plan (TSAP), the Tank Safety Screening Data Quality Objective (Safety DQO), the Historical Model Evaluation Data Requirements (Historical DQO) and the Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO). The analytical results are included in Table 1. Six of the twenty-four subsamples submitted for the differential scanning calorimetry (DSC) analysis exceeded the notification limit of 480 Joules/g stated in the DQO. Appropriate notifications were made. Total Organic Carbon (TOC) analyses were performed on all samples that produced exotherms during the DSC analysis. All results were less than the notification limit of three weight percent TOC. No cyanide analysis was performed, per agreement with the Tank Safety Program. None of the samples submitted for Total Alpha Activity exceeded notification limits as stated in the TSAP. Statistical evaluation of results by calculating the 95% upper confidence limit is not performed by the 222-S Laboratory and is not considered in this report. No core composites were created because there was insufficient solid material from any of the three core sampling events to generate a composite that would be representative of the tank contents

  15. Analytical results for a stochastic model of gene expression with arbitrary partitioning of proteins

    Science.gov (United States)

    Tschirhart, Hugo; Platini, Thierry

    2018-05-01

    In biophysics, the search for analytical solutions of stochastic models of cellular processes is often a challenging task. In recent work on models of gene expression, it was shown that a mapping based on partitioning of Poisson arrivals (PPA-mapping) can lead to exact solutions for previously unsolved problems. While the approach can be used in general when the model involves Poisson processes corresponding to creation or degradation, current applications of the method and new results derived using it have been limited to date. In this paper, we present the exact solution of a variation of the two-stage model of gene expression (with time dependent transition rates) describing the arbitrary partitioning of proteins. The methodology proposed makes full use of the PPA-mapping by transforming the original problem into a new process describing the evolution of three biological switches. Based on a succession of transformations, the method leads to a hierarchy of reduced models. We give an integral expression of the time dependent generating function as well as explicit results for the mean, variance, and correlation function. Finally, we discuss how results for time dependent parameters can be extended to the three-stage model and used to make inferences about models with parameter fluctuations induced by hidden stochastic variables.

  16. Some analytical results for toroidal magnetic field coils with elongated minor cross-sections

    International Nuclear Information System (INIS)

    Raeder, J.

    1976-09-01

    The problem of determining the shape of a flexible current filament forming part of an ideal toroidal magnetic field coil is solved in a virtually analytical form. Analytical formulae for characteristic coil dimensions, stored magnetic energies, inductances and forces are derived for the so-called D-coils. The analytically calculated inductances of ideal D-coils are compared with numerically calculated ones for the case of finite numbers of D-shaped current filaments. Finally, the magnetic energies stored in ideal rectangular, elliptic and D-coils are compared. (orig.) [de

  17. An analytical model for nanoparticles concentration resulting from infusion into poroelastic brain tissue.

    Science.gov (United States)

    Pizzichelli, G; Di Michele, F; Sinibaldi, E

    2016-02-01

    We consider the infusion of a diluted suspension of nanoparticles (NPs) into poroelastic brain tissue, in view of relevant biomedical applications such as intratumoral thermotherapy. Indeed, the high impact of the related pathologies motivates the development of advanced therapeutic approaches, whose design also benefits from theoretical models. This study provides an analytical expression for the time-dependent NPs concentration during the infusion into poroelastic brain tissue, which also accounts for particle binding onto cells (by recalling relevant results from the colloid filtration theory). Our model is computationally inexpensive and, compared to fully numerical approaches, permits to explicitly elucidate the role of the involved physical aspects (tissue poroelasticity, infusion parameters, NPs physico-chemical properties, NP-tissue interactions underlying binding). We also present illustrative results based on parameters taken from the literature, by considering clinically relevant ranges for the infusion parameters. Moreover, we thoroughly assess the model working assumptions besides discussing its limitations. While not laying any claims of generality, our model can be used to support the development of more ambitious numerical approaches, towards the preliminary design of novel therapies based on NPs infusion into brain tissue. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Tank 241-B-108, cores 172 and 173 analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Nuzum, J.L., Fluoro Daniel Hanford

    1997-03-04

    The Data Summary Table (Table 3) included in this report compiles analytical results in compliance with all applicable DQOS. Liquid subsamples that were prepared for analysis by an acid adjustment of the direct subsample are indicated by a `D` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a fusion digest are indicated by an `F` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a water digest are indicated by a I.wl. or an `I` in the A column of Table 3. Due to poor precision and accuracy in original analysis of both Lower Half Segment 2 of Core 173 and the core composite of Core 173, fusion and water digests were performed for a second time. Precision and accuracy improved with the repreparation of Core 173 Composite. Analyses with the repreparation of Lower Half Segment 2 of Core 173 did not show improvement and suggest sample heterogeneity. Results from both preparations are included in Table 3.

  19. Tank 241-TX-104, cores 230 and 231 analytical results for the final report

    International Nuclear Information System (INIS)

    Diaz, L.A.

    1998-01-01

    This document is the analytical laboratory report for tank 241-TX-104 push mode core segments collected between February 18, 1998 and February 23, 1998. The segments were subsampled and analyzed in accordance with the Tank 241-TX-104 Push Mode Core Sampling and Analysis Plan (TSAP) (McCain, 1997), the Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO) (Turner, et al., 1995) and the Safety Screening Data Quality Objective (DQO) (Dukelow, et.al., 1995). The analytical results are included in the data summary table. None of the samples submitted for Differential Scanning Calorimetry (DSC) and Total Alpha Activity (AT) exceeded notification limits as stated in the TSAP. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and are not considered in this report. Appearance and Sample Handling Attachment 1 is a cross reference to relate the tank farm identification numbers to the 222-S Laboratory LabCore/LIMS sample numbers. The subsamples generated in the laboratory for analyses are identified in these diagrams with their sources shown. Core 230: Three push mode core segments were removed from tank 241-TX-104 riser 9A on February 18, 1998. Segments were received by the 222-S Laboratory on February 19, 1998. Two segments were expected for this core. However, due to poor sample recovery, an additional segment was taken and identified as 2A. Core 231: Four push mode core segments were removed from tank 241-TX-104 riser 13A between February 19, 1998 and February 23, 1998. Segments were received by the 222-S Laboratory on February 24, 1998. Two segments were expected for this core. However, due to poor sample recovery, additional segments were taken and identified as 2A and 2B. The TSAP states the core samples should be transported to the laboratory within three

  20. Heavy-quark QCD vacuum polarisation function. Analytical results at four loops

    International Nuclear Information System (INIS)

    Kniehl, B.A.; Kotikov, A.V.

    2006-07-01

    The first two moments of the heavy-quark vacuum polarisation function at four loops in quantum chromo-dynamics are found in fully analytical form by evaluating the missing massive four-loop tadpole master integrals. (orig.)

  1. Analytical Results for Scaling Properties of the Spectrum of the Fibonacci Chain

    Science.gov (United States)

    Piéchon, Frédéric; Benakli, Mourad; Jagannathan, Anuradha

    1995-06-01

    We solve the approximate renormalization group found by Niu and Nori for a quasiperiodic tight-binding Hamiltonian on the Fibonacci chain. This enables us to characterize analytically the spectral properties of this model.

  2. Resonant amplification of neutrino transitions in the Sun: exact analytical results

    International Nuclear Information System (INIS)

    Toshev, S.; Petkov, P.

    1988-01-01

    We investigate in detail the Mikheyev-Smirnov-Wolfenstein explanation of the solar neutrino puzzle using analytical expressions for the neutrino transition probabilities in matter with exponentially varying electron number density

  3. Comparison and validation of dynamic characteristic analytical method for tubular heat exchanger

    International Nuclear Information System (INIS)

    Huang Qing; Xu Dinggeng; Chen Meng; Shen Rui

    2013-01-01

    In this study, the natural frequencies of Normal Residual Heat Removal Heat Exchangers are evaluated based on the beam and shell-beam finite element models. The corresponding results are compared and some discrepancies are observed. These discrepancies are analyzed in terms of the analysis of a cylindrical shell and the unreasonable treatment of boundary conditions is accordingly pointed out. The experimental data of the natural frequencies of heat exchangers used for Qinshan Phase Ⅰ Nuclear Power Plant are compared with the computational results from the shell-beam models for corresponding heat exchangers of C-2 program. The experimental and numerical results agree quite well, which implies that the shell-beam finite element simplification is applicable to the heat exchangers. The results indicate that the procedures introduced in this article apply to the dynamic analysis of other similar heat exchangers. (authors)

  4. Process and results of analytical framework and typology development for POINT

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Lehtonen, Markku; Bauler, Tom

    2009-01-01

    POINT is a project about how indicators are used in practice; to what extent and in what way indicators actually influence, support, or hinder policy and decision making processes, and what could be done to enhance the positive role of indicators in such processes. The project needs an analytical......, a set of core concepts and associated typologies, a series of analytic schemes proposed, and a number of research propositions and questions for the subsequent empirical work in POINT....

  5. Validating and determining the weight of items used for evaluating clinical governance implementation based on analytic hierarchy process model.

    Science.gov (United States)

    Hooshmand, Elaheh; Tourani, Sogand; Ravaghi, Hamid; Vafaee Najar, Ali; Meraji, Marziye; Ebrahimipour, Hossein

    2015-04-08

    The purpose of implementing a system such as Clinical Governance (CG) is to integrate, establish and globalize distinct policies in order to improve quality through increasing professional knowledge and the accountability of healthcare professional toward providing clinical excellence. Since CG is related to change, and change requires money and time, CG implementation has to be focused on priority areas that are in more dire need of change. The purpose of the present study was to validate and determine the significance of items used for evaluating CG implementation. The present study was descriptive-quantitative in method and design. Items used for evaluating CG implementation were first validated by the Delphi method and then compared with one another and ranked based on the Analytical Hierarchy Process (AHP) model. The items that were validated for evaluating CG implementation in Iran include performance evaluation, training and development, personnel motivation, clinical audit, clinical effectiveness, risk management, resource allocation, policies and strategies, external audit, information system management, research and development, CG structure, implementation prerequisites, the management of patients' non-medical needs, complaints and patients' participation in the treatment process. The most important items based on their degree of significance were training and development, performance evaluation, and risk management. The least important items included the management of patients' non-medical needs, patients' participation in the treatment process and research and development. The fundamental requirements of CG implementation included having an effective policy at national level, avoiding perfectionism, using the expertise and potentials of the entire country and the coordination of this model with other models of quality improvement such as accreditation and patient safety. © 2015 by Kerman University of Medical Sciences.

  6. Seabird tissue archival and monitoring project: Egg collections and analytical results 1999-2002

    Science.gov (United States)

    Vander Pol, Stacy S.; Christopher, Steven J.; Roseneau, David G.; Becker, Paul R.; Day, Russel D.; Kucklick, John R.; Pugh, Rebecca S.; Simac, Kristin S.; Weston-York, Geoff

    2003-01-01

    have been developed by STAMP (see York et al. 2001). Eggs are being collected on an annual basis for several species at nesting colonies throughout Alaska. Aliquots of these egg samples are being analyzed on a regular basis for persistent organic pollutants and mercury. Results of this work have been published in scientific journals (Christopher et al. 2002) and in conference proceedings (Kucklick et al. 2002; Vander Pol et al. 2002a, 2002b). The intent of this report is to provide an up-to-date description of STAMP. The report contains the most recent egg collection inventory, analytical data, preliminary interpretations based on these data, and a discussion of possible future directions of the project.

  7. Pressurized thermal shocks: the JRC Ispra experimental test rig and analytical results

    International Nuclear Information System (INIS)

    Jovanovic, A.; Lucia, A.C.

    1990-01-01

    The paper tackles some issues of particular interest for the remanent (remaining) life prediction for the pressurized components exposed to pressurized thermal shock (PTS) loads, that have been tackled in analytical work performed in the framework of the MPA - JRC collaboration for the PTS experimental research at the JRC Ispra. These issues regard in general application of damage mechanics, fracture mechanics and artificial intelligence (including the treatment of uncertainties in the PTS analysis and experiments). The considered issues are essential for further understanding and modelling of the crack behaviour and of the component response in PTS conditions. In particular, the development of the FRAP preprocessor and development and implementation of a methodology for analysis of local non-stationary heat transfer coefficients during a PTS, have been explained more in detail. FRAP is used as a frontend, for the finite element code ABAQUS, for the heat transfer, stress and fracture mechanics analyses. The ABAQUS results are used further on, for the probabilistic fatigue crack growth analysis performed by the COVASTOL code. (author)

  8. SEMI-ANALYTIC GALAXY EVOLUTION (SAGE): MODEL CALIBRATION AND BASIC RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Croton, Darren J.; Stevens, Adam R. H.; Tonini, Chiara; Garel, Thibault; Bernyk, Maksym; Bibiano, Antonio; Hodkinson, Luke; Mutch, Simon J.; Poole, Gregory B.; Shattow, Genevieve M. [Centre for Astrophysics and Supercomputing, Swinburne University of Technology, P.O. Box 218, Hawthorn, Victoria 3122 (Australia)

    2016-02-15

    This paper describes a new publicly available codebase for modeling galaxy formation in a cosmological context, the “Semi-Analytic Galaxy Evolution” model, or sage for short.{sup 5} sage is a significant update to the 2006 model of Croton et al. and has been rebuilt to be modular and customizable. The model will run on any N-body simulation whose trees are organized in a supported format and contain a minimum set of basic halo properties. In this work, we present the baryonic prescriptions implemented in sage to describe the formation and evolution of galaxies, and their calibration for three N-body simulations: Millennium, Bolshoi, and GiggleZ. Updated physics include the following: gas accretion, ejection due to feedback, and reincorporation via the galactic fountain; a new gas cooling–radio mode active galactic nucleus (AGN) heating cycle; AGN feedback in the quasar mode; a new treatment of gas in satellite galaxies; and galaxy mergers, disruption, and the build-up of intra-cluster stars. Throughout, we show the results of a common default parameterization on each simulation, with a focus on the local galaxy population.

  9. Network Traffic Analysis With Query Driven VisualizationSC 2005HPC Analytics Results

    Energy Technology Data Exchange (ETDEWEB)

    Stockinger, Kurt; Wu, Kesheng; Campbell, Scott; Lau, Stephen; Fisk, Mike; Gavrilov, Eugene; Kent, Alex; Davis, Christopher E.; Olinger,Rick; Young, Rob; Prewett, Jim; Weber, Paul; Caudell, Thomas P.; Bethel,E. Wes; Smith, Steve

    2005-09-01

    Our analytics challenge is to identify, characterize, and visualize anomalous subsets of large collections of network connection data. We use a combination of HPC resources, advanced algorithms, and visualization techniques. To effectively and efficiently identify the salient portions of the data, we rely on a multi-stage workflow that includes data acquisition, summarization (feature extraction), novelty detection, and classification. Once these subsets of interest have been identified and automatically characterized, we use a state-of-the-art-high-dimensional query system to extract data subsets for interactive visualization. Our approach is equally useful for other large-data analysis problems where it is more practical to identify interesting subsets of the data for visualization than to render all data elements. By reducing the size of the rendering workload, we enable highly interactive and useful visualizations. As a result of this work we were able to analyze six months worth of data interactively with response times two orders of magnitude shorter than with conventional methods.

  10. Dynamics of tachyon fields and inflation - comparison of analytical and numerical results with observation

    Directory of Open Access Journals (Sweden)

    Milošević M.

    2016-01-01

    Full Text Available The role tachyon fields may play in evolution of early universe is discussed in this paper. We consider the evolution of a flat and homogeneous universe governed by a tachyon scalar field with the DBI-type action and calculate the slow-roll parameters of inflation, scalar spectral index (n, and tensor-scalar ratio (r for the given potentials. We pay special attention to the inverse power potential, first of all to V (x ~ x−4, and compare the available results obtained by analytical and numerical methods with those obtained by observation. It is shown that the computed values of the observational parameters and the observed ones are in a good agreement for the high values of the constant X0. The possibility that influence of the radion field can extend a range of the acceptable values of the constant X0 to the string theory motivated sector of its values is briefly considered. [Projekat Ministarstva nauke Republike Srbije, br. 176021, br. 174020 i br. 43011

  11. Discordant Analytical Results Caused by Biotin Interference on Diagnostic Immunoassays in a Pediatric Hospital.

    Science.gov (United States)

    Ali, Mahesheema; Rajapakshe, Deepthi; Cao, Liyun; Devaraj, Sridevi

    2017-09-01

    Recent studies have reported that biotin interferes with certain immunoassays. In this study, we evaluated the analytical interference of biotin on immunoassays that use streptavidin-biotin in our pediatric hospital. We tested the effect of different concentrations of biotin (1.5-200 ng/ml) on TSH, Prolactin, Ferritin, CK-MB, β-hCG, Troponin I, LH, FSH, Cortisol, Anti-HAV antibody (IgG and IgM), assays on Ortho Clinical Diagnostic Vitros 5600 Analyzer. Biotin (up to 200 ng/mL) did not significantly affect Troponin I and HAV assays. Biotin (up to 12.5 ng/ml) resulted in biotin >6.25 ng/mL significantly affected TSH (>20% bias) assay. Prolactin was significantly affected even at low levels (Biotin 1.5 ng/mL). Thus, we recommend educating physicians about biotin interference in common immunoassays and adding an electronic disclaimer. © 2017 by the Association of Clinical Scientists, Inc.

  12. SEMI-ANALYTIC GALAXY EVOLUTION (SAGE): MODEL CALIBRATION AND BASIC RESULTS

    International Nuclear Information System (INIS)

    Croton, Darren J.; Stevens, Adam R. H.; Tonini, Chiara; Garel, Thibault; Bernyk, Maksym; Bibiano, Antonio; Hodkinson, Luke; Mutch, Simon J.; Poole, Gregory B.; Shattow, Genevieve M.

    2016-01-01

    This paper describes a new publicly available codebase for modeling galaxy formation in a cosmological context, the “Semi-Analytic Galaxy Evolution” model, or sage for short. 5 sage is a significant update to the 2006 model of Croton et al. and has been rebuilt to be modular and customizable. The model will run on any N-body simulation whose trees are organized in a supported format and contain a minimum set of basic halo properties. In this work, we present the baryonic prescriptions implemented in sage to describe the formation and evolution of galaxies, and their calibration for three N-body simulations: Millennium, Bolshoi, and GiggleZ. Updated physics include the following: gas accretion, ejection due to feedback, and reincorporation via the galactic fountain; a new gas cooling–radio mode active galactic nucleus (AGN) heating cycle; AGN feedback in the quasar mode; a new treatment of gas in satellite galaxies; and galaxy mergers, disruption, and the build-up of intra-cluster stars. Throughout, we show the results of a common default parameterization on each simulation, with a focus on the local galaxy population

  13. Are factor analytical techniques used appropriately in the validation of health status questionnaires?

    DEFF Research Database (Denmark)

    de Vet, Henrica C W; Adér, Herman J; Terwee, Caroline B

    2005-01-01

    Factor analysis is widely used to evaluate whether questionnaire items can be grouped into clusters representing different dimensions of the construct under study. This review focuses on the appropriate use of factor analysis. The Medical Outcomes Study Short Form-36 (SF-36) is used as an example...... of the results and conclusions was often incomplete. Some of our results are specific for the SF-36, but the finding that both the application and the reporting of factor analysis leaves much room for improvement probably applies to other health status questionnaires as well. Optimal reporting and justification...

  14. Influence of Hemolysis on Analytic Results of Nuclear Magnetic Resonance-based Metabonomics

    Directory of Open Access Journals (Sweden)

    Qiao LIU

    2015-09-01

    Full Text Available Objective: To explore the changes of small molecular metabolites and their content in plasma samples due to hemolysis so as to analyze the influence of hemolysis of plasma samples on metabonomic study. Methods: Healthy adult males undergoing physical examination without drug administration history in recent period were selected to collect 10 hemolytic plasma samples and 10 hemolysis-free samples from them. Spectrograms of hydrogen nuclear magnetic resonance (1H-NMR were collected and Carr-Purcell-Meiboom-Gill (CPMG pulse sequence was used to inhibit the production of broad peak by protein and lipid, and SIMCA-P+12.0 software was applied to conduct mode recognition and Pearson correlation analysis.Results: CPMG-1H NMR plasma metabolism spectrums showed that compared with hemolysis-free samples, hemolytic samples were evidently higher in the contents of acetate, acetone and pyruvic acid, but markedly lower in that of glucose. In addition, the chemical shift of glycine-CH2 in hemolysis group moved to the lower field. Orthogonal partial least-square discriminant analysis (OPLS-DA was further applied to initiate mode recognition analysis and the results demonstrated that hemolysis group was prominently higher in the contents of metabolites, such as leucine, valine, lysine, acetate, proline, acetone, pyruvic acid, creatine, creatinine, glycine, glycerol, serine and lactic acid, but obviously lower in the contents of isoleucine and glucose than hemolysis-free group. Pearson correlation analysis indicated that in hemolytic samples, the contents of eucine, valine, lysine, proline, N-acetyl-glycoprotein, creatine, creatinine, glycerol and serine were higher but that of isoleucine was lower.Conclusion: Hemolysis can lead to the changes of multiple metabolite content and influence the analytic results of metabonomics, so in practical operation, hemolytic samples should be excluded from the study.

  15. 42 CFR 478.15 - QIO review of changes resulting from DRG validation.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false QIO review of changes resulting from DRG validation... review of changes resulting from DRG validation. (a) General rules. (1) A provider or practitioner dissatisfied with a change to the diagnostic or procedural coding information made by a QIO as a result of DRG...

  16. NCI-FDA Interagency Oncology Task Force Workshop Provides Guidance for Analytical Validation of Protein-based Multiplex Assays | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    An NCI-FDA Interagency Oncology Task Force (IOTF) Molecular Diagnostics Workshop was held on October 30, 2008 in Cambridge, MA, to discuss requirements for analytical validation of protein-based multiplex technologies in the context of its intended use. This workshop developed through NCI's Clinical Proteomic Technologies for Cancer initiative and the FDA focused on technology-specific analytical validation processes to be addressed prior to use in clinical settings. In making this workshop unique, a case study approach was used to discuss issues related to

  17. Dried blood spot specimen quality and validation of a new pre-analytical processing method for qualitative HIV-1 PCR, KwaZulu-Natal, South Africa

    Science.gov (United States)

    Parboosing, Raveen; Siyaca, Ntombizandile; Moodley, Pravikrishnen

    2016-01-01

    Background Poor quality dried blood spot (DBS) specimens are usually rejected by virology laboratories, affecting early infant diagnosis of HIV. The practice of combining two incompletely-filled DBS in one specimen preparation tube during pre-analytical specimen processing (i.e., the two-spot method) has been implemented to reduce the number of specimens being rejected for insufficient volume. Objectives This study analysed laboratory data to describe the quality of DBS specimens and the use of the two-spot method over a one-year period, then validated the two-spot method against the standard (one-spot) method. Methods Data on HIV-1 PCR test requests submitted in 2014 to the Department of Virology at Inkosi Albert Luthuli Central Hospital in KwaZulu-Natal province, South Africa were analysed to describe reasons for specimen rejection, as well as results of the two-spot method. The accuracy, lower limit of detection and precision of the two-spot method were assessed. Results Of the 88 481 specimens received, 3.7% were rejected for pre-analytical problems. Of those, 48.9% were rejected as a result of insufficient specimen volume. Two health facilities had significantly more specimen rejections than other facilities. The two-spot method prevented 10 504 specimen rejections. The Pearson correlation coefficient comparing the standard to the two-spot method was 0.997. Conclusions The two-spot method was comparable with the standard method of pre-analytical specimen processing. Two health facilities were identified for targeted retraining on specimen quality. The two-spot method of DBS specimen processing can be used as an adjunct to retraining, to reduce the number of specimens rejected and improve linkage to care. PMID:28879108

  18. [Comparability study of analytical results between a group of clinical laboratories].

    Science.gov (United States)

    Alsius-Serra, A; Ballbé-Anglada, M; López-Yeste, M L; Buxeda-Figuerola, M; Guillén-Campuzano, E; Juan-Pereira, L; Colomé-Mallolas, C; Caballé-Martín, I

    2015-01-01

    To describe the study of the comparability of the measurements levels of biological tests processed in biochemistry in Catlab's 4 laboratories. Quality requirements, coefficients of variation and total error (CV% and TE %) were established. Controls were verified with the precision requirements (CV%) in each test and each individual laboratory analyser. Fresh serum samples were used for the comparability study. The differences were analysed using a Microsoft Access® application that produces modified Bland-Altman plots. The comparison of 32 biological parameters that are performed in more than one laboratory and/or analyser generated 306 Bland-Altman graphs. Of these, 101 (33.1%) fell within the accepted range of values based on biological variability, and 205 (66.9%) required revision. Data were re-analysed based on consensus minimum specifications for analytical quality (consensus of the Asociación Española de Farmacéuticos Analistas (AEFA), the Sociedad Española de Bioquímica Clínica y Patología Molecular (SEQC), the Asociación Española de Biopatología Médica (AEBM) and the Sociedad Española de Hematología y Hemoterapia (SEHH), October 2013). With the new specifications, 170 comparisons (56%) fitted the requirements and 136 (44%) required additional review. Taking into account the number of points that exceeded the requirement, random errors, range of results in which discrepancies were detected, and range of clinical decision, it was shown that the 44% that required review were acceptable, and the 32 tests were comparable in all laboratories and analysers. The analysis of the results showed that the consensus requirements of the 4 scientific societies were met. However, each laboratory should aim to meet stricter criteria for total error. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  19. Validation of QuEChERS analytical technique for organochlorines and synthetic pyrethroids in fruits and vegetables using GC-ECD.

    Science.gov (United States)

    Dubey, J K; Patyal, S K; Sharma, Ajay

    2018-03-19

    In the present day scenario of increasing awareness and concern about the pesticides, it is very important to ensure the quality of data being generated in pesticide residue analysis. To impart confidence in the products, terms like quality assurance and quality control are used as an integral part of quality management. In order to ensure better quality of results in pesticide residue analysis, validation of analytical methods to be used is extremely important. Keeping in view the importance of validation of method, the validation of QuEChERS (quick, easy, cheap, effective, rugged, and safe) a multiresidue method for extraction of 13 organochlorines and seven synthetic pyrethroids in fruits and vegetables followed by GC ECD for quantification was done so as to use this method for analysis of samples received in the laboratory. The method has been validated as per the Guidelines issued by SANCO (French words Sante for Health and Consommateurs for Consumers) in accordance with their document SANCO/XXXX/2013. Various parameters analyzed, viz., linearity, specificity, repeatability, reproducibility, and ruggedness were found to have acceptable values with a per cent RSD of less than 10%. Limit of quantification (LOQ) for the organochlorines was established to be 0.01 and 0.05 mg kg -1 for the synthetic pyrethroids. The uncertainty of the measurement (MU) for all these compounds ranged between 1 and 10%. The matrix-match calibration was used to compensate the matrix effect on the quantification of the compounds. The overall recovery of the method ranged between 80 and 120%. These results demonstrate the applicability and acceptability of this method in routine estimation of pesticide residues of these 20 pesticides in the fruits and vegetables by the laboratory.

  20. Provenance validation of polished rice samples using nuclear and isotopic analytical techniques

    International Nuclear Information System (INIS)

    Pabroa, P.C.B.; Sucgang, R.J.; Mendoza, N.D.S.; Ebihara, M.; Peña, M.

    2015-01-01

    Rice (Oryza sativa) has been considered the best staple food among all cereals and is the staple food for over 3 billion people, constituting over half of the world’s population. Elemental and isotopic analysis revealed variance between Philippine and Japanese rice. Rice samples collected in Japan and in the Philippines (market survey samples from Metro Manila, and farm harvests from Aklan province and Central Luzon) were washed, dried and ground to fine powder. Elemental analyses of the samples were carried out using instrumental neutron activation analysis (INAA) while isotopic signatures of the samples were determined using the isotope ratio mass spectrometry (IRMS). Results show that compared with the unpolished rice standard NIES CRM10b, the polished Japanese and Philippine rice sampled show reduced concentrations of elements by as much as 1/10. 1/4 , 1/5 and 1/3 for Mg, Mn, K and Na, respectively. Levels of Ca and Zn are not greatly affected. Arsenic, probably introduced from fertilizers used in rice fields is found in all the Japanese rice tested at an average concentration of 0.103 μg/g and three out of four of the Philippine rice at an average concentration of 0.70μg/g. Higher levels of Br seen in two of the Philippine rice at 14 and 34μg/g indicated probable contamination source from the pesticide methyl bromide during quarantine. Good correlation of isotopic signatures with geographical location of polished, but not for unpolished, rice samples from Central Luzon and Aklan indicated that provenance studies are best done on polished rice samples. Isotopic with of ω’”13C show signature that of a C3 plant with possible narrow distinguishable signature with Japanese rice falling within -27.5 to -28.5 while Philippine rice within -29 to -30. Rice provenance can be ascertained using elemental analysis and isotopic abundance determination as shown by the study.(author)

  1. An assessment of the validity of inelastic design analysis methods by comparisons of predictions with test results

    International Nuclear Information System (INIS)

    Corum, J.M.; Clinard, J.A.; Sartory, W.K.

    1976-01-01

    The use of computer programs that employ relatively complex constitutive theories and analysis procedures to perform inelastic design calculations on fast reactor system components introduces questions of validation and acceptance of the analysis results. We may ask ourselves, ''How valid are the answers.'' These questions, in turn, involve the concepts of verification of computer programs as well as qualification of the computer programs and of the underlying constitutive theories and analysis procedures. This paper addresses the latter - the qualification of the analysis methods for inelastic design calculations. Some of the work underway in the United States to provide the necessary information to evaluate inelastic analysis methods and computer programs is described, and typical comparisons of analysis predictions with inelastic structural test results are presented. It is emphasized throughout that rather than asking ourselves how valid, or correct, are the analytical predictions, we might more properly question whether or not the combination of the predictions and the associated high-temperature design criteria leads to an acceptable level of structural integrity. It is believed that in this context the analysis predictions are generally valid, even though exact correlations between predictions and actual behavior are not obtained and cannot be expected. Final judgment, however, must be reserved for the design analyst in each specific case. (author)

  2. Comparison of EPRI safety valve test data with analytically determined hydraulic results

    International Nuclear Information System (INIS)

    Smith, L.C.; Howe, K.S.

    1983-01-01

    NUREG-0737 (November 1980) and all subsequent U.S. NRC generic follow-up letters require that all operating plant licensees and applicants verify the acceptability of plant specific pressurizer safety valve piping systems for valve operation transients by testing. To aid in this verification process, the Electric Power Research Institute (EPRI) conducted an extensive testing program at the Combustion Engineering Test Facility. Pertinent tests simulating dynamic opening of the safety valves for representative upstream environments were carried out. Different models and sizes of safety valves were tested at the simulated operating conditions. Transducers placed at key points in the system monitored a variety of thermal, hydraulic and structural parameters. From this data, a more complete description of the transient can be made. The EPRI test configuration was analytically modeled using a one-dimensional thermal hydraulic computer program that uses the method of characteristics approach to generate key fluid parameters as a function of space and time. The conservative equations are solved by applying both the implicit and explicit characteristic methods. Unbalanced or wave forces were determined for each straight run of pipe bounded on each side by a turn or elbow. Blowdown forces were included, where appropriate. Several parameters were varied to determine the effects on the pressure, hydraulic forces and timings of events. By comparing these quantities with the experimentally obtained data, an approximate picture of the flow dynamics is arrived at. Two cases in particular are presented. These are the hot and cold loop seal discharge tests made with the Crosby 6M6 spring-loaded safety valve. Included in the paper is a description of the hydraulic code, modeling techniques and assumptions, a comparison of the numerical results with experimental data and a qualitative description of the factors which govern pipe support loading. (orig.)

  3. Validated spectroscopic methods for determination of anti-histaminic drug azelastine in pure form: Analytical application for quality control of its pharmaceutical preparations

    Science.gov (United States)

    El-Masry, Amal A.; Hammouda, Mohammed E. A.; El-Wasseef, Dalia R.; El-Ashry, Saadia M.

    2018-02-01

    Two simple, sensitive, rapid, validated and cost effective spectroscopic methods were established for quantification of antihistaminic drug azelastine (AZL) in bulk powder as well as in pharmaceutical dosage forms. In the first method (A) the absorbance difference between acidic and basic solutions was measured at 228 nm, whereas in the second investigated method (B) the binary complex formed between AZL and Eosin Y in acetate buffer solution (pH 3) was measured at 550 nm. Different criteria that have critical influence on the intensity of absorption were deeply studied and optimized so as to achieve the highest absorption. The proposed methods obeyed Beer's low in the concentration range of (2.0-20.0 μg·mL- 1) and (0.5-15.0 μg·mL- 1) with % recovery ± S.D. of (99.84 ± 0.87), (100.02 ± 0.78) for methods (A) and (B), respectively. Furthermore, the proposed methods were easily applied for quality control of pharmaceutical preparations without any conflict with its co-formulated additives, and the analytical results were compatible with those obtained by the comparison one with no significant difference as insured by student's t-test and the variance ratio F-test. Validation of the proposed methods was performed according the ICH guidelines in terms of linearity, limit of quantification, limit of detection, accuracy, precision and specificity, where the analytical results were persuasive. The absorption spectrum of AZL (16 μg·mL- 1) in 0.1 M HCl. The absorption spectrum of AZL (16 μg·mL- 1) in 0.1 M NaOH. The difference absorption spectrum of AZL (16 μg·mL- 1) in 0.1 M NaOH vs 0.1 M HCl. The absorption spectrum of eosin binary complex with AZL (10 μg·mL- 1).

  4. Dried blood spot specimen quality and validation of a new pre-analytical processing method for qualitative HIV-1 PCR, KwaZulu-Natal, South Africa.

    Science.gov (United States)

    Govender, Kerusha; Parboosing, Raveen; Siyaca, Ntombizandile; Moodley, Pravikrishnen

    2016-01-01

    Poor quality dried blood spot (DBS) specimens are usually rejected by virology laboratories, affecting early infant diagnosis of HIV. The practice of combining two incompletely-filled DBS in one specimen preparation tube during pre-analytical specimen processing (i.e., the two-spot method) has been implemented to reduce the number of specimens being rejected for insufficient volume. This study analysed laboratory data to describe the quality of DBS specimens and the use of the two-spot method over a one-year period, then validated the two-spot method against the standard (one-spot) method. Data on HIV-1 PCR test requests submitted in 2014 to the Department of Virology at Inkosi Albert Luthuli Central Hospital in KwaZulu-Natal province, South Africa were analysed to describe reasons for specimen rejection, as well as results of the two-spot method. The accuracy, lower limit of detection and precision of the two-spot method were assessed. Of the 88 481 specimens received, 3.7% were rejected for pre-analytical problems. Of those, 48.9% were rejected as a result of insufficient specimen volume. Two health facilities had significantly more specimen rejections than other facilities. The two-spot method prevented 10 504 specimen rejections. The Pearson correlation coefficient comparing the standard to the two-spot method was 0.997. The two-spot method was comparable with the standard method of pre-analytical specimen processing. Two health facilities were identified for targeted retraining on specimen quality. The two-spot method of DBS specimen processing can be used as an adjunct to retraining, to reduce the number of specimens rejected and improve linkage to care.

  5. Pesticides residues in water treatment plant sludge: validation of analytical methodology using liquid chromatography coupled to Tandem mass spectrometry (LC-MS/MS)

    International Nuclear Information System (INIS)

    Moracci, Luiz Fernando Soares

    2008-01-01

    The evolving scenario of Brazilian agriculture brings benefits to the population and demands technological advances to this field. Constantly, new pesticides are introduced encouraging scientific studies with the aim of determine and evaluate impacts on the population and on environment. In this work, the evaluated sample was the sludge resulted from water treatment plant located in the Vale do Ribeira, Sao Paulo, Brazil. The technique used was the reversed phase liquid chromatography coupled to electrospray ionization tandem mass spectrometry. Compounds were previously liquid extracted from the matrix. The development of the methodology demanded data processing in order to be transformed into reliable information. The processes involved concepts of validation of chemical analysis. The evaluated parameters were selectivity, linearity, range, sensitivity, accuracy, precision, limit of detection, limit of quantification and robustness. The obtained qualitative and quantitative results were statistically treated and presented. The developed and validated methodology is simple. As results, even exploring the sensitivity of the analytical technique, the work compounds were not detected in the sludge of the WTP. One can explain that these compounds can be present in a very low concentration, can be degraded under the conditions of the water treatment process or are not completely retained by the WTP. (author)

  6. Analytical method validation of GC-FID for the simultaneous measurement of hydrocarbons (C2-C4) in their gas mixture

    OpenAIRE

    Oman Zuas; Harry budiman; Muhammad Rizky Mulyana

    2016-01-01

    An accurate gas chromatography coupled to a flame ionization detector (GC-FID) method was validated for the simultaneous analysis of light hydrocarbons (C2-C4) in their gas mixture. The validation parameters were evaluated based on the ISO/IEC 17025 definition including method selectivity, repeatability, accuracy, linearity, limit of detection (LOD), limit of quantitation (LOQ), and ruggedness. Under the optimum analytical conditions, the analysis of gas mixture revealed that each target comp...

  7. Analytical validation of the PAM50-based Prosigna Breast Cancer Prognostic Gene Signature Assay and nCounter Analysis System using formalin-fixed paraffin-embedded breast tumor specimens

    International Nuclear Information System (INIS)

    Nielsen, Torsten; Storhoff, James; Wallden, Brett; Schaper, Carl; Ferree, Sean; Liu, Shuzhen; Gao, Dongxia; Barry, Garrett; Dowidar, Naeem; Maysuria, Malini

    2014-01-01

    NanoString’s Prosigna™ Breast Cancer Prognostic Gene Signature Assay is based on the PAM50 gene expression signature. The test outputs a risk of recurrence (ROR) score, risk category, and intrinsic subtype (Luminal A/B, HER2-enriched, Basal-like). The studies described here were designed to validate the analytical performance of the test on the nCounter Analysis System across multiple laboratories. Analytical precision was measured by testing five breast tumor RNA samples across 3 sites. Reproducibility was measured by testing replicate tissue sections from 43 FFPE breast tumor blocks across 3 sites following independent pathology review at each site. The RNA input range was validated by comparing assay results at the extremes of the specified range to the nominal RNA input level. Interference was evaluated by including non-tumor tissue into the test. The measured standard deviation (SD) was less than 1 ROR unit within the analytical precision study and the measured total SD was 2.9 ROR units within the reproducibility study. The ROR scores for RNA inputs at the extremes of the range were the same as those at the nominal input level. Assay results were stable in the presence of moderate amounts of surrounding non-tumor tissue (<70% by area). The analytical performance of NanoString’s Prosigna assay has been validated using FFPE breast tumor specimens across multiple clinical testing laboratories

  8. Thermodynamics of atomic and ionized hydrogen: analytical results versus equation-of-state tables and Monte Carlo data.

    Science.gov (United States)

    Alastuey, A; Ballenegger, V

    2012-12-01

    We compute thermodynamical properties of a low-density hydrogen gas within the physical picture, in which the system is described as a quantum electron-proton plasma interacting via the Coulomb potential. Our calculations are done using the exact scaled low-temperature (SLT) expansion, which provides a rigorous extension of the well-known virial expansion-valid in the fully ionized phase-into the Saha regime where the system is partially or fully recombined into hydrogen atoms. After recalling the SLT expansion of the pressure [A. Alastuey et al., J. Stat. Phys. 130, 1119 (2008)], we obtain the SLT expansions of the chemical potential and of the internal energy, up to order exp(-|E_{H}|/kT) included (E_{H}≃-13.6 eV). Those truncated expansions describe the first five nonideal corrections to the ideal Saha law. They account exactly, up to the considered order, for all effects of interactions and thermal excitations, including the formation of bound states (atom H, ions H^{-} and H_{2}^{+}, molecule H_{2},⋯) and atom-charge and atom-atom interactions. Among the five leading corrections, three are easy to evaluate, while the remaining ones involve well-defined internal partition functions for the molecule H_{2} and ions H^{-} and H_{2}^{+}, for which no closed-form analytical formula exist currently. We provide accurate low-temperature approximations for those partition functions by using known values of rotational and vibrational energies. We compare then the predictions of the SLT expansion, for the pressure and the internal energy, with, on the one hand, the equation-of-state tables obtained within the opacity program at Livermore (OPAL) and, on the other hand, data of path integral quantum Monte Carlo (PIMC) simulations. In general, a good agreement is found. At low densities, the simple analytical SLT formulas reproduce the values of the OPAL tables up to the last digit in a large range of temperatures, while at higher densities (ρ∼10^{-2} g/cm^{3}), some

  9. Many analysts, one dataset: Making transparent how variations in analytical choices affect results

    NARCIS (Netherlands)

    Silberzahn, Raphael; Uhlmann, E.L.; Martin, D.P.; Anselmi, P.; Aust, F.; Awtrey, E.; Bahnik, Š.; Bai, F.; Bannard, C.; Bonnier, E.; Carlsson, R.; Cheung, F.; Christensen, G.; Clay, R.; Craig, M.A.; Dalla Rosa, A.; Dam, Lammertjan; Evans, M.H.; Flores Cervantes, I.; Fong, N.; Gamez-Djokic, M.; Glenz, A.; Gordon-McKeon, S.; Heaton, T.J.; Hederos, K.; Heene, M.; Hofelich Mohr, A.J.; Högden, F.; Hui, K.; Johannesson, M.; Kalodimos, J.; Kaszubowski, E.; Kennedy, D.M.; Lei, R.; Lindsay, T.A.; Liverani, S.; Madan, C.R.; Molden, D.; Molleman, Henricus; Morey, R.D.; Mulder, Laetitia; Nijstad, Bernard; Pope, N.G.; Pope, B.; Prenoveau, J.M.; Rink, Floortje; Robusto, E.; Roderique, H.; Sandberg, A.; Schlüter, E.; Schönbrodt, F.D.; Sherman, M.F.; Sommer, S.A.; Sotak, K.; Spain, S.; Spörlein, C.; Stafford, T.; Stefanutti, L.; Täuber, Susanne; Ullrich, J.; Vianello, M.; Wagenmakers, E.-J.; Witkowiak, M.; Yoon, S.; Nosek, B.A.

    2018-01-01

    Twenty-nine teams involving 61 analysts used the same dataset to address the same research question: whether soccer referees are more likely to give red cards to dark skin toned players than light skin toned players. Analytic approaches varied widely across teams, and estimated effect sizes ranged

  10. D4.1 Learning analytics: theoretical background, methodology and expected results

    NARCIS (Netherlands)

    Tammets, Kairit; Laanpere, Mart; Eradze, Maka; Brouns, Francis; Padrón-Nápoles, Carmen; De Rosa, Rosanna; Ferrari, Chiara

    2014-01-01

    The purpose of the EMMA project is to showcase excellence in innovative teaching methodologies and learning approaches through the large-scale piloting of MOOCs on different subjects. The main objectives related with the implementation of learning analytics in EMMa project are to: ● develop the

  11. Analytical results for non-Hermitian parity–time-symmetric and ...

    Indian Academy of Sciences (India)

    Abstract. We investigate both the non-Hermitian parity–time-(PT-)symmetric and Hermitian asymmetric volcano potentials, and present the analytical solution in terms of the confluent Heun function. Under certain special conditions, the confluent Heun function can be terminated as a polynomial, thereby leading to certain ...

  12. A Validated Reverse Phase HPLC Analytical Method for Quantitation of Glycoalkaloids in Solanum lycocarpum and Its Extracts

    Directory of Open Access Journals (Sweden)

    Renata Fabiane Jorge Tiossi

    2012-01-01

    Full Text Available Solanum lycocarpum (Solanaceae is native to the Brazilian Cerrado. Fruits of this species contain the glycoalkaloids solasonine (SN and solamargine (SM, which display antiparasitic and anticancer properties. A method has been developed for the extraction and HPLC-UV analysis of the SN and SM in different parts of S. lycocarpum, mainly comprising ripe and unripe fruits, leaf, and stem. This analytical method was validated and gave good detection response with linearity over a dynamic range of 0.77–1000.00 μg mL−1 and recovery in the range of 80.92–91.71%, allowing a reliable quantitation of the target compounds. Unripe fruits displayed higher concentrations of glycoalkaloids (1.04% ± 0.01 of SN and 0.69% ± 0.00 of SM than the ripe fruits (0.83% ± 0.02 of SN and 0.60% ± 0.01 of SM. Quantitation of glycoalkaloids in the alkaloidic extract gave 45.09% ± 1.14 of SN and 44.37% ± 0.60 of SM, respectively.

  13. Analytical and numerical study of validation test-cases for multi-physic problems: application to magneto-hydro-dynamic

    Directory of Open Access Journals (Sweden)

    D Cébron

    2016-04-01

    Full Text Available The present paper is concerned with the numerical simulation of Magneto-Hydro-Dynamic (MHD problems with industrial tools. MHD has receivedattention some twenty to thirty years ago as a possible alternative inpropulsion applications; MHD propelled ships have even been designed forthat purpose. However, such propulsion systems have been proved of lowefficiency and fundamental researches in the area have progressivelyreceived much less attention over the past decades. Numerical simulationof MHD problem could however provide interesting solutions in the field ofturbulent flow control. The development of recent efficient numericaltechniques for multi-physic applications provide promising tool for theengineer for that purpose. In the present paper, some elementary testcases in laminar flow with magnetic forcing terms are analysed; equationsof the coupled problem are exposed, analytical solutions are derived ineach case and are compared to numerical solutions obtained with anumerical tool for multi-physic applications. The present work can be seenas a validation of numerical tools (based on the finite element method foracademic as well as industrial application purposes.

  14. Validation of an analytical method for the determination of spiramycin, virginiamycin and tylosin in feeding-stuffs bij thin-layer chromatography and bio-autography

    NARCIS (Netherlands)

    Vincent, U.; Gizzi, G.; Holst, von C.; Jong, de J.; Michard, J.

    2007-01-01

    An inter-laboratory validation was carried out to determine the performance characteristics of an analytical method based on thin-layer chromatography (TLC) coupled to microbiological detection (bio-autography) for screening feed samples for the presence of spiramycin, tylosin and virginiamycin.

  15. STABLE CONIC-HELICAL ORBITS OF PLANETS AROUND BINARY STARS: ANALYTICAL RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Oks, E. [Physics Department, 206 Allison Lab., Auburn University, Auburn, AL 36849 (United States)

    2015-05-10

    Studies of planets in binary star systems are especially important because it was estimated that about half of binary stars are capable of supporting habitable terrestrial planets within stable orbital ranges. One-planet binary star systems (OBSS) have a limited analogy to objects studied in atomic/molecular physics: one-electron Rydberg quasimolecules (ORQ). Specifically, ORQ, consisting of two fully stripped ions of the nuclear charges Z and Z′ plus one highly excited electron, are encountered in various plasmas containing more than one kind of ion. Classical analytical studies of ORQ resulted in the discovery of classical stable electronic orbits with the shape of a helix on the surface of a cone. In the present paper we show that despite several important distinctions between OBSS and ORQ, it is possible for OBSS to have stable planetary orbits in the shape of a helix on a conical surface, whose axis of symmetry coincides with the interstellar axis; the stability is not affected by the rotation of the stars. Further, we demonstrate that the eccentricity of the stars’ orbits does not affect the stability of the helical planetary motion if the center of symmetry of the helix is relatively close to the star of the larger mass. We also show that if the center of symmetry of the conic-helical planetary orbit is relatively close to the star of the smaller mass, a sufficiently large eccentricity of stars’ orbits can switch the planetary motion to the unstable mode and the planet would escape the system. We demonstrate that such planets are transitable for the overwhelming majority of inclinations of plane of the stars’ orbits (i.e., the projections of the planet and the adjacent start on the plane of the sky coincide once in a while). This means that conic-helical planetary orbits at binary stars can be detected photometrically. We consider, as an example, Kepler-16 binary stars to provide illustrative numerical data on the possible parameters and the

  16. Comparison of Analytical and Measured Performance Results on Network Coding in IEEE 802.11 Ad-Hoc Networks

    DEFF Research Database (Denmark)

    Zhao, Fang; Médard, Muriel; Hundebøll, Martin

    2012-01-01

    CATWOMAN that can run on standard WiFi hardware. We present an analytical model to evaluate the performance of COPE in simple networks, and our results show the excellent predictive quality of this model. By closely examining the performance in two simple topologies, we observe that the coding gain results...

  17. Quasi-normal frequencies: Semi-analytic results for highly damped modes

    International Nuclear Information System (INIS)

    Skakala, Jozef; Visser, Matt

    2011-01-01

    Black hole highly-damped quasi-normal frequencies (QNFs) are very often of the form ω n = (offset) + in (gap). We have investigated the genericity of this phenomenon for the Schwarzschild-deSitter (SdS) black hole by considering a model potential that is piecewise Eckart (piecewise Poschl-Teller), and developing an analytic 'quantization condition' for the highly-damped quasi-normal frequencies. We find that the ω n = (offset) + in (gap) behaviour is common but not universal, with the controlling feature being whether or not the ratio of the surface gravities is a rational number. We furthermore observed that the relation between rational ratios of surface gravities and periodicity of QNFs is very generic, and also occurs within different analytic approaches applied to various types of black hole spacetimes. These observations are of direct relevance to any physical situation where highly-damped quasi-normal modes are important.

  18. Distribution of Steps with Finite-Range Interactions: Analytic Approximations and Numerical Results

    Science.gov (United States)

    GonzáLez, Diego Luis; Jaramillo, Diego Felipe; TéLlez, Gabriel; Einstein, T. L.

    2013-03-01

    While most Monte Carlo simulations assume only nearest-neighbor steps interact elastically, most analytic frameworks (especially the generalized Wigner distribution) posit that each step elastically repels all others. In addition to the elastic repulsions, we allow for possible surface-state-mediated interactions. We investigate analytically and numerically how next-nearest neighbor (NNN) interactions and, more generally, interactions out to q'th nearest neighbor alter the form of the terrace-width distribution and of pair correlation functions (i.e. the sum over n'th neighbor distribution functions, which we investigated recently.[2] For physically plausible interactions, we find modest changes when NNN interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.

  19. Three-neutrino oscillations in matter: Analytical results in the adiabatic approximaton

    International Nuclear Information System (INIS)

    Petcov, S.T.; Toshev, S.

    1987-01-01

    Analytical expressions for the probabilities of the transitions between different neutrino flavours in matter in the case of three lepton families and small vacuum mixing angles are obtained in the adiabatic approximation. A brief discussion of the characteristic features of the Mikheyev-Smirnov-Wolfenstein effect in the system of the three neutrino flavours ν e , ν μ and ν τ is also given. (orig.)

  20. DEVELOPMENT AND VALIDATION OF AN HPLC-DAD ANALYTICAL METHOD TO QUANTIFY 5-METHOXYFLAVONES IN METHANOLIC EXTRACTS OF Vochysia divergens POHL CULTURED UNDER STRESS CONDITIONS

    Directory of Open Access Journals (Sweden)

    Letícia Pereira Pimenta

    Full Text Available Vochysia divergens Pohl, known as "Cambara" in Brazil, is an invasive species that is expanding throughout Pantanal in Brazil, to form mono-dominant communities. This expansion is affecting the agricultural areas that support the typical seasonal flood and drought conditions of this biome. This article describes the development and validation of an HPLC-DAD analytical method to quantify 5-methoxyflavones in methanolic extracts of greenhouse-grown V. divergens associated with one of two endophytic fungal species Zopfiella tetraspora (Zt or Melanconiella elegans (Me and later subjected to water stress. The developed method gave good validation parameters and was successfully applied to quantify the flavones 3',5-dimethoxy luteolin-7-O-β-glucopyranoside (1, 5-methoxy luteolin (2, and 3',5-dimethoxy luteolin (3 in the target extracts. Inoculation of the plant with Zt decreased the concentration of flavone 1 in the extract by 2.69-fold as compared to the control. Inoculation of the plant with Zt or Me did not significantly alter the contents of flavones 2 and 3 in the extracts as compared to the control. Therefore, the aerial parts of germinated V. divergens plants inoculated with either Zt or Me responded differently in terms of the production of flavones. These results can cast light on the symbiosis between fungal microorganisms and V. divergens, which most likely influences the response of V. divergens to changes in the availability of water in Pantanal.

  1. Analytical results of variance reduction characteristics of biased Monte Carlo for deep-penetration problems

    International Nuclear Information System (INIS)

    Murthy, K.P.N.; Indira, R.

    1986-01-01

    An analytical formulation is presented for calculating the mean and variance of transmission for a model deep-penetration problem. With this formulation, the variance reduction characteristics of two biased Monte Carlo schemes are studied. The first is the usual exponential biasing wherein it is shown that the optimal biasing parameter depends sensitively on the scattering properties of the shielding medium. The second is a scheme that couples exponential biasing to the scattering angle biasing proposed recently. It is demonstrated that the coupled scheme performs better than exponential biasing

  2. Results Of Analytical Sample Crosschecks For Next Generation Solvent Extraction Samples Isopar L Concentration And pH

    International Nuclear Information System (INIS)

    Peters, T.; Fink, S.

    2011-01-01

    As part of the implementation process for the Next Generation Cesium Extraction Solvent (NGCS), SRNL and F/H Lab performed a series of analytical cross-checks to ensure that the components in the NGCS solvent system do not constitute an undue analytical challenge. For measurement of entrained Isopar(reg s ign) L in aqueous solutions, both labs performed similarly with results more reliable at higher concentrations (near 50 mg/L). Low bias occurred in both labs, as seen previously for comparable blind studies for the baseline solvent system. SRNL recommends consideration to use of Teflon(trademark) caps on all sample containers used for this purpose. For pH measurements, the labs showed reasonable agreement but considerable positive bias for dilute boric acid solutions. SRNL recommends consideration of using an alternate analytical method for qualification of boric acid concentrations.

  3. Development and validation of a multi-analyte method for the regulatory control of carotenoids used as feed additives in fish and poultry feed.

    Science.gov (United States)

    Vincent, Ursula; Serano, Federica; von Holst, Christoph

    2017-08-01

    Carotenoids are used in animal nutrition mainly as sensory additives that favourably affect the colour of fish, birds and food of animal origin. Various analytical methods exist for their quantification in compound feed, reflecting the different physico-chemical characteristics of the carotenoid and the corresponding feed additives. They may be natural products or specific formulations containing the target carotenoids produced by chemical synthesis. In this study a multi-analyte method was developed that can be applied to the determination of all 10 carotenoids currently authorised within the European Union for compound feedingstuffs. The method functions regardless of whether the carotenoids have been added to the compound feed via natural products or specific formulations. It is comprised of three steps: (1) digestion of the feed sample with an enzyme; (2) pressurised liquid extraction; and (3) quantification of the analytes by reversed-phase HPLC coupled to a photodiode array detector in the visible range. The method was single-laboratory validated for poultry and fish feed covering a mass fraction range of the target analyte from 2.5 to 300 mg kg - 1 . The following method performance characteristics were obtained: the recovery rate varied from 82% to 129% and precision expressed as the relative standard deviation of intermediate precision varied from 1.6% to 15%. Based on the acceptable performance obtained in the validation study, the multi-analyte method is considered fit for the intended purpose.

  4. From the Phenix irradiation end to the analytical results: PROFIL R target destructive characterization

    International Nuclear Information System (INIS)

    Ferlay, G.; Dancausse, J. Ph.

    2009-01-01

    In the French long-lived radionuclide (LLRN) transmutation program, several irradiation experiments were initiated in the Phenix fast neutron reactor to obtain a better understanding of the transmutation processes. The PROFIL experiments are performed in order to collect accurate information on the total capture integral cross sections of the principal heavy isotopes and some important fission products in the spectral range of fast reactors. One of the final goals is to diminish the uncertainties on the capture cross-section of the fission products involved in reactivity losses in fast reactors. This program includes two parts: PROFIL-R irradiated in a standard fast reactor spectrum and PROFIL-M irradiated in a moderated spectrum. The PROFIL-R and PROFIL-M irradiations were completed in August 2005 and May 2008, respectively. For both irradiations more than a hundred containers with isotopes of pure actinides and other elements in different chemical forms must be characterized. This raises a technical and analytical challenge: how to recover by selective dissolution less than 5 mg of isotope powder from a container with dimensions of only a few millimeters using hot cell facilities, and how to determine analytically both trace and ultra-trace elemental and isotopic compositions with sufficient accuracy to be useful for code calculations. (authors)

  5. The Physics of Type Ia Supernova Light Curves. I. Analytic Results and Time Dependence

    International Nuclear Information System (INIS)

    Pinto, Philip A.; Eastman, Ronald G.

    2000-01-01

    We develop an analytic solution of the radiation transport problem for Type Ia supernovae (SNe Ia) and show that it reproduces bolometric light curves produced by more detailed calculations under the assumption of a constant-extinction coefficient. This model is used to derive the thermal conditions in the interior of SNe Ia and to study the sensitivity of light curves to various properties of the underlying supernova explosions. Although the model is limited by simplifying assumptions, it is adequate for demonstrating that the relationship between SNe Ia maximum-light luminosity and rate of decline is most easily explained if SNe Ia span a range in mass. The analytic model is also used to examine the size of various terms in the transport equation under conditions appropriate to maximum light. For instance, the Eulerian and advective time derivatives are each shown to be of the same order of magnitude as other order v/c terms in the transport equation. We conclude that a fully time-dependent solution to the transport problem is needed in order to compute SNe Ia light curves and spectra accurate enough to distinguish subtle differences of various explosion models. (c) 2000 The American Astronomical Society

  6. DNA breathing dynamics: analytic results for distribution functions of relevant Brownian functionals.

    Science.gov (United States)

    Bandyopadhyay, Malay; Gupta, Shamik; Segal, Dvira

    2011-03-01

    We investigate DNA breathing dynamics by suggesting and examining several Brownian functionals associated with bubble lifetime and reactivity. Bubble dynamics is described as an overdamped random walk in the number of broken base pairs. The walk takes place on the Poland-Scheraga free-energy landscape. We suggest several probability distribution functions that characterize the breathing process, and adopt the recently studied backward Fokker-Planck method and the path decomposition method as elegant and flexible tools for deriving these distributions. In particular, for a bubble of an initial size x₀, we derive analytical expressions for (i) the distribution P(t{f}|x₀) of the first-passage time t{f}, characterizing the bubble lifetime, (ii) the distribution P(A|x₀) of the area A until the first-passage time, providing information about the effective reactivity of the bubble to processes within the DNA, (iii) the distribution P(M) of the maximum bubble size M attained before the first-passage time, and (iv) the joint probability distribution P(M,t{m}) of the maximum bubble size M and the time t{m} of its occurrence before the first-passage time. These distributions are analyzed in the limit of small and large bubble sizes. We supplement our analytical predictions with direct numericalsimulations of the related Langevin equation, and obtain a very good agreement in the appropriate limits. The nontrivial scaling behavior of the various quantities analyzed here can, in principle, be explored experimentally.

  7. Comparison of experimental target currents with analytical model results for plasma immersion ion implantation

    International Nuclear Information System (INIS)

    En, W.G.; Lieberman, M.A.; Cheung, N.W.

    1995-01-01

    Ion implantation is a standard fabrication technique used in semiconductor manufacturing. Implantation has also been used to modify the surface properties of materials to improve their resistance to wear, corrosion and fatigue. However, conventional ion implanters require complex optics to scan a narrow ion beam across the target to achieve implantation uniformity. An alternative implantation technique, called Plasma Immersion Ion Implantation (PIII), immerses the target into a plasma. The ions are extracted from the plasma directly and accelerated by applying negative high-voltage pulses to the target. An analytical model of the voltage and current characteristics of a remote plasma is presented. The model simulates the ion, electron and secondary electron currents induced before, during and after a high voltage negative pulse is applied to a target immersed in a plasma. The model also includes analytical relations that describe the sheath expansion and collapse due to negative high voltage pulses. The sheath collapse is found to be important for high repetition rate pulses. Good correlation is shown between the model and experiment for a wide variety of voltage pulses and plasma conditions

  8. Heat transfer analytical models for the rapid determination of cooling time in crystalline thermoplastic injection molding and experimental validation

    Science.gov (United States)

    Didier, Delaunay; Baptiste, Pignon; Nicolas, Boyard; Vincent, Sobotka

    2018-05-01

    Heat transfer during the cooling of a thermoplastic injected part directly affects the solidification of the polymer and consequently the quality of the part in term of mechanical properties, geometric tolerance and surface aspect. This paper proposes to mold designers a methodology based on analytical models to provide quickly the time to reach the ejection temperature depending of the temperature and the position of cooling channels. The obtained cooling time is the first step of the thermal conception of the mold. The presented methodology is dedicated to the determination of solidification time of a semi-crystalline polymer slab. It allows the calculation of the crystallization time of the part and is based on the analytical solution of the Stefan problem in a semi-infinite medium. The crystallization is then considered as a phase change with an effective crystallization temperature, which is obtained from Fast Scanning Calorimetry (FSC) results. The crystallization time is then corrected to take the finite thickness of the part into account. To check the accuracy of such approach, the solidification time is calculated by solving the heat conduction equation coupled to the crystallization kinetics of the polymer. The impact of the nature of the contact between the polymer and the mold is evaluated. The thermal contact resistance (TCR) appears as significant parameter that needs to be taken into account in the cooling time calculation. The results of the simplified model including or not TCR are compared in the case of a polypropylene (PP) with experiments carried out with an instrumented mold. Then, the methodology is applied for a part made with PolyEtherEtherKetone (PEEK).

  9. Paraxial light distribution in the focal region of a lens: a comparison of several analytical solutions and a numerical result

    Science.gov (United States)

    Wu, Yang; Kelly, Damien P.

    2014-12-01

    The distribution of the complex field in the focal region of a lens is a classical optical diffraction problem. Today, it remains of significant theoretical importance for understanding the properties of imaging systems. In the paraxial regime, it is possible to find analytical solutions in the neighborhood of the focus, when a plane wave is incident on a focusing lens whose finite extent is limited by a circular aperture. For example, in Born and Wolf's treatment of this problem, two different, but mathematically equivalent analytical solutions, are presented that describe the 3D field distribution using infinite sums of ? and ? type Lommel functions. An alternative solution expresses the distribution in terms of Zernike polynomials, and was presented by Nijboer in 1947. More recently, Cao derived an alternative analytical solution by expanding the Fresnel kernel using a Taylor series expansion. In practical calculations, however, only a finite number of terms from these infinite series expansions is actually used to calculate the distribution in the focal region. In this manuscript, we compare and contrast each of these different solutions to a numerically calculated result, paying particular attention to how quickly each solution converges for a range of different spatial locations behind the focusing lens. We also examine the time taken to calculate each of the analytical solutions. The numerical solution is calculated in a polar coordinate system and is semi-analytic. The integration over the angle is solved analytically, while the radial coordinate is sampled with a sampling interval of ? and then numerically integrated. This produces an infinite set of replicas in the diffraction plane, that are located in circular rings centered at the optical axis and each with radii given by ?, where ? is the replica order. These circular replicas are shown to be fundamentally different from the replicas that arise in a Cartesian coordinate system.

  10. Paraxial light distribution in the focal region of a lens: a comparison of several analytical solutions and a numerical result.

    Science.gov (United States)

    Wu, Yang; Kelly, Damien P

    2014-12-12

    The distribution of the complex field in the focal region of a lens is a classical optical diffraction problem. Today, it remains of significant theoretical importance for understanding the properties of imaging systems. In the paraxial regime, it is possible to find analytical solutions in the neighborhood of the focus, when a plane wave is incident on a focusing lens whose finite extent is limited by a circular aperture. For example, in Born and Wolf's treatment of this problem, two different, but mathematically equivalent analytical solutions, are presented that describe the 3D field distribution using infinite sums of [Formula: see text] and [Formula: see text] type Lommel functions. An alternative solution expresses the distribution in terms of Zernike polynomials, and was presented by Nijboer in 1947. More recently, Cao derived an alternative analytical solution by expanding the Fresnel kernel using a Taylor series expansion. In practical calculations, however, only a finite number of terms from these infinite series expansions is actually used to calculate the distribution in the focal region. In this manuscript, we compare and contrast each of these different solutions to a numerically calculated result, paying particular attention to how quickly each solution converges for a range of different spatial locations behind the focusing lens. We also examine the time taken to calculate each of the analytical solutions. The numerical solution is calculated in a polar coordinate system and is semi-analytic. The integration over the angle is solved analytically, while the radial coordinate is sampled with a sampling interval of [Formula: see text] and then numerically integrated. This produces an infinite set of replicas in the diffraction plane, that are located in circular rings centered at the optical axis and each with radii given by [Formula: see text], where [Formula: see text] is the replica order. These circular replicas are shown to be fundamentally

  11. One-loop Higgs plus four gluon amplitudes. Full analytic results

    International Nuclear Information System (INIS)

    Badger, Simon; Nigel Glover, E.W.; Williams, Ciaran; Mastrolia, Pierpaolo

    2009-10-01

    We consider one-loop amplitudes of a Higgs boson coupled to gluons in the limit of a large top quark mass. We treat the Higgs as the real part of a complex field φ that couples to the self-dual field strengths and compute the one-loop corrections to the φ-NMHV amplitude, which contains one gluon of positive helicity whilst the remaining three have negative helicity. We use four-dimensional unitarity to construct the cut-containing contributions and a hybrid of Feynman diagram and recursive based techniques to determine the rational piece. Knowledge of the φ-NMHV contribution completes the analytic calculation of the Higgs plus four gluon amplitude. For completeness we also include expressions for the remaining helicity configurations which have been calculated elsewhere. These amplitudes are relevant for Higgs plus jet production via gluon fusion in the limit where the top quark is large compared to all other scales in the problem. (orig.)

  12. Analytic result for the two-loop six-point NMHV amplitude in N=4 super Yang-Mills theory

    CERN Document Server

    Dixon, Lance J.; Henn, Johannes M.

    2012-01-01

    We provide a simple analytic formula for the two-loop six-point ratio function of planar N = 4 super Yang-Mills theory. This result extends the analytic knowledge of multi-loop six-point amplitudes beyond those with maximal helicity violation. We make a natural ansatz for the symbols of the relevant functions appearing in the two-loop amplitude, and impose various consistency conditions, including symmetry, the absence of spurious poles, the correct collinear behaviour, and agreement with the operator product expansion for light-like (super) Wilson loops. This information reduces the ansatz to a small number of relatively simple functions. In order to fix these parameters uniquely, we utilize an explicit representation of the amplitude in terms of loop integrals that can be evaluated analytically in various kinematic limits. The final compact analytic result is expressed in terms of classical polylogarithms, whose arguments are rational functions of the dual conformal cross-ratios, plus precisely two function...

  13. The German cervical cancer screening model: development and validation of a decision-analytic model for cervical cancer screening in Germany.

    Science.gov (United States)

    Siebert, Uwe; Sroczynski, Gaby; Hillemanns, Peter; Engel, Jutta; Stabenow, Roland; Stegmaier, Christa; Voigt, Kerstin; Gibis, Bernhard; Hölzel, Dieter; Goldie, Sue J

    2006-04-01

    We sought to develop and validate a decision-analytic model for the natural history of cervical cancer for the German health care context and to apply it to cervical cancer screening. We developed a Markov model for the natural history of cervical cancer and cervical cancer screening in the German health care context. The model reflects current German practice standards for screening, diagnostic follow-up and treatment regarding cervical cancer and its precursors. Data for disease progression and cervical cancer survival were obtained from the literature and German cancer registries. Accuracy of Papanicolaou (Pap) testing was based on meta-analyses. We performed internal and external model validation using observed epidemiological data for unscreened women from different German cancer registries. The model predicts life expectancy, incidence of detected cervical cancer cases, lifetime cervical cancer risks and mortality. The model predicted a lifetime cervical cancer risk of 3.0% and a lifetime cervical cancer mortality of 1.0%, with a peak cancer incidence of 84/100,000 at age 51 years. These results were similar to observed data from German cancer registries, German literature data and results from other international models. Based on our model, annual Pap screening could prevent 98.7% of diagnosed cancer cases and 99.6% of deaths due to cervical cancer in women completely adherent to screening and compliant to treatment. Extending the screening interval from 1 year to 2, 3 or 5 years resulted in reduced screening effectiveness. This model provides a tool for evaluating the long-term effectiveness of different cervical cancer screening tests and strategies.

  14. 42 CFR 493.571 - Disclosure of accreditation, State and CMS validation inspection results.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Disclosure of accreditation, State and CMS... Program § 493.571 Disclosure of accreditation, State and CMS validation inspection results. (a) Accreditation organization inspection results. CMS may disclose accreditation organization inspection results to...

  15. A program wide framework for evaluating data driven teaching and learning - earth analytics approaches, results and lessons learned

    Science.gov (United States)

    Wasser, L. A.; Gold, A. U.

    2017-12-01

    There is a deluge of earth systems data available to address cutting edge science problems yet specific skills are required to work with these data. The Earth analytics education program, a core component of Earth Lab at the University of Colorado - Boulder - is building a data intensive program that provides training in realms including 1) interdisciplinary communication and collaboration 2) earth science domain knowledge including geospatial science and remote sensing and 3) reproducible, open science workflows ("earth analytics"). The earth analytics program includes an undergraduate internship, undergraduate and graduate level courses and a professional certificate / degree program. All programs share the goals of preparing a STEM workforce for successful earth analytics driven careers. We are developing an program-wide evaluation framework that assesses the effectiveness of data intensive instruction combined with domain science learning to better understand and improve data-intensive teaching approaches using blends of online, in situ, asynchronous and synchronous learning. We are using targeted online search engine optimization (SEO) to increase visibility and in turn program reach. Finally our design targets longitudinal program impacts on participant career tracts over time.. Here we present results from evaluation of both an interdisciplinary undergrad / graduate level earth analytics course and and undergraduate internship. Early results suggest that a blended approach to learning and teaching that includes both synchronous in-person teaching and active classroom hands-on learning combined with asynchronous learning in the form of online materials lead to student success. Further we will present our model for longitudinal tracking of participant's career focus overtime to better understand long-term program impacts. We also demonstrate the impact of SEO optimization on online content reach and program visibility.

  16. Development, validation and application of a sensitive analytical method for residue determination and dissipation of imidacloprid in sugarcane under tropical field condition.

    Science.gov (United States)

    Ramasubramanian, T; Paramasivam, M; Nirmala, R

    2016-06-01

    A simple and sensitive analytical method has been developed and validated for the determination of trace amounts of imidacloprid in/on sugarcane sett, stalk and leaf. The method optimized in the present study requires less volume of organic solvent and time. Hence, this method is suitable for high-throughput analyses involving large number of samples. The limit of detection (LOD) and limit of quantification (LOQ) of the method were 0.003 and 0.01 mg/kg, respectively. The recovery and relative standard deviation were more than 93 % and less than 4 %, respectively. Thus, it is obvious that the analytical method standardized in this study is more precise and accurate enough to determine the residues of imidacloprid in sugarcane sett, stalk and leaf. The dissipation and translocation of imidacloprid residues from treated cane setts to leaf and stalk were studied by adopting this method. In sugarcane setts, the residues of imidacloprid persisted up to 120 days with half-life of 15.4 days at its recommended dose (70 g a.i./ha). The residues of imidacloprid were found to be translocated from setts to stalk and leaf. The imidacloprid residues were detected up to 105 days in both leaf and stalk. Dipping of sugarcane setts in imidacloprid at its recommended dose may result in better protection of cane setts and established crop because of higher initial deposit (>100 mg/kg) and longer persistence (>120 days).

  17. Development and validation of an analytical method for quality control and the stability of the eyedrops 10 % Phenylephrine and the 1 % Tropicamide

    International Nuclear Information System (INIS)

    Garcia Penna, Caridad Margarita; Botet Garcia, Martha; Troche Concepcion, Yenilen

    2011-01-01

    An analytical high-performance liquid chromatography method was developed and validated applicable to quality control and to stability study of 10 % phenylephrine plus eyedrops 1 % tropicamide. To quantify simultaneously both active principles in the finished product, separation was carried out through a Lichrosorb RP-18 (15 μm) (260 x 4 mm) column chromatography, with ultraviolet detection at 253 nm using the mobile phase composed of methanol: distilled water (1:1), with 1.1 g of sodium 1-octasulfanate by litre and pH fitted to 3.0 with phosphoric acid and the quantification of this front to a reference sample using the external standard method. The analytical method developed was linear, precise, specific and accurate in the rank of study concentrations, established for the quality control and stability study of the finished product since there were not analytical methods designed for these aims

  18. Develop and validation of an analytic method for the histamine determination in fish, using chromatography liquidates of high efficiency in reverse phase with ultraviolet detection

    International Nuclear Information System (INIS)

    Valverde Chavarria, J. C.

    1997-01-01

    There were determined and optimized the reaction and conditions analysis, for the derivation of the histamine with the reagent of or-ftalaldehido (OPA), it was proven that it is possible to quantify the one derived formed at 333nm. The good conditions crhomatografics settled down for the determination of the histamine in fish by means of the analytic technique of chromatography it liquidates of high efficiency (HPLC) in reverse phase, using the derivatizacion in precolumn of the histamine with the reagent of OPA, with ultraviolet detection at 333nm. The conditions of the proposed methodology were optimized and the variables of analytic acting were validated, for the analytic quantification of the histamine in the mg g-1 environment. The applicability of the methodology was demonstrated in the histamine determination in samples of fresh fish [es

  19. The Viking X ray fluorescence experiment - Analytical methods and early results

    Science.gov (United States)

    Clark, B. C., III; Castro, A. J.; Rowe, C. D.; Baird, A. K.; Rose, H. J., Jr.; Toulmin, P., III; Christian, R. P.; Kelliher, W. C.; Keil, K.; Huss, G. R.

    1977-01-01

    Ten samples of the Martian regolith have been analyzed by the Viking lander X ray fluorescence spectrometers. Because of high-stability electronics, inclusion of calibration targets, and special data encoding within the instruments the quality of the analyses performed on Mars is closely equivalent to that attainable with the same instruments operated in the laboratory. Determination of absolute elemental concentrations requires gain drift adjustments, subtraction of background components, and use of a mathematical response model with adjustable parameters set by prelaunch measurements on selected rock standards. Bulk fines at both Viking landing sites are quite similar in composition, implying that a chemically and mineralogically homogeneous regolith covers much of the surface of the planet. Important differences between samples include a higher sulfur content in what appear to be duricrust fragments than in fines and a lower iron content in fines taken from beneath large rocks than those taken from unprotected surface material. Further extensive reduction of these data will allow more precise and more accurate analytical numbers to be determined and thus a more comprehensive understanding of elemental trends between samples.

  20. Crystal growth of pure substances: Phase-field simulations in comparison with analytical and experimental results

    Science.gov (United States)

    Nestler, B.; Danilov, D.; Galenko, P.

    2005-07-01

    A phase-field model for non-isothermal solidification in multicomponent systems [SIAM J. Appl. Math. 64 (3) (2004) 775-799] consistent with the formalism of classic irreversible thermodynamics is used for numerical simulations of crystal growth in a pure material. The relation of this approach to the phase-field model by Bragard et al. [Interface Science 10 (2-3) (2002) 121-136] is discussed. 2D and 3D simulations of dendritic structures are compared with the analytical predictions of the Brener theory [Journal of Crystal Growth 99 (1990) 165-170] and with recent experimental measurements of solidification in pure nickel [Proceedings of the TMS Annual Meeting, March 14-18, 2004, pp. 277-288; European Physical Journal B, submitted for publication]. 3D morphology transitions are obtained for variations in surface energy and kinetic anisotropies at different undercoolings. In computations, we investigate the convergence behaviour of a standard phase-field model and of its thin interface extension at different undercoolings and at different ratios between the diffuse interface thickness and the atomistic capillary length. The influence of the grid anisotropy is accurately analyzed for a finite difference method and for an adaptive finite element method in comparison.

  1. Copper and tin isotopic analysis of ancient bronzes for archaeological investigation: development and validation of a suitable analytical methodology.

    Science.gov (United States)

    Balliana, Eleonora; Aramendía, Maite; Resano, Martin; Barbante, Carlo; Vanhaecke, Frank

    2013-03-01

    Although in many cases Pb isotopic analysis can be relied on for provenance determination of ancient bronzes, sometimes the use of "non-traditional" isotopic systems, such as those of Cu and Sn, is required. The work reported on in this paper aimed at revising the methodology for Cu and Sn isotope ratio measurements in archaeological bronzes via optimization of the analytical procedures in terms of sample pre-treatment, measurement protocol, precision, and analytical uncertainty. For Cu isotopic analysis, both Zn and Ni were investigated for their merit as internal standard (IS) relied on for mass bias correction. The use of Ni as IS seems to be the most robust approach as Ni is less prone to contamination, has a lower abundance in bronzes and an ionization potential similar to that of Cu, and provides slightly better reproducibility values when applied to NIST SRM 976 Cu isotopic reference material. The possibility of carrying out direct isotopic analysis without prior Cu isolation (with AG-MP-1 anion exchange resin) was investigated by analysis of CRM IARM 91D bronze reference material, synthetic solutions, and archaeological bronzes. Both procedures (Cu isolation/no Cu isolation) provide similar δ (65)Cu results with similar uncertainty budgets in all cases (±0.02-0.04 per mil in delta units, k = 2, n = 4). Direct isotopic analysis of Cu therefore seems feasible, without evidence of spectral interference or matrix-induced effect on the extent of mass bias. For Sn, a separation protocol relying on TRU-Spec anion exchange resin was optimized, providing a recovery close to 100 % without on-column fractionation. Cu was recovered quantitatively together with the bronze matrix with this isolation protocol. Isotopic analysis of this Cu fraction provides δ (65)Cu results similar to those obtained upon isolation using AG-MP-1 resin. This means that Cu and Sn isotopic analysis of bronze alloys can therefore be carried out after a single chromatographic

  2. Analytical method development and validation of spectrofluorimetric and spectrophotometric determination of some antimicrobial drugs in their pharmaceuticals

    Science.gov (United States)

    Ibrahim, F.; Wahba, M. E. K.; Magdy, G.

    2018-01-01

    In this study, three novel, sensitive, simple and validated spectrophotometric and spectrofluorimetric methods have been proposed for estimation of some important antimicrobial drugs. The first two methods have been proposed for estimation of two important third-generation cephalosporin antibiotics namely, cefixime and cefdinir. Both methods were based on condensation of the primary amino group of the studied drugs with acetyl acetone and formaldehyde in acidic medium. The resulting products were measured by spectrophotometric (Method I) and spectrofluorimetric (Method II) tools. Regarding method I, the absorbance was measured at 315 nm and 403 nm with linearity ranges of 5.0-140.0 and 10.0-100.0 μg/mL for cefixime and cefdinir, respectively. Meanwhile in method II, the produced fluorophore was measured at λem 488 nm or 491 nm after excitation at λex 410 nm with linearity ranges of 0.20-10.0 and 0.20-36.0 μg/mL for cefixime and cefdinir, respectively. On the other hand, method III was devoted to estimate nifuroxazide spectrofluorimetrically depending on formation of highly fluorescent product upon reduction of the studied drug with Zinc powder in acidic medium. Measurement of the fluorescent product was carried out at λem 335 nm following excitation at λex 255 nm with linearity range of 0.05 to 1.6 μg/mL. The developed methods were subjected to detailed validation procedure, moreover they were used for the estimation of the concerned drugs in their pharmaceuticals. It was found that there is a good agreement between the obtained results and those obtained by the reported methods.

  3. Properties of transit-time interactions in magnetized plasmas: Analytic and numerical results

    International Nuclear Information System (INIS)

    Melatos, A.; Robinson, P.A.

    1993-01-01

    The recently developed perturbation theory of transit-time interactions between particles and coherent wave packets in magnetized plasmas is applied to particular field structures. Limits of validity are determined by comparison with test-particle simulations, showing that the theory is accurate everywhere except near certain well-determined resonances, for wave fields exceeding a characteristic threshold, and for particles below a particular velocity. The properties of transit-time interactions in magnetized plasmas are investigated in detail to determine their dependence on the fields and parameters of the particle motion. Resonant particle scattering is found to occur at low particle velocities when the frequency of the coherent wave packet is an integer multiple of the gyrofrequency. Two different types of resonant transit-time dissipation are also observed: one arises from transient cyclotron acceleration in the localized wave packet, the other from beating between the gyration of the particles and the oscillation of the wave packet field. Both effects involve an interplay between the field geometry and resonant oscillations

  4. Simple quasi-analytical holonomic homogenization model for the non-linear analysis of in-plane loaded masonry panels: Part 2, structural implementation and validation

    Science.gov (United States)

    Milani, G.; Bertolesi, E.

    2017-07-01

    The simple quasi analytical holonomic homogenization approach for the non-linear analysis of in-plane loaded masonry presented in Part 1 is here implemented at a structural leveland validated. For such implementation, a Rigid Body and Spring Mass model (RBSM) is adopted, relying into a numerical modelling constituted by rigid elements interconnected by homogenized inelastic normal and shear springs placed at the interfaces between adjoining elements. Such approach is also known as HRBSM. The inherit advantage is that it is not necessary to solve a homogenization problem at each load step in each Gauss point, and a direct implementation into a commercial software by means of an external user supplied subroutine is straightforward. In order to have an insight into the capabilities of the present approach to reasonably reproduce masonry behavior at a structural level, non-linear static analyses are conducted on a shear wall, for which experimental and numerical data are available in the technical literature. Quite accurate results are obtained with a very limited computational effort.

  5. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    Science.gov (United States)

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. Copyright © 2010 S. Karger AG, Basel.

  6. Piecewise linear emulator of the nonlinear Schroedinger equation and the resulting analytic solutions for Bose-Einstein condensates

    International Nuclear Information System (INIS)

    Theodorakis, Stavros

    2003-01-01

    We emulate the cubic term Ψ 3 in the nonlinear Schroedinger equation by a piecewise linear term, thus reducing the problem to a set of uncoupled linear inhomogeneous differential equations. The resulting analytic expressions constitute an excellent approximation to the exact solutions, as is explicitly shown in the case of the kink, the vortex, and a δ function trap. Such a piecewise linear emulation can be used for any differential equation where the only nonlinearity is a Ψ 3 one. In particular, it can be used for the nonlinear Schroedinger equation in the presence of harmonic traps, giving analytic Bose-Einstein condensate solutions that reproduce very accurately the numerically calculated ones in one, two, and three dimensions

  7. Fields, particles and analyticity: recent results or 30 goldberg (ER) variations on B.A.C.H

    International Nuclear Information System (INIS)

    Bros, J.

    1991-01-01

    As it is known, Axiomatic Field Theory (A) implies double analyticity of the η-point functions in space-time and energy-momentum Complex Variables (C), with various interconnections by Fourier-Laplace analysis. When the latter is replaced by. Harmonic Analysis (H) on spheres and hyperboloids, a new kind of double analyticity results from (A) (i.e. from locality, spectral condition, temperateness and invariance): complex angular momentum is thereby introduced (a missing chapter in (A)). Exploitation of Asymptotic Completeness via Bethe-Salpeter-type equations (B) leads to new developments of the previous theme on (A, C, H) (complex angular momentum) and of other themes on (A,C) (crossing, Haag-Swieca property etc...). Various aspects of (A) + (B) have been implemented in Constructive Field Theory (composite spectrum, asymptotic properties etc...) by a combination of specific techniques and of model-independent methods

  8. Fields, particles and analyticity: recent results or 30 goldberg (ER) variations on B.A.C.H

    Energy Technology Data Exchange (ETDEWEB)

    Bros, J

    1992-12-31

    As it is known, Axiomatic Field Theory (A) implies double analyticity of the {eta}-point functions in space-time and energy-momentum Complex Variables (C), with various interconnections by Fourier-Laplace analysis. When the latter is replaced by. Harmonic Analysis (H) on spheres and hyperboloids, a new kind of double analyticity results from (A) (i.e. from locality, spectral condition, temperateness and invariance): complex angular momentum is thereby introduced (a missing chapter in (A)). Exploitation of Asymptotic Completeness via Bethe-Salpeter-type equations (B) leads to new developments of the previous theme on (A, C, H) (complex angular momentum) and of other themes on (A,C) (crossing, Haag-Swieca property etc...). Various aspects of (A) + (B) have been implemented in Constructive Field Theory (composite spectrum, asymptotic properties etc...) by a combination of specific techniques and of model-independent methods.

  9. Comment on 'Analytical results for a Bessel function times Legendre polynomials class integrals'

    International Nuclear Information System (INIS)

    Cregg, P J; Svedlindh, P

    2007-01-01

    A result is obtained, stemming from Gegenbauer, where the products of certain Bessel functions and exponentials are expressed in terms of an infinite series of spherical Bessel functions and products of associated Legendre functions. Closed form solutions for integrals involving Bessel functions times associated Legendre functions times exponentials, recently elucidated by Neves et al (J. Phys. A: Math. Gen. 39 L293), are then shown to result directly from the orthogonality properties of the associated Legendre functions. This result offers greater flexibility in the treatment of classical Heisenberg chains and may do so in other problems such as occur in electromagnetic diffraction theory. (comment)

  10. The structure and evolution of galacto-detonation waves - Some analytic results in sequential star formation models of spiral galaxies

    Science.gov (United States)

    Cowie, L. L.; Rybicki, G. B.

    1982-01-01

    Waves of star formation in a uniform, differentially rotating disk galaxy are treated analytically as a propagating detonation wave front. It is shown, that if single solitary waves could be excited, they would evolve asymptotically to one of two stable spiral forms, each of which rotates with a fixed pattern speed. Simple numerical solutions confirm these results. However, the pattern of waves that develop naturally from an initially localized disturbance is more complex and dies out within a few rotation periods. These results suggest a conclusive observational test for deciding whether sequential star formation is an important determinant of spiral structure in some class of galaxies.

  11. Comparison of gamma knife validation film's analysis results of different film dose analysis software

    International Nuclear Information System (INIS)

    Cheng Xiaojun; Zhang Conghua; Liu Han; Dai Fuyou; Hu Chuanpeng; Liu Cheng; Yao Zhongfu

    2011-01-01

    Objective: To compare the analytical result of different kinds of film dose analysis software for the same gamma knife, analyze the reasons of difference caused, and explore the measurements and means for quality control and quality assurance during testing gamma knife and analyzing its result. Methods: To test the Moon Deity gamma knife with Kodak EDR2 film and γ-Star gamma knife with GAFCHROMIC® EBT film, respectively. All the validation films are scanned to proper imagine format for dose analysis software by EPSON PERFECTION V750 PRO scanner. Then imagines of Moon Deity gamma knife are analyzed with Robot Knife Adjuvant 1.09 and Fas-09 1.0, and imagines of γ-Star gamma knife with Fas-09 and MATLAB 7.0. Results: There is no significant difference in the maximum deviation of radiation field size (Full Width at Half Maximum, FWHM) and its nominal value between Robot Knife Adjuvant and Fas-09 for Moon Deity gamma knife (t=-2.133, P>0.05). The analysis on the radiation field's penumbra region width of collimators which have different sizes indicated that the differences are significant (t=-8.154, P<0.05). There is no significant difference in the maximum deviation of FWHM and its nominal value between Fas-09 and MATLAB for γ-Star gamma knife (t=-1.384, P>0.05). However, following national standards,analysis of φ4 mm width of collimators can obtain different results according to the two kinds software, and the result of Fas-09 is not qualified while MATLAB is qualified. The analysis on the radiation field's penumbra region width of collimators which have different sizes indicates that the differences are significant (t=3.074, P<0.05). The imagines are processed with Fas-09. The analysis of imagine in the pre-and the post-processing indicates that there is no significant difference in the maximum deviation of FWHM and its nominal value (t=0.647, P>0.05), and the analytical result of the radiation field's penumbra region width indicates that there is

  12. ValidatorDB: database of up-to-date validation results for ligands and non-standard residues from the Protein Data Bank.

    Science.gov (United States)

    Sehnal, David; Svobodová Vařeková, Radka; Pravda, Lukáš; Ionescu, Crina-Maria; Geidl, Stanislav; Horský, Vladimír; Jaiswal, Deepti; Wimmerová, Michaela; Koča, Jaroslav

    2015-01-01

    Following the discovery of serious errors in the structure of biomacromolecules, structure validation has become a key topic of research, especially for ligands and non-standard residues. ValidatorDB (freely available at http://ncbr.muni.cz/ValidatorDB) offers a new step in this direction, in the form of a database of validation results for all ligands and non-standard residues from the Protein Data Bank (all molecules with seven or more heavy atoms). Model molecules from the wwPDB Chemical Component Dictionary are used as reference during validation. ValidatorDB covers the main aspects of validation of annotation, and additionally introduces several useful validation analyses. The most significant is the classification of chirality errors, allowing the user to distinguish between serious issues and minor inconsistencies. Other such analyses are able to report, for example, completely erroneous ligands, alternate conformations or complete identity with the model molecules. All results are systematically classified into categories, and statistical evaluations are performed. In addition to detailed validation reports for each molecule, ValidatorDB provides summaries of the validation results for the entire PDB, for sets of molecules sharing the same annotation (three-letter code) or the same PDB entry, and for user-defined selections of annotations or PDB entries. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. Analytical Method Validation and Determination of Pyridoxine, Nicotinamide, and Caffeine in Energy Drinks Using Thin Layer Chromatography-Densitometry

    Directory of Open Access Journals (Sweden)

    Florentinus Dika Octa Riswanto

    2015-03-01

    Full Text Available Food supplement which contains vitamins and stimulants such as caffeine were classified as energy drink. TLC-densitometry method was chosen to determine the pyridoxine, nicotinamide, and caffeine in the energy drink sample. TLC plates of silica gel 60 F254 was used as the stationary phase and methanol : ethyl acetate : ammonia 25% (134:77:10 was used as the mobile phase. The correlation coefficient for each pyridoxine, nicotinamide, and caffeine were 0.9982, 0.9997, and 0.9966, respectively. Detection and quantitation limits of from the three analytes were 4.05 and 13.51 µg/mL; 13.15 and 43.83 µg/mL; 5.43 and 18.11 µg/mL, respectively. The recovery of pyridoxine, nicotinamide, and caffeine were within the required limit range of 95-105%. The percent of RSD were below the limit value of 5.7% for caffeine and nicotinamide and 8% for pyridoxine. The content amount of pyridoxine in the sample 1 and 2 were 33.59 ± 0.981 and 30.29 ± 2.061 µg/mL, respectively. The content amount of nicotinamide in the sample 1 and 2 were 106.53 ± 3.521 and 98.20 ± 3.648 µg/mL, respectively. The content amount of caffeine in the sample 1 and 2 were 249.50 ± 5.080 and 252.80 ± 2.640 µg/mL, respectively. Robustness test results showed that the most optimal method conditions should be applied for the analysis.

  14. Experimental validation of the twins prediction program for rolling noise. Pt.2: results

    NARCIS (Netherlands)

    Thompson, D.J.; Fodiman, P.; Mahé, H.

    1996-01-01

    Two extensive measurement campaigns have been carried out to validate the TWINS prediction program for rolling noise, as described in part 1 of this paper. This second part presents the experimental results of vibration and noise during train pass-bys and compares them with predictions from the

  15. Planck intermediate results: IV. the XMM-Newton validation programme for new Planck galaxy clusters

    DEFF Research Database (Denmark)

    Bartlett, J.G.; Delabrouille, J.; Ganga, K.

    2013-01-01

    We present the final results from the XMM-Newton validation follow-up of new Planck galaxy cluster candidates. We observed 15 new candidates, detected with signal-to-noise ratios between 4.0 and 6.1 in the 15.5-month nominal Planck survey. The candidates were selected using ancillary data flags d...

  16. Comparison of analytical models and experimental results for single-event upset in CMOS SRAMs

    International Nuclear Information System (INIS)

    Mnich, T.M.; Diehl, S.E.; Shafer, B.D.

    1983-01-01

    In an effort to design fully radiation-hardened memories for satellite and deep-space applications, a 16K and a 2K CMOS static RAM were modeled for single-particle upset during the design stage. The modeling resulted in the addition of a hardening feedback resistor in the 16K remained tentatively unaltered. Subsequent experiments, using the Lawrence Berkeley Laboratories' 88-inch cyclotron to accelerate krypton and oxygen ions, established an upset threshold for the 2K and the 16K without resistance added, as well as a hardening threshold for the 16K with feedback resistance added. Results for the 16K showed it to be hardenable to the higher level than previously published data for other unhardened 16K RAMs. The data agreed fairly well with the modeling results; however, a close look suggests that modification of the simulation methodology is required to accurately predict the resistance necessary to harden the RAM cell

  17. Development of Magnetometer Digital Circuit for KSR-3 Rocket and Analytical Study on Calibration Result

    Directory of Open Access Journals (Sweden)

    Eun-Seok Lee

    2002-12-01

    Full Text Available This paper describes the re-design and the calibration results of the MAG digital circuit onboard the KSR-3. We enhanced the sampling rate of magnetometer data. Also, we reduced noise and increased authoritativeness of data. We could confirm that AIM resolution was decreased less than 1nT of analog calibration by a digital calibration of magnetometer. Therefore, we used numerical-program to correct this problem. As a result, we could calculate correction and error of data. These corrections will be applied to magnetometer data after the launch of KSR-3.

  18. Analytical results for post-buckling behaviour of plates in compression and in shear

    Science.gov (United States)

    Stein, M.

    1985-01-01

    The postbuckling behavior of long rectangular isotropic and orthotropic plates is determined. By assuming trigonometric functions in one direction, the nonlinear partial differential equations of von Karman large deflection plate theory are converted into nonlinear ordinary differential equations. The ordinary differential equations are solved numerically using an available boundary value problem solver which makes use of Newton's method. Results for longitudinal compression show different postbuckling behavior between isotropic and orthotropic plates. Results for shear show that change in inplane edge constraints can cause large change in postbuckling stiffness.

  19. Validation Test Results for Orthogonal Probe Eddy Current Thruster Inspection System

    Science.gov (United States)

    Wincheski, Russell A.

    2007-01-01

    Recent nondestructive evaluation efforts within NASA have focused on an inspection system for the detection of intergranular cracking originating in the relief radius of Primary Reaction Control System (PCRS) Thrusters. Of particular concern is deep cracking in this area which could lead to combustion leakage in the event of through wall cracking from the relief radius into an acoustic cavity of the combustion chamber. In order to reliably detect such defects while ensuring minimal false positives during inspection, the Orthogonal Probe Eddy Current (OPEC) system has been developed and an extensive validation study performed. This report describes the validation procedure, sample set, and inspection results as well as comparing validation flaws with the response from naturally occuring damage.

  20. 42 CFR 476.85 - Conclusive effect of QIO initial denial determinations and changes as a result of DRG validations.

    Science.gov (United States)

    2010-10-01

    ... determinations and changes as a result of DRG validations. 476.85 Section 476.85 Public Health CENTERS FOR... denial determinations and changes as a result of DRG validations. A QIO initial denial determination or change as a result of DRG validation is final and binding unless, in accordance with the procedures in...

  1. Tank 241-SY-102, January 2000 Compatibility Grab Samples Analytical Results for the Final Report

    International Nuclear Information System (INIS)

    BELL, K.E.

    2000-01-01

    This document is the format IV, final report for the tank 241-SY-102 (SY-102) grab samples taken in January 2000 to address waste compatibility concerns. Chemical, radiochemical, and physical analyses on the tank SY-102 samples were performed as directed in Comparability Grab Sampling and Analysis Plan for Fiscal Year 2000 (Sasaki 1999). No notification limits were exceeded. Preliminary data on samples 2SY-99-5, -6, and -7 were reported in ''Format II Report on Tank 241-SY-102 Waste Compatibility Grab Samples Taken in January 2000'' (Lockrem 2000). The data presented here represent the final results

  2. Air Monitoring Network at Tonopah Test Range: Network Description, Capabilities, and Analytical Results

    International Nuclear Information System (INIS)

    Hartwell, William T.; Daniels, Jeffrey; Nikolich, George; Shadel, Craig; Giles, Ken; Karr, Lynn; Kluesner, Tammy

    2012-01-01

    During the period April to June 2008, at the behest of the Department of Energy (DOE), National Nuclear Security Administration, Nevada Site Office (NNSA/NSO); the Desert Research Institute (DRI) constructed and deployed two portable environmental monitoring stations at the Tonopah Test Range (TTR) as part of the Environmental Restoration Project Soils Activity. DRI has operated these stations since that time. A third station was deployed in the period May to September 2011. The TTR is located within the northwest corner of the Nevada Test and Training Range (NTTR), and covers an area of approximately 725.20 km2 (280 mi2). The primary objective of the monitoring stations is to evaluate whether and under what conditions there is wind transport of radiological contaminants from Soils Corrective Action Units (CAUs) associated with Operation Roller Coaster on TTR. Operation Roller Coaster was a series of tests, conducted in 1963, designed to examine the stability and dispersal of plutonium in storage and transportation accidents. These tests did not result in any nuclear explosive yield. However, the tests did result in the dispersal of plutonium and contamination of surface soils in the surrounding area.

  3. Investigation and analytical results of bituminized products in drums at filing room

    International Nuclear Information System (INIS)

    Shibata, Atsuhiro; Kato, Yoshiyuki; Sano, Yuichi; Kitajima, Takafumi; Fujita, Hideto

    1999-09-01

    This report describes the results of investigation of the bituminized products in drums, liquid waste in the receiving tank V21 and the bituminized mixture in the extruder. The investigation of the products in drums showed most of the unburned products filled after 28B had abnormality, such as hardened surfaces, caves and porous brittle products. The particle sizes of the salt fixed in bituminized products depended neither on batch number nor on feed rate. It indicates the fining of the salt particle caused by the decreased feed rate did not occur. The measured concentrations of metals and anions in the bituminized products showed no abnormality. The catalytic content was not recognized in the products. The infrared absorption spectra obtained with the bituminized products show the oxidation at the incident occurred without oxygen. There was no organic phase on the surface of liquid waste in V21. Chemical analysis and thermal analysis on the precipitate in V21 showed no abnormality. Concentration of sodium nitrate/nitrite in the mixture collected from the extruder was lower than normal products. These results show no chemical activation of the bituminized products. It can be concluded that the chemical characteristics of the products had little abnormality even around the incident. (author)

  4. Bio-analytical method development and validation of Rasagiline by high performance liquid chromatography tandem mass spectrometry detection and its application to pharmacokinetic study

    Directory of Open Access Journals (Sweden)

    Ravi Kumar Konda

    2012-10-01

    Full Text Available The most suitable bio-analytical method based on liquid–liquid extraction has been developed and validated for quantification of Rasagiline in human plasma. Rasagiline-13C3 mesylate was used as an internal standard for Rasagiline. Zorbax Eclipse Plus C18 (2.1 mm×50 mm, 3.5 μm column provided chromatographic separation of analyte followed by detection with mass spectrometry. The method involved simple isocratic chromatographic condition and mass spectrometric detection in the positive ionization mode using an API-4000 system. The total run time was 3.0 min. The proposed method has been validated with the linear range of 5–12000 pg/mL for Rasagiline. The intra-run and inter-run precision values were within 1.3%–2.9% and 1.6%–2.2% respectively for Rasagiline. The overall recovery for Rasagiline and Rasagiline-13C3 mesylate analog was 96.9% and 96.7% respectively. This validated method was successfully applied to the bioequivalence and pharmacokinetic study of human volunteers under fasting condition. Keywords: High performance liquid chromatography, Mass spectrometry, Rasagiline, Liquid–liquid extraction

  5. Analysis of environmental contamination resulting from catastrophic incidents: part 2. Building laboratory capability by selecting and developing analytical methodologies.

    Science.gov (United States)

    Magnuson, Matthew; Campisano, Romy; Griggs, John; Fitz-James, Schatzi; Hall, Kathy; Mapp, Latisha; Mullins, Marissa; Nichols, Tonya; Shah, Sanjiv; Silvestri, Erin; Smith, Terry; Willison, Stuart; Ernst, Hiba

    2014-11-01

    Catastrophic incidents can generate a large number of samples of analytically diverse types, including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for sample analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to acceptable levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illustrates the result of applying this principle, in the form of a compendium of analytical methods for contaminants of interest. The compendium is based on experience with actual incidents, where appropriate and available. This paper also discusses efforts aimed at adaptation of existing methods to increase fitness-for-purpose and development of innovative methods when necessary. The contaminants of interest are primarily those potentially released through catastrophes resulting from malicious activity

  6. Analytical results for 544 water samples collected in the Attean Quartz Monzonite in the vicinity of Jackman, Maine

    Science.gov (United States)

    Ficklin, W.H.; Nowlan, G.A.; Preston, D.J.

    1983-01-01

    Water samples were collected in the vicinity of Jackman, Maine as a part of the study of the relationship of dissolved constituents in water to the sediments subjacent to the water. Each sample was analyzed for specific conductance, alkalinity, acidity, pH, fluoride, chloride, sulfate, phosphate, nitrate, sodium, potassium, calcium, magnesium, and silica. Trace elements determined were copper, zinc, molybdenum, lead, iron, manganese, arsenic, cobalt, nickel, and strontium. The longitude and latitude of each sample location and a sample site map are included in the report as well as a table of the analytical results.

  7. Tank 241-AP-105, cores 208, 209 and 210, analytical results for the final report

    International Nuclear Information System (INIS)

    Nuzum, J.L.

    1997-01-01

    This document is the final laboratory report for Tank 241-AP-105. Push mode core segments were removed from Risers 24 and 28 between July 2, 1997, and July 14, 1997. Segments were received and extruded at 222-S Laboratory. Analyses were performed in accordance with Tank 241-AP-105 Push Mode Core Sampling and Analysis Plan (TSAP) (Hu, 1997) and Tank Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT), differential scanning calorimetry (DSC) analysis, or total organic carbon (TOC) analysis exceeded the notification limits as stated in TSAP and DQO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group, and are not considered in this report. Appearance and Sample Handling Two cores, each consisting of four segments, were expected from Tank 241-AP-105. Three cores were sampled, and complete cores were not obtained. TSAP states core samples should be transported to the laboratory within three calendar days from the time each segment is removed from the tank. This requirement was not met for all cores. Attachment 1 illustrates subsamples generated in the laboratory for analysis and identifies their sources. This reference also relates tank farm identification numbers to their corresponding 222-S Laboratory sample numbers

  8. Tank 241-T-201, core 192 analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Nuzum, J.L.

    1997-08-07

    This document is the final laboratory report for Tank 241-T-201. Push mode core segments were removed from Riser 3 between April 24, 1997, and April 25, 1997. Segments were received and extruded at 222-S Laboratory. Analyses were performed in accordance with Tank 241-T-201 Push Mode Core Sampling and Analysis Plan (TSAP) (Hu, 1997), Letter of Instruction for Core Sample Analysis of Tanks 241-T-201, 241-T-202, 241-T-203, and 241-T-204 (LOI) (Bell, 1997), Additional Core Composite Sample from Drainable Liquid Samples for Tank 241-T-2 01 (ACC) (Hall, 1997), and Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT) or differential scanning calorimetry (DSC) analyses exceeded the notification limits stated in DQO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group, and are not considered in this report.

  9. Tank 241-AP-105, cores 208, 209 and 210, analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Nuzum, J.L.

    1997-10-24

    This document is the final laboratory report for Tank 241-AP-105. Push mode core segments were removed from Risers 24 and 28 between July 2, 1997, and July 14, 1997. Segments were received and extruded at 222-S Laboratory. Analyses were performed in accordance with Tank 241-AP-105 Push Mode Core Sampling and Analysis Plan (TSAP) (Hu, 1997) and Tank Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT), differential scanning calorimetry (DSC) analysis, or total organic carbon (TOC) analysis exceeded the notification limits as stated in TSAP and DQO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group, and are not considered in this report. Appearance and Sample Handling Two cores, each consisting of four segments, were expected from Tank 241-AP-105. Three cores were sampled, and complete cores were not obtained. TSAP states core samples should be transported to the laboratory within three calendar days from the time each segment is removed from the tank. This requirement was not met for all cores. Attachment 1 illustrates subsamples generated in the laboratory for analysis and identifies their sources. This reference also relates tank farm identification numbers to their corresponding 222-S Laboratory sample numbers.

  10. Tank 241-T-201, core 192 analytical results for the final report

    International Nuclear Information System (INIS)

    Nuzum, J.L.

    1997-01-01

    This document is the final laboratory report for Tank 241-T-201. Push mode core segments were removed from Riser 3 between April 24, 1997, and April 25, 1997. Segments were received and extruded at 222-S Laboratory. Analyses were performed in accordance with Tank 241-T-201 Push Mode Core Sampling and Analysis Plan (TSAP) (Hu, 1997), Letter of Instruction for Core Sample Analysis of Tanks 241-T-201, 241-T-202, 241-T-203, and 241-T-204 (LOI) (Bell, 1997), Additional Core Composite Sample from Drainable Liquid Samples for Tank 241-T-2 01 (ACC) (Hall, 1997), and Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT) or differential scanning calorimetry (DSC) analyses exceeded the notification limits stated in DQO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group, and are not considered in this report

  11. Tank 103, 219-S Facility at 222-S Laboratory, analytical results for the final report

    International Nuclear Information System (INIS)

    Fuller, R.K.

    1998-01-01

    This is the final report for the polychlorinated biphenyls analysis of Tank-103 (TK-103) in the 219-S Facility at 222-S Laboratory. Twenty 1-liter bottles (Sample numbers S98SO00074 through S98SO00093) were received from TK-103 during two sampling events, on May 5 and May 7, 1998. The samples were centrifuged to separate the solids and liquids. The centrifuged sludge was analyzed for PCBs as Aroclor mixtures. The results are discussed on page 6. The sample breakdown diagram (Page 114) provides a cross-reference of sample identification of the bulk samples to the laboratory identification number for the solids. The request for sample analysis (RSA) form is provided as Page 117. The raw data is presented on Page 43. Sample Description, Handling, and Preparation Twenty samples were received in the laboratory in 1-Liter bottles. The first 8 samples were received on May 5, 1998. There were insufficient solids to perform the requested PCB analysis and 12 additional samples were collected and received on May 7, 1998. Breakdown and sub sampling was performed on May 8, 1998. Sample number S98SO00084 was lost due to a broken bottle. Nineteen samples were centrifuged and the solids were collected in 8 centrifuge cones. After the last sample was processed, the solids were consolidated into 2 centrifuge cones. The first cone contained 9.7 grams of solid and 13.0 grams was collected in the second cone. The wet sludge from the first centrifuge cone was submitted to the laboratory for PCB analysis (sample number S98SO00102). The other sample portion (S98SO00103) was retained for possible additional analyses

  12. Gas purity analytics, calibration studies, and background predictions towards the first results of XENON1T

    Energy Technology Data Exchange (ETDEWEB)

    Hasterok, Constanze

    2017-10-25

    The XENON1T experiment aims at the direct detection of the well motivated dark matter candidate of weakly interacting massive particles (WIMPs) scattering off xenon nuclei. The first science run of 34.2 live days has already achieved the most stringent upper limit on spin-independent WIMP-nucleon cross-sections above masses of 10 GeV with a minimum of 7.7.10{sup -47} cm{sup 2} at a mass of 35 GeV. Crucial for this unprecedented sensitivity are a high xenon gas purity and a good understanding of the background. In this work, a procedure is described that was developed to measure the purity of the experiment's xenon inventory of more than three tons during its initial transfer to the detector gas system. The technique of gas chromatography has been employed to analyze the noble gas for impurities with the focus on oxygen and krypton contaminations. Furthermore, studies on the calibration of the experiment's dominating background induced by natural gamma and beta radiation were performed. Hereby, the novel sources of radioactive isotopes that can be dissolved in the xenon were employed, namely {sup 220}Rn and tritium. The sources were analyzed in terms of a potential impact on the outcome of a dark matter search. As a result of the promising findings for {sup 220}Rn, the source was successfully deployed in the first science run of XENON1T. The first WIMP search of XENON1T is outlined in this thesis, in which a background component from interactions taking place in close proximity to the detector wall is identified, investigated and modeled. A background prediction was derived that was incorporated into the background model of the WIMP search which was found to be in good agreement with the observation.

  13. Environmental influences on fruit and vegetable intake: Results from a path analytic model

    Science.gov (United States)

    Liese, Angela D.; Bell, Bethany A.; Barnes, Timothy L.; Colabianchi, Natalie; Hibbert, James D.; Blake, Christine E.; Freedman, Darcy A.

    2014-01-01

    Objective Fruit and vegetable intake (F&V) is influenced by behavioral and environmental factors, but these have rarely been assessed simultaneously. We aimed to quantify the relative influence of supermarket availability, perceptions of the food environment, and shopping behavior on F&V intake. Design A cross-sectional study. Setting Eight-counties in South Carolina, USA, with verified locations of all supermarkets. Subjects A telephone survey of 831 household food shoppers ascertained F&V intake with a 17-item screener, primary food store location, shopping frequency, perceptions of healthy food availability, and calculated GIS-based supermarket availability. Path analysis was conducted. We report standardized beta coefficients on paths significant at the 0.05 level. Results Frequency of grocery shopping at primary food store (β=0.11) was the only factor exerting an independent, statistically significant direct effect on F&V intake. Supermarket availability was significantly associated with distance to food store (β=-0.24) and shopping frequency (β=0.10). Increased supermarket availability was significantly and positively related to perceived healthy food availability in the neighborhood (β=0.18) and ease of shopping access (β=0.09). Collectively considering all model paths linked to perceived availability of healthy foods, this measure was the only other factor to have a significant total effect on F&V intake. Conclusions While the majority of literature to date has suggested an independent and important role of supermarket availability for F&V intake, our study found only indirect effects of supermarket availability and suggests that food shopping frequency and perceptions of healthy food availability are two integral components of a network of influences on F&V intake. PMID:24192274

  14. 42 CFR 476.94 - Notice of QIO initial denial determination and changes as a result of a DRG validation.

    Science.gov (United States)

    2010-10-01

    ... changes as a result of a DRG validation. 476.94 Section 476.94 Public Health CENTERS FOR MEDICARE... changes as a result of a DRG validation. (a) Notice of initial denial determination—(1) Parties to be... retrospective review, (excluding DRG validation and post procedure review), within 3 working days of the initial...

  15. Development and validation of an analytical method for the determination of lead isotopic composition using ICP-QMS

    OpenAIRE

    Rodríguez-Salazar, M. T.; Morton Bermea, O.; Hernández-Álvarez, E.; García-Arreola, M. E.; Ortuño-Arzate, M. T.

    2010-01-01

    This work reports a method for the precise and accurate determination of Pb isotope composition in soils and geological matrices by ICP-QMS. Three reference materials (AGV-2, SRM 2709 and JSO-1) were repeatedly measured, using ICP-QMS instruments in order to assess the quality of this analytical procedure. Mass discrimination was evaluated for Pb/Pb with Pb isotope reference material NIST SRM 981, and the correction applied to the above mentioned reference materials to achieve good accuracy o...

  16. Evaluation of convergent and discriminant validity of the Russian version of MMPI-2: First results

    Directory of Open Access Journals (Sweden)

    Emma I. Mescheriakova

    2015-06-01

    Full Text Available The paper presents the results of construct validity testing for a new version of the MMPI-2 (Minnesota Multiphasic Personality Inventory, which restandardization started in 1982 (J.N. Butcher, W.G. Dahlstrom, J.R. Graham, A. Tellegen, B. Kaemmer and is still going on. The professional community’s interest in this new version of the Inventory is determined by its advantage over the previous one in restructuring the inventory and adding new items which offer additional opportunities for psychodiagnostics and personality assessment. The construct validity testing was carried out using three up-to-date techniques, namely the Quality of Life and Satisfaction with Life questionnaire (a short version of Ritsner’s instrument adapted by E.I. Rasskazova, Janoff-Bulman’s World Assumptions Scale (adapted by O. Kravtsova, and the Character Strengths Assessment questionnaire developed by E. Osin based on Peterson and Seligman’s Values in Action Inventory of Strengths. These psychodiagnostic techniques were selected in line with the current trends in psychology, such as its orientation to positive phenomena as well as its interpretation of subjectivity potential as the need for self-determined, self-organized, self-realized and self-controlled behavior and the ability to accomplish it. The procedure of construct validity testing involved the «norm» group respondents, with the total sample including 205 people (62% were females, 32% were males. It was focused on the MMPI-2 additional and expanded scales (FI, BF, FP, S и К and six of its ten basic ones (D, Pd, Pa, Pt, Sc, Si. The results obtained confirmed construct validity of the scales concerned, and this allows the MMPI-2 to be applied to examining one’s personal potential instead of a set of questionnaires, facilitating, in turn, the personality researchers’ objectives. The paper discusses the first stage of this construct validity testing, the further stage highlighting the factor

  17. Development and validation of simple RP-HPLC-PDA analytical protocol for zileuton assisted with Design of Experiments for robustness determination

    OpenAIRE

    Saurabh B. Ganorkar; Dinesh M. Dhumal; Atul A. Shirkhedkar

    2017-01-01

    A simple, rapid, sensitive, robust, stability-indicating RP-HPLC-PDA analytical protocol was developed and validated for the analysis of zileuton racemate in bulk and in tablet formulation. Development of method and resolution of degradation products from forced; hydrolytic (acidic, basic, neutral), oxidative, photolytic (acidic, basic, neutral, solid state) and thermal (dry heat) degradation was achieved on a LC – GC Qualisil BDS C18 column (250 mm × 4.6 mm × 5 μm) by isocratic mode at ambie...

  18. Validation of analytical methods in GMP: the disposable Fast Read 102® device, an alternative practical approach for cell counting

    Directory of Open Access Journals (Sweden)

    Gunetti Monica

    2012-05-01

    Full Text Available Abstract Background The quality and safety of advanced therapy products must be maintained throughout their production and quality control cycle to ensure their final use in patients. We validated the cell count method according to the International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use and European Pharmacopoeia, considering the tests’ accuracy, precision, repeatability, linearity and range. Methods As the cell count is a potency test, we checked accuracy, precision, and linearity, according to ICH Q2. Briefly our experimental approach was first to evaluate the accuracy of Fast Read 102® compared to the Bürker chamber. Once the accuracy of the alternative method was demonstrated, we checked the precision and linearity test only using Fast Read 102®. The data were statistically analyzed by average, standard deviation and coefficient of variation percentages inter and intra operator. Results All the tests performed met the established acceptance criteria of a coefficient of variation of less than ten percent. For the cell count, the precision reached by each operator had a coefficient of variation of less than ten percent (total cells and under five percent (viable cells. The best range of dilution, to obtain a slope line value very similar to 1, was between 1:8 and 1:128. Conclusions Our data demonstrated that the Fast Read 102® count method is accurate, precise and ensures the linearity of the results obtained in a range of cell dilution. Under our standard method procedures, this assay may thus be considered a good quality control method for the cell count as a batch release quality control test. Moreover, the Fast Read 102® chamber is a plastic, disposable device that allows a number of samples to be counted in the same chamber. Last but not least, it overcomes the problem of chamber washing after use and so allows a cell count in a clean environment such as that in a

  19. An overview of gamma-hydroxybutyric acid: pharmacodynamics, pharmacokinetics, toxic effects, addiction, analytical methods, and interpretation of results.

    Science.gov (United States)

    Andresen, H; Aydin, B E; Mueller, A; Iwersen-Bergmann, S

    2011-09-01

    Abuse of gamma-hydroxybutyric acid (GHB) has been known since the early 1990's, but is not as widespread as the consumption of other illegal drugs. However, the number of severe intoxications with fatal outcomes is comparatively high; not the least of which is brought about by the consumption of the currently legal precursor substances gamma-butyrolactone (GBL) and 1,4-butanediol (1,4-BD). In regards to previous assumptions, addiction to GHB or its analogues can occur with severe symptoms of withdrawal. Moreover, GHB can be used for drug-facilitated sexual assaults. Its pharmacological effects are generated mainly by interaction with both GABA(B) and GHB receptors, as well as its influence on other transmitter systems in the human brain. Numerous analytical methods for determining GHB using chromatographic techniques were published in recent years, and an enzymatic screening method was established. However, the short window of GHB detection in blood or urine due to its rapid metabolism is a challenge. Furthermore, despite several studies addressing this problem, evaluation of analytical results can be difficult: GHB is a metabolite of GABA (gamma-aminobutyric acid); a differentiation between endogenous and exogenous concentrations has to be made. Apart from this, in samples with a longer storage interval and especially in postmortem specimens, higher levels can be measured due to GHB generation during this postmortem interval or storage time. Copyright © 2011 John Wiley & Sons, Ltd.

  20. Effect of Micro Electrical Discharge Machining Process Conditions on Tool Wear Characteristics: Results of an Analytic Study

    DEFF Research Database (Denmark)

    Puthumana, Govindan; P., Rajeev

    2016-01-01

    Micro electrical discharge machining is one of the established techniques to manufacture high aspect ratio features on electrically conductive materials. This paper presents the results and inferences of an analytical study for estimating theeffect of process conditions on tool electrode wear...... characteristicsin micro-EDM process. A new approach with two novel factors anticipated to directly control the material removal mechanism from the tool electrode are proposed; using discharge energyfactor (DEf) and dielectric flushing factor (DFf). The results showed that the correlation between the tool wear rate...... (TWR) and the factors is poor. Thus, individual effects of each factor on TWR are analyzed. The factors selected for the study of individual effects are pulse on-time, discharge peak current, gap voltage and gap flushing pressure. The tool wear rate decreases linearly with an increase in the pulse on...

  1. Validation of analytical methods in GMP: the disposable Fast Read 102® device, an alternative practical approach for cell counting.

    Science.gov (United States)

    Gunetti, Monica; Castiglia, Sara; Rustichelli, Deborah; Mareschi, Katia; Sanavio, Fiorella; Muraro, Michela; Signorino, Elena; Castello, Laura; Ferrero, Ivana; Fagioli, Franca

    2012-05-31

    The quality and safety of advanced therapy products must be maintained throughout their production and quality control cycle to ensure their final use in patients. We validated the cell count method according to the International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use and European Pharmacopoeia, considering the tests' accuracy, precision, repeatability, linearity and range. As the cell count is a potency test, we checked accuracy, precision, and linearity, according to ICH Q2. Briefly our experimental approach was first to evaluate the accuracy of Fast Read 102® compared to the Bürker chamber. Once the accuracy of the alternative method was demonstrated, we checked the precision and linearity test only using Fast Read 102®. The data were statistically analyzed by average, standard deviation and coefficient of variation percentages inter and intra operator. All the tests performed met the established acceptance criteria of a coefficient of variation of less than ten percent. For the cell count, the precision reached by each operator had a coefficient of variation of less than ten percent (total cells) and under five percent (viable cells). The best range of dilution, to obtain a slope line value very similar to 1, was between 1:8 and 1:128. Our data demonstrated that the Fast Read 102® count method is accurate, precise and ensures the linearity of the results obtained in a range of cell dilution. Under our standard method procedures, this assay may thus be considered a good quality control method for the cell count as a batch release quality control test. Moreover, the Fast Read 102® chamber is a plastic, disposable device that allows a number of samples to be counted in the same chamber. Last but not least, it overcomes the problem of chamber washing after use and so allows a cell count in a clean environment such as that in a Cell Factory. In a good manufacturing practice setting the disposable

  2. Validación del método analítico para la cuantificación de bacitracina Validation of the analytical method to quantify the bacitracine

    Directory of Open Access Journals (Sweden)

    Carolina Velandia-Castellanos

    2011-06-01

    Full Text Available Se desarrolló y validó un método analítico para la determinación cuantitativa de bacitracina zinc al 15 % y bacitracina metilen disalicilato al 11 %, por el método de cilindro en placa (difusión en agar, con el fin de ser usado en el control de calidad de las materias primas y productos farmacéuticos. Se evaluaron los parámetros de especificidad, selectividad, linealidad del sistema, y del método, exactitud, límite de cuantificación y precisión. Mediante el diseño experimental y la evaluación estadística de los resultados, se demostró que el método analítico es específico, selectivo, lineal, preciso (CVAn analytical method was developed and validated for quantitative determination of 15 % zinc bacitracine and 11 % disalicylate methylene-bacitracine by the plate-cylinder method (agar diffusion to be used in quality control of raw products and pharmaceutical products. Specificity, selectivity, system and method linearity, accuracy, quantification and precision parameters were assessed. By the experimental design and the statistic evaluation of results, it was demonstrated that the analytical method is specific, selective, linear, precise (CV< 5 % and exact (bias < 3 %, Gexp< Gtab< t exp< t tab during the study concentrations. The quantification and detection limit was of 0.02 and 0.005 Ul/mL, respectively. The analytical performance characteristics fulfill the requirement for the proposal analytical implementation.

  3. Method validation in plasma source optical emission spectroscopy (ICP-OES) - From samples to results

    International Nuclear Information System (INIS)

    Pilon, Fabien; Vielle, Karine; Birolleau, Jean-Claude; Vigneau, Olivier; Labet, Alexandre; Arnal, Nadege; Adam, Christelle; Camilleri, Virginie; Amiel, Jeanine; Granier, Guy; Faure, Joel; Arnaud, Regine; Beres, Andre; Blanchard, Jean-Marc; Boyer-Deslys, Valerie; Broudic, Veronique; Marques, Caroline; Augeray, Celine; Bellefleur, Alexandre; Bienvenu, Philippe; Delteil, Nicole; Boulet, Beatrice; Bourgarit, David; Brennetot, Rene; Fichet, Pascal; Celier, Magali; Chevillotte, Rene; Klelifa, Aline; Fuchs, Gilbert; Le Coq, Gilles; Mermet, Jean-Michel

    2017-01-01

    Even though ICP-OES (Inductively Coupled Plasma - Optical Emission Spectroscopy) is now a routine analysis technique, requirements for measuring processes impose a complete control and mastering of the operating process and of the associated quality management system. The aim of this (collective) book is to guide the analyst during all the measurement validation procedure and to help him to guarantee the mastering of its different steps: administrative and physical management of samples in the laboratory, preparation and treatment of the samples before measuring, qualification and monitoring of the apparatus, instrument setting and calibration strategy, exploitation of results in terms of accuracy, reliability, data covariance (with the practical determination of the accuracy profile). The most recent terminology is used in the book, and numerous examples and illustrations are given in order to a better understanding and to help the elaboration of method validation documents

  4. Validation of an analytical method for the determination of polycyclic aromatic hydrocarbons by high efficiency liquid chromatography in PM10 and PM2,5 particles

    International Nuclear Information System (INIS)

    Herrera Murillo, Jorge; Chaves Villalobos, Maria del Carmen

    2012-01-01

    An analytical method was validated for polycyclic aromatic hydrocarbons in PM10 and PM2,5 particles collected from air by high performance liquid chromatography (HPLC) was validated. The PAHs analyzed in the methodology include: Naphthalene, Acenaphthylene, Fluorene, Acenaphthene, Phenanthrene, Anthracene, fluoranthene, pyrene, Benzo (a)anthracene, Chrysene, Benzo (b)fluoranthene, Benzo (k)fluoranthene, Benzo (a)pyrene, Dibenzo (a, h)anthracene, Benzo (g, h, i)perylene and Indeno (1,2,3-CD)pyrene. For these compounds, the detection limit and quantification limit have been between 0,02 and 0,1 mg/l. An equipment DIONEX, ICS 3000 model is used, that has two in series detectors: one ultraviolet model VWD-1, and fluorescence detector, model RF-2000, separating the different absorption and emission signals for proper identification of individual compounds. For all the compounds analyzed, the recovery factor has found not significantly different from each other and the repeatability and reproducibility has been to be suitable for an analytical method, especially for the lighter PAHs. (author) [es

  5. Validation of an analytical method for simultaneous high-precision measurements of greenhouse gas emissions from wastewater treatment plants using a gas chromatography-barrier discharge detector system.

    Science.gov (United States)

    Pascale, Raffaella; Caivano, Marianna; Buchicchio, Alessandro; Mancini, Ignazio M; Bianco, Giuliana; Caniani, Donatella

    2017-01-13

    Wastewater treatment plants (WWTPs) emit CO 2 and N 2 O, which may lead to climate change and global warming. Over the last few years, awareness of greenhouse gas (GHG) emissions from WWTPs has increased. Moreover, the development of valid, reliable, and high-throughput analytical methods for simultaneous gas analysis is an essential requirement for environmental applications. In the present study, an analytical method based on a gas chromatograph (GC) equipped with a barrier ionization discharge (BID) detector was developed for the first time. This new method simultaneously analyses CO 2 and N 2 O and has a precision, measured in terms of relative standard of variation RSD%, equal to or less than 6.6% and 5.1%, respectively. The method's detection limits are 5.3ppm v for CO 2 and 62.0ppb v for N 2 O. The method's selectivity, linearity, accuracy, repeatability, intermediate precision, limit of detection and limit of quantification were good at trace concentration levels. After validation, the method was applied to a real case of N 2 O and CO 2 emissions from a WWTP, confirming its suitability as a standard procedure for simultaneous GHG analysis in environmental samples containing CO 2 levels less than 12,000mg/L. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Sewage-based epidemiology in monitoring the use of new psychoactive substances: Validation and application of an analytical method using LC-MS/MS.

    Science.gov (United States)

    Kinyua, Juliet; Covaci, Adrian; Maho, Walid; McCall, Ann-Kathrin; Neels, Hugo; van Nuijs, Alexander L N

    2015-09-01

    Sewage-based epidemiology (SBE) employs the analysis of sewage to detect and quantify drug use within a community. While SBE has been applied repeatedly for the estimation of classical illicit drugs, only few studies investigated new psychoactive substances (NPS). These compounds mimic effects of illicit drugs by introducing slight modifications to chemical structures of controlled illicit drugs. We describe the optimization, validation, and application of an analytical method using liquid chromatography coupled to positive electrospray tandem mass spectrometry (LC-ESI-MS/MS) for the determination of seven NPS in sewage: methoxetamine (MXE), butylone, ethylone, methylone, methiopropamine (MPA), 4-methoxymethamphetamine (PMMA), and 4-methoxyamphetamine (PMA). Sample preparation was performed using solid-phase extraction (SPE) with Oasis MCX cartridges. The LC separation was done with a HILIC (150 x 3 mm, 5 µm) column which ensured good resolution of the analytes with a total run time of 19 min. The lower limit of quantification (LLOQ) was between 0.5 and 5 ng/L for all compounds. The method was validated by evaluating the following parameters: sensitivity, selectivity, linearity, accuracy, precision, recoveries and matrix effects. The method was applied on sewage samples collected from sewage treatment plants in Belgium and Switzerland in which all investigated compounds were detected, except MPA and PMA. Furthermore, a consistent presence of MXE has been observed in most of the sewage samples at levels higher than LLOQ. Copyright © 2015 John Wiley & Sons, Ltd.

  7. Pre-analytical and analytical validations and clinical applications of a miniaturized, simple and cost-effective solid phase extraction combined with LC-MS/MS for the simultaneous determination of catecholamines and metanephrines in spot urine samples.

    Science.gov (United States)

    Li, Xiaoguang Sunny; Li, Shu; Kellermann, Gottfried

    2016-10-01

    It remains a challenge to simultaneously quantify catecholamines and metanephrines in a simple, sensitive and cost-effective manner due to pre-analytical and analytical constraints. Herein, we describe such a method consisting of a miniaturized sample preparation and selective LC-MS/MS detection by the use of second morning spot urine samples. Ten microliters of second morning urine sample were subjected to solid phase extraction on an Oasis HLB microplate upon complexation with phenylboronic acid. The analytes were well-resolved on a Luna PFP column followed by tandem mass spectrometric detection. Full validation and suitability of spot urine sampling and biological variation were investigated. The extraction recovery and matrix effect are 74.1-97.3% and 84.1-119.0%, respectively. The linearity range is 2.5-500, 0.5-500, 2.5-1250, 2.5-1250 and 0.5-1250ng/mL for norepinephrine, epinephrine, dopamine, normetanephrine and metanephrine, respectively. The intra- and inter-assay imprecisions are ≤9.4% for spiked quality control samples, and the respective recoveries are 97.2-112.5% and 95.9-104.0%. The Deming regression slope is 0.90-1.08, and the mean Bland-Altman percentage difference is from -3.29 to 11.85 between a published and proposed method (n=50). A correlation observed for the spot and 24h urine collections is significant (n=20, p<0.0001, r: 0.84-0.95, slope: 0.61-0.98). No statistical differences are found in day-to-day biological variability (n=20). Reference intervals are established for an apparently healthy population (n=88). The developed method, being practical, sensitive, reliable and cost-effective, is expected to set a new stage for routine testing, basic research and clinical applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Validation results of satellite mock-up capturing experiment using nets

    Science.gov (United States)

    Medina, Alberto; Cercós, Lorenzo; Stefanescu, Raluca M.; Benvenuto, Riccardo; Pesce, Vincenzo; Marcon, Marco; Lavagna, Michèle; González, Iván; Rodríguez López, Nuria; Wormnes, Kjetil

    2017-05-01

    The PATENDER activity (Net parametric characterization and parabolic flight), funded by the European Space Agency (ESA) via its Clean Space initiative, was aiming to validate a simulation tool for designing nets for capturing space debris. This validation has been performed through a set of different experiments under microgravity conditions where a net was launched capturing and wrapping a satellite mock-up. This paper presents the architecture of the thrown-net dynamics simulator together with the set-up of the deployment experiment and its trajectory reconstruction results on a parabolic flight (Novespace A-310, June 2015). The simulator has been implemented within the Blender framework in order to provide a highly configurable tool, able to reproduce different scenarios for Active Debris Removal missions. The experiment has been performed over thirty parabolas offering around 22 s of zero-g conditions. Flexible meshed fabric structure (the net) ejected from a container and propelled by corner masses (the bullets) arranged around its circumference have been launched at different initial velocities and launching angles using a pneumatic-based dedicated mechanism (representing the chaser satellite) against a target mock-up (the target satellite). High-speed motion cameras were recording the experiment allowing 3D reconstruction of the net motion. The net knots have been coloured to allow the images post-process using colour segmentation, stereo matching and iterative closest point (ICP) for knots tracking. The final objective of the activity was the validation of the net deployment and wrapping simulator using images recorded during the parabolic flight. The high-resolution images acquired have been post-processed to determine accurately the initial conditions and generate the reference data (position and velocity of all knots of the net along its deployment and wrapping of the target mock-up) for the simulator validation. The simulator has been properly

  9. Nature and strength of bonding in a crystal of semiconducting nanotubes: van der Waals density functional calculations and analytical results

    DEFF Research Database (Denmark)

    Kleis, Jesper; Schröder, Elsebeth; Hyldgaard, Per

    2008-01-01

    calculations, the vdW-DF study predicts an intertube vdW bonding with a strength that is consistent with recent observations for the interlayer binding in graphitics. It also produces a nanotube wall-to-wall separation, which is in very good agreement with experiments. Moreover, we find that the vdW-DF result...... for the nanotube-crystal binding energy can be approximated by a sum of nanotube-pair interactions when these are calculated in vdW-DR This observation suggests a framework for an efficient implementation of quantum-physical modeling of the carbon nanotube bundling in more general nanotube bundles, including......The dispersive interaction between nanotubes is investigated through ab initio theory calculations and in an analytical approximation. A van der Waals density functional (vdW-DF) [M. Dion et al., Phys. Rev. Lett. 92, 246401 (2004)] is used to determine and compare the binding of a pair of nanotubes...

  10. Analytic two-loop results for self-energy- and vertex-type diagrams with one non-zero mass

    International Nuclear Information System (INIS)

    Fleischer, J.; Kotikov, A.V.; Veretin, O.L.

    1999-01-01

    For a large class of two-loop self-energy- and vertex-type diagrams with only one non-zero mass (m) and the vertices also with only one non-zero external momentum squared (q 2 ) the first few expansion coefficients are calculated by the large mass expansion. This allows us to 'guess' the general structure of these coefficients and to verify them in terms of certain classes of 'basis elements', which are essentially harmonic sums. Since for this case with only one non-zero mass the large mass expansion and the Taylor series in terms of q 2 are identical, this approach yields analytic expressions of the Taylor coefficients, from which the diagram can be easily evaluated numerically in a large domain of the complex q 2 -plane by well known methods. It is also possible to sum the Taylor series and present the results in terms of polylogarithms

  11. FAPIG's activities for public acceptance of nuclear energy. Analytical results of questionnaire executed at organized visits to nuclear power stations

    International Nuclear Information System (INIS)

    Yoneda, Masaaki

    2010-01-01

    The First Atomic Power Industry Group (FAPIG) organized eighteenth visit of woman employees to nuclear power stations. They would have few chance of such a visit and to unfamiliar with mechanism of nuclear power generation as well as radiation and radioactivity. Participants were required to have a lecture on energy in general and basic understanding of nuclear energy and then had a visit to nuclear power stations to learn nuclear energy as correct knowledge. They also filled out the same questionnaire before the lecture and after the visit to express their ideas or comments on nuclear energy. This paper described analytical results of the questionnaire and significance of the organized visit for public acceptance of nuclear energy. (T. Tanaka)

  12. Validation and results of a questionnaire for functional bowel disease in out-patients

    Directory of Open Access Journals (Sweden)

    Skordilis Panagiotis

    2002-05-01

    Full Text Available Abstract Background The aim was to evaluate and validate a bowel disease questionnaire in patients attending an out-patient gastroenterology clinic in Greece. Methods This was a prospective study. Diagnosis was based on detailed clinical and laboratory evaluation. The questionnaire was tested on a pilot group of patients. Interviewer-administration technique was used. One-hundred-and-forty consecutive patients attending the out-patient clinic for the first time and fifty healthy controls selected randomly participated in the study. Reliability (kappa statistics and validity of the questionnaire were tested. We used logistic regression models and binary recursive partitioning for assessing distinguishing ability among irritable bowel syndrome (IBS, functional dyspepsia and organic disease patients. Results Mean time for questionnaire completion was 18 min. In test-retest procedure a good agreement was obtained (kappa statistics 0.82. There were 55 patients diagnosed as having IBS, 18 with functional dyspepsia (Rome I criteria, 38 with organic disease. Location of pain was a significant distinguishing factor, patients with functional dyspepsia having no lower abdominal pain (p Conclusions This questionnaire for functional bowel disease is a valid and reliable instrument that can distinguish satisfactorily between organic and functional disease in an out-patient setting.

  13. Cultural adaptation and validation of an instrument on barriers for the use of research results.

    Science.gov (United States)

    Ferreira, Maria Beatriz Guimarães; Haas, Vanderlei José; Dantas, Rosana Aparecida Spadoti; Felix, Márcia Marques Dos Santos; Galvão, Cristina Maria

    2017-03-02

    to culturally adapt The Barriers to Research Utilization Scale and to analyze the metric validity and reliability properties of its Brazilian Portuguese version. methodological research conducted by means of the cultural adaptation process (translation and back-translation), face and content validity, construct validity (dimensionality and known groups) and reliability analysis (internal consistency and test-retest). The sample consisted of 335 nurses, of whom 43 participated in the retest phase. the validity of the adapted version of the instrument was confirmed. The scale investigates the barriers for the use of the research results in clinical practice. Confirmatory factorial analysis demonstrated that the Brazilian Portuguese version of the instrument is adequately adjusted to the dimensional structure the scale authors originally proposed. Statistically significant differences were observed among the nurses holding a Master's or Doctoral degree, with characteristics favorable to Evidence-Based Practice, and working at an institution with an organizational cultural that targets this approach. The reliability showed a strong correlation (r ranging between 0.77 and 0.84, pcultura organizacional dirigida hacia tal aproximación. La fiabilidad presentó correlación fuerte (r variando entre 0,77 y 0,84, pcultura organizacional direcionada para tal abordagem. A confiabilidade apresentou correlação forte (r variando entre 0,77e 0,84, p<0,001) e a consistência interna foi adequada (alfa de Cronbach variando entre 0,77 e 0,82) . a versão para o português brasileiro do instrumento The Barriers Scale demonstrou-se válida e confiável no grupo estudado.

  14. Analytic validation and comparison of three commercial immunoassays for measurement of plasma atrial/A-type natriuretic peptide concentration in horses

    DEFF Research Database (Denmark)

    Trachsel, D S; Schwarzwald, C C; Grenacher, B

    2014-01-01

    Measurement of atrial/A-type natriuretic peptide (ANP) concentrations may be of use for assessment of cardiac disease, and reliable data on the analytic performance of available assays are needed. To assess the suitability for clinical use of commercially available ANP assays, intra-assay and inter......-Altman analyses. For all assays, precision was moderate but acceptable and dilution parallelism was good. All assays showed analytic performance similar to other immunoassays used in veterinary medicine. However, the results from the three assays were poorly comparable. Our study highlights the need...

  15. Proposal and experimental validation of analytical models for seismic and vibration isolation devices in nuclear and non-nuclear facilities

    International Nuclear Information System (INIS)

    Serino, G.; Bonacina, G.; Bettinali, F.

    1993-01-01

    Two analytical-experimental models of HDLRBs having different levels of approximations are presented. Comparison with available experimental data shows that a non-linear hysteretic model, defined by three rubber parameters only, allows a very good complete simulation of the dynamic behavior of the isolation devices. A simpler equivalent linear viscous model reproduces less exactly the experimental behavior, but permits a good prediction of peak response values in the earthquake analysis of an isolated structure, if bearing stiffness and damping parameters are properly selected. The models have been used in preliminary design and subsequent check of the isolation system of two different types of Gas-Insulated Electric Substations (GIS), in view of possible future installation of isolated GISes in areas of high seismic risk. (author)

  16. SU-E-T-631: Preliminary Results for Analytical Investigation Into Effects of ArcCHECK Setup Errors

    International Nuclear Information System (INIS)

    Kar, S; Tien, C

    2015-01-01

    Purpose: As three-dimensional diode arrays increase in popularity for patient-specific quality assurance for intensity-modulated radiation therapy (IMRT), it is important to evaluate an array’s susceptibility to setup errors. The ArcCHECK phantom is set up by manually aligning its outside marks with the linear accelerator’s lasers and light-field. If done correctly, this aligns the ArcCHECK cylinder’s central axis (CAX) with the linear accelerator’s axis of rotation. However, this process is prone to error. This project has developed an analytical expression including a perturbation factor to quantify the effect of shifts. Methods: The ArcCHECK is set up by aligning its machine marks with either the sagittal room lasers or the light-field of the linear accelerator at gantry zero (IEC). ArcCHECK has sixty-six evenly-spaced SunPoint diodes aligned radially in a ring 14.4 cm from CAX. The detector response function (DRF) was measured and combined with inverse-square correction to develop an analytical expression for output. The output was calculated using shifts of 0 (perfect alignment), +/−1, +/−2 and +/−5 mm. The effect on a series of simple inputs was determined: unity, 1-D ramp, steps, and hat-function to represent uniform field, wedge, evenly-spaced modulation, and single sharp modulation, respectively. Results: Geometric expressions were developed with perturbation factor included to represent shifts. DRF was modeled using sixth-degree polynomials with correlation coefficient 0.9997. The output was calculated using simple inputs such as unity, 1-D ramp, steps, and hat-function, with perturbation factors of: 0, +/−1, +/−2 and +/−5 mm. Discrepancies have been observed, but large fluctuations have been somewhat mitigated by aliasing arising from discrete diode placement. Conclusion: An analytical expression with perturbation factors was developed to estimate the impact of setup errors on an ArcCHECK phantom. Presently, this has been applied to

  17. Results and validity of renal blood flow measurements using Xenon 133

    International Nuclear Information System (INIS)

    Serres, P.; Danet, B.; Guiraud, R.; Durand, D.; Ader, J.L.

    1975-01-01

    The renal blood flow was measured by external recording of the xenon 133 excretion curve. The study involved 45 patients with permanent high blood pressure and 7 transplant patients. The validity of the method was checked on 10 dogs. From the results it seems that the cortical blood flow, its fraction and the mean flow rate are the most representative of the renal haemodynamics parameters, from which may be established the repercussions of blood pressure on kidney vascularisation. Experiments are in progress on animals to check the compartment idea by comparing injections into the renal artery and into various kidney tissues in situ [fr

  18. Volatile composition of Merlot red wine and its contribution to the aroma: optimization and validation of analytical method.

    Science.gov (United States)

    Arcari, Stefany Grützmann; Caliari, Vinicius; Sganzerla, Marla; Godoy, Helena Teixeira

    2017-11-01

    A methodology for the determination of volatile compounds in red wine using headspace solid phase microextraction (HS-SPME) combined with gas chromatography-ion trap/ mass spectrometry (GC-IT/MS) and flame ionization detector (GC -FID) was developed, validated and applied to a sample of Brazilian red wine. The optimization strategy was conducted using the Plackett-Burman design for variable selection and central composite rotational design (CCRD). The response surface methodology showed that the performance of the extraction of the volatile compounds using divinylbenzene/carboxen/polydimethylsiloxane (DVB/CAR/PDMS) fiber is improved with no sample dilution, the addition of 30% NaCl, applying an extraction temperature of 56°C and extraction time of 55min. The qualitative method allowed the extraction and identification of 60 volatile compounds in the sample studied, notably the classes of esters, alcohols, and fatty acids. Furthermore, the method was successfully validated for the quantification of 55 volatile compounds of importance in wines and applied to twelve samples of Merlot red wine from South of Brazil. The calculation of the odor activity value (OAV) showed the most important components of the samples aroma. Ethyl isovalerate, ethyl hexanoate, 1-hexanol, octanoic acid and ethyl cinnamate had the greatest contribution to the aroma of the wines analyzed, which is predominantly fruity with the presence of herbal and fatty odors. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Analytical method validation of GC-FID for the simultaneous measurement of hydrocarbons (C2-C4 in their gas mixture

    Directory of Open Access Journals (Sweden)

    Oman Zuas

    2016-09-01

    Full Text Available An accurate gas chromatography coupled to a flame ionization detector (GC-FID method was validated for the simultaneous analysis of light hydrocarbons (C2-C4 in their gas mixture. The validation parameters were evaluated based on the ISO/IEC 17025 definition including method selectivity, repeatability, accuracy, linearity, limit of detection (LOD, limit of quantitation (LOQ, and ruggedness. Under the optimum analytical conditions, the analysis of gas mixture revealed that each target component was well-separated with high selectivity property. The method was also found to be precise and accurate. The method linearity was found to be high with good correlation coefficient values (R2 ≥ 0.999 for all target components. It can be concluded that the GC-FID developed method is reliable and suitable for determination of light C2-C4 hydrocarbons (including ethylene, propane, propylene, isobutane, and n-butane in their gas mixture. The validated method has successfully been applied to the estimation of hydrocarbons light C2-C4 hydrocarbons in natural gas samples, showing high performance repeatability with relative standard deviation (RSD less than 1.0% and good selectivity with no interference from other possible components could be observed.

  20. Analytical method (HPLC, validation used for identification and assay of the pharmaceutical active ingredient, Tylosin tartrate for veterinary use and its finite product Tilodem 50, hydrosoluble powder

    Directory of Open Access Journals (Sweden)

    Maria Neagu

    2010-12-01

    Full Text Available In SC DELOS IMPEX ’96 SRL the quality of the active pharmaceutical ingredient (API for the finite product Tilodem 50 - hydrosoluble powder was acomkplished in the respect of last European Pharmacopoeia.The method for analysis used in this purpose was the compendial method „Tylosin tartrate for veterinary use” in EurPh. in vigour edition and represent a variant developed and validation „in house”.The parameters which was included in the methodology validation for chromatographic method are the followings: Selectivity, Linearity, Linearity range, Detection and Quantification limits, Precision, Repeatability (intra day, Inter-Day Reproductibility, Accuracy, Robustness, Solutions’ stability and System suitability. According to the European Pharmacopoeia, the active pharmaceutical ingredient is consistent, in terms of quality, if it contains Tylosin A - minimum 80% and the amount of Tylosin A, B, C, D, at minimum 95%. Identification and determination of each component separately (Tylosin A, B, C, D is possible by chromatographic separation-HPLC. Validation of analytical methods is presented below.

  1. Site characterization and validation - Tracer migration experiment in the validation drift, report 2, part 1: performed experiments, results and evaluation

    International Nuclear Information System (INIS)

    Birgersson, L.; Widen, H.; Aagren, T.; Neretnieks, I.; Moreno, L.

    1992-01-01

    This report is the second of the two reports describing the tracer migration experiment where water and tracer flow has been monitored in a drift at the 385 m level in the Stripa experimental mine. The tracer migration experiment is one of a large number of experiments performed within the Site Characterization and Validation (SCV) project. The upper part of the 50 m long validation drift was covered with approximately 150 plastic sheets, in which the emerging water was collected. The water emerging into the lower part of the drift was collected in short boreholes, sumpholes. Sex different tracer mixtures were injected at distances between 10 and 25 m from the drift. The flowrate and tracer monitoring continued for ten months. Tracer breakthrough curves and flowrate distributions were used to study flow paths, velocities, hydraulic conductivities, dispersivities, interaction with the rock matrix and channelling effects within the rock. The present report describes the structure of the observations, the flowrate measurements and estimated hydraulic conductivities. The main part of this report addresses the interpretation of the tracer movement in fractured rock. The tracer movement as measured by the more than 150 individual tracer curves has been analysed with the traditional advection-dispersion model and a subset of the curves with the advection-dispersion-diffusion model. The tracer experiments have permitted the flow porosity, dispersion and interaction with the rock matrix to be studied. (57 refs.)

  2. Noninvasive assessment of mitral inertness [correction of inertance]: clinical results with numerical model validation.

    Science.gov (United States)

    Firstenberg, M S; Greenberg, N L; Smedira, N G; McCarthy, P M; Garcia, M J; Thomas, J D

    2001-01-01

    Inertial forces (Mdv/dt) are a significant component of transmitral flow, but cannot be measured with Doppler echo. We validated a method of estimating Mdv/dt. Ten patients had a dual sensor transmitral (TM) catheter placed during cardiac surgery. Doppler and 2D echo was performed while acquiring LA and LV pressures. Mdv/dt was determined from the Bernoulli equation using Doppler velocities and TM gradients. Results were compared with numerical modeling. TM gradients (range: 1.04-14.24 mmHg) consisted of 74.0 +/- 11.0% inertial forcers (range: 0.6-12.9 mmHg). Multivariate analysis predicted Mdv/dt = -4.171(S/D (RATIO)) + 0.063(LAvolume-max) + 5. Using this equation, a strong relationship was obtained for the clinical dataset (y=0.98x - 0.045, r=0.90) and the results of numerical modeling (y=0.96x - 0.16, r=0.84). TM gradients are mainly inertial and, as validated by modeling, can be estimated with echocardiography.

  3. Is the Market Eroding Moral Norms? A Micro-analytical Validation of Some Ideas of Anomie Theory

    Directory of Open Access Journals (Sweden)

    Eckhard Burkatzki

    2008-11-01

    Full Text Available Anomie theorists have been reporting the suppression of shared welfare orientations by the overwhelming dominance of economic values within capitalist societies since before the outset of neoliberalism debate. Obligations concerning common welfare are more and more often subordinated to the overarching aim of realizing economic success goals. This should be especially valid with for social life in contemporary market societies. This empirical investigation examines the extent to which market imperatives and values of the societal community are anchored within the normative orientations of market actors. Special attention is paid to whether the shape of these normative orientations varies with respect to the degree of market inclusion. Empirical analyses, based on the data of a standardized written survey within the German working population carried out in 2002, show that different types of normative orientation can be distinguished among market actors. These types are quite similar to the well-known types of anomic adaptation developed by Robert K. Merton in “Social Structure and Anomie” and are externally valid with respect to the prediction of different forms of economic crime. Further analyses show that the type of normative orientation actors adopt within everyday life depends on the dregree of market inclusion. Confirming anomie theory, it is shown that the individual willingness to subordinate matters of common welfare to the aim of economic success—radical market activism—gets stronger the more actors are included in the market sphere. Finally, the relevance of reported findings for the explanation of violent behavior, especially with view to varieties of corporate violence, is discussed.

  4. Furthering our Understanding of Land Surface Interactions using SVAT modelling: Results from SimSphere's Validation

    Science.gov (United States)

    North, Matt; Petropoulos, George; Ireland, Gareth; Rendal, Daisy; Carlson, Toby

    2015-04-01

    With current predicted climate change, there is an increased requirement to gain knowledge on the terrestrial biosphere, for numerous agricultural, hydrological and meteorological applications. To this end, Soil Vegetation Atmospheric Transfer (SVAT) models are quickly becoming the preferred scientific tool to monitor, at fine temporal and spatial resolutions, detailed information on numerous parameters associated with Earth system interactions. Validation of any model is critical to assess its accuracy, generality and realism to distinctive ecosystems and subsequently acts as important step before its operational distribution. In this study, the SimSphere SVAT model has been validated to fifteen different sites of the FLUXNET network, where model performance was statistically evaluated by directly comparing the model predictions vs in situ data, for cloud free days with a high energy balance closure. Specific focus is given to the models ability to simulate parameters associated with the energy balance, namely Shortwave Incoming Solar Radiation (Rg), Net Radiation (Rnet), Latent Heat (LE), Sensible Heat (H), Air Temperature at 1.3m (Tair 1.3m) and Air temperature at 50m (Tair 50m). Comparisons were performed for a number distinctive ecosystem types and for 150 days in total using in-situ data from ground observational networks acquired from the year 2011 alone. Evaluation of the models' coherence to reality was evaluated on the basis of a series of statistical parameters including RMSD, R2, Scatter, Bias, MAE , NASH index, Slope and Intercept. Results showed good to very good agreement between predicted and observed datasets, particularly so for LE, H, Tair 1.3m and Tair 50m where mean error distribution values indicated excellent model performance. Due to the systematic underestimation, poorer simulation accuracies were exhibited for Rg and Rnet, yet all values reported are still analogous to other validatory studies of its kind. In overall, the model

  5. Sources of Traffic and Visitors’ Preferences Regarding Online Public Reports of Quality: Web Analytics and Online Survey Results

    Science.gov (United States)

    Hibbard, Judith H; Greaves, Felix; Dudley, R Adams

    2015-01-01

    Background In the context of the Affordable Care Act, there is extensive emphasis on making provider quality transparent and publicly available. Online public reports of quality exist, but little is known about how visitors find reports or about their purpose in visiting. Objective To address this gap, we gathered website analytics data from a national group of online public reports of hospital or physician quality and surveyed real-time visitors to those websites. Methods Websites were recruited from a national group of online public reports of hospital or physician quality. Analytics data were gathered from each website: number of unique visitors, method of arrival for each unique visitor, and search terms resulting in visits. Depending on the website, a survey invitation was launched for unique visitors on landing pages or on pages with quality information. Survey topics included type of respondent (eg, consumer, health care professional), purpose of visit, areas of interest, website experience, and demographics. Results There were 116,657 unique visitors to the 18 participating websites (1440 unique visitors/month per website), with most unique visitors arriving through search (63.95%, 74,606/116,657). Websites with a higher percent of traffic from search engines garnered more unique visitors (P=.001). The most common search terms were for individual hospitals (23.25%, 27,122/74,606) and website names (19.43%, 22,672/74,606); medical condition terms were uncommon (0.81%, 605/74,606). Survey view rate was 42.48% (49,560/116,657 invited) resulting in 1755 respondents (participation rate=3.6%). There were substantial proportions of consumer (48.43%, 850/1755) and health care professional respondents (31.39%, 551/1755). Across websites, proportions of consumer (21%-71%) and health care professional respondents (16%-48%) varied. Consumers were frequently interested in using the information to choose providers or assess the quality of their provider (52.7%, 225

  6. Assessing the Validity of Single-item Life Satisfaction Measures: Results from Three Large Samples

    Science.gov (United States)

    Cheung, Felix; Lucas, Richard E.

    2014-01-01

    Purpose The present paper assessed the validity of single-item life satisfaction measures by comparing single-item measures to the Satisfaction with Life Scale (SWLS) - a more psychometrically established measure. Methods Two large samples from Washington (N=13,064) and Oregon (N=2,277) recruited by the Behavioral Risk Factor Surveillance System (BRFSS) and a representative German sample (N=1,312) recruited by the Germany Socio-Economic Panel (GSOEP) were included in the present analyses. Single-item life satisfaction measures and the SWLS were correlated with theoretically relevant variables, such as demographics, subjective health, domain satisfaction, and affect. The correlations between the two life satisfaction measures and these variables were examined to assess the construct validity of single-item life satisfaction measures. Results Consistent across three samples, single-item life satisfaction measures demonstrated substantial degree of criterion validity with the SWLS (zero-order r = 0.62 – 0.64; disattenuated r = 0.78 – 0.80). Patterns of statistical significance for correlations with theoretically relevant variables were the same across single-item measures and the SWLS. Single-item measures did not produce systematically different correlations compared to the SWLS (average difference = 0.001 – 0.005). The average absolute difference in the magnitudes of the correlations produced by single-item measures and the SWLS were very small (average absolute difference = 0.015 −0.042). Conclusions Single-item life satisfaction measures performed very similarly compared to the multiple-item SWLS. Social scientists would get virtually identical answer to substantive questions regardless of which measure they use. PMID:24890827

  7. FAPIG's activities for public acceptance of nuclear energy. Analytical results of questionnaire executed at organized visits to nuclear power stations

    International Nuclear Information System (INIS)

    Mizoguchi, Tadao

    1999-01-01

    FAPIG organizes a visit to nuclear power station in every November. It is an object that visitors acquire the correct knowledge of nuclear power by looking at the various facilities in the nuclear power stations. The paper showed the analytical results of questionnaire executed at organized visits to the Kashiwazaki-Kariwa nuclear power station. The visitors were 18 women. The questionnaire was carried out by the same problems before and after seminar and a conducted tour. Their impressions and opinions and the changes are analyzed. The speakers used easy words, video, OHP, pamphlet and experimental equipment. These means showed very good results to visitors. The seminar had very large effect on just recognition of safety and need of it. The change of answer proved from 3 to 6 of need and from 0 to 7 of safety of it. Nine members indicated good understanding of seminar content. The interested items in the seminar were measurement of radiation, effects of radiation, reason of decreasing average life, Chernobyl accident, difference between nuclear power and atomic bomb and nuclear power dose not generate carbon dioxide and recycle plutonium after nuclear fission of uranium. (S.Y.)

  8. Sources of traffic and visitors' preferences regarding online public reports of quality: web analytics and online survey results.

    Science.gov (United States)

    Bardach, Naomi S; Hibbard, Judith H; Greaves, Felix; Dudley, R Adams

    2015-05-01

    In the context of the Affordable Care Act, there is extensive emphasis on making provider quality transparent and publicly available. Online public reports of quality exist, but little is known about how visitors find reports or about their purpose in visiting. To address this gap, we gathered website analytics data from a national group of online public reports of hospital or physician quality and surveyed real-time visitors to those websites. Websites were recruited from a national group of online public reports of hospital or physician quality. Analytics data were gathered from each website: number of unique visitors, method of arrival for each unique visitor, and search terms resulting in visits. Depending on the website, a survey invitation was launched for unique visitors on landing pages or on pages with quality information. Survey topics included type of respondent (eg, consumer, health care professional), purpose of visit, areas of interest, website experience, and demographics. There were 116,657 unique visitors to the 18 participating websites (1440 unique visitors/month per website), with most unique visitors arriving through search (63.95%, 74,606/116,657). Websites with a higher percent of traffic from search engines garnered more unique visitors (P=.001). The most common search terms were for individual hospitals (23.25%, 27,122/74,606) and website names (19.43%, 22,672/74,606); medical condition terms were uncommon (0.81%, 605/74,606). Survey view rate was 42.48% (49,560/116,657 invited) resulting in 1755 respondents (participation rate=3.6%). There were substantial proportions of consumer (48.43%, 850/1755) and health care professional respondents (31.39%, 551/1755). Across websites, proportions of consumer (21%-71%) and health care professional respondents (16%-48%) varied. Consumers were frequently interested in using the information to choose providers or assess the quality of their provider (52.7%, 225/427); the majority of those choosing a

  9. The impact of pre-analytical variables on the stability of neurofilament proteins in CSF, determined by a novel validated SinglePlex Luminex assay and ELISA.

    Science.gov (United States)

    Koel-Simmelink, Marleen J A; Vennegoor, Anke; Killestein, Joep; Blankenstein, Marinus A; Norgren, Niklas; Korth, Carsten; Teunissen, Charlotte E

    2014-01-15

    Neurofilament (Nf) proteins have been shown to be promising biomarkers for monitoring and predicting disease progression for various neurological diseases. The aim of this study was to evaluate the effects of pre-analytical variables on the concentration of neurofilament heavy (NfH) and neurofilament light (NfL) proteins. For NfH an in-house newly-developed and validated SinglePlex Luminex assay was used; ELISA was used to analyze NfL. For the NfL ELISA assay, the intra- and inter-assay variation was respectively, 1.5% and 16.7%. Analytical performance of the NfH SinglePlex Luminex assay in terms of sensitivity (6.6pg/mL), recovery in cerebrospinal fluid (CSF) (between 90 and 104%), linearity (from 6.6-1250pg/mL), and inter- and intra-assay variation (<8%) were good. Concentrations of both NfL and NfH appeared not negatively affected by blood contamination, repeated freeze-thaw cycles (up to 4), delayed processing (up to 24hours) and during long-term storage at -20°C, 4°C, and room temperature. A decrease in concentration was observed during storage of both neurofilament proteins up to 21days at 37°C, which was significant by day 5. The newly developed NfH SinglePlex Luminex assay has a good sensitivity and is robust. Moreover, both NfH and NfL are stable under the most prevalent pre-analytical variations. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. The Validation of an Analytical Method for Sulfentrazone Residue Determination in Soil Using Liquid Chromatography and a Comparison of Chromatographic Sensitivity to Millet as a Bioindicator Species

    Directory of Open Access Journals (Sweden)

    Marcelo Antonio de Oliveira

    2014-07-01

    Full Text Available Commonly used herbicides, such as sulfentrazone, pose the risk of soil contamination due to their persistence, bioaccumulation and toxicity. Phytoremediation by green manure species has been tested using biomarkers, but analytical data are now required to confirm the extraction of sulfentrazone from soil. Thus, the present work was carried out to analyze sulfentrazone residues in soil based on liquid chromatography with a comparison of these values to the sensitivity of the bioindicator Pennisetum glaucum. The soil samples were obtained after cultivation of Crotalaria juncea and Canavalia ensiformis at four seeding densities and with three doses of sulfentrazone. The seedlings were collected into pots, at two different depths, after 75 days of phytoremediator sowing and then were used to determine the herbicide persistence in the soil. A bioassay with P. glaucum was carried out in the same pot. High-performance liquid chromatography (HPLC, using UV-diode array detection (HPLC/UV-DAD, was used to determine the herbicide residues. The HPLC determination was optimized and validated according to the parameters of precision, accuracy, linearity, limit of detection and quantification, robustness and specificity. The bioindicator P. glaucum was more sensitive to sulfentrazone than residue determination by HPLC. Changes in sulfentrazone concentration caused by green manure phytoremediation were accurately identified by the bioindicator. However, a true correlation between the size of the species and the analyte content was not identified.

  11. Experimental and analytical studies for the validation of HTR-VGD and primary cell passive decay heat removal. Supplement. Calculations

    International Nuclear Information System (INIS)

    Geiss, M.; Giannikos, A.; Hejzlar, P.; Kneer, A.

    1993-04-01

    The alternative concept for a modular HTR-reactor design by Siempelkamp, Krefeld, using a prestressed cast iron vessel (VGD) combined with a cast iron/concrete module for the primary cell with integrated passive decay heat removal system was fully qualified with respect to operational and accidental thermal loads. The main emphasis was to confirm and validate the passive decay heat removal capability. An experimental facility (INWA) was designed, instrumented and operated with an appropriate electrical heating system simulating steady-state operational and transient accidental thermal loads. The experiments were accompanied by extensive computations concerning the combination of conductive, radiative and convective energy transport mechanisms in the different components of the VGD/primary cell structures, as well as elastic-plastic stress analyses of the VGD. In addition, a spectrum of potential alternatives for passive energy removed options have been parametrically examined. The experimental data clearly demonstrate that the proposed Siempelkamp-design is able to passively and safely remove the decay heat for operational and accidental conditions without invalidating technological important thermal limits. This also holds in case of failures of both the natural convection system and ultimate heat sink by outside concrete water film cooling. (orig./HP) [de

  12. Theoretical validation of potential habitability via analytical and boosted tree methods: An optimistic study on recently discovered exoplanets

    Science.gov (United States)

    Saha, S.; Basak, S.; Safonova, M.; Bora, K.; Agrawal, S.; Sarkar, P.; Murthy, J.

    2018-04-01

    Seven Earth-sized planets, known as the TRAPPIST-1 system, was discovered with great fanfare in the last week of February 2017. Three of these planets are in the habitable zone of their star, making them potentially habitable planets (PHPs) a mere 40 light years away. The discovery of the closest potentially habitable planet to us just a year before - Proxima b and a realization that Earth-type planets in circumstellar habitable zones are a common occurrence provides the impetus to the existing pursuit for life outside the Solar System. The search for life has two goals essentially: looking for planets with Earth-like conditions (Earth similarity) and looking for the possibility of life in some form (habitability). An index was recently developed, the Cobb-Douglas Habitability Score (CDHS), based on Cobb-Douglas habitability production function (CD-HPF), which computes the habitability score by using measured and estimated planetary parameters. As an initial set, radius, density, escape velocity and surface temperature of a planet were used. The proposed metric, with exponents accounting for metric elasticity, is endowed with analytical properties that ensure global optima and can be scaled to accommodate a finite number of input parameters. We show here that the model is elastic, and the conditions on elasticity to ensure global maxima can scale as the number of predictor parameters increase. K-NN (K-Nearest Neighbor) classification algorithm, embellished with probabilistic herding and thresholding restriction, utilizes CDHS scores and labels exoplanets into appropriate classes via feature-learning methods yielding granular clusters of habitability. The algorithm works on top of a decision-theoretical model using the power of convex optimization and machine learning. The goal is to characterize the recently discovered exoplanets into an "Earth League" and several other classes based on their CDHS values. A second approach, based on a novel feature-learning and

  13. Analytical results on the periodically driven damped pendulum. Application to sliding charge-density waves and Josephson junctions

    International Nuclear Information System (INIS)

    Azbel, M.Y.; Bak, P.

    1984-01-01

    The differential equation epsilonphi-dieresis+phi-dot-(1/2)α sin(2phi) = I+summation/sub n/ = -infinity/sup infinity/A/sub n/delta(t-t/sub n/) describing the periodically driven damped pendulum is analyzed in the strong damping limit epsilon<<1, using first-order perturbation theory. The equation may represent the motion of a sliding charge-density wave (CDW) in ac plus dc electric fields, and the resistively shunted Josephson junction driven by dc and microwave currents. When the torque I exceeds a critical value the pendulum rotates with a frequency ω. For infinite damping, or zero mass (epsilon = 0), the equation can be transformed to the Schroedinger equation of the Kronig-Penney model. When A/sub n/ is random the pendulum exhibits chaotic motion. In the regular case A/sub n/ = A the frequency ω is a smooth function of the parameters, so there are no phase-locked subharmonic plateaus in the ω(I) curve, or the I-V characteristics for the CDW or Josephson-junction systems. For small nonzero epsilon the return map expressing the phase phi(t/sub n/+1) as a function of the phase phi(t/sub n/) is a one-dimensional circle map. Applying known analytical results for the circle map one finds narrow subharmonic plateaus at all rational frequencies, in agreement with experiments on CDW systems

  14. Results of a survey on accident and safety analysis codes, benchmarks, verification and validation methods

    International Nuclear Information System (INIS)

    Lee, A.G.; Wilkin, G.B.

    1996-03-01

    During the 'Workshop on R and D needs' at the 3rd Meeting of the International Group on Research Reactors (IGORR-III), the participants agreed that it would be useful to compile a survey of the computer codes and nuclear data libraries used in accident and safety analyses for research reactors and the methods various organizations use to verify and validate their codes and libraries. Five organizations, Atomic Energy of Canada Limited (AECL, Canada), China Institute of Atomic Energy (CIAE, People's Republic of China), Japan Atomic Energy Research Institute (JAERI, Japan), Oak Ridge National Laboratories (ORNL, USA), and Siemens (Germany) responded to the survey. The results of the survey are compiled in this report. (author) 36 refs., 3 tabs

  15. Exit probability of the one-dimensional q-voter model: Analytical results and simulations for large networks

    Science.gov (United States)

    Timpanaro, André M.; Prado, Carmen P. C.

    2014-05-01

    We discuss the exit probability of the one-dimensional q-voter model and present tools to obtain estimates about this probability, both through simulations in large networks (around 107 sites) and analytically in the limit where the network is infinitely large. We argue that the result E(ρ )=ρq/ρq+(1-ρ)q, that was found in three previous works [F. Slanina, K. Sznajd-Weron, and P. Przybyła, Europhys. Lett. 82, 18006 (2008), 10.1209/0295-5075/82/18006; R. Lambiotte and S. Redner, Europhys. Lett. 82, 18007 (2008), 10.1209/0295-5075/82/18007, for the case q =2; and P. Przybyła, K. Sznajd-Weron, and M. Tabiszewski, Phys. Rev. E 84, 031117 (2011), 10.1103/PhysRevE.84.031117, for q >2] using small networks (around 103 sites), is a good approximation, but there are noticeable deviations that appear even for small systems and that do not disappear when the system size is increased (with the notable exception of the case q =2). We also show that, under some simple and intuitive hypotheses, the exit probability must obey the inequality ρq/ρq+(1-ρ)≤E(ρ)≤ρ/ρ +(1-ρ)q in the infinite size limit. We believe this settles in the negative the suggestion made [S. Galam and A. C. R. Martins, Europhys. Lett. 95, 48005 (2001), 10.1209/0295-5075/95/48005] that this result would be a finite size effect, with the exit probability actually being a step function. We also show how the result that the exit probability cannot be a step function can be reconciled with the Galam unified frame, which was also a source of controversy.

  16. Development and validation of simple RP-HPLC-PDA analytical protocol for zileuton assisted with Design of Experiments for robustness determination

    Directory of Open Access Journals (Sweden)

    Saurabh B. Ganorkar

    2017-02-01

    Full Text Available A simple, rapid, sensitive, robust, stability-indicating RP-HPLC-PDA analytical protocol was developed and validated for the analysis of zileuton racemate in bulk and in tablet formulation. Development of method and resolution of degradation products from forced; hydrolytic (acidic, basic, neutral, oxidative, photolytic (acidic, basic, neutral, solid state and thermal (dry heat degradation was achieved on a LC – GC Qualisil BDS C18 column (250 mm × 4.6 mm × 5 μm by isocratic mode at ambient temperature, employing a mobile phase methanol and (0.2%, v/v orthophosphoric acid in ratio of (80:20, v/v at a flow rate of 1.0 mL min−1 and detection at 260 nm. ‘Design of Experiments’ (DOE employing ‘Central Composite Design’ (CCD and ‘Response Surface Methodology’ (RSM were applied as an advancement to traditional ‘One Variable at Time’ (OVAT approach to evaluate the effects of variations in selected factors (methanol content, flow rate, concentration of orthophosphoric acid as graphical interpretation for robustness and statistical interpretation was achieved with Multiple Linear Regression (MLR and ANOVA. The method succeeded over the validation parameters: linearity, precision, accuracy, limit of detection and limit of quantitation, and robustness. The method was applied effectively for analysis of in-house zileuton tablets.

  17. Analytic turnaround time study for integrated reporting of pathology results on electronic medical records using the Illuminate system

    Directory of Open Access Journals (Sweden)

    Tawfik O

    2016-09-01

    Full Text Available Timely pathology results are critical for appropriate diagnosis and management of patients. Yet workflows in laboratories remain ad hoc and involve accessing multiple systems with no direct linkage between patient history and prior or pending pathology records for the case being analyzed. A major hindrance in timely reporting of pathology results is the need to incorporate/interface with multiple electronic health records (EHRs. We evaluated the Illuminate PatientView software (Illuminate integration into pathologist's workflow. Illuminate is a search engine architecture that has a repository of textual information from many hospital systems. Our goal was to develop a comprehensive, user friendly patient summary display to integrate the current fractionated subspecialty specific systems. An analytical time study noting changes in turnaround time (TAT before and after Illuminate implementation was recorded for reviewers, including pathologists, residents and fellows. Reviewers' TAT for 359 cases was recorded (200 cases before and 159 after implementation. The impact of implementing Illuminate on transcriptionists’ workflow was also studied. Average TAT to retrieve EHRs prior to Illuminate was 5:32 min (range 1:35-10:50. That time was significantly reduced to 35 seconds (range 10 sec-1:10 min using Illuminate. Reviewers were very pleased with the ease in accessing information and in eliminating the draft paper documents of the pathology reports, eliminating up to 65 min/day (25-65 min by transcriptionists matching requisition with paperwork. Utilizing Illuminate improved workflow, decreased TAT and minimized cost. Patient care can be improved through a comprehensive patient management system that facilitates communications between isolated information systems.

  18. Quality of life and hormone use: new validation results of MRS scale

    Directory of Open Access Journals (Sweden)

    Heinemann Lothar AJ

    2006-05-01

    Full Text Available Abstract Background The Menopause Rating Scale is a health-related Quality of Life scale developed in the early 1990s and step-by-step validated since then. Recently the MRS scale was validated as outcomes measure for hormone therapy. The suspicion however was expressed that the data were too optimistic due to methodological problems of the study. A new study became available to check how founded this suspicion was. Method An open post-marketing study of 3282 women with pre- and post- treatment data of the self-administered version of the MRS scale was analyzed to evaluate the capacity of the scale to detect hormone treatment related effects with the MRS scale. The main results were then compared with the old study where the interview-based version of the MRS scale was used. Results The hormone-therapy related improvement of complaints relative to the baseline score was about or less than 30% in total or domain scores, whereas it exceeded 30% improvement in the old study. Similarly, the relative improvement after therapy, stratified by the degree of severity at baseline, was lower in the new than in the old study, but had the same slope. Although we cannot exclude different treatment effects with the study method used, this supports our hypothesis that the individual MRS interviews performed by the physician biased the results towards over-estimation of the treatment effects. This hypothesis is underlined by the degree of concordance of physician's assessment and patient's perception of treatment success (MRS results: Sensitivity (correct prediction of the positive assessment by the treating physician of the MRS and specificity (correct prediction of a negative assessment by the physician were lower than the results obtained with the interview-based MRS scale in the previous publication. Conclusion The study confirmed evidence for the capacity of the MRS scale to measure treatment effects on quality of life across the full range of severity of

  19. Results from the radiometric validation of Sentinel-3 optical sensors using natural targets

    Science.gov (United States)

    Fougnie, Bertrand; Desjardins, Camille; Besson, Bruno; Bruniquel, Véronique; Meskini, Naceur; Nieke, Jens; Bouvet, Marc

    2016-09-01

    The recently launched SENTINEL-3 mission measures sea surface topography, sea/land surface temperature, and ocean/land surface colour with high accuracy. The mission provides data continuity with the ENVISAT mission through acquisitions by multiple sensing instruments. Two of them, OLCI (Ocean and Land Colour Imager) and SLSTR (Sea and Land Surface Temperature Radiometer) are optical sensors designed to provide continuity with Envisat's MERIS and AATSR instruments. During the commissioning, in-orbit calibration and validation activities are conducted. Instruments are in-flight calibrated and characterized primarily using on-board devices which include diffusers and black body. Afterward, vicarious calibration methods are used in order to validate the OLCI and SLSTR radiometry for the reflective bands. The calibration can be checked over dedicated natural targets such as Rayleigh scattering, sunglint, desert sites, Antarctica, and tentatively deep convective clouds. Tools have been developed and/or adapted (S3ETRAC, MUSCLE) to extract and process Sentinel-3 data. Based on these matchups, it is possible to provide an accurate checking of many radiometric aspects such as the absolute and interband calibrations, the trending correction, the calibration consistency within the field-of-view, and more generally this will provide an evaluation of the radiometric consistency for various type of targets. Another important aspect will be the checking of cross-calibration between many other instruments such as MERIS and AATSR (bridge between ENVISAT and Sentinel-3), MODIS (bridge to the GSICS radiometric standard), as well as Sentinel-2 (bridge between Sentinel missions). The early results, based on the available OLCI and SLSTR data, will be presented and discussed.

  20. Validating a dance-specific screening test for balance: preliminary results from multisite testing.

    Science.gov (United States)

    Batson, Glenna

    2010-09-01

    Few dance-specific screening tools adequately capture balance. The aim of this study was to administer and modify the Star Excursion Balance Test (oSEBT) to examine its utility as a balance screen for dancers. The oSEBT involves standing on one leg while lightly targeting with the opposite foot to the farthest distance along eight spokes of a star-shaped grid. This task simulates dance in the spatial pattern and movement quality of the gesturing limb. The oSEBT was validated for distance on athletes with history of ankle sprain. Thirty-three dancers (age 20.1 +/- 1.4 yrs) participated from two contemporary dance conservatories (UK and US), with or without a history of lower extremity injury. Dancers were verbally instructed (without physical demonstration) to execute the oSEBT and four modifications (mSEBT): timed (speed), timed with cognitive interference (answering questions aloud), and sensory disadvantaging (foam mat). Stepping strategies were tracked and performance strategies video-recorded. Unlike the oSEBT results, distances reached were not significant statistically (p = 0.05) or descriptively (i.e., shorter) for either group. Performance styles varied widely, despite sample homogeneity and instructions to control for strategy. Descriptive analysis of mSEBT showed an increased number of near-falls and decreased timing on the injured limb. Dancers appeared to employ variable strategies to keep balance during this test. Quantitative analysis is warranted to define balance strategies for further validation of SEBT modifications to determine its utility as a balance screening tool.

  1. Validation of Code ASTEC with LIVE-L1 Experimental Results

    International Nuclear Information System (INIS)

    Bachrata, Andrea

    2008-01-01

    The severe accidents with core melting are considered at the design stage of project at Generation 3+ of Nuclear Power Plants (NPP). Moreover, there is an effort to apply the severe accident management to the operated NPP. The one of main goals of severe accidents mitigation is corium localization and stabilization. The two strategies that fulfil this requirement are: the in-vessel retention (e.g. AP-600, AP- 1000) and the ex-vessel retention (e.g. EPR). To study the scenario of in-vessel retention, a large experimental program and the integrated codes have been developed. The LIVE-L1 experimental facility studied the formation of melt pools and the melt accumulation in the lower head using different cooling conditions. Nowadays, a new European computer code ASTEC is being developed jointly in France and Germany. One of the important steps in ASTEC development in the area of in-vessel retention of corium is its validation with LIVE-L1 experimental results. Details of the experiment are reported. Results of the ASTEC (module DIVA) application to the analysis of the test are presented. (author)

  2. Utilization of paleoclimate results to validate projections of a future greenhouse warming

    International Nuclear Information System (INIS)

    Crowley, T.J.

    1990-01-01

    Paleoclimate data provide a rich source of information for testing projections of future greenhouse trends. This paper summarizes the present state-of-the-art as to assessments of two important climate problems. (1) Validation of climate models - The same climate models that have been used to make greenhouse forecasts have also been used for paleoclimate simulations. Comparisons of model results and observations indicate some impressive successes but also some cases where there are significant divergences between models and observations. However, special conditions associated with the impressive successes could lead to a false confidence in the models; disagreements are a topic of greater concern. It remains to be determined whether the disagreements are due to model limitations or uncertainties in geologic data. (2) Role of CO 2 as a significant climate feedback: Paleoclimate studies indicate that the climate system is generally more sensitive than our ability to model it. Addition or subtraction of CO 2 leads to a closer agreement between models and observations. In this respect paleoclimate results in general support the conclusion that CO 2 is an important climate feedback, with the magnitude of the feedback approximately comparable to the sensitivity of present climate models. If the CO 2 projections are correct, comparison of the future warming with past warm periods indicate that there may be no geologic analogs for a future warming; the future greenhouse climate may represent a unique climate realization in earth history

  3. A Comparison of Result Reliability for Investigation of Milk Composition by Alternative Analytical Methods in Czech Republic

    Directory of Open Access Journals (Sweden)

    Oto Hanuš

    2014-01-01

    Full Text Available The milk analyse result reliability is important for assurance of foodstuff chain quality. There are more direct and indirect methods for milk composition measurement (fat (F, protein (P, lactose (L and solids non fat (SNF content. The goal was to evaluate some reference and routine milk analytical procedures on result basis. The direct reference analyses were: F, fat content (Röse–Gottlieb method; P, crude protein content (Kjeldahl method; L, lactose (monohydrate, polarimetric method; SNF, solids non fat (gravimetric method. F, P, L and SNF were determined also by various indirect methods: – MIR (infrared (IR technology with optical filters, 7 instruments in 4 labs; – MIR–FT (IR spectroscopy with Fourier’s transformations, 10 in 6; – ultrasonic method (UM, 3 in 1; – analysis by the blue and red box (BRB, 1 v 1. There were used 10 reference milk samples. Coefficient of determination (R2, correlation coefficient (r and standard deviation of the mean of individual differences (MDsd, for n were evaluated. All correlations (r; for all indirect and alternative methods and all milk components were significant (P ≤ 0.001. MIR and MIR–FT (conventional methods explained considerably higher proportion of the variability in reference results than the UM and BRB methods (alternative. All r average values (x minus 1.64 × sd for 95% confidence interval can be used as standards for calibration quality evaluation (MIR, MIR–FT, UM and BRB: – for F 0.997, 0.997, 0.99 and 0.995; – for P 0.986, 0.981, 0.828 and 0.864; – for L 0.968, 0.871, 0.705 and 0.761; – for SNF 0.992, 0.993, 0.911 and 0.872. Similarly ​MDsd (x plus 1.64 × sd: – for F 0.071, 0.068, 0.132 and 0.101%; – for P 0.051, 0.054, 0.202 and 0.14%; – for L 0.037, 0.074, 0.113 and 0.11%; – for SNF 0.052, 0.068, 0.141 and 0.204.

  4. Atom-number squeezing and bipartite entanglement of two-component Bose-Einstein condensates: analytical results

    Energy Technology Data Exchange (ETDEWEB)

    Jin, G R; Wang, X W; Li, D; Lu, Y W, E-mail: grjin@bjtu.edu.c [Department of Physics, Beijing Jiaotong University, Beijing 100044 (China)

    2010-02-28

    We investigate spin dynamics of a two-component Bose-Einstein condensate with weak Josephson coupling. Analytical expressions of atom-number squeezing and bipartite entanglement are presented for atom-atom repulsive interactions. For attractive interactions, there is no number squeezing; however, the squeezing parameter is still useful to recognize the appearance of Schroedinger's cat state.

  5. Validation of the WIMSD4M cross-section generation code with benchmark results

    International Nuclear Information System (INIS)

    Deen, J.R.; Woodruff, W.L.; Leal, L.E.

    1995-01-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D 2 O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented

  6. Validation of the WIMSD4M cross-section generation code with benchmark results

    Energy Technology Data Exchange (ETDEWEB)

    Deen, J.R.; Woodruff, W.L. [Argonne National Lab., IL (United States); Leal, L.E. [Oak Ridge National Lab., TN (United States)

    1995-01-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D{sub 2}O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.

  7. A closed-form analytical model for predicting 3D boundary layer displacement thickness for the validation of viscous flow solvers

    Science.gov (United States)

    Kumar, V. R. Sanal; Sankar, Vigneshwaran; Chandrasekaran, Nichith; Saravanan, Vignesh; Natarajan, Vishnu; Padmanabhan, Sathyan; Sukumaran, Ajith; Mani, Sivabalan; Rameshkumar, Tharikaa; Nagaraju Doddi, Hema Sai; Vysaprasad, Krithika; Sharan, Sharad; Murugesh, Pavithra; Shankar, S. Ganesh; Nejaamtheen, Mohammed Niyasdeen; Baskaran, Roshan Vignesh; Rahman Mohamed Rafic, Sulthan Ariff; Harisrinivasan, Ukeshkumar; Srinivasan, Vivek

    2018-02-01

    A closed-form analytical model is developed for estimating the 3D boundary-layer-displacement thickness of an internal flow system at the Sanal flow choking condition for adiabatic flows obeying the physics of compressible viscous fluids. At this unique condition the boundary-layer blockage induced fluid-throat choking and the adiabatic wall-friction persuaded flow choking occur at a single sonic-fluid-throat location. The beauty and novelty of this model is that without missing the flow physics we could predict the exact boundary-layer blockage of both 2D and 3D cases at the sonic-fluid-throat from the known values of the inlet Mach number, the adiabatic index of the gas and the inlet port diameter of the internal flow system. We found that the 3D blockage factor is 47.33 % lower than the 2D blockage factor with air as the working fluid. We concluded that the exact prediction of the boundary-layer-displacement thickness at the sonic-fluid-throat provides a means to correctly pinpoint the causes of errors of the viscous flow solvers. The methodology presented herein with state-of-the-art will play pivotal roles in future physical and biological sciences for a credible verification, calibration and validation of various viscous flow solvers for high-fidelity 2D/3D numerical simulations of real-world flows. Furthermore, our closed-form analytical model will be useful for the solid and hybrid rocket designers for the grain-port-geometry optimization of new generation single-stage-to-orbit dual-thrust-motors with the highest promising propellant loading density within the given envelope without manifestation of the Sanal flow choking leading to possible shock waves causing catastrophic failures.

  8. A closed-form analytical model for predicting 3D boundary layer displacement thickness for the validation of viscous flow solvers

    Directory of Open Access Journals (Sweden)

    V. R. Sanal Kumar

    2018-02-01

    Full Text Available A closed-form analytical model is developed for estimating the 3D boundary-layer-displacement thickness of an internal flow system at the Sanal flow choking condition for adiabatic flows obeying the physics of compressible viscous fluids. At this unique condition the boundary-layer blockage induced fluid-throat choking and the adiabatic wall-friction persuaded flow choking occur at a single sonic-fluid-throat location. The beauty and novelty of this model is that without missing the flow physics we could predict the exact boundary-layer blockage of both 2D and 3D cases at the sonic-fluid-throat from the known values of the inlet Mach number, the adiabatic index of the gas and the inlet port diameter of the internal flow system. We found that the 3D blockage factor is 47.33 % lower than the 2D blockage factor with air as the working fluid. We concluded that the exact prediction of the boundary-layer-displacement thickness at the sonic-fluid-throat provides a means to correctly pinpoint the causes of errors of the viscous flow solvers. The methodology presented herein with state-of-the-art will play pivotal roles in future physical and biological sciences for a credible verification, calibration and validation of various viscous flow solvers for high-fidelity 2D/3D numerical simulations of real-world flows. Furthermore, our closed-form analytical model will be useful for the solid and hybrid rocket designers for the grain-port-geometry optimization of new generation single-stage-to-orbit dual-thrust-motors with the highest promising propellant loading density within the given envelope without manifestation of the Sanal flow choking leading to possible shock waves causing catastrophic failures.

  9. Analytical Validation of Quantitative Real-Time PCR Methods for Quantification of Trypanosoma cruzi DNA in Blood Samples from Chagas Disease Patients.

    Science.gov (United States)

    Ramírez, Juan Carlos; Cura, Carolina Inés; da Cruz Moreira, Otacilio; Lages-Silva, Eliane; Juiz, Natalia; Velázquez, Elsa; Ramírez, Juan David; Alberti, Anahí; Pavia, Paula; Flores-Chávez, María Delmans; Muñoz-Calderón, Arturo; Pérez-Morales, Deyanira; Santalla, José; Marcos da Matta Guedes, Paulo; Peneau, Julie; Marcet, Paula; Padilla, Carlos; Cruz-Robles, David; Valencia, Edward; Crisante, Gladys Elena; Greif, Gonzalo; Zulantay, Inés; Costales, Jaime Alfredo; Alvarez-Martínez, Miriam; Martínez, Norma Edith; Villarroel, Rodrigo; Villarroel, Sandro; Sánchez, Zunilda; Bisio, Margarita; Parrado, Rudy; Maria da Cunha Galvão, Lúcia; Jácome da Câmara, Antonia Cláudia; Espinoza, Bertha; Alarcón de Noya, Belkisyole; Puerta, Concepción; Riarte, Adelina; Diosque, Patricio; Sosa-Estani, Sergio; Guhl, Felipe; Ribeiro, Isabela; Aznar, Christine; Britto, Constança; Yadón, Zaida Estela; Schijman, Alejandro G

    2015-09-01

    An international study was performed by 26 experienced PCR laboratories from 14 countries to assess the performance of duplex quantitative real-time PCR (qPCR) strategies on the basis of TaqMan probes for detection and quantification of parasitic loads in peripheral blood samples from Chagas disease patients. Two methods were studied: Satellite DNA (SatDNA) qPCR and kinetoplastid DNA (kDNA) qPCR. Both methods included an internal amplification control. Reportable range, analytical sensitivity, limits of detection and quantification, and precision were estimated according to international guidelines. In addition, inclusivity and exclusivity were estimated with DNA from stocks representing the different Trypanosoma cruzi discrete typing units and Trypanosoma rangeli and Leishmania spp. Both methods were challenged against 156 blood samples provided by the participant laboratories, including samples from acute and chronic patients with varied clinical findings, infected by oral route or vectorial transmission. kDNA qPCR showed better analytical sensitivity than SatDNA qPCR with limits of detection of 0.23 and 0.70 parasite equivalents/mL, respectively. Analyses of clinical samples revealed a high concordance in terms of sensitivity and parasitic loads determined by both SatDNA and kDNA qPCRs. This effort is a major step toward international validation of qPCR methods for the quantification of T. cruzi DNA in human blood samples, aiming to provide an accurate surrogate biomarker for diagnosis and treatment monitoring for patients with Chagas disease. Copyright © 2015 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  10. Validation of an online risk calculator for the prediction of anastomotic leak after colon cancer surgery and preliminary exploration of artificial intelligence-based analytics.

    Science.gov (United States)

    Sammour, T; Cohen, L; Karunatillake, A I; Lewis, M; Lawrence, M J; Hunter, A; Moore, J W; Thomas, M L

    2017-11-01

    Recently published data support the use of a web-based risk calculator ( www.anastomoticleak.com ) for the prediction of anastomotic leak after colectomy. The aim of this study was to externally validate this calculator on a larger dataset. Consecutive adult patients undergoing elective or emergency colectomy for colon cancer at a single institution over a 9-year period were identified using the Binational Colorectal Cancer Audit database. Patients with a rectosigmoid cancer, an R2 resection, or a diverting ostomy were excluded. The primary outcome was anastomotic leak within 90 days as defined by previously published criteria. Area under receiver operating characteristic curve (AUROC) was derived and compared with that of the American College of Surgeons National Surgical Quality Improvement Program ® (ACS NSQIP) calculator and the colon leakage score (CLS) calculator for left colectomy. Commercially available artificial intelligence-based analytics software was used to further interrogate the prediction algorithm. A total of 626 patients were identified. Four hundred and fifty-six patients met the inclusion criteria, and 402 had complete data available for all the calculator variables (126 had a left colectomy). Laparoscopic surgery was performed in 39.6% and emergency surgery in 14.7%. The anastomotic leak rate was 7.2%, with 31.0% requiring reoperation. The anastomoticleak.com calculator was significantly predictive of leak and performed better than the ACS NSQIP calculator (AUROC 0.73 vs 0.58) and the CLS calculator (AUROC 0.96 vs 0.80) for left colectomy. Artificial intelligence-predictive analysis supported these findings and identified an improved prediction model. The anastomotic leak risk calculator is significantly predictive of anastomotic leak after colon cancer resection. Wider investigation of artificial intelligence-based analytics for risk prediction is warranted.

  11. Optimisation and validation of analytical methods for the simultaneous extraction of antioxidants: application to the analysis of tomato sauces.

    Science.gov (United States)

    Motilva, Maria-José; Macià, Alba; Romero, Maria-Paz; Labrador, Agustín; Domínguez, Alba; Peiró, Lluís

    2014-11-15

    In the present study, simultaneous extraction of natural antioxidants (phenols and carotenoids) in complex matrices, such as tomato sauces, is presented. The tomato sauce antioxidant compounds studied were the phenolics hydroxytyrosol, from virgin olive oil, quercetin and its derivatives, from onions, and quercetin-rutinoside as well as the carotenoid, lycopene (cis and trans), from tomatoes. These antioxidant compounds were extracted simultaneously with n-hexane/acetone/ethanol (50/25/25, v/v/v). The phenolics were analysed by ultra-performance liquid chromatography coupled with tandem mass spectrometry (UPLC-MS/MS), and lycopene (cis- and trans-forms) was analysed using high-performance liquid chromatography coupled to a diode array detector (HPLC-DAD). After studying the parameters of these methods, they were applied to the analysis of virgin olive oil, fresh onion, tomato concentrate and tomato powder, and commercial five tomato sauces. Subsequently, the results obtained in our laboratory were compared with those from the Gallina Blanca Star Group laboratory. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. A meta-analytic review of the MMPI validity scales and indexes to detect defensiveness in custody evaluations

    Directory of Open Access Journals (Sweden)

    Francisca Fariña

    2017-01-01

    Full Text Available Antecedentes/Objetivo: En los casos de disputa por la custodia, el psicólogo forense tiene entre sus cometidos la evaluación de las competencias parentales, así como sospechar disimulación. Para esta doble tarea, el instrumento de referencia es el MMPI. Método: Para establecer el estado de la cuestión se llevó a cabo un meta-análisis encontrando 32 estudios primarios de los que se obtuvieron 256 tama˜nos del efecto. Los tama˜nos del efecto fueron corregidos por error de muestreo y falta de fiabilidad del criterio. Resultados: Los resultados mostraron un tama˜no del efecto medio verdadero positivo, significativo, grande y generalizable para las escalas L, K, S y MP, y los índices L + K y L + K-F. Para Wsd, también resultó positivo, significativo y grande, pero no generalizable. Para F y el índice F-K fue negativo y significativo, pero no generalizable para F y generalizable para F-K. Los tama˜nos del efecto de las escalas L, K, S y MP, y los índices L + K-F y L + K resultaron ser iguales. Se estudiaron como moderadores el género del progenitor (padre vs. madre y el contexto de evaluación (progenitores en disputa por la custodia de los hijos vs. evaluación de la capacidad parental. Conclusiones: Se discute la utilidad para la práctica forense de estos resultados.

  13. Thermodynamic properties of 1-naphthol: Mutual validation of experimental and computational results

    International Nuclear Information System (INIS)

    Chirico, Robert D.; Steele, William V.; Kazakov, Andrei F.

    2015-01-01

    Highlights: • Heat capacities were measured for the temperature range 5 K to 445 K. • Vapor pressures were measured for the temperature range 370 K to 570 K. • Computed and derived properties for ideal gas entropies are in excellent accord. • The enthalpy of combustion was measured and shown to be consistent with reliable literature values. • Thermodynamic consistency analysis revealed anomalous literature data. - Abstract: Thermodynamic properties for 1-naphthol (Chemical Abstracts registry number [90-15-3]) in the ideal-gas state are reported based on both experimental and computational methods. Measured properties included the triple-point temperature, enthalpy of fusion, and heat capacities for the crystal and liquid phases by adiabatic calorimetry; vapor pressures by inclined-piston manometry and comparative ebulliometry; and the enthalpy of combustion of the crystal phase by oxygen bomb calorimetry. Critical properties were estimated. Entropies for the ideal-gas state were derived from the experimental studies for the temperature range 298.15 ⩽ T/K ⩽ 600, and independent statistical calculations were performed based on molecular geometry optimization and vibrational frequencies calculated at the B3LYP/6-31+G(d,p) level of theory. The mutual validation of the independent experimental and computed results is achieved with a scaling factor of 0.975 applied to the calculated vibrational frequencies. This same scaling factor was successfully applied in the analysis of results for other polycyclic molecules, as described in a series of recent articles by this research group. This article reports the first extension of this approach to a hydroxy-aromatic compound. All experimental results are compared with property values reported in the literature. Thermodynamic consistency between properties is used to show that several studies in the literature are erroneous. The enthalpy of combustion for 1-naphthol was also measured in this research, and excellent

  14. THE GLOBAL TANDEM-X DEM: PRODUCTION STATUS AND FIRST VALIDATION RESULTS

    Directory of Open Access Journals (Sweden)

    M. Huber

    2012-07-01

    Full Text Available The TanDEM-X mission will derive a global digital elevation model (DEM with satellite SAR interferometry. Two radar satellites (TerraSAR-X and TanDEM-X will map the Earth in a resolution and accuracy with an absolute height error of 10m and a relative height error of 2m for 90% of the data. In order to fulfill the height requirements in general two global coverages are acquired and processed. Besides the final TanDEM-X DEM, an intermediate DEM with reduced accuracy is produced after the first coverage is completed. The last step in the whole workflow for generating the TanDEM-X DEM is the calibration of remaining systematic height errors and the merge of single acquisitions to 1°x1° DEM tiles. In this paper the current status of generating the intermediate DEM and first validation results based on GPS tracks, laser scanning DEMs, SRTM data and ICESat points are shown for different test sites.

  15. Apar-T: code, validation, and physical interpretation of particle-in-cell results

    Science.gov (United States)

    Melzani, Mickaël; Winisdoerffer, Christophe; Walder, Rolf; Folini, Doris; Favre, Jean M.; Krastanov, Stefan; Messmer, Peter

    2013-10-01

    simulations. The other is that the level of electric field fluctuations scales as 1/ΛPIC ∝ p. We provide a corresponding exact expression, taking into account the finite superparticle size. We confirm both expectations with simulations. Fourth, we compare the Vlasov-Maxwell theory, often used for code benchmarking, to the PIC model. The former describes a phase-space fluid with Λ = + ∞ and no correlations, while the PIC plasma features a small Λ and a high level of correlations when compared to a real plasma. These differences have to be kept in mind when interpreting and validating PIC results against the Vlasov-Maxwell theory and when modeling real physical plasmas.

  16. A Mathematical Model for Reactions During Top-Blowing in the AOD Process: Validation and Results

    Science.gov (United States)

    Visuri, Ville-Valtteri; Järvinen, Mika; Kärnä, Aki; Sulasalmi, Petri; Heikkinen, Eetu-Pekka; Kupari, Pentti; Fabritius, Timo

    2017-06-01

    In earlier work, a fundamental mathematical model was proposed for side-blowing operation in the argon oxygen decarburization (AOD) process. In the preceding part "Derivation of the Model," a new mathematical model was proposed for reactions during top-blowing in the AOD process. In this model it was assumed that reactions occur simultaneously at the surface of the cavity caused by the gas jet and at the surface of the metal droplets ejected from the metal bath. This paper presents validation and preliminary results with twelve industrial heats. In the studied heats, the last combined-blowing stage was altered so that oxygen was introduced from the top lance only. Four heats were conducted using an oxygen-nitrogen mixture (1:1), while eight heats were conducted with pure oxygen. Simultaneously, nitrogen or argon gas was blown via tuyères in order to provide mixing that is comparable to regular practice. The measured carbon content varied from 0.4 to 0.5 wt pct before the studied stage to 0.1 to 0.2 wt pct after the studied stage. The results suggest that the model is capable of predicting changes in metal bath composition and temperature with a reasonably high degree of accuracy. The calculations indicate that the top slag may supply oxygen for decarburization during top-blowing. Furthermore, it is postulated that the metal droplets generated by the shear stress of top-blowing create a large mass exchange area, which plays an important role in enabling the high decarburization rates observed during top-blowing in the AOD process. The overall rate of decarburization attributable to top-blowing in the last combined-blowing stage was found to be limited by the mass transfer of dissolved carbon.

  17. Validation of analytical method to calculate the concentration of conjugated monoclonal antibody; Validacao de metodo analitico para calculo de concentracao de anticorpo monoclonal conjugado

    Energy Technology Data Exchange (ETDEWEB)

    Alcarde, Lais F.; Massicano, Adriana V.F.; Oliveira, Ricardo S.; Araujo, Elaine B. de, E-mail: lais_alcarde@hotmail.com [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    The objective of this study was to develop a quantitative analytical method using high performance liquid chromatography (HPLC) to determine the antibody concentration in conjunction with bifunctional chelator. Assays were performed using a high performance liquid chromatograph, and the following conditions were used: flow rate of 1 mL / min, 15 min run time, 0.2 M sodium phosphate buffer pH 7.0 as the mobile phase and column of molecular exclusion BioSep SEC S-3000 (300 x 7.8 mm, 5 μM - Phenomenex). The calibration curve was obtained with AcM diluted in 0.2 M sodium phosphate buffer pH 7.0 by serial dilution, yielding the concentrations: 400 μg/mL, 200 μg/mL, 100 μg/mL, 50 μg/mL, 25 μg/mL and 12.5 μg/mL. From the calibration curve calculated the equation of the line and with it the concentration of the immunoconjugate. To ensure the validity of the method accuracy and precision studies were conducted. The accuracy test consisted in the evaluation of 3 samples of known concentration, being this test performed with low concentrations (50 μg/mL), medium (100 μg/mL) and high (200 μg/mL). The precision test consisted of 3 consecutive measurements of one sample of known concentration, subject to the conditions set forth above for the other tests. The correlation coefficient of the standard curve was greater than 97%, the accuracy was satisfactory at low concentrations as well as accuracy. The method was validated by showing it for the accurate and precise determination of the concentration of the immunoconjugate. Furthermore, this assay was found to be extremely important, because using the correct mass of the protein, the radiochemical purity of the radioimmunoconjugate was above 95% in all studies.

  18. The influence of bilirubin, haemolysis and turbidity on 20 analytical tests performed on automatic analysers. Results of an interlaboratory study.

    Science.gov (United States)

    Grafmeyer, D; Bondon, M; Manchon, M; Levillain, P

    1995-01-01

    The director of a laboratory has to be sure to give out reliable results for routine tests on automatic analysers regardless of the clinical context. However, he may find hyperbilirubinaemia in some circumstances, parenteral nutrition causing turbidity in others, and haemolysis occurring if sampling is difficult. For this reason, the Commission for Instrumentation of the Société Française de Biologie Clinique (SFBC) (president Alain Feuillu) decided to look into "visible" interferences--bilirubin, haemolysis and turbidity--and their effect on 20 major tests: 13 substrates/chemistries: albumin, calcium, cholesterol, creatinine, glucose, iron, magnesium, phosphorus, total bilirubin, total proteins, triacylglycerols, uric acid, urea, and 7 enzymatic activities: alkaline phosphatase, alanine aminotransferase, alpha-amylase, aspartate aminotransferase, creatine kinase, gamma-glutamyl transferase and lactate dehydrogenase measured on 15 automatic analysers representative of those found on the French market (Astra 8, AU 510, AU 5010, AU 5000, Chem 1, CX 7, Dax 72, Dimension, Ektachem, Hitachi 717, Hitachi 737, Hitachi 747, Monarch, Open 30, Paramax, Wako 30 R) and to see how much they affect the accuracy of results under routine conditions in the laboratory. The study was carried out following the SFBC protocol for the validation of techniques using spiked plasma pools with bilirubin, ditauro-bilirubin, haemoglobin (from haemolysate) and Intralipid (turbidity). Overall, the following results were obtained: haemolysis affects tests the most often (34.5% of cases); total bilirubin interferes in 21.7% of cases; direct bilirubin and turbidity seem to interfere less at around 17%. The different tests are not affected to the same extent; enzyme activity is hardly affected at all; on the other hand certain major tests are extremely sensitive, increasingly so as we go through the following: creatinine (interference of bilirubin), triacylglycerols (interference of bilirubin and

  19. New experimental and analytical results for diffusion and swelling of resins used in graphite/epoxy composite materials

    Science.gov (United States)

    Hiel, C. C.; Adamson, M. J.

    1986-01-01

    The epoxy resins currently in use can slowly absorb moisture from the atmosphere over a long period. This reduces those mechanical properties of composites which depend strongly on the matrix, such as compressive strength and buckling instabilities. The effect becomes greater at elevated temperatures. The paper will discuss new phenomena which occur under simultaneous temperature and moisture variations. An analytical model will also be discussed and documented.

  20. Validity of the Framingham point scores in the elderly: results from the Rotterdam study.

    Science.gov (United States)

    Koller, Michael T; Steyerberg, Ewout W; Wolbers, Marcel; Stijnen, Theo; Bucher, Heiner C; Hunink, M G Myriam; Witteman, Jacqueline C M

    2007-07-01

    The National Cholesterol Education Program recommends assessing 10-year risk of coronary heart disease (CHD) in individuals free of established CHD with the Framingham Point Scores (FPS). Individuals with a risk >20% are classified as high risk and are candidates for preventive intervention. We aimed to validate the FPS in a European population of elderly subjects. Subjects free of established CHD at baseline were selected from the Rotterdam study, a population-based cohort of subjects 55 years or older in The Netherlands. We studied calibration, discrimination (c-index), and the accuracy of high-risk classifications. Events consisted of fatal CHD and nonfatal myocardial infarction. Among 6795 subjects, 463 died because of CHD and 336 had nonfatal myocardial infarction. Predicted 10-year risk of CHD was on average well calibrated for women (9.9% observed vs 10.1% predicted) but showed substantial overestimation in men (14.3% observed vs 19.8% predicted), particularly with increasing age. This resulted in substantial number of false-positive classifications (specificity 70%) in men. In women, discrimination of the FPS was better than that in men (c-index 0.73 vs 0.63, respectively). However, because of the low baseline risk of CHD and limited discriminatory power, only 33% of all CHD events occurred in women classified as high risk. The FPS need recalibration for elderly men with better incorporation of the effect of age. In elderly women, FPS perform reasonably well. However, maintaining the rational of the high-risk threshold requires better performing models for a population with low incidence of CHD.

  1. Validity testing and neuropsychology practice in the VA healthcare system: results from recent practitioner survey (.).

    Science.gov (United States)

    Young, J Christopher; Roper, Brad L; Arentsen, Timothy J

    2016-05-01

    A survey of neuropsychologists in the Veterans Health Administration examined symptom/performance validity test (SPVT) practices and estimated base rates for patient response bias. Invitations were emailed to 387 psychologists employed within the Veterans Affairs (VA), identified as likely practicing neuropsychologists, resulting in 172 respondents (44.4% response rate). Practice areas varied, with 72% at least partially practicing in general neuropsychology clinics and 43% conducting VA disability exams. Mean estimated failure rates were 23.0% for clinical outpatient, 12.9% for inpatient, and 39.4% for disability exams. Failure rates were the highest for mTBI and PTSD referrals. Failure rates were positively correlated with the number of cases seen and frequency and number of SPVT use. Respondents disagreed regarding whether one (45%) or two (47%) failures are required to establish patient response bias, with those administering more measures employing the more stringent criterion. Frequency of the use of specific SPVTs is reported. Base rate estimates for SPVT failure in VA disability exams are comparable to those in other medicolegal settings. However, failure in routine clinical exams is much higher in the VA than in other settings, possibly reflecting the hybrid nature of the VA's role in both healthcare and disability determination. Generally speaking, VA neuropsychologists use SPVTs frequently and eschew pejorative terms to describe their failure. Practitioners who require only one SPVT failure to establish response bias may overclassify patients. Those who use few or no SPVTs may fail to identify response bias. Additional clinical and theoretical implications are discussed.

  2. [Biological markers for the status of vitamins B12 and D: the importance of some analytical aspects in relation to clinical interpretation of results].

    Science.gov (United States)

    Boulat, O; Rey, F; Mooser, V

    2012-10-31

    Biological markers for the status of vitamins B12 and D: the importance of some analytical aspects in relation to clinical interpretation of results When vitamin B12 deficiency is expressed clinically, the diagnostic performance of total cobalamin is identical to that of holotranscobalamin II. In subclinical B12 deficiency, the two aforementioned markers perform less well. Additional analysis of a second, functional marker (methylmalonate or homocysteine) is recommended. Different analytical approaches for 25-hydroxyvitamin D quantification, the marker of vitamin D deficiency, are not yet standardized. Measurement biases of up to +/- 20% compared with the original method used to establish threshold values are still observed.

  3. Challenges of forest landscape modeling - simulating large landscapes and validating results

    Science.gov (United States)

    Hong S. He; Jian Yang; Stephen R. Shifley; Frank R. Thompson

    2011-01-01

    Over the last 20 years, we have seen a rapid development in the field of forest landscape modeling, fueled by both technological and theoretical advances. Two fundamental challenges have persisted since the inception of FLMs: (1) balancing realistic simulation of ecological processes at broad spatial and temporal scales with computing capacity, and (2) validating...

  4. The Arabic Scale of Death Anxiety (ASDA): Its Development, Validation, and Results in Three Arab Countries

    Science.gov (United States)

    Abdel-Khalek, Ahmed M.

    2004-01-01

    The Arabic Scale of Death Anxiety (ASDA) was constructed and validated in a sample of undergraduates (17-33 yrs) in 3 Arab countries, Egypt (n = 418), Kuwait (n = 509), and Syria (n = 709). In its final form, the ASDA consists of 20 statements. Each item is answered on a 5-point intensity scale anchored by 1: No, and 5: Very much. Alpha…

  5. Double-contained receiver tank 244-TX, grab samples, 244TX-97-1 through 244TX-97-3 analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final report for the double-contained receiver tank (DCRT) 244-TX grab samples. Three grabs samples were collected from riser 8 on May 29, 1997. Analyses were performed in accordance with the Compatibility Grab Sampling and Analysis Plan (TSAP) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DQO). The analytical results are presented in a table

  6. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark® for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens

    Directory of Open Access Journals (Sweden)

    Jeffrey S. Larson

    2010-01-01

    Full Text Available We report here the results of the analytical validation of assays that measure HER2 total protein (H2T and HER2 homodimer (H2D expression in Formalin Fixed Paraffin Embedded (FFPE breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7–10 times greater than conventional immunohistochemistry (IHC (HercepTest. The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC or on indirect assessments of gene amplification (FISH.

  7. Precise orbit determination for quad-constellation satellites at Wuhan University: strategy, result validation, and comparison

    Science.gov (United States)

    Guo, Jing; Xu, Xiaolong; Zhao, Qile; Liu, Jingnan

    2016-02-01

    This contribution summarizes the strategy used by Wuhan University (WHU) to determine precise orbit and clock products for Multi-GNSS Experiment (MGEX) of the International GNSS Service (IGS). In particular, the satellite attitude, phase center corrections, solar radiation pressure model developed and used for BDS satellites are addressed. In addition, this contribution analyzes the orbit and clock quality of the quad-constellation products from MGEX Analysis Centers (ACs) for a common time period of 1 year (2014). With IGS final GPS and GLONASS products as the reference, Multi-GNSS products of WHU (indicated by WUM) show the best agreement among these products from all MGEX ACs in both accuracy and stability. 3D Day Boundary Discontinuities (DBDs) range from 8 to 27 cm for Galileo-IOV satellites among all ACs' products, whereas WUM ones are the largest (about 26.2 cm). Among three types of BDS satellites, MEOs show the smallest DBDs from 10 to 27 cm, whereas the DBDs for all ACs products are at decimeter to meter level for GEOs and one to three decimeter for IGSOs, respectively. As to the satellite laser ranging (SLR) validation for Galileo-IOV satellites, the accuracy evaluated by SLR residuals is at the one decimeter level with the well-known systematic bias of about -5 cm for all ACs. For BDS satellites, the accuracy could reach decimeter level, one decimeter level, and centimeter level for GEOs, IGSOs, and MEOs, respectively. However, there is a noticeable bias in GEO SLR residuals. In addition, systematic errors dependent on orbit angle related to mismodeled solar radiation pressure (SRP) are present for BDS GEOs and IGSOs. The results of Multi-GNSS combined kinematic PPP demonstrate that the best accuracy of position and fastest convergence speed have been achieved using WUM products, particularly in the Up direction. Furthermore, the accuracy of static BDS only PPP degrades when the BDS IGSO and MEO satellites switches to orbit-normal orientation

  8. A meta-analytic investigation of conscientiousness in the prediction of job performance: examining the intercorrelations and the incremental validity of narrow traits.

    Science.gov (United States)

    Dudley, Nicole M; Orvis, Karin A; Lebiecki, Justin E; Cortina, José M

    2006-01-01

    Researchers of broad and narrow traits have debated whether narrow traits are important to consider in the prediction of job performance. Because personality-performance relationship meta-analyses have focused almost exclusively on the Big Five, the predictive power of narrow traits has not been adequately examined. In this study, the authors address this question by meta-analytically examining the degree to which the narrow traits of conscientiousness predict above and beyond global conscientiousness. Results suggest that narrow traits do incrementally predict performance above and beyond global conscientiousness, yet the degree to which they contribute depends on the particular performance criterion and occupation in question. Overall, the results of this study suggest that there are benefits to considering the narrow traits of conscientiousness in the prediction of performance. (c) 2006 APA, all rights reserved.

  9. Validity of proposed DSM-5 diagnostic criteria for nicotine use disorder: results from 734 Israeli lifetime smokers

    Science.gov (United States)

    Shmulewitz, D.; Wall, M.M.; Aharonovich, E.; Spivak, B.; Weizman, A.; Frisch, A.; Grant, B. F.; Hasin, D.

    2013-01-01

    Background The fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) proposes aligning nicotine use disorder (NUD) criteria with those for other substances, by including the current DSM fourth edition (DSM-IV) nicotine dependence (ND) criteria, three abuse criteria (neglect roles, hazardous use, interpersonal problems) and craving. Although NUD criteria indicate one latent trait, evidence is lacking on: (1) validity of each criterion; (2) validity of the criteria as a set; (3) comparative validity between DSM-5 NUD and DSM-IV ND criterion sets; and (4) NUD prevalence. Method Nicotine criteria (DSM-IV ND, abuse and craving) and external validators (e.g. smoking soon after awakening, number of cigarettes per day) were assessed with a structured interview in 734 lifetime smokers from an Israeli household sample. Regression analysis evaluated the association between validators and each criterion. Receiver operating characteristic analysis assessed the association of the validators with the DSM-5 NUD set (number of criteria endorsed) and tested whether DSM-5 or DSM-IV provided the most discriminating criterion set. Changes in prevalence were examined. Results Each DSM-5 NUD criterion was significantly associated with the validators, with strength of associations similar across the criteria. As a set, DSM-5 criteria were significantly associated with the validators, were significantly more discriminating than DSM-IV ND criteria, and led to increased prevalence of binary NUD (two or more criteria) over ND. Conclusions All findings address previous concerns about the DSM-IV nicotine diagnosis and its criteria and support the proposed changes for DSM-5 NUD, which should result in improved diagnosis of nicotine disorders. PMID:23312475

  10. Development of a validation test for self-reported abstinence from smokeless tobacco products: preliminary results

    International Nuclear Information System (INIS)

    Robertson, J.B.; Bray, J.T.

    1988-01-01

    Using X-ray fluorescence spectrometry, 11 heavy elements at concentrations that are easily detectable have been identified in smokeless tobacco products. These concentrations were found to increase in cheek epithelium samples of the user after exposure to smokeless tobacco. This feasibility study suggests that the level of strontium in the cheek epithelium could be a valid measure of recent smokeless tobacco use. It also demonstrates that strontium levels become undetectable within several days of smokeless tobacco cessation. This absence of strontium could validate a self-report of abstinence from smokeless tobacco. Finally, the X-ray spectrum of heavy metal content of cheek epithelium from smokeless tobacco users could itself provide a visual stimulus to further motivate the user to terminate the use of smokeless tobacco products

  11. Satisfaction with information provided to Danish cancer patients: validation and survey results.

    Science.gov (United States)

    Ross, Lone; Petersen, Morten Aagaard; Johnsen, Anna Thit; Lundstrøm, Louise Hyldborg; Groenvold, Mogens

    2013-11-01

    To validate five items (CPWQ-inf) regarding satisfaction with information provided to cancer patients from health care staff, assess the prevalence of dissatisfaction with this information, and identify factors predicting dissatisfaction. The questionnaire was validated by patient-observer agreement and cognitive interviews. The prevalence of dissatisfaction was assessed in a cross-sectional sample of all cancer patients in contact with hospitals during the past year in three Danish counties. The validation showed that the CPWQ performed well. Between 3 and 23% of the 1490 participating patients were dissatisfied with each of the measured aspects of information. The highest level of dissatisfaction was reported regarding the guidance, support and help provided when the diagnosis was given. Younger patients were consistently more dissatisfied than older patients. The brief CPWQ performs well for survey purposes. The survey depicts the heterogeneous patient population encountered by hospital staff and showed that younger patients probably had higher expectations or a higher need for information and that those with more severe diagnoses/prognoses require extra care in providing information. Four brief questions can efficiently assess information needs. With increasing demands for information, a wide range of innovative initiatives is needed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Large-scale Validation of AMIP II Land-surface Simulations: Preliminary Results for Ten Models

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, T J; Henderson-Sellers, A; Irannejad, P; McGuffie, K; Zhang, H

    2005-12-01

    This report summarizes initial findings of a large-scale validation of the land-surface simulations of ten atmospheric general circulation models that are entries in phase II of the Atmospheric Model Intercomparison Project (AMIP II). This validation is conducted by AMIP Diagnostic Subproject 12 on Land-surface Processes and Parameterizations, which is focusing on putative relationships between the continental climate simulations and the associated models' land-surface schemes. The selected models typify the diversity of representations of land-surface climate that are currently implemented by the global modeling community. The current dearth of global-scale terrestrial observations makes exacting validation of AMIP II continental simulations impractical. Thus, selected land-surface processes of the models are compared with several alternative validation data sets, which include merged in-situ/satellite products, climate reanalyses, and off-line simulations of land-surface schemes that are driven by observed forcings. The aggregated spatio-temporal differences between each simulated process and a chosen reference data set then are quantified by means of root-mean-square error statistics; the differences among alternative validation data sets are similarly quantified as an estimate of the current observational uncertainty in the selected land-surface process. Examples of these metrics are displayed for land-surface air temperature, precipitation, and the latent and sensible heat fluxes. It is found that the simulations of surface air temperature, when aggregated over all land and seasons, agree most closely with the chosen reference data, while the simulations of precipitation agree least. In the latter case, there also is considerable inter-model scatter in the error statistics, with the reanalyses estimates of precipitation resembling the AMIP II simulations more than to the chosen reference data. In aggregate, the simulations of land-surface latent and

  13. Desarrollo y validación de los métodos analíticos para el control de calidad del micocilén polvo Development and validation of analytical methods for quality control of Micocilen powder

    Directory of Open Access Journals (Sweden)

    Yania Suárez Pérez

    2011-06-01

    Full Text Available El micocilén es un medicamento que se presenta en forma de polvo. Contiene 2 ingredientes farmacéuticos activos: el ácido undecilénico y el undecilinato de zinc. Por su acción fungistática, se ha convertido en un producto de alta demanda en Cuba, ya que las micosis se favorecen en climas cálidos y húmedos. Se realizó el desarrollo y la validación de 2 métodos analíticos para el control de calidad sobre la base de la cuantificación de cada analito presente en la formulación. Se seleccionaron técnicas volumétricas por neutralización acuosa y complejometría. Los resultados fueron satisfactorios, ya que en ambos casos se obtuvo adecuada especificidad, linealidad, exactitud, precisión y robustez. Los métodos propuestos se compararon con los aplicados anteriormente y se obtuvieron resultados mucho más confiables, según resultados del análisis estadístico aplicado, sin diferencias significativas entre las réplicas de un mismo lote.The undecylenic acid (Micocilen is a drug in powder presentation containing two active pharmaceutical ingredients: undecylenic acid and zinc undecylenate. By its fungistatic action becames a first line product in Cuba because of mycoses are typical of wet and warm climates. Development and validation of two analytical methods for the quality control on the base of the quantification of each symbol present in the formula. Volumetric techniques were selected by aqueous neutralization and complexometric. Results were satisfactory since in both cases an appropriate specificity, linearity, accuracy, precision and robustness were obtained. Methods proposed were compared to those previously applied obtaining more reliable results, according to results of analytical analysis applied, without significant differences among replica of a same batch.

  14. The ASCAT soil moisture product. A Review of its specifications, validation results, and emerging applications

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, Wolfgang; Hahn, Sebastian; Kidd, Richard [Vienna Univ. of Technology (Austria). Dept. of Geodesy and Geoinformation] [and others

    2013-02-15

    provide a comprehensive overview of the major characteristics and caveats of the ASCAT soil moisture product, this paper describes the ASCAT instrument and the soil moisture processor and near-real-time distribution service implemented by the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT). A review of the most recent validation studies shows that the quality of ASCAT soil moisture product is - with the exception of arid environments -comparable to, and over some regions (e.g. Europe) even better than currently available soil moisture data derived from passive microwave sensors. Further, a review of applications studies shows that the use of the ASCAT soil moisture product is particularly advanced in the fields of numerical weather prediction and hydrologic modelling. But also in other application areas such as yield monitoring, epidemiologic modelling, or societal risks assessment some first progress can be noted. Considering the generally positive evaluation results, it is expected that the ASCAT soil moisture product will increasingly be used by a growing number of rather diverse land applications. (orig.)

  15. The ASCAT Soil Moisture Product: A Review of its Specifications, Validation Results, and Emerging Applications

    Directory of Open Access Journals (Sweden)

    Wolfgang Wagner

    2013-02-01

    applications. To provide a comprehensive overview of the major characteristics and caveats of the ASCAT soil moisture product, this paper describes the ASCAT instrument and the soil moisture processor and near-real-time distribution service implemented by the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT. A review of the most recent validation studies shows that the quality of ASCAT soil moisture product is - with the exception of arid environments -comparable to, and over some regions (e.g. Europe even better than currently available soil moisture data derived from passive microwave sensors. Further, a review of applications studies shows that the use of the ASCAT soil moisture product is particularly advanced in the fields of numerical weather prediction and hydrologic modelling. But also in other application areas such as yield monitoring, epidemiologic modelling, or societal risks assessment some first progress can be noted. Considering the generally positive evaluation results, it is expected that the ASCAT soil moisture product will increasingly be used by a growing number of rather diverse land applications.

  16. Hospital blood bank information systems accurately reflect patient transfusion: results of a validation study.

    Science.gov (United States)

    McQuilten, Zoe K; Schembri, Nikita; Polizzotto, Mark N; Akers, Christine; Wills, Melissa; Cole-Sinclair, Merrole F; Whitehead, Susan; Wood, Erica M; Phillips, Louise E

    2011-05-01

    Hospital transfusion laboratories collect information regarding blood transfusion and some registries gather clinical outcomes data without transfusion information, providing an opportunity to integrate these two sources to explore effects of transfusion on clinical outcomes. However, the use of laboratory information system (LIS) data for this purpose has not been validated previously. Validation of LIS data against individual patient records was undertaken at two major centers. Data regarding all transfusion episodes were analyzed over seven 24-hour periods. Data regarding 596 units were captured including 399 red blood cell (RBC), 95 platelet (PLT), 72 plasma, and 30 cryoprecipitate units. They were issued to: inpatient 221 (37.1%), intensive care 109 (18.3%), outpatient 95 (15.9%), operating theater 45 (7.6%), emergency department 27 (4.5%), and unrecorded 99 (16.6%). All products recorded by LIS as issued were documented as transfused to intended patients. Median time from issue to transfusion initiation could be calculated for 535 (89.8%) components: RBCs 16 minutes (95% confidence interval [CI], 15-18 min; interquartile range [IQR], 7-30 min), PLTs 20 minutes (95% CI, 15-22 min; IQR, 10-37 min), fresh-frozen plasma 33 minutes (95% CI, 14-83 min; IQR, 11-134 min), and cryoprecipitate 3 minutes (95% CI, -10 to 42 min; IQR, -15 to 116 min). Across a range of blood component types and destinations comparison of LIS data with clinical records demonstrated concordance. The difference between LIS timing data and patient clinical records reflects expected time to transport, check, and prepare transfusion but does not affect the validity of linkage for most research purposes. Linkage of clinical registries with LIS data can therefore provide robust information regarding individual patient transfusion. This enables analysis of joint data sets to determine the impact of transfusion on clinical outcomes. © 2010 American Association of Blood Banks.

  17. Assessing the validity of single-item life satisfaction measures: results from three large samples.

    Science.gov (United States)

    Cheung, Felix; Lucas, Richard E

    2014-12-01

    The present paper assessed the validity of single-item life satisfaction measures by comparing single-item measures to the Satisfaction with Life Scale (SWLS)-a more psychometrically established measure. Two large samples from Washington (N = 13,064) and Oregon (N = 2,277) recruited by the Behavioral Risk Factor Surveillance System and a representative German sample (N = 1,312) recruited by the Germany Socio-Economic Panel were included in the present analyses. Single-item life satisfaction measures and the SWLS were correlated with theoretically relevant variables, such as demographics, subjective health, domain satisfaction, and affect. The correlations between the two life satisfaction measures and these variables were examined to assess the construct validity of single-item life satisfaction measures. Consistent across three samples, single-item life satisfaction measures demonstrated substantial degree of criterion validity with the SWLS (zero-order r = 0.62-0.64; disattenuated r = 0.78-0.80). Patterns of statistical significance for correlations with theoretically relevant variables were the same across single-item measures and the SWLS. Single-item measures did not produce systematically different correlations compared to the SWLS (average difference = 0.001-0.005). The average absolute difference in the magnitudes of the correlations produced by single-item measures and the SWLS was very small (average absolute difference = 0.015-0.042). Single-item life satisfaction measures performed very similarly compared to the multiple-item SWLS. Social scientists would get virtually identical answer to substantive questions regardless of which measure they use.

  18. Results of a survey on accident and safety analysis codes, benchmarks, verification and validation methods

    International Nuclear Information System (INIS)

    Lee, A.G.; Wilkin, G.B.

    1995-01-01

    This report is a compilation of the information submitted by AECL, CIAE, JAERI, ORNL and Siemens in response to a need identified at the 'Workshop on R and D Needs' at the IGORR-3 meeting. The survey compiled information on the national standards applied to the Safety Quality Assurance (SQA) programs undertaken by the participants. Information was assembled for the computer codes and nuclear data libraries used in accident and safety analyses for research reactors and the methods used to verify and validate the codes and libraries. Although the survey was not comprehensive, it provides a basis for exchanging information of common interest to the research reactor community

  19. Effect of Changes in Prolactin RIA Reactants on the Validity of the Results

    International Nuclear Information System (INIS)

    Ahmed, A.M.; Megahed, Y.M.; El Mosallamy, M.A.F.; El-Khoshnia, R.A.M.

    1998-01-01

    Human prolactin plays an essential role in the secretion of milk and has the ability to suppress gonadal function. This study is considered as atrial to discuss some technical problems which made by operator in the RIA technique to select an optimized reliable and valid parameters for the measurement of prolactin concentration in human sera. Prolactin concentration was measured in normal control group and chronic renal failure group using the optimized technique. Finally the present optimized technique is very suitable selected one for measurement of prolactin

  20. Micro-homogeneity of candidate reference materials: Results from an intercomparison study for the Analytical Quality Control Services (AQCS) of the IAEA

    International Nuclear Information System (INIS)

    Rossbach, M.; Kniewald, G.

    2002-01-01

    The IAEA Analytical Quality Control Services (AQCS) has made available two single cell algae materials IAEA-392 and IAEA-393 as well as an urban dust IAEA-396 to study their use for analytical sample sizes in the milligram range and below. Micro-analytical techniques such as PIXE and μ-PIXE, solid sampling AAS, scanning electron microprobe X-ray analysis and INAA were applied to the determination of trace elements on the basis of μg to mg amounts of the selected materials. The comparability of the mean values as well as the reproducibility of successive measurements is being evaluated in order to compare relative homogeneity factors for many elements in the investigated materials. From the reported results it seems that the algae materials IAEA-392 and IAEA-393 are extremely homogeneous biological materials for a number of elements with an extraordinary sharp particle size distribution below 10 μm. A similar situation seems to hold for the urban dust material IAEA-396 which had been air-jet milled to a particle size distribution around 4 μm. The introduction of these materials as CRMs with very small amounts needed to determine the certified concentrations will help to meet the needs of micro-analytical techniques for natural matrix reference materials. (author)

  1. Experimental results and validation of a method to reconstruct forces on the ITER test blanket modules

    International Nuclear Information System (INIS)

    Zeile, Christian; Maione, Ivan A.

    2015-01-01

    Highlights: • An in operation force measurement system for the ITER EU HCPB TBM has been developed. • The force reconstruction methods are based on strain measurements on the attachment system. • An experimental setup and a corresponding mock-up have been built. • A set of test cases representing ITER relevant excitations has been used for validation. • The influence of modeling errors on the force reconstruction has been investigated. - Abstract: In order to reconstruct forces on the test blanket modules in ITER, two force reconstruction methods, the augmented Kalman filter and a model predictive controller, have been selected and developed to estimate the forces based on strain measurements on the attachment system. A dedicated experimental setup with a corresponding mock-up has been designed and built to validate these methods. A set of test cases has been defined to represent possible excitation of the system. It has been shown that the errors in the estimated forces mainly depend on the accuracy of the identified model used by the algorithms. Furthermore, it has been found that a minimum of 10 strain gauges is necessary to allow for a low error in the reconstructed forces.

  2. ASTER Global Digital Elevation Model Version 2 - summary of validation results

    Science.gov (United States)

    Tachikawa, Tetushi; Kaku, Manabu; Iwasaki, Akira; Gesch, Dean B.; Oimoen, Michael J.; Zhang, Z.; Danielson, Jeffrey J.; Krieger, Tabatha; Curtis, Bill; Haase, Jeff; Abrams, Michael; Carabajal, C.; Meyer, Dave

    2011-01-01

    On June 29, 2009, NASA and the Ministry of Economy, Trade and Industry (METI) of Japan released a Global Digital Elevation Model (GDEM) to users worldwide at no charge as a contribution to the Global Earth Observing System of Systems (GEOSS). This “version 1” ASTER GDEM (GDEM1) was compiled from over 1.2 million scenebased DEMs covering land surfaces between 83°N and 83°S latitudes. A joint U.S.-Japan validation team assessed the accuracy of the GDEM1, augmented by a team of 20 cooperators. The GDEM1 was found to have an overall accuracy of around 20 meters at the 95% confidence level. The team also noted several artifacts associated with poor stereo coverage at high latitudes, cloud contamination, water masking issues and the stacking process used to produce the GDEM1 from individual scene-based DEMs (ASTER GDEM Validation Team, 2009). Two independent horizontal resolution studies estimated the effective spatial resolution of the GDEM1 to be on the order of 120 meters.

  3. Analytical results for a conditional phase shift between single-photon pulses in a nonlocal nonlinear medium

    Science.gov (United States)

    Viswanathan, Balakrishnan; Gea-Banacloche, Julio

    2018-03-01

    It has been suggested that second-order nonlinearities could be used for quantum logic at the single-photon level. Specifically, successive two-photon processes in principle could accomplish the phase shift (conditioned on the presence of two photons in the low-frequency modes) |011 〉→i |100 〉→-|011 〉 . We have analyzed a recent scheme proposed by Xia et al. [Phys. Rev. Lett. 116, 023601 (2016)], 10.1103/PhysRevLett.116.023601 to induce such a conditional phase shift between two single-photon pulses propagating at different speeds through a nonlinear medium with a nonlocal response. We present here an analytical solution for the most general case, i.e., for an arbitrary response function, initial state, and pulse velocity, which supports their numerical observation that a π phase shift with unit fidelity is possible, in principle, in an appropriate limit. We also discuss why this is possible in this system, despite the theoretical objections to the possibility of conditional phase shifts on single photons that were raised some time ago by Shapiro [Phys. Rev. A 73, 062305 (2006)], 10.1103/PhysRevA.73.062305 and by Gea-Banacloche [Phys. Rev. A 81, 043823 (2010)], 10.1103/PhysRevA.81.043823 one of us.

  4. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  5. Fast and simultaneous monitoring of organic pollutants in a drinking water treatment plant by a multi-analyte biosensor followed by LC-MS validation.

    Science.gov (United States)

    Rodriguez-Mozaz, Sara; de Alda, Maria J López; Barceló, Damià

    2006-04-15

    This work describes the application of an optical biosensor (RIver ANALyser, RIANA) to the simultaneous analysis of three relevant environmental organic pollutants, namely, the pesticides atrazine and isoproturon and the estrogen estrone, in real water samples. This biosensor is based on an indirect inhibition immunoassay which takes place at a chemically modified optical transducer chip. The spatially resolved modification of the transducer surface allows the simultaneous determination of selected target analytes by means of "total internal reflection fluorescence" (TIRF). The performance of the immunosensor method developed was evaluated against a well accepted traditional method based on solid-phase extraction followed by liquid chromatography-mass spectrometry (LC-MS). The chromatographic method was superior in terms of linearity, sensitivity and accuracy, and the biosensor method in terms of repeatability, speed, cost and automation. The application of both methods in parallel to determine the occurrence and removal of atrazine, isoproturon and estrone throughout the treatment process (sand filtration, ozonation, activated carbon filtration and chlorination) in a waterworks showed an overestimation of results in the case of the biosensor, which was partially attributed to matrix and cross-reactivity effects, in spite of the addition of ovalbumin to the sample to minimize matrix interferences. Based on the comparative performance of both techniques, the biosensor emerges as a suitable tool for fast, simple and automated screening of water pollutants without sample pretreatment. To the author's knowledge, this is the first description of the application of the biosensor RIANA in the multi-analyte configuration to the regular monitoring of pollutants in a waterworks.

  6. O papel dos programas interlaboratoriais para a qualidade dos resultados analíticos Interlaboratorial programs for improving the quality of analytical results

    Directory of Open Access Journals (Sweden)

    Queenie Siu Hang Chui

    2004-12-01

    Full Text Available Interlaboratorial programs are conducted for a number of purposes: to identify problems related to the calibration of instruments, to assess the degree of equivalence of analytical results among several laboratories, to attribute quantity values and its uncertainties in the development of a certified reference material and to verify the performance of laboratories as in proficiency testing, a key quality assurance technique, which is sometimes used in conjunction with accreditation. Several statistics tools are employed to assess the analytical results of laboratories participating in an intercomparison program. Among them are the z-score technique, the elypse of confidence and the Grubbs and Cochran test. This work presents the experience in coordinating an intercomparison exercise in order to determine Ca, Al, Fe, Ti and Mn, as impurities in samples of silicon metal of chemical grade prepared as a candidate for reference material.

  7. The development, validation and initial results of an integrated model for determining the environmental sustainability of biogas production pathways

    NARCIS (Netherlands)

    Pierie, Frank; van Someren, Christian; Benders, René M.J.; Bekkering, Jan; van Gemert, Wim; Moll, Henri C.

    2016-01-01

    Biogas produced through Anaerobic Digestion can be seen as a flexible and storable energy carrier. However, the environmental sustainability and efficiency of biogas production is not fully understood. Within this article the use, operation, structure, validation, and results of a model for the

  8. Pooled results from five validation studies of dietary self-report instruments using recovery biomarkers for potassium and sodium intake

    Science.gov (United States)

    We have pooled data from five large validation studies of dietary self-report instruments that used recovery biomarkers as referents to assess food frequency questionnaires (FFQs) and 24-hour recalls. We reported on total potassium and sodium intakes, their densities, and their ratio. Results were...

  9. [Critical reading of articles about diagnostic tests (part I): Are the results of the study valid?].

    Science.gov (United States)

    Arana, E

    2015-01-01

    In the era of evidence-based medicine, one of the most important skills a radiologist should have is the ability to analyze the diagnostic literature critically. This tutorial aims to present guidelines for determining whether primary diagnostic articles are valid for clinical practice. The following elements should be evaluated: whether the study can be applied to clinical practice, whether the technique was compared to the reference test, whether an appropriate spectrum of patients was included, whether expectation bias and verification bias were limited, the reproducibility of the study, the practical consequences of the study, the confidence intervals for the parameters analyzed, the normal range for continuous variables, and the placement of the test in the context of other diagnostic tests. We use elementary practical examples to illustrate how to select and interpret the literature on diagnostic imaging and specific references to provide more details. Copyright © 2014 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  10. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jae Seong

    1993-02-15

    This book is comprised of nineteen chapters, which describes introduction of analytical chemistry, experimental error and statistics, chemistry equilibrium and solubility, gravimetric analysis with mechanism of precipitation, range and calculation of the result, volume analysis on general principle, sedimentation method on types and titration curve, acid base balance, acid base titration curve, complex and firing reaction, introduction of chemical electro analysis, acid-base titration curve, electrode and potentiometry, electrolysis and conductometry, voltammetry and polarographic spectrophotometry, atomic spectrometry, solvent extraction, chromatograph and experiments.

  11. Analytical chemistry

    International Nuclear Information System (INIS)

    Choi, Jae Seong

    1993-02-01

    This book is comprised of nineteen chapters, which describes introduction of analytical chemistry, experimental error and statistics, chemistry equilibrium and solubility, gravimetric analysis with mechanism of precipitation, range and calculation of the result, volume analysis on general principle, sedimentation method on types and titration curve, acid base balance, acid base titration curve, complex and firing reaction, introduction of chemical electro analysis, acid-base titration curve, electrode and potentiometry, electrolysis and conductometry, voltammetry and polarographic spectrophotometry, atomic spectrometry, solvent extraction, chromatograph and experiments.

  12. Carbon and nitrogen determination in Zr by photon or proton activation analysis. Comparison between the results obtained by this method and other analytical techniques

    International Nuclear Information System (INIS)

    Petit, J.; Gosset, J.; Engelmann, C.

    1977-01-01

    Carbon and nitrogen are determined by the following nuclear reactions: 12 C(γ,n) 11 C and 14 N(p,α) 11 C. The performances of the method and the main interferences are considered. The process developed for the separation of carbon-11 from zirconium is described and its efficiency evaluated. The results obtained are compared with those given by different laboratories using various analytical techniques [fr

  13. Exact Analytic Result of Contact Value for the Density in a Modified Poisson-Boltzmann Theory of an Electrical Double Layer.

    Science.gov (United States)

    Lou, Ping; Lee, Jin Yong

    2009-04-14

    For a simple modified Poisson-Boltzmann (SMPB) theory, taking into account the finite ionic size, we have derived the exact analytic expression for the contact values of the difference profile of the counterion and co-ion, as well as of the sum (density) and product profiles, near a charged planar electrode that is immersed in a binary symmetric electrolyte. In the zero ionic size or dilute limit, these contact values reduce to the contact values of the Poisson-Boltzmann (PB) theory. The analytic results of the SMPB theory, for the difference, sum, and product profiles were compared with the results of the Monte-Carlo (MC) simulations [ Bhuiyan, L. B.; Outhwaite, C. W.; Henderson, D. J. Electroanal. Chem. 2007, 607, 54 ; Bhuiyan, L. B.; Henderson, D. J. Chem. Phys. 2008, 128, 117101 ], as well as of the PB theory. In general, the analytic expression of the SMPB theory gives better agreement with the MC data than the PB theory does. For the difference profile, as the electrode charge increases, the result of the PB theory departs from the MC data, but the SMPB theory still reproduces the MC data quite well, which indicates the importance of including steric effects in modeling diffuse layer properties. As for the product profile, (i) it drops to zero as the electrode charge approaches infinity; (ii) the speed of the drop increases with the ionic size, and these behaviors are in contrast with the predictions of the PB theory, where the product is identically 1.

  14. Analytical method development and validation for quantification of uranium in compounds of the nuclear fuel cycle by Fourier Transform Infrared (FTIR) Spectroscopy

    International Nuclear Information System (INIS)

    Pereira, Elaine

    2016-01-01

    This work presents a low cost, simple and new methodology for direct quantification of uranium in compounds of the nuclear fuel cycle, based on Fourier Transform Infrared (FTIR) spectroscopy using KBr pressed discs technique. Uranium in different matrices were used to development and validation: UO 2 (NO 3 )2.2TBP complex (TBP uranyl nitrate complex) in organic phase and uranyl nitrate (UO 2 (NO 3 ) 2 ) in aqueous phase. The parameters used in the validation process were: linearity, selectivity, accuracy, limits of detection (LD) and quantitation (LQ), precision (repeatability and intermediate precision) and robustness. The method for uranium in organic phase (UO 2 (NO 3 )2.2TBP complex in hexane/embedded in KBr) was linear (r = 0.9980) over the range of 0.20% 2.85% U/ KBr disc, LD 0.02% and LQ 0.03%, accurate (recoveries were over 101.0%), robust and precise (RSD < 1.6%). The method for uranium aqueous phase (UO 2 (NO 3 ) 2 /embedded in KBr) was linear (r = 0.9900) over the range of 0.14% 1.29% U/KBr disc, LD 0.01% and LQ 0.02%, accurate (recoveries were over 99.4%), robust and precise (RSD < 1.6%). Some process samples were analyzed in FTIR and compared with gravimetric and X-ray fluorescence (XRF) analyses showing similar results in all three methods. The statistical tests (t-Student and Fischer) showed that the techniques are equivalent. The validated method can be successfully employed for routine quality control analysis for nuclear compounds. (author)

  15. Influence of centrifugation conditions on the results of 77 routine clinical chemistry analytes using standard vacuum blood collection tubes and the new BD-Barricor tubes.

    Science.gov (United States)

    Cadamuro, Janne; Mrazek, Cornelia; Leichtle, Alexander B; Kipman, Ulrike; Felder, Thomas K; Wiedemann, Helmut; Oberkofler, Hannes; Fiedler, Georg M; Haschke-Becher, Elisabeth

    2018-02-15

    Although centrifugation is performed in almost every blood sample, recommendations on duration and g-force are heterogeneous and mostly based on expert opinions. In order to unify this step in a fully automated laboratory, we aimed to evaluate different centrifugation settings and their influence on the results of routine clinical chemistry analytes. We collected blood from 41 healthy volunteers into BD Vacutainer PST II-heparin-gel- (LiHepGel), BD Vacutainer SST II-serum-, and BD Vacutainer Barricor heparin-tubes with a mechanical separator (LiHepBar). Tubes were centrifuged at 2000xg for 10 minutes and 3000xg for 7 and 5 minutes, respectively. Subsequently 60 and 21 clinical chemistry analytes were measured in plasma and serum samples, respectively, using a Roche COBAS instrument. High sensitive Troponin T, pregnancy-associated plasma protein A, ß human chorionic gonadotropin and rheumatoid factor had to be excluded from statistical evaluation as many of the respective results were below the measuring range. Except of free haemoglobin (fHb) measurements, no analyte result was altered by the use of shorter centrifugation times at higher g-forces. Comparing LiHepBar to LiHepGel tubes at different centrifugation setting, we found higher lactate-dehydrogenase (LD) (P = 0.003 to centrifuged at higher speed (3000xg) for a shorter amount of time (5 minutes) without alteration of the analytes tested in this study. When using LiHepBar tubes for blood collection, a separate LD reference value might be needed.

  16. Simulation analysis of impact tests of steel plate reinforced concrete and reinforced concrete slabs against aircraft impact and its validation with experimental results

    International Nuclear Information System (INIS)

    Sadiq, Muhammad; Xiu Yun, Zhu; Rong, Pan

    2014-01-01

    Highlights: • Simulation analysis is carried out with two constitutive concrete models. • Winfrith model can better simulate nonlinear response of concrete than CSCM model. • Performance of steel plate concrete is better than reinforced concrete. • Thickness of safety related structures can be reduced by adopting steel plates. • Analysis results, mainly concrete material models should be validated. - Abstract: The steel plate reinforced concrete and reinforced concrete structures are used in nuclear power plants for protection against impact of an aircraft. In order to compare the impact resistance performance of steel plate reinforced concrete and reinforced concrete slabs panels, simulation analysis of 1/7.5 scale model impact tests is carried out by using finite element code ANSYS/LS-DYNA. The damage modes of all finite element models, velocity time history curves of the aircraft engine and damage to aircraft model are compared with the impact test results of steel plate reinforced concrete and reinforced concrete slab panels. The results indicate that finite element simulation results correlate well with the experimental results especially for constitutive winfrith concrete model. Also, the impact resistance performance of steel plate reinforced concrete slab panels is better than reinforced concrete slab panels, particularly the rear face steel plate is very effective in preventing the perforation and scabbing of concrete than conventional reinforced concrete structures. In this way, the thickness of steel plate reinforced concrete structures can be reduced in important structures like nuclear power plants against impact of aircraft. It also demonstrates the methodology to validate the analysis procedure with experimental and analytical studies. It may be effectively employed to predict the precise response of safety related structures against aircraft impact

  17. A new colorimetric chemosensors for Cu{sup 2+} and Cd{sup 2+} ions detection: Application in environmental water samples and analytical method validation

    Energy Technology Data Exchange (ETDEWEB)

    Tekuri, Venkatadri; Trivedi, Darshak R., E-mail: darshak_rtrivedi@yahoo.co.in

    2017-06-15

    A new heterocyclic thiophene-2-caboxylic acid hydrazide based chemosensor R1 to R4 were designed, synthesized and characterized by various spectroscopic techniques like FT-IR, UV-Vis, {sup 1}H NMR, {sup 13}C NMR, Mass and SC-XRD. The chemosensor R3 showed a significant color change from colorless to yellow in the presence of Cu{sup 2+} ions and chemosensor R4 showed a significant color change from colorless to yellow in the presence of Cd{sup 2+} ions over the other tested cations such as Cr{sup 3+}, Mn{sup 2+}, Fe{sup 2+}, Fe{sup 3+}, Co{sup 2+}, Ni{sup 2+}, Zn{sup 2+}, Ag{sup 2+}, Al{sup 3+}, Pb{sup 2+}, Hg{sup 2+}, K{sup +}, Ca{sup 2+} and Mg{sup 2+}. The high selective and sensitivity of R3 towards Cu{sup 2+} and R4 towards Cd{sup 2+} ions was confirmed by UV-Vis spectroscopic study. The R3 showed a red shift in the presence of Cu{sup 2+} ions by Δλ{sub max} 67 nm and R4 showed a red shift in the presence of Cd{sup 2+} ions by Δλ{sub max} 105 nm in the absorption spectrum. The binding stoichiometric ratio of the complex between R3 - Cu{sup 2+} and R4 - Cd{sup 2+} ions have been found to be 1:1 using the B-H plot. Under optimized experimental conditions, the R3 and R4 exhibits a dynamic linear absorption response range, from 0 to 50 μM for Cu{sup 2+} ions and 0 to 30 μM for Cd{sup 2+} ions, with the detection limit of 2.8 × 10{sup −6} M for Cu{sup 2+} and 2.0 × 10{sup −7} M for Cd{sup 2+} ions. The proposed analytical method for the quantitative determination of Cu{sup 2+} and Cd{sup 2+} ions was validated and successfully applied for the environmental samples with good precision and accuracy. - Highlights: • Detection of Cu{sup 2+} and Cd{sup 2+} ions has gained significance by virtue of its key role in biological and environmental science. • The R3 and R4 showed instantaneous color change from colorless to yellow in the presence of Cu{sup 2+} and Cd{sup 2+} ions respectively. • The proposed detection methods were validated and

  18. Radionuclide migration in forest ecosystems - results of a model validation study

    International Nuclear Information System (INIS)

    Shaw, G.; Venter, A.; Avila, R.; Bergman, R.; Bulgakov, A.; Calmon, P.; Fesenko, S.; Frissel, M.; Goor, F.; Konoplev, A.; Linkov, I.; Mamikhin, S.; Moberg, L.; Orlov, A.; Rantavaara, A.; Spiridonov, S.; Thiry, Y.

    2005-01-01

    The primary objective of the IAEA's BIOMASS Forest Working Group (FWG) was to bring together experimental radioecologists and modellers to facilitate the exchange of information which could be used to improve our ability to understand and forecast radionuclide transfers within forests. This paper describes a blind model validation exercise which was conducted by the FWG to test nine models which members of the group had developed in response to the need to predict the fate of radiocaesium in forests in Europe after the Chernobyl accident. The outcomes and conclusions of this exercise are summarised. It was concluded that, as a group, the models are capable of providing an envelope of predictions which can be expected to enclose experimental data for radiocaesium contamination in forests over the time scale tested. However, the models are subject to varying degrees of conceptual uncertainty which gives rise to a very high degree of divergence between individual model predictions, particularly when forecasting edible mushroom contamination. Furthermore, the forecasting capability of the models over future decades currently remains untested

  19. Determination of polychlorinated dibenzodioxins and polychlorinated dibenzofurans (PCDDs/PCDFs) in food and feed using a bioassay. Result of a validation study

    Energy Technology Data Exchange (ETDEWEB)

    Gizzi, G.; Holst, C. von; Anklam, E. [Commission of the European Communities, Geel (Belgium). Joint Research Centre, Inst. for Reference Materials and Measurement, Food Safety and Quality Unit; Hoogenboom, R. [RIKILT-Intitute of Food Safety, Wageningen (Netherlands); Rose, M. [Defra Central Science Laboratory, Sand Hutton, York (United Kingdom)

    2004-09-15

    It is estimated that more than 90% of dioxins consumed by humans come from foods derived from animals. The European Commission through a Council Regulation (No 2375/2001) and a Directive (2001/102/EC), both revised by the Commission Recommendation (2002/201/EC), has set maximum levels for dioxins in food and feedstuffs. To implement the regulation, dioxin-monitoring programs of food and feedstuffs will be undertaken by the Member States requiring the analysis of large amounts of samples. Food and feed companies will have to control their products before putting them into the market. The monitoring for the presence of dioxins in food and feeds needs fast and cheap screening methods in order to select samples with potentially high levels of dioxins to be then analysed by a confirmatory method like HRGC/HRMS. Bioassays like the DR CALUX {sup registered} - assay have claimed to provide a suitable alternative for the screening of large number of samples, reducing costs and the required time of analysis. These methods have to comply with the specific characteristics considered into two Commission Directives (2002/69/EC; 2002/70/EC), establishing the requirements for the determination of dioxin and dioxin-like PCBs for the official control of food and feedstuffs. The European Commission's Joint Research Centre is pursuing validation of alternative techniques in food and feed materials. In order to evaluate the applicability of the DR CALUX {sup registered} technique as screening method in compliance with the Commission Directives, a validation study was organised in collaboration with CSL and RIKILT. The aim of validating an analytical method is first to determine its performance characteristics (e.g. variability, bias, rate of false positive and false negative results), and secondly to evaluate if the method is fit for the purpose. Two approaches are commonly used: an in-house validation is preferentially performed first in order to establish whether the method is

  20. Improvement of the decision efficiency of the accuracy profile by means of a desirability function for analytical methods validation. Application to a diacetyl-monoxime colorimetric assay used for the determination of urea in transdermal iontophoretic extracts.

    Science.gov (United States)

    Rozet, E; Wascotte, V; Lecouturier, N; Préat, V; Dewé, W; Boulanger, B; Hubert, Ph

    2007-05-22

    Validation of analytical methods is a widely used and regulated step for each analytical method. However, the classical approaches to demonstrate the ability to quantify of a method do not necessarily fulfill this objective. For this reason an innovative methodology was recently introduced by using the tolerance interval and accuracy profile, which guarantee that a pre-defined proportion of future measurements obtained with the method will be included within the acceptance limits. Accuracy profile is an effective decision tool to assess the validity of analytical methods. The methodology to build such a profile is detailed here. However, as for any visual tool it has a part of subjectivity. It was then necessary to make the decision process objective in order to quantify the degree of adequacy of an accuracy profile and to allow a thorough comparison between such profiles. To achieve this, we developed a global desirability index based on the three most important validation criteria: the trueness, the precision and the range. The global index allows the classification of the different accuracy profiles obtained according to their respective response functions. A diacetyl-monoxime colorimetric assay for the determination of urea in transdermal iontophoretic extracts was used to illustrate these improvements.

  1. Validation of administrative and clinical case definitions for gestational diabetes mellitus against laboratory results.

    Science.gov (United States)

    Bowker, S L; Savu, A; Donovan, L E; Johnson, J A; Kaul, P

    2017-06-01

    To examine the validity of International Classification of Disease, version 10 (ICD-10) codes for gestational diabetes mellitus in administrative databases (outpatient and inpatient), and in a clinical perinatal database (Alberta Perinatal Health Program), using laboratory data as the 'gold standard'. Women aged 12-54 years with in-hospital, singleton deliveries between 1 October 2008 and 31 March 2010 in Alberta, Canada were included in the study. A gestational diabetes diagnosis was defined in the laboratory data as ≥2 abnormal values on a 75-g oral glucose tolerance test or a 50-g glucose screen ≥10.3 mmol/l. Of 58 338 pregnancies, 2085 (3.6%) met gestational diabetes criteria based on laboratory data. The gestational diabetes rates in outpatient only, inpatient only, outpatient or inpatient combined, and Alberta Perinatal Health Program databases were 5.2% (3051), 4.8% (2791), 5.8% (3367) and 4.8% (2825), respectively. Although the outpatient or inpatient combined data achieved the highest sensitivity (92%) and specificity (97%), it was associated with a positive predictive value of only 57%. The majority of the false-positives (78%), however, had one abnormal value on oral glucose tolerance test, corresponding to a diagnosis of impaired glucose tolerance in pregnancy. The ICD-10 codes for gestational diabetes in administrative databases, especially when outpatient and inpatient databases are combined, can be used to reliably estimate the burden of the disease at the population level. Because impaired glucose tolerance in pregnancy and gestational diabetes may be managed similarly in clinical practice, impaired glucose tolerance in pregnancy is often coded as gestational diabetes. © 2016 Diabetes UK.

  2. The effects of corona on current surges induced on conducting lines by EMP: A comparison of experiment data with results of analytic corona models

    Science.gov (United States)

    Blanchard, J. P.; Tesche, F. M.; McConnell, B. W.

    1987-09-01

    An experiment to determine the interaction of an intense electromagnetic pulse (EMP), such as that produced by a nuclear detonation above the Earth's atmosphere, was performed in March, 1986 at Kirtland Air Force Base near Albuquerque, New Mexico. The results of that experiment have been published without analysis. Following an introduction of the corona phenomenon, the reason for interest in it, and a review of the experiment, this paper discusses five different analytic corona models that may model corona formation on a conducting line subjected to EMP. The results predicted by these models are compared with measured data acquired during the experiment to determine the strengths and weaknesses of each model.

  3. Design description and validation results for the IFMIF High Flux Test Module as outcome of the EVEDA phase

    Directory of Open Access Journals (Sweden)

    F. Arbeiter

    2016-12-01

    Full Text Available During the Engineering Validation and Engineering Design Activities (EVEDA phase (2007-2014 of the International Fusion Materials Irradiation Facility (IFMIF, an advanced engineering design of the High Flux Test Module (HFTM has been developed with the objective to facilitate the controlled irradiation of steel samples in the high flux area directly behind the IFMIF neutron source. The development process addressed included manufacturing techniques, CAD, neutronic, thermal-hydraulic and mechanical analyses complemented by a series of validation activities. Validation included manufacturing of 1:1 parts and mockups, test of prototypes in the FLEX and HELOKA-LP helium loops of KIT for verification of the thermal and mechanical properties, and irradiation of specimen filled capsule prototypes in the BR2 test reactor. The prototyping activities were backed by several R&D studies addressing focused issues like handling of liquid NaK (as filling medium and insertion of Small Specimen Test Technique (SSTT specimens into the irradiation capsules. This paper provides an up-todate design description of the HFTM irradiation device, and reports on the achieved performance criteria related to the requirements. Results of the validation activities are accounted for and the most important issues for further development are identified.

  4. Validation of thermohydraulic codes by comparison of experimental results with computer simulations

    International Nuclear Information System (INIS)

    Madeira, A.A.; Galetti, M.R.S.; Pontedeiro, A.C.

    1989-01-01

    The results obtained by simulation of three cases from CANON depressurization experience, using the TRAC-PF1 computer code, version 7.6, implanted in the VAX-11/750 computer of Brazilian CNEN, are presented. The CANON experience was chosen as first standard problem in thermo-hydraulic to be discussed at ENFIR for comparing results from different computer codes with results obtained experimentally. The ability of TRAC-PF1 code to prevent the depressurization phase of a loss of primary collant accident in pressurized water reactors is evaluated. (M.C.K.) [pt

  5. Some further analytical results on the solid angle subtended at a point by a circular disk using elliptic integrals

    International Nuclear Information System (INIS)

    Timus, D.M.; Prata, M.J.; Kalla, S.L.; Abbas, M.I.; Oner, F.; Galiano, E.

    2007-01-01

    A series formulation involving complete elliptic integrals of the first and second kinds for the solid angle subtended at a point by a circular disk is presented. Results from the present model were tested against data sets obtained with previous treatments for the solid angle in order to determine the degree of simplicity and speed of our calculations. 3-D graphs are presented

  6. SAMAC Analytical Notes II: preliminary results of x-ray fluorescence analysis of archeological materials from southeastern Utah

    International Nuclear Information System (INIS)

    Snow, D.H.; Fullbright, H.J.

    1977-02-01

    A series of prehistoric potsherds, local clay samples, and possible tempering materials from archeological excavations in southeastern Utah have been examined by x-ray fluorescence spectroscopy. The results obtained for this small sampling demonstrate the usefulness of the technique in characterizing the clays, the potsherd pastes, and the decorative pigments

  7. Validation of natural language processing to extract breast cancer pathology procedures and results

    Directory of Open Access Journals (Sweden)

    Arika E Wieneke

    2015-01-01

    Full Text Available Background: Pathology reports typically require manual review to abstract research data. We developed a natural language processing (NLP system to automatically interpret free-text breast pathology reports with limited assistance from manual abstraction. Methods: We used an iterative approach of machine learning algorithms and constructed groups of related findings to identify breast-related procedures and results from free-text pathology reports. We evaluated the NLP system using an all-or-nothing approach to determine which reports could be processed entirely using NLP and which reports needed manual review beyond NLP. We divided 3234 reports for development (2910, 90%, and evaluation (324, 10% purposes using manually reviewed pathology data as our gold standard. Results: NLP correctly coded 12.7% of the evaluation set, flagged 49.1% of reports for manual review, incorrectly coded 30.8%, and correctly omitted 7.4% from the evaluation set due to irrelevancy (i.e. not breast-related. Common procedures and results were identified correctly (e.g. invasive ductal with 95.5% precision and 94.0% sensitivity, but entire reports were flagged for manual review because of rare findings and substantial variation in pathology report text. Conclusions: The NLP system we developed did not perform sufficiently for abstracting entire breast pathology reports. The all-or-nothing approach resulted in too broad of a scope of work and limited our flexibility to identify breast pathology procedures and results. Our NLP system was also limited by the lack of the gold standard data on rare findings and wide variation in pathology text. Focusing on individual, common elements and improving pathology text report standardization may improve performance.

  8. Analytic manifolds in uniform algebras

    International Nuclear Information System (INIS)

    Tonev, T.V.

    1988-12-01

    Here we extend Bear-Hile's result concerning the version of famous Bishop's theorem for one-dimensional analytic structures in two directions: for n-dimensional complex analytic manifolds, n>1, and for generalized analytic manifolds. 14 refs

  9. Tank 241-SY-102 January 2000 Compatibility Grab Samples Analytical Results for the Final Report [SEC 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    BELL, K.E.

    2000-05-11

    This document is the format IV, final report for the tank 241-SY-102 (SY-102) grab samples taken in January 2000 to address waste compatibility concerns. Chemical, radiochemical, and physical analyses on the tank SY-102 samples were performed as directed in Comparability Grab Sampling and Analysis Plan for Fiscal Year 2000 (Sasaki 1999). No notification limits were exceeded. Preliminary data on samples 2SY-99-5, -6, and -7 were reported in ''Format II Report on Tank 241-SY-102 Waste Compatibility Grab Samples Taken in January 2000'' (Lockrem 2000). The data presented here represent the final results.

  10. Eikonal Scattering in the sdg Interacting Boson Model:. Analytical Results in the SUsdg(3) Limit and Their Generalizations

    Science.gov (United States)

    Kota, V. K. B.

    General expression for the representation matrix elements in the SUsdg(3) limit of the sdg interacting boson model (sdgIBM) is derived that determine the scattering amplitude in the eikonal approximation for medium energy proton-nucleus scattering when the target nucleus is deformed and it is described by the SUsdg(3) limit. The SUsdg(3) result is generalized to two important situations: (i) when the target nucleus ground band states are described as states arising out of angular momentum projection from a general single Kπ = 0+ intrinsic state in sdg space; (ii) for rotational bands built on one-phonon excitations in sdgIBM.

  11. Technicians or patient advocates?--still a valid question (results of focus group discussions with pharmacists)

    DEFF Research Database (Denmark)

    Almarsdóttir, Anna Birna; Morgall, Janine Marie

    1999-01-01

    discussions with community pharmacists in the capital area Reykjavík and rural areas were employed to answer the research question: How has the pharmacists' societal role evolved after the legislation and what are the implications for pharmacy practice? The results showed firstly that the public image...... and the self-image of the pharmacist has changed in the short time since the legislative change. The pharmacists generally said that their patient contact is deteriorating due to the discount wars, the rural pharmacists being more optimistic, and believing in a future competition based on quality. Secondly......, the results showed that the pharmacists have difficulties reconciling their technical paradigm with a legislative and professional will specifying customer and patient focus. This study describes the challenges of a new legislation with a market focus for community pharmacists whose education emphasized...

  12. Army Synthetic Validity Project Report of Phase 2 Results. Volume 2. Appendixes

    Science.gov (United States)

    1990-10-01

    to Equipment & Food o Personal Hygine - Field & Garrison (4) o Kitchen Equipment - Garrison o Field Preparation of Foods & Equipment o Food, Field...Results: Volume II: Appendi i 12. PERSONAL AUTHOR(S) Wise, Lauress L. (AIR); Peterson, Norman G.; Houston, Janis (PDRI); Hoffman, R. Gene Campbell, John...o Handling KIA o Personal Hygiene & Preventive Medicine Numbers in parentheses indicate the number of participants that identified the task as

  13. [Analytical procedure of variable number of tandem repeats (VNTR) analysis and effective use of analysis results for tuberculosis control].

    Science.gov (United States)

    Hachisu, Yushi; Hashimoto, Ruiko; Kishida, Kazunori; Yokoyama, Eiji

    2013-12-01

    Variable number of tandem repeats (VNTR) analysis is one of the methods for molecular epidemiological studies of Mycobacterium tuberculosis. VNTR analysis is a method based on PCR, provides rapid highly reproducible results and higher strain discrimination power than the restriction fragment length polymorphism (RFLP) analysis widely used in molecular epidemiological studies of Mycobacterium tuberculosis. Genetic lineage compositions of Mycobacterium tuberculosis clinical isolates differ among the regions from where they are isolated, and allelic diversity at each locus also differs among the genetic lineages of Mycobacterium tuberculosis. Therefore, the combination of VNTR loci that can provide high discrimination capacity for analysis is not common in every region. The Japan Anti-Tuberculosis Association (JATA) 12 (15) reported a standard combination of VNTR loci for analysis in Japan, and the combination with hypervariable (HV) loci added to JATA12 (15), which has very high discrimination capacity, was also reported. From these reports, it is thought that data sharing between institutions and construction of a nationwide database will progress from now on. Using database construction of VNTR profiles, VNTR analysis has become an effective tool to trace the route of tuberculosis infection, and also helps in decision-making in the treatment course. However, in order to utilize the results of VNTR analysis effectively, it is important that each related organization cooperates closely, and analysis should be appropriately applied in the system in which accurate control and private information protection are ensured.

  14. Tank 241-U-103, grab samples 3U-99-1, 3u-99-2 and 3U-99-3 analytical results for the final report

    International Nuclear Information System (INIS)

    STEEN, F.H.

    1999-01-01

    This document is the final report for tank 241-U-103 grab samples. Three grab samples were collected from riser 13 on March 12, 1999 and received by the 222-S laboratory on March 15, 1999. Analyses were performed in accordance with the Compatibility Grab Sampling and Analysis Plan for Fiscal Year 1999 (TSAP) (Sasaki, 1999) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DQO). The analytical results are presented in the data summary report. None of the subsamples submitted for differential scanning calorimetry (DSC), total organic carbon (TOC) and plutonium 239 (Pu239) analyses exceeded the notification limits as stated in TSAP

  15. Labtracker+, a medical smartphone app for the interpretation of consecutive laboratory results: an external validation study.

    Science.gov (United States)

    Hilderink, Judith M; Rennenberg, Roger J M W; Vanmolkot, Floris H M; Bekers, Otto; Koopmans, Richard P; Meex, Steven J R

    2017-09-01

    When monitoring patients over time, clinicians may struggle to distinguish 'real changes' in consecutive blood parameters from so-called natural fluctuations. In practice, they have to do so by relying on their clinical experience and intuition. We developed Labtracker+ , a medical app that calculates the probability that an increase or decrease over time in a specific blood parameter is real, given the time between measurements. We presented patient cases to 135 participants to examine whether there is a difference between medical students, residents and experienced clinicians when it comes to interpreting changes between consecutive laboratory results. Participants were asked to interpret if changes in consecutive laboratory values were likely to be 'real' or rather due to natural fluctuations. The answers of the study participants were compared with the calculated probabilities by the app Labtracker+ and the concordance rates were assessed. Medical students (n=92), medical residents from the department of internal medicine (n=19) and internists (n=24) at a Dutch University Medical Centre. Concordance rates between the study participants and the calculated probabilities by the app Labtracker+ were compared. Besides, we tested whether physicians with clinical experience scored better concordance rates with the app Labtracker+ than inexperienced clinicians. Medical residents and internists showed significantly better concordance rates with the calculated probabilities by the app Labtracker+ than medical students, regarding their interpretation of differences between consecutive laboratory results (p=0.009 and p<0.001, respectively). The app Labtracker+ could serve as a clinical decision tool in the interpretation of consecutive laboratory test results and could contribute to rapid recognition of parameter changes by physicians. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial

  16. Validation Techniques of network harmonic models based on switching of a series linear component and measuring resultant harmonic increments

    DEFF Research Database (Denmark)

    Wiechowski, Wojciech Tomasz; Lykkegaard, Jan; Bak, Claus Leth

    2007-01-01

    In this paper two methods of validation of transmission network harmonic models are introduced. The methods were developed as a result of the work presented in [1]. The first method allows calculating the transfer harmonic impedance between two nodes of a network. Switching a linear, series network......, as for example a transmission line. Both methods require that harmonic measurements performed at two ends of the disconnected element are precisely synchronized....... are used for calculation of the transfer harmonic impedance between the nodes. The determined transfer harmonic impedance can be used to validate a computer model of the network. The second method is an extension of the fist one. It allows switching a series element that contains a shunt branch...

  17. Preliminary Assessment of ATR-C Capabilities to Provide Integral Benchmark Data for Key Structural/Matrix Materials that May be Used for Nuclear Data Testing and Analytical Methods Validation

    Energy Technology Data Exchange (ETDEWEB)

    John D. Bess

    2009-03-01

    The purpose of this research is to provide a fundamental computational investigation into the possible integration of experimental activities with the Advanced Test Reactor Critical (ATR-C) facility with the development of benchmark experiments. Criticality benchmarks performed in the ATR-C could provide integral data for key matrix and structural materials used in nuclear systems. Results would then be utilized in the improvement of nuclear data libraries and as a means for analytical methods validation. It is proposed that experiments consisting of well-characterized quantities of materials be placed in the Northwest flux trap position of the ATR-C. The reactivity worth of the material could be determined and computationally analyzed through comprehensive benchmark activities including uncertainty analyses. Experiments were modeled in the available benchmark model of the ATR using MCNP5 with the ENDF/B-VII.0 cross section library. A single bar (9.5 cm long, 0.5 cm wide, and 121.92 cm high) of each material could provide sufficient reactivity difference in the core geometry for computational modeling and analysis. However, to provide increased opportunity for the validation of computational models, additional bars of material placed in the flux trap would increase the effective reactivity up to a limit of 1$ insertion. For simplicity in assembly manufacture, approximately four bars of material could provide a means for additional experimental benchmark configurations, except in the case of strong neutron absorbers and many materials providing positive reactivity. Future tasks include the cost analysis and development of the experimental assemblies, including means for the characterization of the neutron flux and spectral indices. Oscillation techniques may also serve to provide additional means for experimentation and validation of computational methods and acquisition of integral data for improving neutron cross sections. Further assessment of oscillation

  18. Preliminary Assessment of ATR-C Capabilities to Provide Integral Benchmark Data for Key Structural/Matrix Materials that May be Used for Nuclear Data Testing and Analytical Methods Validation

    Energy Technology Data Exchange (ETDEWEB)

    John D. Bess

    2009-07-01

    The purpose of this document is to identify some suggested types of experiments that can be performed in the Advanced Test Reactor Critical (ATR-C) facility. A fundamental computational investigation is provided to demonstrate possible integration of experimental activities in the ATR-C with the development of benchmark experiments. Criticality benchmarks performed in the ATR-C could provide integral data for key matrix and structural materials used in nuclear systems. Results would then be utilized in the improvement of nuclear data libraries and as a means for analytical methods validation. It is proposed that experiments consisting of well-characterized quantities of materials be placed in the Northwest flux trap position of the ATR-C. The reactivity worth of the material could be determined and computationally analyzed through comprehensive benchmark activities including uncertainty analyses. Experiments were modeled in the available benchmark model of the ATR using MCNP5 with the ENDF/B-VII.0 cross section library. A single bar (9.5 cm long, 0.5 cm wide, and 121.92 cm high) of each material could provide sufficient reactivity difference in the core geometry for computational modeling and analysis. However, to provide increased opportunity for the validation of computational models, additional bars of material placed in the flux trap would increase the effective reactivity up to a limit of 1$ insertion. For simplicity in assembly manufacture, approximately four bars of material could provide a means for additional experimental benchmark configurations, except in the case of strong neutron absorbers and many materials providing positive reactivity. Future tasks include the cost analysis and development of the experimental assemblies, including means for the characterization of the neutron flux and spectral indices. Oscillation techniques may also serve to provide additional means for experimentation and validation of computational methods and acquisition of

  19. Automated Cancer Registry Notifications: Validation of a Medical Text Analytics System for Identifying Patients with Cancer from a State-Wide Pathology Repository.

    Science.gov (United States)

    Nguyen, Anthony N; Moore, Julie; O'Dwyer, John; Philpot, Shoni

    2016-01-01

    The paper assesses the utility of Medtex on automating Cancer Registry notifications from narrative histology and cytology reports from the Queensland state-wide pathology information system. A corpus of 45.3 million pathology HL7 messages (including 119,581 histology and cytology reports) from a Queensland pathology repository for the year of 2009 was analysed by Medtex for cancer notification. Reports analysed by Medtex were consolidated at a patient level and compared against patients with notifiable cancers from the Queensland Oncology Repository (QOR). A stratified random sample of 1,000 patients was manually reviewed by a cancer clinical coder to analyse agreements and discrepancies. Sensitivity of 96.5% (95% confidence interval: 94.5-97.8%), specificity of 96.5% (95.3-97.4%) and positive predictive value of 83.7% (79.6-86.8%) were achieved for identifying cancer notifiable patients. Medtex achieved high sensitivity and specificity across the breadth of cancers, report types, pathology laboratories and pathologists throughout the State of Queensland. The high sensitivity also resulted in the identification of cancer patients that were not found in the QOR. High sensitivity was at the expense of positive predictive value; however, these cases may be considered as lower priority to Cancer Registries as they can be quickly reviewed. Error analysis revealed that system errors tended to be tumour stream dependent. Medtex is proving to be a promising medical text analytic system. High value cancer information can be generated through intelligent data classification and extraction on large volumes of unstructured pathology reports.

  20. Translation and validation of Convergence Insufficiency Symptom Survey (CISS to Portuguese - psychometric results

    Directory of Open Access Journals (Sweden)

    Catarina Tavares

    2014-01-01

    Full Text Available Purpose: Translate and adapt the Convergence Insuficiency Symptom Survey (CISS questionnaire to the Portuguese language and culture and assess the psychometric properties of the translated questionnaire (CISSvp. Methods: The CISS questionnaire was adapted according to the methodology recommended by some authors. The process involved two translations and back-translations performed by independent evaluators, evaluation of these versions, preparation of a synthesis version and its pre-test. The final version (CISSvp was applied in 70 patients (21.79 ± 2.42 years students in higher education, and at two different times, by two observers, to assess its reliability. Results: The results showed good internal consistency of the CISSvp (Cronbach's alpha - α=0.893. The test re-test revealed an average of the differences between the first and second evaluation of 0.75 points (SD ± 3.53, which indicates a minimum bias between the two administrations. The interrater reliability assessed by intraclass correlation coefficient ranged from 0.880 to 0.952, revealing that the CISSvp represents an appropriate tool for measuring the visual discomfort associated with near vision tasks with a high level of reproducibility. Conclusions: The CISS Portuguese version, showed good psychometric properties and has been sown to be applicable to the Portuguese population, to quantify the visual discomfort associated with near vision, in higher education students.

  1. Performance results of a mobile high-resolution MR-TOF mass spectrometer for in-situ analytical mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lippert, Wayne; Lang, Johannes [Justus-Liebig-Universitaet Giessen (Germany); Ayet San Andres, Samuel [GSI, Darmstadt (Germany); Dickel, Timo; Geissel, Hans; Plass, Wolfgang; Scheidenberger, Christoph [Justus-Liebig-Universitaet Giessen (Germany); GSI, Darmstadt (Germany); Yavor, Mikhail [RAS St. Petersburg (Russian Federation)

    2014-07-01

    A mobile multiple-reflection time-of-flight mass spectrometer (MR-TOF-MS) has been developed which provides a mass resolving power exceeding 250,000 and sub-ppm mass accuracy in a transportable format. Thus it allows resolving isobars and enables accurate determination of the composition and structure of biomolecules. Furthermore the device offers high mass resolving MS/MS capability via selective ion re-trapping and collisional-induced dissociation (CID). An atmospheric pressure interface (API) provides for routine measurements with various atmospheric ion sources. All supply electronics, DAQ and control system are mounted with the spectrometer into a single frame with a total volume of only 0.8 m{sup 3}. With the current system many applications like waste water monitoring at hot spots, mass-based classification of biomolecules and breath analysis are possible. In addition the mass spectrometer is readily scalable and can be adopted and simplified for even more specific use like in space science for instance. A characterization and first performance results are shown, and the implementation of MS/MS in combination with CID is discussed.

  2. MLFMA-accelerated Nyström method for ultrasonic scattering - Numerical results and experimental validation

    Science.gov (United States)

    Gurrala, Praveen; Downs, Andrew; Chen, Kun; Song, Jiming; Roberts, Ron

    2018-04-01

    Full wave scattering models for ultrasonic waves are necessary for the accurate prediction of voltage signals received from complex defects/flaws in practical nondestructive evaluation (NDE) measurements. We propose the high-order Nyström method accelerated by the multilevel fast multipole algorithm (MLFMA) as an improvement to the state-of-the-art full-wave scattering models that are based on boundary integral equations. We present numerical results demonstrating improvements in simulation time and memory requirement. Particularly, we demonstrate the need for higher order geom-etry and field approximation in modeling NDE measurements. Also, we illustrate the importance of full-wave scattering models using experimental pulse-echo data from a spherical inclusion in a solid, which cannot be modeled accurately by approximation-based scattering models such as the Kirchhoff approximation.

  3. Validation of Spectral Unmixing Results from Informed Non-Negative Matrix Factorization (INMF) of Hyperspectral Imagery

    Science.gov (United States)

    Wright, L.; Coddington, O.; Pilewskie, P.

    2017-12-01

    Hyperspectral instruments are a growing class of Earth observing sensors designed to improve remote sensing capabilities beyond discrete multi-band sensors by providing tens to hundreds of continuous spectral channels. Improved spectral resolution, range and radiometric accuracy allow the collection of large amounts of spectral data, facilitating thorough characterization of both atmospheric and surface properties. We describe the development of an Informed Non-Negative Matrix Factorization (INMF) spectral unmixing method to exploit this spectral information and separate atmospheric and surface signals based on their physical sources. INMF offers marked benefits over other commonly employed techniques including non-negativity, which avoids physically impossible results; and adaptability, which tailors the method to hyperspectral source separation. The INMF algorithm is adapted to separate contributions from physically distinct sources using constraints on spectral and spatial variability, and library spectra to improve the initial guess. Using this INMF algorithm we decompose hyperspectral imagery from the NASA Hyperspectral Imager for the Coastal Ocean (HICO), with a focus on separating surface and atmospheric signal contributions. HICO's coastal ocean focus provides a dataset with a wide range of atmospheric and surface conditions. These include atmospheres with varying aerosol optical thicknesses and cloud cover. HICO images also provide a range of surface conditions including deep ocean regions, with only minor contributions from the ocean surfaces; and more complex shallow coastal regions with contributions from the seafloor or suspended sediments. We provide extensive comparison of INMF decomposition results against independent measurements of physical properties. These include comparison against traditional model-based retrievals of water-leaving, aerosol, and molecular scattering radiances and other satellite products, such as aerosol optical thickness from

  4. Thermodynamic properties of 9-fluorenone: Mutual validation of experimental and computational results

    International Nuclear Information System (INIS)

    Chirico, Robert D.; Kazakov, Andrei F.; Steele, William V.

    2012-01-01

    Highlights: ► Heat capacities were measured for the temperature range 5 K to 520 K. ► Vapor pressures were measured for the temperature range 368 K to 668 K. ► The enthalpy of combustion was measured and the enthalpy of formation was derived. ► Calculated and derived properties for the ideal gas are in excellent accord. ► Thermodynamic consistency analysis revealed anomalous literature data. - Abstract: Measurements leading to the calculation of thermodynamic properties for 9-fluorenone (IUPAC name 9H-fluoren-9-one and Chemical Abstracts registry number [486-25-9]) in the ideal-gas state are reported. Experimental methods were adiabatic heat-capacity calorimetry, inclined-piston manometry, comparative ebulliometry, and combustion calorimetry. Critical properties were estimated. Molar entropies for the ideal-gas state were derived from the experimental studies at selected temperatures T between T = 298.15 K and T = 600 K, and independent statistical calculations were performed based on molecular geometry optimization and vibrational frequencies calculated at the B3LYP/6 − 31 + G(d,p) level of theory. Values derived with the independent methods are shown to be in excellent accord with a scaling factor of 0.975 applied to the calculated frequencies. This same scaling factor was successfully applied in the analysis of results for other polycyclic molecules, as described in recent articles by this research group. All experimental results are compared with property values reported in the literature. Thermodynamic consistency between properties is used to show that several studies in the literature are erroneous.

  5. Results of a monitoring programme in the environs of Berkeley aimed at collecting Chernobyl data for foodchain model validation

    International Nuclear Information System (INIS)

    Nair, S.; Darley, P.J.; Shaer, J.

    1989-03-01

    The results of a fallout measurement programme which was carried out in the environs of Berkeley Nuclear Laboratory in the United Kingdom following the Chernobyl reactor accident in April 1986 are presented in this report. The programme was aimed at establishing a time-dependent data base of concentrations of Chernobyl fallout radionuclides in selected agricultural products. Results were obtained for milk, grass, silage, soil and wheat over an eighteen month period from May 1986. It is intended to use the data to validate the CEGB's dynamic foodchain model, which is incorporated in the FOODWEB module of the NECTAR environmental code. (author)

  6. Tracing and analytical results of the dioxin contamination incident in 2008 originating from the Republic of Ireland.

    Science.gov (United States)

    Heres, L; Hoogenboom, R; Herbes, R; Traag, W; Urlings, B

    2010-12-01

    High levels of dioxins (PCDD/Fs) in pork were discovered in France and the Netherlands at the end of 2008. The contamination was rapidly traced back to a feed stock in the Republic of Ireland (RoI). Burning oil, used for the drying of bakery waste, appeared to be contaminated with PCBs. Consequently, very high levels up to 500 pg TEQ g⁻¹ fat were found in pork. The congener pattern clearly pointed to PCB-oil as a source, but the ratio between the non-dioxin-like indicator PCBs (PCBs 28, 52, 101, 138, 152 and 180) and PCDD/Fs was much lower than observed during the Belgian incident, thereby limiting the suitability of indicator PCBs as a marker for the presence of dioxins and dioxin-like PCBs. This paper describes the tracking and tracing of the incident, the public-private cooperation, the surveillance activities and its results. A major lesson to be learned from this incident is the importance of good private food safety systems. In this incident, it was the private surveillance systems that identified the origin of contamination within 10 days after the first signal of increased dioxin levels in a product. On the other hand, retrospective analyses showed that signals were missed that could have led to an earlier detection of the incident and the source. Above all, the incident would not have occurred when food safety assurance systems had been effectively implemented in the involved feed chain. It is discussed that besides primary responsibility for effective private food safety systems, the competent authorities have to supervise whether the food safety procedures are capable of coping with these kinds of complex food safety issues, while private food companies need to implement the law, and public authorities should supervise and enforce them. Finally, it is discussed whether the health risks derived from consumption of the contaminated batches of meat may have been underestimated during the incident due to the unusually high intake of dioxins.

  7. Measuring adult mortality using sibling survival: a new analytical method and new results for 44 countries, 1974-2006.

    Directory of Open Access Journals (Sweden)

    Ziad Obermeyer

    2010-04-01

    probability of a 15-y old dying before his or her 60th birthday-for 44 countries with DHS sibling survival data. Our findings suggest that levels of adult mortality prevailing in many developing countries are substantially higher than previously suggested by other analyses of sibling history data. Generally, our estimates show the risk of adult death between ages 15 and 60 y to be about 20%-35% for females and 25%-45% for males in sub-Saharan African populations largely unaffected by HIV. In countries of Southern Africa, where the HIV epidemic has been most pronounced, as many as eight out of ten men alive at age 15 y will be dead by age 60, as will six out of ten women. Adult mortality levels in populations of Asia and Latin America are generally lower than in Africa, particularly for women. The exceptions are Haiti and Cambodia, where mortality risks are comparable to many countries in Africa. In all other countries with data, the probability of dying between ages 15 and 60 y was typically around 10% for women and 20% for men, not much higher than the levels prevailing in several more developed countries.Our results represent an expansion of direct knowledge of levels and trends in adult mortality in the developing world. The CSS method provides grounds for renewed optimism in collecting sibling survival data. We suggest that all nationally representative survey programs with adequate sample size ought to implement this critical module for tracking adult mortality in order to more reliably understand the levels and patterns of adult mortality, and how they are changing. Please see later in the article for the Editors' Summary.

  8. Medical biomodelling in surgical applications: results of a multicentric European validation of 466 cases.

    Science.gov (United States)

    Wulf, J; Vitt, K D; Erben, C M; Bill, J S; Busch, L C

    2003-01-01

    The study started in September 1999 and ended in April 2002. It is based on a questionnaire [www.phidias.org] assessing case-related questions due to the application of stereolithographic models. Each questionnaire contains over 50 items. These variables take into account diagnosis, indications and benefits of stereolithographic models with view on different steps of the surgical procedures: preoperative planning, intraoperative application and overall outcome after surgical intervervention. These questionnaires were completed by the surgeons who performed operation. Over the time course of our multicentric study (30 months), we evaluated 466 cases. The study population consists of n=231 male and n= 235 female patients. 54 surgeons from 9 European countries were involved. There are main groups of diagnosis that related to the use of a model. Most models were used in maxillofacial surgery. The operative planning may help to determine the resection line of tumor and optimize reconstructive procedures. Correction of large calvarian defects can be simulated and implants can be produced preoperatively. Overall in 58 % of all cases a time- saving effect was reported. The study strongly suggests, that medical modeling has utility in surgical specialities, especially in the craniofacial and maxillofacial area, however increasingly in the orthopedic field. Due to our results, medical modeling optimizes the preoperative surgical planning. Surgeons are enabeled to perform realistic and interactive simulations. The fabrication of implants, its design and fit on the model, allow to reduce operation time and in consequence risk and cost of operation. In addition, the understanging of volumetric data is improved, especially if medical models are combined with standart imaging modalities. Finally, surgeons are able to improve communication between their patientents and colleagues.

  9. Validation of an analytical method for the determination of total mercury in urine samples using cold vapor atomic absorption spectrometry (CV-AAS)

    International Nuclear Information System (INIS)

    Guilhen, Sabine Neusatz

    2009-01-01

    Mercury (Hg) is a toxic metal applied to a variety of products and processes, representing a risk to the health of occupationally or accidentally exposed subjects. Dental amalgam is a restorative material composed of metallic mercury, which use has been widely debated in the last decades. Due to the dubiety of the studies concerning dental amalgam, many efforts concerning this issue have been conducted. The Tropical Medicine Foundation (Tocantins, Brazil) has recently initiated a study to evaluate the environmental and occupational levels of exposure to mercury in dentistry attendants at public consulting rooms in the city of Araguaina (TO). In collaboration with this study, the laboratory of analysis at IPEN's Chemistry and Environment Center is undertaking the analysis of mercury levels in exposed subjects' urine samples using cold vapor atomic absorption spectrometry. This analysis requires the definition of a methodology capable of generating reliable results. Such methodology can only be implemented after a rigorous validation procedure. As part of this work, a series of tests were conducted in order to confirm the suitability of the selected methodology and to assert that the laboratory addresses all requirements needed for a successful implementation of the methodology. The following parameters were considered in order to test the method's performance: detection and quantitation limits, selectivity, sensitivity, linearity, accuracy and precision. The assays were carried out with certified reference material, which assures the traceability of the results. Taking into account the estimated parameters, the method can be considered suitable for the afore mentioned purpose. The mercury concentration found for the reference material was of (95,12 +- 11,70)mug.L -1 with a recovery rate of 97%. The method was also applied to 39 urine samples, six of which (15%) showing urinary mercury levels above the normal limit of 10μg.L -1 . The obtained results fall into a

  10. Validation of an analytical methodology for the determination of diethylene glycol and ethylene glycol as impurities in glycerin and propylene glycol

    International Nuclear Information System (INIS)

    Rosabal Cordovi, Ursula M; Fonseca Gola, Antonio; Cordovi Velazquez, Juan M; Morales Torres, Galina

    2014-01-01

    A methodology for the quantification of diethylene glycol (DEG) and the ethylene glycol (EG) impurities by gas Chromatography with flame ionization detector in glycerol and propylene glycol samples was developed and validated. It was selected dimethyl sulphoxide as internal standard. It was used hydrogen as carrier and auxiliary gas. The temperature program was 100°C holding one minute, then ramp to rate of 7.5°C/ min up to 200 °C. A Restek 624 column was used, with a flow in column of 4.20 ml/ min. Temperatures of the injector and detector were set at 220°C and 250 °C, respectively. The linearity was determined at 25-75 ?μg/ml as interval of concentrations for both impurities with correlation coefficients larger than 0.999. Detection Limits were settled down in 0.0350 μ?g/ml to the diethylene glycol, and 0.0572 μg/ml to ethylene glycol, while the quantitation limits were 0.1160 μ?g/ml to DEG and 0.1897 μg/ml to the EG. The recoveries were 99.98 % and 100.00 %, respectively; with RSD % 1.18 % to DEG, and 0.60 % to the EG. The obtained results demonstrated that the methodology was linear, accurate, robustness, sensitive and selective to be used in the determination of both impurities in the quality control of the glycerol and propylene glycol as raw materials

  11. SMOS near-real-time soil moisture product: processor overview and first validation results

    Directory of Open Access Journals (Sweden)

    N. J. Rodríguez-Fernández

    2017-10-01

    Full Text Available Measurements of the surface soil moisture (SM content are important for a wide range of applications. Among them, operational hydrology and numerical weather prediction, for instance, need SM information in near-real-time (NRT, typically not later than 3 h after sensing. The European Space Agency (ESA Soil Moisture and Ocean Salinity (SMOS satellite is the first mission specifically designed to measure SM from space. The ESA Level 2 SM retrieval algorithm is based on a detailed geophysical modelling and cannot provide SM in NRT. This paper presents the new ESA SMOS NRT SM product. It uses a neural network (NN to provide SM in NRT. The NN inputs are SMOS brightness temperatures for horizontal and vertical polarizations and incidence angles from 30 to 45°. In addition, the NN uses surface soil temperature from the European Centre for Medium-Range Weather Forecasts (ECMWF Integrated Forecast System (IFS. The NN was trained on SMOS Level 2 (L2 SM. The swath of the NRT SM retrieval is somewhat narrower (∼ 915 km than that of the L2 SM dataset (∼ 1150 km, which implies a slightly lower revisit time. The new SMOS NRT SM product was compared to the SMOS Level 2 SM product. The NRT SM data show a standard deviation of the difference with respect to the L2 data of < 0.05 m3 m−3 in most of the Earth and a Pearson correlation coefficient higher than 0.7 in large regions of the globe. The NRT SM dataset does not show a global bias with respect to the L2 dataset but can show local biases of up to 0.05 m3 m−3 in absolute value. The two SMOS SM products were evaluated against in situ measurements of SM from more than 120 sites of the SCAN (Soil Climate Analysis Network and the USCRN (US Climate Reference Network networks in North America. The NRT dataset obtains similar but slightly better results than the L2 data. In summary, the NN SMOS NRT SM product exhibits performances similar to those of the Level 2 SM product

  12. Evaluation of instrumental parameters for obtaining acceptable analytical results of the Dosimetry Laboratory of Chemistry of the Regional Center of Nuclear Sciences, CNEN-NE, Recife, Brazil

    International Nuclear Information System (INIS)

    Souza, V.L.B.; Figueiredo, M.D.C.; Cunha, M.S.

    2008-01-01

    Instrumental parameters need to be evaluated for obtaining acceptable analytical results for a specific instrument. The performance of the UV-VIS spectrophotometer can be verified for wavelengths and absorbances with appropriate materials (solutions of different concentrations of K 2 CrO 4 , for example). The aim of this work was to demonstrate the results of the procedures to control the quality of the measurements carried out in the laboratory in the last four years. The samples were analyzed in the spectrophotometer and control graphics were obtained for K 2 CrO 4 and Fe 3+ absorbance values. The variation in the results obtained for the stability of the spectrophotometer and for the control of its calibration did not exceed 2%. (author)

  13. Tank 241-AP-106, Grab samples, 6AP-98-1, 6AP-98-2 and 6AP-98-3 Analytical results for the final report

    International Nuclear Information System (INIS)

    FULLER, R.K.

    1999-01-01

    This document is the final report for tank 241-AP-106 grab samples. Three grab samples 6AP-98-1, 6AP-98-2 and 6AP-98-3 were taken from riser 1 of tank 241-AP-106 on May 28, 1998 and received by the 222-S Laboratory on May 28, 1998. Analyses were performed in accordance with the ''Compatability Grab Sampling and Analysis Plan'' (TSAP) (Sasaki, 1998) and the ''Data Quality Objectives for Tank Farms Waste Compatability Program (DQO). The analytical results are presented in the data summary report. No notification limits were exceeded. The request for sample analysis received for AP-106 indicated that the samples were polychlorinated biphenyl (PCB) suspects. The results of this analysis indicated that no PCBs were present at the Toxic Substance Control Act (TSCA) regulated limit of 50 ppm. The results and raw data for the PCB analysis are included in this document

  14. Analytical evaluation of atomic form factors: Application to Rayleigh scattering

    Energy Technology Data Exchange (ETDEWEB)

    Safari, L., E-mail: laleh.safari@ist.ac.at [IST Austria (Institute of Science and Technology Austria), Am Campus 1, 3400 Klosterneuburg (Austria); Department of Physics, University of Oulu, Box 3000, FI-90014 Oulu (Finland); Santos, J. P. [Laboratório de Instrumentação, Engenharia Biomédica e Física da Radiação (LIBPhys-UNL), Departamento de Física, Faculdade de Ciências e Tecnologia, FCT, Universidade Nova de Lisboa, 2829-516 Caparica (Portugal); Amaro, P. [Laboratório de Instrumentação, Engenharia Biomédica e Física da Radiação (LIBPhys-UNL), Departamento de Física, Faculdade de Ciências e Tecnologia, FCT, Universidade Nova de Lisboa, 2829-516 Caparica (Portugal); Physikalisches Institut, Universität Heidelberg, D-69120 Heidelberg (Germany); Jänkälä, K. [Department of Physics, University of Oulu, Box 3000, FI-90014 Oulu (Finland); Fratini, F. [Department of Physics, University of Oulu, Box 3000, FI-90014 Oulu (Finland); Institute of Atomic and Subatomic Physics, TU Wien, Stadionallee 2, 1020 Wien (Austria); Departamento de Física, Instituto de Ciências Exatas, Universidade Federal de Minas Gerais, 31270-901 Belo Horizonte, MG (Brazil)

    2015-05-15

    Atomic form factors are widely used for the characterization of targets and specimens, from crystallography to biology. By using recent mathematical results, here we derive an analytical expression for the atomic form factor within the independent particle model constructed from nonrelativistic screened hydrogenic wave functions. The range of validity of this analytical expression is checked by comparing the analytically obtained form factors with the ones obtained within the Hartee-Fock method. As an example, we apply our analytical expression for the atomic form factor to evaluate the differential cross section for Rayleigh scattering off neutral atoms.

  15. Follow-up and control of analytical results from environmental monitoring program of the Radioactive Waste Disposal Facility - Abadia de Goias

    International Nuclear Information System (INIS)

    Peixoto, Claudia Marques; Jacomino, Vanusa Maria Feliciano

    2000-01-01

    The analytical results for the 12 month period (August/1997 to July/1998) of the Environmental Monitoring Program operational phase of the radioactive waste disposal facility 'Abadia de Goias' (DIGOI), located in the District of Goiania, are summarized in this report. A statistical treatment of the data using control graphs is also presented. The use of these graphs allows the arrangement of the data in a way that facilitates process control and visualization of data trends and periodicity organized according to temporal variation. A comparison is made of these results vs. those obtained during the pre-operational phase. Moreover, the effective equivalent dose received by the public individuals for different critical pathways is estimated. (author)

  16. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    Science.gov (United States)

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  17. Out-of-plane buckling of pantographic fabrics in displacement-controlled shear tests: experimental results and model validation

    Science.gov (United States)

    Barchiesi, Emilio; Ganzosch, Gregor; Liebold, Christian; Placidi, Luca; Grygoruk, Roman; Müller, Wolfgang H.

    2018-01-01

    Due to the latest advancements in 3D printing technology and rapid prototyping techniques, the production of materials with complex geometries has become more affordable than ever. Pantographic structures, because of their attractive features, both in dynamics and statics and both in elastic and inelastic deformation regimes, deserve to be thoroughly investigated with experimental and theoretical tools. Herein, experimental results relative to displacement-controlled large deformation shear loading tests of pantographic structures are reported. In particular, five differently sized samples are analyzed up to first rupture. Results show that the deformation behavior is strongly nonlinear, and the structures are capable of undergoing large elastic deformations without reaching complete failure. Finally, a cutting edge model is validated by means of these experimental results.

  18. The greek translation of the symptoms rating scale for depression and anxiety: preliminary results of the validation study

    Directory of Open Access Journals (Sweden)

    Gougoulias Kyriakos

    2003-12-01

    Full Text Available Abstract Background The aim of the current study was to assess the reliability, validity and the psychometric properties of the Greek translation of the Symptoms Rating Scale For Depression and Anxiety. The scale consists of 42 items and permits the calculation of the scores of the Beck Depression Inventory (BDI-21, the BDI 13, the Melancholia Subscale, the Asthenia Subscale, the Anxiety Subscale and the Mania Subscale Methods 29 depressed patients 30.48 ± 9.83 years old, and 120 normal controls 27.45 ± 10.85 years old entered the study. In 20 of them (8 patients and 12 controls the instrument was re-applied 1–2 days later. Translation and Back Translation was made. Clinical Diagnosis was reached by consensus of two examiners with the use of the SCAN v.2.0 and the IPDE. CES-D and ZDRS were used for cross-validation purposes. The Statistical Analysis included ANOVA, the Spearman Correlation Coefficient, Principal Components Analysis and the calculation of Cronbach's alpha. Results The optimal cut-off points were: BDI-21: 14/15, BDI-13: 7/8, Melancholia: 8/9, Asthenia: 9/10, Anxiety: 10/11. Chronbach's alpha ranged between 0.86 and 0.92 for individual scales. Only the Mania subscale had very low alpha (0.12. The test-retest reliability was excellent for all scales with Spearman's Rho between 0.79 and 0.91. Conclusions The Greek translation of the SRSDA and the scales that consist it are both reliable and valid and are suitable for clinical and research use with satisfactory properties. Their properties are close to those reported in the international literature. However one should always have in mind the limitations inherent in the use of self-report scales.

  19. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  20. Analytical Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Analytical Labspecializes in Oil and Hydraulic Fluid Analysis, Identification of Unknown Materials, Engineering Investigations, Qualification Testing (to support...

  1. Sixteen-item Anxiety Sensitivity Index: Confirmatory factor analytic evidence, internal consistency, and construct validity in a young adult sample from the Netherlands

    NARCIS (Netherlands)

    Vujanovic, Anka A.; Arrindell, Willem A.; Bernstein, Amit; Norton, Peter J.; Zvolensky, Michael J.

    The present investigation examined the factor structure, internal consistency, and construct validity of the 16-item Anxiety Sensitivity Index (ASI; Reiss Peterson, Gursky, & McNally 1986) in a young adult sample (n = 420)from the Netherlands. Confirmatory factor analysis was used to comparatively

  2. Tank 241-U-102, Grab Samples 2U-99-1, 2U-99-2 and 2U-99-3 Analytical Results for the Final Report

    International Nuclear Information System (INIS)

    STEEN, F.H.

    1999-01-01

    This document is the final report for tank 241-U-102 grab samples. Five grab samples were collected from riser 13 on May 26, 1999 and received by the 222-S laboratory on May 26 and May 27, 1999. Samples 2U-99-3 and 2U-99-4 were submitted to the Process Chemistry Laboratory for special studies. Samples 2U-99-1, 2U-99-2 and 2U-99-5 were submitted to the laboratory for analyses. Analyses were performed in accordance with the Compatibility Grab Sampling and Analysis Plan for Fiscal year 1999 (TSAP) (Sasaki, 1999) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DQO) (Fowler 1995, Mulkey and Miller 1998). The analytical results are presented in the data summary report. None of the subsamples submitted for differential scanning calorimetry (DSC), total organic carbon (TOC) and plutonium 239 (Pu239) analyses exceeded the notification limits as stated in TSAP

  3. Analytical Method Development and Validation for the Quantification of Acetone and Isopropyl Alcohol in the Tartaric Acid Base Pellets of Dipyridamole Modified Release Capsules by Using Headspace Gas Chromatographic Technique

    Directory of Open Access Journals (Sweden)

    Sriram Valavala

    2018-01-01

    Full Text Available A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8 µm column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH. All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis.

  4. Non-invasive transcranial ultrasound therapy based on a 3D CT scan: protocol validation and in vitro results

    International Nuclear Information System (INIS)

    Marquet, F; Pernot, M; Aubry, J-F; Montaldo, G; Tanter, M; Fink, M; Marsac, L

    2009-01-01

    A non-invasive protocol for transcranial brain tissue ablation with ultrasound is studied and validated in vitro. The skull induces strong aberrations both in phase and in amplitude, resulting in a severe degradation of the beam shape. Adaptive corrections of the distortions induced by the skull bone are performed using a previous 3D computational tomography scan acquisition (CT) of the skull bone structure. These CT scan data are used as entry parameters in a FDTD (finite differences time domain) simulation of the full wave propagation equation. A numerical computation is used to deduce the impulse response relating the targeted location and the ultrasound therapeutic array, thus providing a virtual time-reversal mirror. This impulse response is then time-reversed and transmitted experimentally by a therapeutic array positioned exactly in the same referential frame as the one used during CT scan acquisitions. In vitro experiments are conducted on monkey and human skull specimens using an array of 300 transmit elements working at a central frequency of 1 MHz. These experiments show a precise refocusing of the ultrasonic beam at the targeted location with a positioning error lower than 0.7 mm. The complete validation of this transcranial adaptive focusing procedure paves the way to in vivo animal and human transcranial HIFU investigations.

  5. Non-invasive transcranial ultrasound therapy based on a 3D CT scan: protocol validation and in vitro results

    Energy Technology Data Exchange (ETDEWEB)

    Marquet, F; Pernot, M; Aubry, J-F; Montaldo, G; Tanter, M; Fink, M [Laboratoire Ondes et Acoustique, ESPCI, Universite Paris VII, UMR CNRS 7587, 10 rue Vauquelin, 75005 Paris (France); Marsac, L [Supersonic Imagine, Les Jardins de la Duranne, 510 rue Rene Descartes, 13857 Aix-en-Provence (France)], E-mail: fabrice.marquet@espci.org

    2009-05-07

    A non-invasive protocol for transcranial brain tissue ablation with ultrasound is studied and validated in vitro. The skull induces strong aberrations both in phase and in amplitude, resulting in a severe degradation of the beam shape. Adaptive corrections of the distortions induced by the skull bone are performed using a previous 3D computational tomography scan acquisition (CT) of the skull bone structure. These CT scan data are used as entry parameters in a FDTD (finite differences time domain) simulation of the full wave propagation equation. A numerical computation is used to deduce the impulse response relating the targeted location and the ultrasound therapeutic array, thus providing a virtual time-reversal mirror. This impulse response is then time-reversed and transmitted experimentally by a therapeutic array positioned exactly in the same referential frame as the one used during CT scan acquisitions. In vitro experiments are conducted on monkey and human skull specimens using an array of 300 transmit elements working at a central frequency of 1 MHz. These experiments show a precise refocusing of the ultrasonic beam at the targeted location with a positioning error lower than 0.7 mm. The complete validation of this transcranial adaptive focusing procedure paves the way to in vivo animal and human transcranial HIFU investigations.

  6. [Validity of axis III "Conflicts" of Operationalized Psychodynamic Diagnostics (OPD-1)--empirical results and conclusions for OPD-2].

    Science.gov (United States)

    Schneider, Gudrun; Mendler, Till; Heuft, Gereon; Burgmer, Markus

    2008-01-01

    Using specific psychometric instruments, we investigate criteria-related validity of axis III ("conflicts") of OPD-1 by a priori formulated hypotheses concerning the relations to the main conflict/mode. A consecutive sample of 105 psychotherapy inpatients was examined using self-assessment scales (Inventory of Interpersonal Problems; Rosenberg Self-Esteem Scale, Test of Self-Conscious Affect; Toronto Alexithymia-Scale; Frankfurt Self Concept Scales) and videotaped OPD research interviews in the first week after admission to the hospital. Two OPD-certified raters first rated the interviews independently, then in a consensus rating. Due to the different frequency of the main conflict and mode, evaluation of 4 of 7 conflicts was possible. The a priori hypotheses could be confirmed for the conflicts Dependence versus Autonomy (both modes), Submission versus Control (active mode), Desire for Care versus Autarchy (active mode), and Self-Value (passive mode). Confirmation of the a priori hypotheses indicates validity of axis III (Conflicts) of OPD. We discuss the small numbers of some conflicts, the comparison of expert rating OPD with self-assessment and the meaning of the results for OPD-2.

  7. An audit of the contribution to post-mortem examination diagnosis of individual analyte results obtained from biochemical analysis of the vitreous.

    Science.gov (United States)

    Mitchell, Rebecca; Charlwood, Cheryl; Thomas, Sunethra Devika; Bellis, Maria; Langlois, Neil E I

    2013-12-01

    Biochemical analysis of the vitreous humor from the eye is an accepted accessory test for post-mortem investigation of cause of death. Modern biochemical analyzers allow testing of a range of analytes from a sample. However, it is not clear which analytes should be requested in order to prevent unnecessary testing (and expense). The means and standard deviation of the values obtained from analysis of the vitreous humor for sodium, potassium, chloride, osmolality, glucose, ketones (β-hydroxybutyrate), creatinine, urea, calcium, lactate, and ammonia were calculated from which the contribution of each analyte was reviewed in the context of post-mortem findings and final cause of death. For sodium 32 cases were regarded as high (more than one standard deviation above the mean), from which 9 contributed to post-mortem diagnosis [drowning (4), heat related death (2), diabetic hyperglycemia (2), and dehydration (1)], but 25 low values (greater than one standard deviation below the mean) made no contribution. For chloride 29 high values contributed to 4 cases--3 drowning and 1 heat-related, but these were all previously identified by a high sodium level. There were 29 high and 35 low potassium values, none of which contributed to determining the final cause of death. Of 22 high values of creatinine, 12 contributed to a diagnosis of renal failure. From 32 high values of urea, 18 contributed to 16 cases of renal failure (2 associated with diabetic hyperglycemia), 1 heat-related death, and one case with dehydration. Osmolarity contributed to 12 cases (5 heat-related, 4 diabetes, 2 renal failure, and 1 dehydration) from 36 high values. There was no contribution from 32 high values and 19 low values of calcium and there was no contribution from 4 high and 2 low values of ammonia. There were 11 high values of glucose, which contributed to the diagnosis of 6 cases of diabetic hyperglycemia and 21 high ketone levels contributed to 8 cases: 4 diabetic ketosis, 3 hypothermia, 3

  8. Validated analytical methodology for the simultaneous determination of a wide range of pesticides in human blood using GC-MS/MS and LC-ESI/MS/MS and its application in two poisoning cases.

    Science.gov (United States)

    Luzardo, Octavio P; Almeida-González, Maira; Ruiz-Suárez, Norberto; Zumbado, Manuel; Henríquez-Hernández, Luis A; Meilán, María José; Camacho, María; Boada, Luis D

    2015-09-01

    Pesticides are frequently responsible for human poisoning and often the information on the involved substance is lacking. The great variety of pesticides that could be responsible for intoxication makes necessary the development of powerful and versatile analytical methodologies, which allows the identification of the unknown toxic substance. Here we developed a methodology for simultaneous identification and quantification in human blood of 109 highly toxic pesticides. The application of this analytical scheme would help in minimizing the cost of this type of chemical identification, maximizing the chances of identifying the pesticide involved. In the methodology that we present here, we use a liquid-liquid extraction, followed by one single purification step, and quantitation of analytes by a combination of liquid and gas chromatography, both coupled to triple quadrupole mass spectrometry, which is operated in the mode of multiple reaction monitoring. The methodology has been fully validated, and its applicability has been demonstrated in two recent cases involving one self-poisoning fatality and one non-fatal homicidal attempt. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  9. Development and validation of analytical methodology for determination of polycyclic aromatic hydrocarbons (PAHS) in sediments. Assesment of Pedroso Park dam, Santo Andre, SP; Desenvolvimento e validacao de metodologia analitica para determinacao de hidrocarbonetos policiclicos aromaticos (HPAS) em sedimentos. Avaliacao da represa do Parque Pedroso, Santo Andre, SP

    Energy Technology Data Exchange (ETDEWEB)

    Brito, Carlos Fernando de

    2009-07-01

    The polycyclic aromatic hydrocarbons (PAHs), by being considered persistent contaminants, by their ubiquity in the environment and by the recognition of their genotoxicity, have stimulated research activities in order to determine and evaluate their sources, transport, processing, biological effects and accumulation in compartments of aquatic and terrestrial ecosystems. In this work, the matrix studied was sediment collected at Pedroso Park's dam at Santo Andre, SP. The analytical technique employed was liquid chromatography in reverse phase with a UV/Vis detector. Statistics treatment of the data was established during the process of developing the methodology for which there was reliable results. The steps involved were evaluated using the concept of Validation of Chemical Testing. The parameters selected for the analytical validation were selectivity, linearity, Working Range, Sensitivity, Accuracy, Precision, Limit of Detection, Limit of quantification and robustness. These parameters showed satisfactory results, allowing the application of the methodology, and is a simple method that allows the minimization of contamination and loss of compounds by over-handling. For the PAHs tested were no found positive results, above the limit of detection, in any of the samples collected in the first phase. But, at the second collection, were found small changes mainly acenaphthylene, fluorene and benzo[a]anthracene. Although the area is preserved, it is possible to realize little signs of contamination. (author)

  10. A simple validated multi-analyte method for detecting drugs in oral fluid by ultra-performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS).

    Science.gov (United States)

    Zheng, Yufang; Sparve, Erik; Bergström, Mats

    2018-06-01

    A UPLC-MS/MS method was developed to identify and quantitate 37 commonly abused drugs in oral fluid. Drugs of interest included amphetamines, benzodiazepines, cocaine, opiates, opioids, phencyclidine and tetrahydrocannabinol. Sample preparation and extraction are simple, and analysis times short. Validation showed satisfactory performance at relevant concentrations. The possibility of contaminated samples as well as the interpretation in relation to well-knows matrices, such as urine, will demand further study. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Life cycle management of analytical methods.

    Science.gov (United States)

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Tank 241-AP-106, Grab samples, 6AP-98-1, 6AP-98-2 and 6AP-98-3 Analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    FULLER, R.K.

    1999-02-23

    This document is the final report for tank 241-AP-106 grab samples. Three grab samples 6AP-98-1, 6AP-98-2 and 6AP-98-3 were taken from riser 1 of tank 241-AP-106 on May 28, 1998 and received by the 222-S Laboratory on May 28, 1998. Analyses were performed in accordance with the ''Compatability Grab Sampling and Analysis Plan'' (TSAP) (Sasaki, 1998) and the ''Data Quality Objectives for Tank Farms Waste Compatability Program (DQO). The analytical results are presented in the data summary report. No notification limits were exceeded. The request for sample analysis received for AP-106 indicated that the samples were polychlorinated biphenyl (PCB) suspects. The results of this analysis indicated that no PCBs were present at the Toxic Substance Control Act (TSCA) regulated limit of 50 ppm. The results and raw data for the PCB analysis are included in this document.

  13. Tank 241-AP-107, grab samples 7AP-97-1, 7AP-97-2 and 7AP-97-3 analytical results for the final report

    International Nuclear Information System (INIS)

    Steen, F.H.

    1997-01-01

    This document is the final report for tank 241-AP-107 grab samples. Three grab samples were collected from riser 1 on September 11, 1997. Analyses were performed on samples 7AP-97-1, 7AP-97-2 and 7AP-97-3 in accordance with the Compatibility Grab Sampling and Analysis Plan (TSAP) (Sasaki, 1997) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DQO) (Rev. 1: Fowler, 1995; Rev. 2: Mulkey and Nuier, 1997). The analytical results are presented in the data summary report (Table 1). A notification was made to East Tank Farms Operations concerning low hydroxide in the tank and a hydroxide (caustic) demand analysis was requested. The request for sample analysis (RSA) (Attachment 2) received for AP-107 indicated that the samples were polychlorinated biphenyl (PCB) suspects. Therefore, prior to performing the requested analyses, aliquots were made to perform PCB analysis in accordance with the 222-S Laboratory administrative procedure, LAP-101-100. The results of this analysis indicated that no PCBs were present at 50 ppm and analysis proceeded as non-PCB samples. The results and raw data for the PCB analysis will be included in a revision to this document. The sample breakdown diagrams (Attachment 1) are provided as a cross-reference for relating the tank farm customer identification numbers with the 222-S Laboratory sample numbers and the portion of sample analyzed

  14. A New Enzyme-Linked Immunosorbent Assay for a Total Anti-T Lymphocyte Globulin Determination: Development, Analytical Validation, and Clinical Applications.

    Science.gov (United States)

    Montagna, Michela; La Nasa, Giorgio; Bernardo, Maria E; Piras, Eugenia; Avanzini, Maria A; Regazzi, Mario; Locatelli, Franco

    2017-06-01

    Anti-T lymphocyte globulin (ATLG) modulates the alloreactivity of T lymphocytes, reducing the risk of immunological posttransplant complications, in particular rejection and graft-versus-host disease, after allogeneic hematopoietic stem cell transplantation (HSCT). We developed and validated a new enzyme-linked immunosorbent assay (ELISA) method to measure serum levels of total ATLG and evaluate the pharmacokinetics (PK) of the drug in children with β-Thalassemia, receiving allogeneic HSCT. Diluted serum samples were incubated with Goat-anti-Rabbit IgG antibody coated on a microtiter plate and then, with Goat-anti-Human IgG labeled with horseradish peroxidase. After incubation and washings, substrate solution was added and absorbance was read at 492 nm. ATLG concentrations in samples were determined by interpolation from a standard curve (range: 200-0.095 ng/mL), prepared by diluting a known amount of ATLG in phosphate-buffered saline (PBS). Low, medium, and high-quality control concentrations were 1.56, 6.25, and 25 ng/mL, respectively. This method was developed and validated within the acceptance criteria in compliance with the Guidelines for a biological method validation: the sensitivity of the method was 0.095 ng/mL. We analyzed serum samples from 14 children with β-Thalassemia who received ATLG (Grafalon) at a dose of 10 mg/kg administered as intravenous (IV) infusion on days -5, -4, and -3 before HSCT (day 0). Blood sampling for PK evaluation was performed on days -5, -4, and -3 before and after drug infusion; and then from day -2 to +56. The median total ATLG levels pre-IVand post-IV were 0 and 118 mcg/mL on day -5; 85.9 and 199.2 mcg/mL on day -4; 153 and 270.9 mcg/mL on day -3, respectively. The median PK values of CL was 0.0029 (range: 0.0028-0.0057) L·kg·d, Vd was 0.088 (range: 0.025-0.448) L/kg and t1/2 was 20.2 (range: 5.8-50.2) days. These data suggest that given the marked interindividual variability of total ATLG disposition, the development of

  15. Validation of dose-response calibration curve for X-Ray field of CRCN-NE/CNEN: preliminary results

    International Nuclear Information System (INIS)

    Silva, Laís Melo; Mendonç, Julyanne Conceição de Goes; Andrade, Aida Mayra Guedes de; Hwang, Suy F.; Mendes, Mariana Esposito; Lima, Fabiana F.; Melo, Ana Maria M.A.

    2017-01-01

    It is very important in accident investigations that accurate estimating of absorbed dose takes place, so that it contributes to medical decisions and overall assessment of long-term health consequences. Analysis of chromosome aberrations is the most developed method for biological monitoring, and frequencies of dicentric chromosomes are related to absorbed dose of human peripheral blood lymphocytes using calibration curves. International Atomic Energy Agency (IAEA) recommends that each biodosimetry laboratory sets its own calibration curves, given that there are intrinsic differences in protocols and dose interpretations when using calibration curves produced in other laboratories, which could add further uncertainties to dose estimations. The Laboratory for Biological Dosimetry CRCN-NE recently completed dose-response calibration curves for X ray field. Curves of chromosomes dicentrics and dicentrics plus rings were made using Dose Estimate. This study aimed to validate the calibration curves dose-response for X ray with three irradiated samples. Blood was obtained by venipuncture from healthy volunteer and three samples were irradiated by x-rays of 250 kVp with different absorbed doses (0,5Gy, 1Gy and 2Gy). The irradiation was performed at the CRCN-NE/CNEN Metrology Service with PANTAK X-ray equipment, model HF 320. The frequency of dicentric and centric rings chromosomes were determined in 500 metaphases per sample after cultivation of lymphocytes, and staining with Giemsa 5%. Results showed that the estimated absorbed doses are included in the confidence interval of 95% of real absorbed dose. These Dose-response calibration curves (dicentrics and dicentrics plus rings) seems valid, therefore other tests will be done with different volunteers. (author)

  16. Validation of dose-response calibration curve for X-Ray field of CRCN-NE/CNEN: preliminary results

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Laís Melo; Mendonç, Julyanne Conceição de Goes; Andrade, Aida Mayra Guedes de; Hwang, Suy F.; Mendes, Mariana Esposito; Lima, Fabiana F., E-mail: falima@cnen.gov.br, E-mail: mendes_sb@hotmail.com [Centro Regional de Ciências Nucleares, (CRCN-NE/CNEN-PE), Recife, PE (Brazil); Melo, Ana Maria M.A., E-mail: july_cgm@yahoo.com.br [Universidade Federal de Pernambuco (UFPE), Vitória de Santo Antão, PE (Brazil). Centro Acadêmico de Vitória

    2017-07-01

    It is very important in accident investigations that accurate estimating of absorbed dose takes place, so that it contributes to medical decisions and overall assessment of long-term health consequences. Analysis of chromosome aberrations is the most developed method for biological monitoring, and frequencies of dicentric chromosomes are related to absorbed dose of human peripheral blood lymphocytes using calibration curves. International Atomic Energy Agency (IAEA) recommends that each biodosimetry laboratory sets its own calibration curves, given that there are intrinsic differences in protocols and dose interpretations when using calibration curves produced in other laboratories, which could add further uncertainties to dose estimations. The Laboratory for Biological Dosimetry CRCN-NE recently completed dose-response calibration curves for X ray field. Curves of chromosomes dicentrics and dicentrics plus rings were made using Dose Estimate. This study aimed to validate the calibration curves dose-response for X ray with three irradiated samples. Blood was obtained by venipuncture from healthy volunteer and three samples were irradiated by x-rays of 250 kVp with different absorbed doses (0,5Gy, 1Gy and 2Gy). The irradiation was performed at the CRCN-NE/CNEN Metrology Service with PANTAK X-ray equipment, model HF 320. The frequency of dicentric and centric rings chromosomes were determined in 500 metaphases per sample after cultivation of lymphocytes, and staining with Giemsa 5%. Results showed that the estimated absorbed doses are included in the confidence interval of 95% of real absorbed dose. These Dose-response calibration curves (dicentrics and dicentrics plus rings) seems valid, therefore other tests will be done with different volunteers. (author)

  17. Validation of model-based brain shift correction in neurosurgery via intraoperative magnetic resonance imaging: preliminary results

    Science.gov (United States)

    Luo, Ma; Frisken, Sarah F.; Weis, Jared A.; Clements, Logan W.; Unadkat, Prashin; Thompson, Reid C.; Golby, Alexandra J.; Miga, Michael I.

    2017-03-01

    The quality of brain tumor resection surgery is dependent on the spatial agreement between preoperative image and intraoperative anatomy. However, brain shift compromises the aforementioned alignment. Currently, the clinical standard to monitor brain shift is intraoperative magnetic resonance (iMR). While iMR provides better understanding of brain shift, its cost and encumbrance is a consideration for medical centers. Hence, we are developing a model-based method that can be a complementary technology to address brain shift in standard resections, with resource-intensive cases as referrals for iMR facilities. Our strategy constructs a deformation `atlas' containing potential deformation solutions derived from a biomechanical model that account for variables such as cerebrospinal fluid drainage and mannitol effects. Volumetric deformation is estimated with an inverse approach that determines the optimal combinatory `atlas' solution fit to best match measured surface deformation. Accordingly, preoperative image is updated based on the computed deformation field. This study is the latest development to validate our methodology with iMR. Briefly, preoperative and intraoperative MR images of 2 patients were acquired. Homologous surface points were selected on preoperative and intraoperative scans as measurement of surface deformation and used to drive the inverse problem. To assess the model accuracy, subsurface shift of targets between preoperative and intraoperative states was measured and compared to model prediction. Considering subsurface shift above 3 mm, the proposed strategy provides an average shift correction of 59% across 2 cases. While further improvements in both the model and ability to validate with iMR are desired, the results reported are encouraging.

  18. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  19. A Simulation Tool for Geometrical Analysis and Optimization of Fuel Cell Bipolar Plates: Development, Validation and Results

    Directory of Open Access Journals (Sweden)

    Javier Pino

    2009-07-01

    Full Text Available Bipolar plates (BPs are one of the most important components in Proton Exchange Membrane Fuel Cells (PEMFC due to the numerous functions they perform. The objective of the research work described in this paper was to develop a simplified and validated method based on Computational Fluid Dynamics (CFD, aimed at the analysis and study of the influence of geometrical parameters of BPs on the operation of a cell. A complete sensibility analysis of the influence of dimensions and shape of the BP can be obtained through a simplified CFD model without including the complexity of other components of the PEMFC. This model is compared with the PEM Fuel Cell Module of the FLUENT software, which includes the physical and chemical phenomena relevant in PEMFCs. Results with both models regarding the flow field inside the channels and local current densities are obtained and compared. The results show that it is possible to use the simple model as a standard tool for geometrical analysis of BPs, and results of a sensitivity analysis using the simplified model are presented and discussed.

  20. A validation of direct grey Dancoff factors results for cylindrical cells in cluster geometry by the Monte Carlo method

    International Nuclear Information System (INIS)

    Rodrigues, Leticia Jenisch; Bogado, Sergio; Vilhena, Marco T.

    2008-01-01

    The WIMS code is a well known and one of the most used codes to handle nuclear core physics calculations. Recently, the PIJM module of the WIMS code was modified in order to allow the calculation of Grey Dancoff factors, for partially absorbing materials, using the alternative definition in terms of escape and collision probabilities. Grey Dancoff factors for the Canadian CANDU-37 and CANFLEX assemblies were calculated with PIJM at five symmetrically distinct fuel pin positions. The results, obtained via Direct Method, i.e., by direct calculation of escape and collision probabilities, were satisfactory when compared with the ones of literature. On the other hand, the PIJMC module was developed to calculate escape and collision probabilities using Monte Carlo method. Modifications in this module were performed to determine Black Dancoff factors, considering perfectly absorbing fuel rods. In this work, we proceed further in the task of validating the Direct Method by the Monte Carlo approach. To this end, the PIJMC routine is modified to compute Grey Dancoff factors using the cited alternative definition. Results are reported for the mentioned CANDU-37 and CANFLEX assemblies obtained with PIJMC, at the same fuel pin positions as with PIJM. A good agreement is observed between the results from the Monte Carlo and Direct methods

  1. Analytical Method Development and Validation for the Simultaneous Estimation of Abacavir and Lamivudine by Reversed-phase High-performance Liquid Chromatography in Bulk and Tablet Dosage Forms.

    Science.gov (United States)

    Raees Ahmad, Sufiyan Ahmad; Patil, Lalit; Mohammed Usman, Mohammed Rageeb; Imran, Mohammad; Akhtar, Rashid

    2018-01-01

    A simple rapid, accurate, precise, and reproducible validated reverse phase high performance liquid chromatography (HPLC) method was developed for the determination of Abacavir (ABAC) and Lamivudine (LAMI) in bulk and tablet dosage forms. The quantification was carried out using Symmetry Premsil C18 (250 mm × 4.6 mm, 5 μm) column run in isocratic way using mobile phase comprising methanol: water (0.05% orthophosphoric acid with pH 3) 83:17 v/v and a detection wavelength of 245 nm and injection volume of 20 μl, with a flow rate of 1 ml/min. In the developed method, the retention times of ABAC and LAMI were found to be 3.5 min and 7.4 min, respectively. The method was validated in terms of linearity, precision, accuracy, limits of detection, limits of quantitation, and robustness in accordance with the International Conference on Harmonization guidelines. The assay of the proposed method was found to be 99% - 101%. The recovery studies were also carried out and mean % recovery was found to be 99% - 101%. The % relative standard deviation from reproducibility was found to be performance liquid chromatography, UV: Ultraviolet, ICH: International Conference on Harmonization, ABAC: Abacavir, LAMI: Lamivudine, HIV: Human immunodeficiency virus, AIDS: Acquired immunodeficiency syndrome, NRTI: Nucleoside reverse transcriptase inhibitors, ARV: Antiretroviral, RSD: Relative standard deviation, RT: Retention time, SD: Standard deviation.

  2. Analytical Validation and Clinical Qualification of a New Immunohistochemical Assay for Androgen Receptor Splice Variant-7 Protein Expression in Metastatic Castration-resistant Prostate Cancer.

    Science.gov (United States)

    Welti, Jonathan; Rodrigues, Daniel Nava; Sharp, Adam; Sun, Shihua; Lorente, David; Riisnaes, Ruth; Figueiredo, Ines; Zafeiriou, Zafeiris; Rescigno, Pasquale; de Bono, Johann S; Plymate, Stephen R

    2016-10-01

    The androgen receptor splice variant-7 (AR-V7) has been implicated in the development of castration-resistant prostate cancer (CRPC) and resistance to abiraterone and enzalutamide. To develop a validated assay for detection of AR-V7 protein in tumour tissue and determine its expression and clinical significance as patients progress from hormone-sensitive prostate cancer (HSPC) to CRPC. Following monoclonal antibody generation and validation, we retrospectively identified patients who had HSPC and CRPC tissue available for AR-V7 immunohistochemical (IHC) analysis. Nuclear AR-V7 expression was determined using IHC H score (HS) data. The change in nuclear AR-V7 expression from HSPC to CRPC and the association between nuclear AR-V7 expression and overall survival (OS) was determined. Nuclear AR-V7 expression was significantly lower in HSPC (median HS 50, interquartile range [IQR] 17.5-90) compared to CRPC (HS 135, IQR 80-157.5; pprostate cancer. A higher level of AR-V7 identifies a group of patients who respond less well to certain prostate cancer treatments and live for a shorter period of time. Copyright © 2016 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  3. Validation of satellite SAR offshore wind speed maps to in-situ data, microscala and mesoscale model results

    Energy Technology Data Exchange (ETDEWEB)

    Hasager, C B; Astrup, P; Barthelmie, R; Dellwik, E; Hoffmann Joergensen, B; Gylling Mortensen, N; Nielsen, M; Pryor, S; Rathmann, O

    2002-05-01

    A validation study has been performed in order to investigate the precision and accuracy of the satellite-derived ERS-2 SAR wind products in offshore regions. The overall project goal is to develop a method for utilizing the satellite wind speed maps for offshore wind resources, e.g. in future planning of offshore wind farms. The report describes the validation analysis in detail for three sites in Denmark, Italy and Egypt. The site in Norway is analyzed by the Nansen Environmental and Remote Sensing Centre (NERSC). Wind speed maps and wind direction maps from Earth Observation data recorded by the ERS-2 SAR satellite have been obtained from the NERSC. For the Danish site the wind speed and wind direction maps have been compared to in-situ observations from a met-mast at Horns Rev in the North Sea located 14 km offshore. The SAR wind speeds have been area-averaged by simple and advanced footprint modelling, ie. the upwind conditions to the meteorological mast are explicitly averaged in the SAR wind speed maps before comparison. The comparison results are very promising with a standard error of {+-} 0.61 m s{sup -1}, a bias {approx}2 m s{sup -1} and R{sup 2} {approx}0.88 between in-situ wind speed observations and SAR footprint averaged values at 10 m level. Wind speeds predicted by the local scale model LINCOM and the mesoscale model KAMM2 have been compared to the spatial variations in the SAR wind speed maps. The finding is a good correspondence between SAR observations and model results. Near the coast is an 800 m wide band in which the SAR wind speed observations have a strong negative bias. The bathymetry of Horns Rev combined with tidal currents give rise to bias in the SAR wind speed maps near areas of shallow, complex bottom topography in some cases. A total of 16 cases were analyzed for Horns Rev. For Maddalena in Italy five cases were analyzed. At the Italian site the SAR wind speed maps were compared to WAsP and KAMM2 model results. The WAsP model

  4. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-15

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  5. Analytical chemistry

    International Nuclear Information System (INIS)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-01

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  6. Validation and verification of MCNP6 against intermediate and high-energy experimental data and results by other codes

    International Nuclear Information System (INIS)

    Mashnik, Stepan G.

    2011-01-01

    MCNP6, the latest and most advanced LANL transport code representing a recent merger of MCNP5 and MCNPX, has been Validated and Verified (V and V) against a variety of intermediate and high-energy experimental data and against results by different versions of MCNPX and other codes. In the present work, we V and V MCNP6 using mainly the latest modifications of the Cascade-Exciton Model (CEM) and of the Los Alamos version of the Quark-Gluon String Model (LAQGSM) event generators CEM03.02 and LAQGSM03.03. We found that MCNP6 describes reasonably well various reactions induced by particles and nuclei at incident energies from 18 MeV to about 1 TeV per nucleon measured on thin and thick targets and agrees very well with similar results obtained with MCNPX and calculations by CEM03.02, LAQGSM03.01 (03.03), INCL4 + ABLA, and Bertini INC + Dresner evaporation, EPAX, ABRABLA, HIPSE, and AMD, used as stand alone codes. Most of several computational bugs and more serious physics problems observed in MCNP6/X during our V and V have been fixed; we continue our work to solve all the known problems before MCNP6 is distributed to the public. (author)

  7. Theoretically Guided Analytical Method Development and Validation for the Estimation of Rifampicin in a Mixture of Isoniazid and Pyrazinamide by UV Spectrophotometer.

    Science.gov (United States)

    Khan, Mohammad F; Rita, Shamima A; Kayser, Md Shahidulla; Islam, Md Shariful; Asad, Sharmeen; Bin Rashid, Ridwan; Bari, Md Abdul; Rahman, Muhammed M; Al Aman, D A Anwar; Setu, Nurul I; Banoo, Rebecca; Rashid, Mohammad A

    2017-01-01

    A simple, rapid, economic, accurate, and precise method for the estimation of rifampicin in a mixture of isoniazid and pyrazinamide by UV spectrophotometeric technique (guided by the theoretical investigation of physicochemical properties) was developed and validated. Theoretical investigations revealed that isoniazid and pyrazinamide both were freely soluble in water and slightly soluble in ethyl acetate whereas rifampicin was practically insoluble in water but freely soluble in ethyl acetate. This indicates that ethyl acetate is an effective solvent for the extraction of rifampicin from a water mixture of isoniazid and pyrazinamide. Computational study indicated that pH range of 6.0-8.0 would favor the extraction of rifampicin. Rifampicin is separated from isoniazid and pyrazinamide at pH 7.4 ± 0.1 by extracting with ethyl acetate. The ethyl acetate was then analyzed at λ max of 344.0 nm. The developed method was validated for linearity, accuracy and precision according to ICH guidelines. The proposed method exhibited good linearity over the concentration range of 2.5-35.0 μg/mL. The intraday and inter-day precision in terms of % RSD ranged from 1.09 to 1.70% and 1.63 to 2.99%, respectively. The accuracy (in terms of recovery) of the method varied from of 96.7 ± 0.9 to 101.1 ± 0.4%. The LOD and LOQ were found to be 0.83 and 2.52 μg/mL, respectively. In addition, the developed method was successfully applied to determine rifampicin combination (isoniazid and pyrazinamide) brands available in Bangladesh.

  8. Confirmatory factor analytic investigation of variance composition, gender invariance, and validity of the Male Role Norms Inventory-Adolescent-revised (MRNI-A-r).

    Science.gov (United States)

    Levant, Ronald F; McDermott, Ryon C; Hewitt, Amber A; Alto, Kathleen M; Harris, Kyle T

    2016-10-01

    Confirmatory factor analysis of responses to the Male Role Norms Inventory-Adolescent-revised (MRNI-A-r) from 384 middle school students (163 boys, 221 girls) indicated that the best fit to the data was a bifactor model incorporating the hypothesized 3-factor structure while explicitly modeling an additional, general factor. Specifically, each item-level indicator loaded simultaneously on 2 factors: a general traditional masculinity ideology factor and a specific factor corresponding to 1 of the 3 hypothesized masculine norms for adolescents: Emotionally Detached Dominance, Toughness, and Avoidance of Femininity. Invariance testing across gender supported metric invariance for the general factor only. Although item loadings on the general factor were similar across boys and girls, the specific factor loadings varied substantially, with many becoming nonsignificant in the presence of the general factor for girls. A structural regression analysis predicting latent variables of the Meanings of Adolescent Masculinity Scale (MAMS), the Rosenberg Self-esteem Scale, and the Discipline, School Difficulties, and Positive Behavior Scale (DSDPBS) indicated that the general factor was a strong predictor of MAMS for both genders and DSDPBS for girls. Findings indicate that the MRNI-A-r general factor is a valid and reliable indicator of overall internalization of traditional masculinity ideology in adolescents; however, the specific factors may have different meanings for boys as compared with girls and lack validity in the presence of the general factor. These findings are consistent with a developmental perspective of gender ideology that views adolescence as a time when a differentiated cognitive schema of masculine norms is beginning to develop. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. Assessing generalized anxiety disorder in elderly people using the GAD-7 and GAD-2 scales: results of a validation study.

    Science.gov (United States)

    Wild, Beate; Eckl, Anne; Herzog, Wolfgang; Niehoff, Dorothea; Lechner, Sabine; Maatouk, Imad; Schellberg, Dieter; Brenner, Hermann; Müller, Heiko; Löwe, Bernd

    2014-10-01

    The aim of this study was to evaluate the validity of the seven-item Generalized Anxiety Disorder scale (GAD-7) and its two core items (GAD-2) for detecting GAD in elderly people. A criterion-standard study was performed between May and December of 2010 on a general elderly population living at home. A subsample of 438 elderly persons (ages 58-82) of the large population-based German ESTHER study was included in the study. The GAD-7 was administered to participants as part of a home visit. A telephone-administered structured clinical interview was subsequently conducted by a blinded interviewer. The structured clinical (SCID) interview diagnosis of GAD constituted the criterion standard to determine sensitivity and specificity of the GAD-7 and the GAD-2 scales. Twenty-seven participants met the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for current GAD according to the SCID interview (6.2%; 95% confidence interval [CI]: 3.9%-8.2%). For the GAD-7, a cut point of five or greater appeared to be optimal for detecting GAD. At this cut point the sensitivity of the GAD-7 was 0.63 and the specificity was 0.9. Correspondingly, the optimal cut point for the GAD-2 was two or greater with a sensitivity of 0.67 and a specificity of 0.90. The areas under the curve were 0.88 (95% CI: 0.83-0.93) for the GAD-7 and 0.87 (95% CI: 0.80-0.94) for the GAD-2. The increased scores on both GAD scales were strongly associated with mental health related quality of life (p <0.0001). Our results establish the validity of both the GAD-7 and the GAD-2 in elderly persons. Results of this study show that the recommended cut points of the GAD-7 and the GAD-2 for detecting GAD should be lowered for the elderly general population. Copyright © 2014 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  10. Non-Rhabdomyosarcoma Soft Tissue Sarcomas in Children: A Surveillance, Epidemiology, and End Results Analysis Validating COG Risk Stratifications

    Energy Technology Data Exchange (ETDEWEB)

    Waxweiler, Timothy V., E-mail: timothy.waxweiler@ucdenver.edu [Department of Radiation Oncology, University of Colorado School of Medicine, Aurora, Colorado (United States); Rusthoven, Chad G. [Department of Radiation Oncology, University of Colorado School of Medicine, Aurora, Colorado (United States); Proper, Michelle S. [Department of Radiation Oncology, Billings Clinic, Billings, Montana (United States); Cost, Carrye R. [Division of Hematology and Oncology, Department of Pediatrics, University of Colorado Denver School of Medicine, Aurora, Colorado (United States); Cost, Nicholas G. [Division of Urology, Department of Surgery, University of Colorado Denver School of Medicine, Aurora, Colorado (United States); Donaldson, Nathan [Department of Orthopedics, University of Colorado Denver School of Medicine, Aurora, Colorado (United States); Garrington, Timothy; Greffe, Brian S. [Division of Hematology and Oncology, Department of Pediatrics, University of Colorado Denver School of Medicine, Aurora, Colorado (United States); Heare, Travis [Department of Orthopedics, University of Colorado Denver School of Medicine, Aurora, Colorado (United States); Macy, Margaret E. [Division of Hematology and Oncology, Department of Pediatrics, University of Colorado Denver School of Medicine, Aurora, Colorado (United States); Liu, Arthur K. [Department of Radiation Oncology, University of Colorado School of Medicine, Aurora, Colorado (United States)

    2015-06-01

    Purpose: Non-rhabdomyosarcoma soft tissue sarcomas (NRSTS) are a heterogeneous group of sarcomas that encompass over 35 histologies. With an incidence of ∼500 cases per year in the United States in those <20 years of age, NRSTS are rare and therefore difficult to study in pediatric populations. We used the large Surveillance, Epidemiology, and End Results (SEER) database to validate the prognostic ability of the Children's Oncology Group (COG) risk classification system and to define patient, tumor, and treatment characteristics. Methods and Materials: From SEER data from 1988 to 2007, we identified patients ≤18 years of age with NRSTS. Data for age, sex, year of diagnosis, race, registry, histology, grade, primary size, primary site, stage, radiation therapy, and survival outcomes were analyzed. Patients with nonmetastatic grossly resected low-grade tumors of any size or high-grade tumors ≤5 cm were considered low risk. Cases of nonmetastatic tumors that were high grade, >5 cm, or unresectable were considered intermediate risk. Patients with nodal or distant metastases were considered high risk. Results: A total of 941 patients met the review criteria. On univariate analysis, black race, malignant peripheral nerve sheath (MPNST) histology, tumors >5 cm, nonextremity primary, lymph node involvement, radiation therapy, and higher risk group were associated with significantly worse overall survival (OS) and cancer-specific survival (CSS). On multivariate analysis, MPNST histology, chemotherapy-resistant histology, and higher risk group were significantly poor prognostic factors for OS and CSS. Compared to low-risk patients, intermediate patients showed poorer OS (hazard ratio [HR]: 6.08, 95% confidence interval [CI]: 3.53-10.47, P<.001) and CSS (HR: 6.27; 95% CI: 3.44-11.43, P<.001), and high-risk patients had the worst OS (HR: 13.35, 95% CI: 8.18-21.76, P<.001) and CSS (HR: 14.65, 95% CI: 8.49-25.28, P<.001). Conclusions: The current COG risk group

  11. P185-M Protein Identification and Validation of Results in Workflows that Integrate over Various Instruments, Datasets, Search Engines

    Science.gov (United States)

    Hufnagel, P.; Glandorf, J.; Körting, G.; Jabs, W.; Schweiger-Hufnagel, U.; Hahner, S.; Lubeck, M.; Suckau, D.

    2007-01-01

    Analysis of complex proteomes often results in long protein lists, but falls short in measuring the validity of identification and quantification results on a greater number of proteins. Biological and technical replicates are mandatory, as is the combination of the MS data from various workflows (gels, 1D-LC, 2D-LC), instruments (TOF/TOF, trap, qTOF or FTMS), and search engines. We describe a database-driven study that combines two workflows, two mass spectrometers, and four search engines with protein identification following a decoy database strategy. The sample was a tryptic