WorldWideScience

Sample records for previously validated method

  1. 40 CFR 152.93 - Citation of a previously submitted valid study.

    Science.gov (United States)

    2010-07-01

    ... Data Submitters' Rights § 152.93 Citation of a previously submitted valid study. An applicant may demonstrate compliance for a data requirement by citing a valid study previously submitted to the Agency. The... the original data submitter, the applicant may cite the study only in accordance with paragraphs (b...

  2. Validation of the Online version of the Previous Day Food Questionnaire for schoolchildren

    Directory of Open Access Journals (Sweden)

    Raquel ENGEL

    Full Text Available ABSTRACT Objective To evaluate the validity of the web-based version of the Previous Day Food Questionnaire Online for schoolchildren from the 2nd to 5th grades of elementary school. Methods Participants were 312 schoolchildren aged 7 to 12 years of a public school from the city of Florianópolis, Santa Catarina, Brazil. Validity was assessed by sensitivity, specificity, as well as by agreement rates (match, omission, and intrusion rates of food items reported by children on the Previous Day Food Questionnaire Online, using direct observation of foods/beverages eaten during school meals (mid-morning snack or afternoon snack on the previous day as the reference. Multivariate multinomial logistic regression analysis was used to evaluate the influence of participants’ characteristics on omission and intrusion rates. Results The results showed adequate sensitivity (67.7% and specificity (95.2%. There were low omission and intrusion rates of 22.8% and 29.5%, respectively when all food items were analyzed. Pizza/hamburger showed the highest omission rate, whereas milk and milk products showed the highest intrusion rate. The participants who attended school in the afternoon shift presented a higher probability of intrusion compared to their peers who attended school in the morning. Conclusion The Previous Day Food Questionnaire Online possessed satisfactory validity for the assessment of food intake at the group level in schoolchildren from the 2nd to 5th grades of public school.

  3. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    Science.gov (United States)

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  4. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    Science.gov (United States)

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  5. Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory

    Science.gov (United States)

    Long, Haiying

    2017-01-01

    Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…

  6. Developing and validating a nutrition knowledge questionnaire: key methods and considerations.

    Science.gov (United States)

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-10-01

    To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.

  7. Validation of commercially available automated canine-specific immunoturbidimetric method for measuring canine C-reactive protein

    DEFF Research Database (Denmark)

    Hillström, Anna; Hagman, Ragnvi; Tvedten, Harold

    2014-01-01

    BACKGROUND: Measurement of C-reactive protein (CRP) is used for diagnosing and monitoring systemic inflammatory disease in canine patients. An automated human immunoturbidimetric assay has been validated for measuring canine CRP, but cross-reactivity with canine CRP is unpredictable. OBJECTIVE......: The purpose of the study was to validate a new automated canine-specific immunoturbidimetric CRP method (Gentian cCRP). METHODS: Studies of imprecision, accuracy, prozone effect, interference, limit of quantification, and stability under different storage conditions were performed. The new method was compared...... with a human CRP assay previously validated for canine CRP determination. Samples from 40 healthy dogs were analyzed to establish a reference interval. RESULTS: Total imprecision was

  8. Space Suit Joint Torque Measurement Method Validation

    Science.gov (United States)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  9. Validation of DRAGON side-step method for Bruce-A restart Phase-B physics tests

    International Nuclear Information System (INIS)

    Shen, W.; Ngo-Trong, C.; Davis, R.S.

    2004-01-01

    The DRAGON side-step method, developed at AECL, has a number of advantages over the all-DRAGON method that was used before. It is now the qualified method for reactivity-device calculations. Although the side-step-method-generated incremental cross sections have been validated against those previously calculated with the all-DRAGON method, it is highly desirable to validate the side-step method against device-worth measurements in power reactors directly. In this paper, the DRAGON side-step method was validated by comparison with the device-calibration measurements made in Bruce-A NGS Unit 4 restart Phase-B commissioning in 2003. The validation exercise showed excellent results, with the DRAGON code overestimating the measured ZCR worth by ∼5%. A sensitivity study was also performed in this paper to assess the effect of various DRAGON modelling techniques on the incremental cross sections. The assessment shows that the refinement of meshes in 3-D and the use of the side-step method are two major reasons contributing to the improved agreement between the calculated ZCR worths and the measurements. Use of different DRAGON versions, DRAGON libraries, local-parameter core conditions, and weighting techniques for the homogenization of tube clusters inside the ZCR have a very small effect on the ZCR incremental thermal absorption cross section and ZCR reactivity worth. (author)

  10. Potential of accuracy profile for method validation in inductively coupled plasma spectrochemistry

    International Nuclear Information System (INIS)

    Mermet, J.M.; Granier, G.

    2012-01-01

    Method validation is usually performed over a range of concentrations for which analytical criteria must be verified. One important criterion in quantitative analysis is accuracy, i.e. the contribution of both trueness and precision. The study of accuracy over this range is called an accuracy profile and provides experimental tolerance intervals. Comparison with acceptability limits fixed by the end user defines a validity domain. This work describes the computation involved in the building of the tolerance intervals, particularly for the intermediate precision with within-laboratory experiments and for the reproducibility with interlaboratory studies. Computation is based on ISO 5725‐4 and on previously published work. Moreover, the bias uncertainty is also computed to verify the bias contribution to accuracy. The various types of accuracy profile behavior are exemplified with results obtained by using ICP-MS and ICP-AES. This procedure allows the analyst to define unambiguously a validity domain for a given accuracy. However, because the experiments are time-consuming, the accuracy profile method is mainly dedicated to method validation. - Highlights: ► An analytical method is defined by its accuracy, i.e. both trueness and precision. ► The accuracy as a function of an analyte concentration is an accuracy profile. ► Profile basic concepts are explained for trueness and intermediate precision. ► Profile-based tolerance intervals have to be compared with acceptability limits. ► Typical accuracy profiles are given for both ICP-AES and ICP-MS techniques.

  11. Validation of a rapid, non-radioactive method to quantify internalisation of G-protein coupled receptors

    NARCIS (Netherlands)

    Jongsma, Maikel; Florczyk, Urszula M.; Hendriks-Balk, Marieelle C.; Michel, Martin C.; Peters, Stephan L. M.; Alewijnse, Astrid E.

    2007-01-01

    Agonist exposure can cause internalisation of G-protein coupled receptors (GPCRs), which may be a part of desensitisation but also of cellular signaling. Previous methods to study internalisation have been tedious or only poorly quantitative. Therefore, we have developed and validated a quantitative

  12. Validation for chromatographic and electrophoretic methods

    OpenAIRE

    Ribani, Marcelo; Bottoli, Carla Beatriz Grespan; Collins, Carol H.; Jardim, Isabel Cristina Sales Fontes; Melo, Lúcio Flávio Costa

    2004-01-01

    The validation of an analytical method is fundamental to implementing a quality control system in any analytical laboratory. As the separation techniques, GC, HPLC and CE, are often the principal tools used in such determinations, procedure validation is a necessity. The objective of this review is to describe the main aspects of validation in chromatographic and electrophoretic analysis, showing, in a general way, the similarities and differences between the guidelines established by the dif...

  13. Convergent validity test, construct validity test and external validity test of the David Liberman algorithm

    Directory of Open Access Journals (Sweden)

    David Maldavsky

    2013-08-01

    Full Text Available The author first exposes a complement of a previous test about convergent validity, then a construct validity test and finally an external validity test of the David Liberman algorithm.  The first part of the paper focused on a complementary aspect, the differential sensitivity of the DLA 1 in an external comparison (to other methods, and 2 in an internal comparison (between two ways of using the same method, the DLA.  The construct validity test exposes the concepts underlined to DLA, their operationalization and some corrections emerging from several empirical studies we carried out.  The external validity test examines the possibility of using the investigation of a single case and its relation with the investigation of a more extended sample.

  14. Development and Validation of a Dissolution Test Method for ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a dissolution test method for dissolution release of artemether and lumefantrine from tablets. Methods: A single dissolution method for evaluating the in vitro release of artemether and lumefantrine from tablets was developed and validated. The method comprised of a dissolution medium of ...

  15. Practical procedure for method validation in INAA- A tutorial

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2015-01-01

    This paper describes the procedure employed by the Neutron Activation Laboratory at the Nuclear and Energy Research Institute (LAN, IPEN - CNEN/SP) for validation of Instrumental Neutron Activation Analysis (INAA) methods. According to recommendations of ISO/IEC 17025 the method performance characteristics (limit of detection, limit of quantification, trueness, repeatability, intermediate precision, reproducibility, selectivity, linearity and uncertainties budget) were outline in an easy, fast and convenient way. The paper presents step by step how to calculate the required method performance characteristics in a process of method validation, what are the procedures, adopted strategies and acceptance criteria for the results, that is, how to make a method validation in INAA. In order to exemplify the methodology applied, obtained results for the method validation of mass fraction determination of Co, Cr, Fe, Rb, Se and Zn in biological matrix samples, using an internal reference material of mussel tissue were presented. It was concluded that the methodology applied for validation of INAA methods is suitable, meeting all the requirements of ISO/IEC 17025, and thereby, generating satisfactory results for the studies carried at LAN, IPEN - CNEN/SP. (author)

  16. Practical procedure for method validation in INAA- A tutorial

    Energy Technology Data Exchange (ETDEWEB)

    Petroni, Robson; Moreira, Edson G., E-mail: robsonpetroni@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    This paper describes the procedure employed by the Neutron Activation Laboratory at the Nuclear and Energy Research Institute (LAN, IPEN - CNEN/SP) for validation of Instrumental Neutron Activation Analysis (INAA) methods. According to recommendations of ISO/IEC 17025 the method performance characteristics (limit of detection, limit of quantification, trueness, repeatability, intermediate precision, reproducibility, selectivity, linearity and uncertainties budget) were outline in an easy, fast and convenient way. The paper presents step by step how to calculate the required method performance characteristics in a process of method validation, what are the procedures, adopted strategies and acceptance criteria for the results, that is, how to make a method validation in INAA. In order to exemplify the methodology applied, obtained results for the method validation of mass fraction determination of Co, Cr, Fe, Rb, Se and Zn in biological matrix samples, using an internal reference material of mussel tissue were presented. It was concluded that the methodology applied for validation of INAA methods is suitable, meeting all the requirements of ISO/IEC 17025, and thereby, generating satisfactory results for the studies carried at LAN, IPEN - CNEN/SP. (author)

  17. Validation of the Rotation Ratios Method

    International Nuclear Information System (INIS)

    Foss, O.A.; Klaksvik, J.; Benum, P.; Anda, S.

    2007-01-01

    Background: The rotation ratios method describes rotations between pairs of sequential pelvic radiographs. The method seems promising but has not been validated. Purpose: To validate the accuracy of the rotation ratios method. Material and Methods: Known pelvic rotations between 165 radiographs obtained from five skeletal pelvises in an experimental material were compared with the corresponding calculated rotations to describe the accuracy of the method. The results from a clinical material of 262 pelvic radiographs from 46 patients defined the ranges of rotational differences compared. Repeated analyses, both on the experimental and the clinical material, were performed using the selected reference points to describe the robustness and the repeatability of the method. Results: The reference points were easy to identify and barely influenced by pelvic rotations. The mean differences between calculated and real pelvic rotations were 0.0 deg (SD 0.6) for vertical rotations and 0.1 deg (SD 0.7) for transversal rotations in the experimental material. The intra- and interobserver repeatability of the method was good. Conclusion: The accuracy of the method was reasonably high, and the method may prove to be clinically useful

  18. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.; Adams, M.L.; McClarren, R.G.; Mallick, B.K.

    2011-01-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework

  19. Method validation in pharmaceutical analysis: from theory to practical optimization

    Directory of Open Access Journals (Sweden)

    Jaqueline Kaleian Eserian

    2015-01-01

    Full Text Available The validation of analytical methods is required to obtain high-quality data. For the pharmaceutical industry, method validation is crucial to ensure the product quality as regards both therapeutic efficacy and patient safety. The most critical step in validating a method is to establish a protocol containing well-defined procedures and criteria. A well planned and organized protocol, such as the one proposed in this paper, results in a rapid and concise method validation procedure for quantitative high performance liquid chromatography (HPLC analysis.   Type: Commentary

  20. Validation of NAA Method for Urban Particulate Matter

    International Nuclear Information System (INIS)

    Woro Yatu Niken Syahfitri; Muhayatun; Diah Dwiana Lestiani; Natalia Adventini

    2009-01-01

    Nuclear analytical techniques have been applied in many countries for determination of environmental pollutant. Method of NAA (neutron activation analysis) representing one of nuclear analytical technique of that has low detection limits, high specificity, high precision, and accuracy for large majority of naturally occurring elements, and ability of non-destructive and simultaneous determination of multi-elemental, and can handle small sample size (< 1 mg). To ensure quality and reliability of the method, validation are needed to be done. A standard reference material, SRM NIST 1648 Urban Particulate Matter, has been used to validate NAA method. Accuracy and precision test were used as validation parameters. Particulate matter were validated for 18 elements: Ti, I, V, Br, Mn, Na, K, Cl, Cu, Al, As, Fe, Co, Zn, Ag, La, Cr, and Sm,. The result showed that the percent relative standard deviation of the measured elemental concentrations are found to be within ranged from 2 to 14,8% for most of the elements analyzed whereas Hor rat value in range 0,3-1,3. Accuracy test results showed that relative bias ranged from -11,1 to 3,6%. Based on validation results, it can be stated that NAA method is reliable for characterization particulate matter and other similar matrix samples to support air quality monitoring. (author)

  1. Validation of method in instrumental NAA for food products sample

    International Nuclear Information System (INIS)

    Alfian; Siti Suprapti; Setyo Purwanto

    2010-01-01

    NAA is a method of testing that has not been standardized. To affirm and confirm that this method is valid. it must be done validation of the method with various sample standard reference materials. In this work. the validation is carried for food product samples using NIST SRM 1567a (wheat flour) and NIST SRM 1568a (rice flour). The results show that the validation method for testing nine elements (Al, K, Mg, Mn, Na, Ca, Fe, Se and Zn) in SRM 1567a and eight elements (Al, K, Mg, Mn, Na, Ca, Se and Zn ) in SRM 1568a pass the test of accuracy and precision. It can be conclude that this method has power to give valid result in determination element of the food products samples. (author)

  2. Human Factors methods concerning integrated validation of nuclear power plant control rooms; Metodutveckling foer integrerad validering

    Energy Technology Data Exchange (ETDEWEB)

    Oskarsson, Per-Anders; Johansson, Bjoern J.E.; Gonzalez, Natalia (Swedish Defence Research Agency, Information Systems, Linkoeping (Sweden))

    2010-02-15

    The frame of reference for this work was existing recommendations and instructions from the NPP area, experiences from the review of the Turbic Validation and experiences from system validations performed at the Swedish Armed Forces, e.g. concerning military control rooms and fighter pilots. These enterprises are characterized by complex systems in extreme environments, often with high risks, where human error can lead to serious consequences. A focus group has been performed with representatives responsible for Human Factors issues from all Swedish NPP:s. The questions that were discussed were, among other things, for whom an integrated validation (IV) is performed and its purpose, what should be included in an IV, the comparison with baseline measures, the design process, the role of SSM, which methods of measurement should be used, and how the methods are affected of changes in the control room. The report brings different questions to discussion concerning the validation process. Supplementary methods of measurement for integrated validation are discussed, e.g. dynamic, psychophysiological, and qualitative methods for identification of problems. Supplementary methods for statistical analysis are presented. The study points out a number of deficiencies in the validation process, e.g. the need of common guidelines for validation and design, criteria for different types of measurements, clarification of the role of SSM, and recommendations for the responsibility of external participants in the validation process. The authors propose 12 measures for taking care of the identified problems

  3. A Method for Ship Collision Damage and Energy Absorption Analysis and its Validation

    DEFF Research Database (Denmark)

    Zhang, Shengming; Pedersen, Preben Terndrup

    2016-01-01

    -examine this method’s validity andaccuracy for ship collision damage analysis in shipdesign assessments by comprehensive validations withthe experimental results from the public domain. Twentyexperimental tests have been selected, analysed andcompared with the results calculated using the proposedmethod. It can......For design evaluation there is a need for a method whichis fast, practical and yet accurate enough to determine theabsorbed energy and collision damage extent in shipcollision analysis. The most well-known simplifiedempirical approach to collision analysis was madeprobably by Minorsky and its...... limitation is also wellrecognized.The authors have previously developedsimple expressions for the relation between the absorbedenergy and the damaged material volume which take intoaccount the structural arrangements, the materialproperties and the damage modes. The purpose of thepresent paper is to re...

  4. ASTM Validates Air Pollution Test Methods

    Science.gov (United States)

    Chemical and Engineering News, 1973

    1973-01-01

    The American Society for Testing and Materials (ASTM) has validated six basic methods for measuring pollutants in ambient air as the first part of its Project Threshold. Aim of the project is to establish nationwide consistency in measuring pollutants; determining precision, accuracy and reproducibility of 35 standard measuring methods. (BL)

  5. The Validation of NAA Method Used as Test Method in Serpong NAA Laboratory

    International Nuclear Information System (INIS)

    Rina-Mulyaningsih, Th.

    2004-01-01

    The Validation Of NAA Method Used As Test Method In Serpong NAA Laboratory. NAA Method is a non standard testing method. The testing laboratory shall validate its using method to ensure and confirm that it is suitable with application. The validation of NAA methods have been done with the parameters of accuracy, precision, repeatability and selectivity. The NIST 1573a Tomato Leaves, NIES 10C Rice flour unpolished and standard elements were used in this testing program. The result of testing with NIST 1573a showed that the elements of Na, Zn, Al and Mn are met from acceptance criteria of accuracy and precision, whereas Co is rejected. The result of testing with NIES 10C showed that Na and Zn elements are met from acceptance criteria of accuracy and precision, but Mn element is rejected. The result of selectivity test showed that the value of quantity is between 0.1-2.5 μg, depend on the elements. (author)

  6. Toward a Unified Validation Framework in Mixed Methods Research

    Science.gov (United States)

    Dellinger, Amy B.; Leech, Nancy L.

    2007-01-01

    The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as…

  7. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. Softcopy quality ruler method: implementation and validation

    Science.gov (United States)

    Jin, Elaine W.; Keelan, Brian W.; Chen, Junqing; Phillips, Jonathan B.; Chen, Ying

    2009-01-01

    A softcopy quality ruler method was implemented for the International Imaging Industry Association (I3A) Camera Phone Image Quality (CPIQ) Initiative. This work extends ISO 20462 Part 3 by virtue of creating reference digital images of known subjective image quality, complimenting the hardcopy Standard Reference Stimuli (SRS). The softcopy ruler method was developed using images from a Canon EOS 1Ds Mark II D-SLR digital still camera (DSC) and a Kodak P880 point-and-shoot DSC. Images were viewed on an Apple 30in Cinema Display at a viewing distance of 34 inches. Ruler images were made for 16 scenes. Thirty ruler images were generated for each scene, representing ISO 20462 Standard Quality Scale (SQS) values of approximately 2 to 31 at an increment of one just noticeable difference (JND) by adjusting the system modulation transfer function (MTF). A Matlab GUI was developed to display the ruler and test images side-by-side with a user-adjustable ruler level controlled by a slider. A validation study was performed at Kodak, Vista Point Technology, and Aptina Imaging in which all three companies set up a similar viewing lab to run the softcopy ruler method. The results show that the three sets of data are in reasonable agreement with each other, with the differences within the range expected from observer variability. Compared to previous implementations of the quality ruler, the slider-based user interface allows approximately 2x faster assessments with 21.6% better precision.

  9. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  10. Validated modified Lycopodium spore method development for ...

    African Journals Online (AJOL)

    Validated modified lycopodium spore method has been developed for simple and rapid quantification of herbal powdered drugs. Lycopodium spore method was performed on ingredients of Shatavaryadi churna, an ayurvedic formulation used as immunomodulator, galactagogue, aphrodisiac and rejuvenator. Estimation of ...

  11. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.

    Science.gov (United States)

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil

    2014-08-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.

  12. Method Validation Procedure in Gamma Spectroscopy Laboratory

    International Nuclear Information System (INIS)

    El Samad, O.; Baydoun, R.

    2008-01-01

    The present work describes the methodology followed for the application of ISO 17025 standards in gamma spectroscopy laboratory at the Lebanese Atomic Energy Commission including the management and technical requirements. A set of documents, written procedures and records were prepared to achieve the management part. The technical requirements, internal method validation was applied through the estimation of trueness, repeatability , minimum detectable activity and combined uncertainty, participation in IAEA proficiency tests assure the external method validation, specially that the gamma spectroscopy lab is a member of ALMERA network (Analytical Laboratories for the Measurements of Environmental Radioactivity). Some of these results are presented in this paper. (author)

  13. Method validation for strobilurin fungicides in cereals and fruit

    DEFF Research Database (Denmark)

    Christensen, Hanne Bjerre; Granby, Kit

    2001-01-01

    Strobilurins are a new class of fungicides that are active against a broad spectrum of fungi. In the present work a GC method for analysis of strobilurin fungicides was validated. The method was based on extraction with ethyl acetate/cyclohexane, clean-up by gel permeation chromatography (GPC......) and determination of the content by gas chromatography (GC) with electron capture (EC-), nitrogen/phosphorous (NP-), and mass spectrometric (MS-) detection. Three strobilurins, azoxystrobin, kresoxim-methyl and trifloxystrobin were validated on three matrices, wheat, apple and grapes. The validation was based...

  14. 76 FR 28664 - Method 301-Field Validation of Pollutant Measurement Methods From Various Waste Media

    Science.gov (United States)

    2011-05-18

    ... . d m = The mean of the paired sample differences. n = Total number of paired samples. 7.4.2 t Test... being compared to a validated test method as part of the Method 301 validation and an audit sample for... tighten the acceptance criteria for the precision of candidate alternative test methods. One commenter...

  15. A Method for Ship Collision Damage and Energy Absorption Analysis and its Validation

    DEFF Research Database (Denmark)

    Zhang, Shengming; Pedersen, Preben Terndrup

    2017-01-01

    For design evaluation, there is a need for a method which is fast, practical and yet accurate enough to deter-mine the absorbed energy and collision damage extent in ship collision analysis. The most well-known sim-plified empirical approach to collision analysis was made probably by Minorsky......, and its limitation is alsowell-recognised. The authors have previously developed simple expressions for the relation between theabsorbed energy and the damaged material volume which take into account the structural arrangements,the material properties and the damage modes. The purpose of the present paper...... is to re-examine thismethod’s validity and accuracy for ship collision damage analysis in ship design assessments by compre-hensive validations with experimental results from the public domain. In total, 20 experimental tests havebeen selected, analysed and compared with the results calculated using...

  16. The truncated Wigner method for Bose-condensed gases: limits of validity and applications

    International Nuclear Information System (INIS)

    Sinatra, Alice; Lobo, Carlos; Castin, Yvan

    2002-01-01

    We study the truncated Wigner method applied to a weakly interacting spinless Bose-condensed gas which is perturbed away from thermal equilibrium by a time-dependent external potential. The principle of the method is to generate an ensemble of classical fields ψ(r) which samples the Wigner quasi-distribution function of the initial thermal equilibrium density operator of the gas, and then to evolve each classical field with the Gross-Pitaevskii equation. In the first part of the paper we improve the sampling technique over our previous work (Sinatra et al 2000 J. Mod. Opt. 47 2629-44) and we test its accuracy against the exactly solvable model of the ideal Bose gas. In the second part of the paper we investigate the conditions of validity of the truncated Wigner method. For short evolution times it is known that the time-dependent Bogoliubov approximation is valid for almost pure condensates. The requirement that the truncated Wigner method reproduces the Bogoliubov prediction leads to the constraint that the number of field modes in the Wigner simulation must be smaller than the number of particles in the gas. For longer evolution times the nonlinear dynamics of the noncondensed modes of the field plays an important role. To demonstrate this we analyse the case of a three-dimensional spatially homogeneous Bose-condensed gas and we test the ability of the truncated Wigner method to correctly reproduce the Beliaev-Landau damping of an excitation of the condensate. We have identified the mechanism which limits the validity of the truncated Wigner method: the initial ensemble of classical fields, driven by the time-dependent Gross-Pitaevskii equation, thermalizes to a classical field distribution at a temperature T class which is larger than the initial temperature T of the quantum gas. When T class significantly exceeds T a spurious damping is observed in the Wigner simulation. This leads to the second validity condition for the truncated Wigner method, T class - T

  17. The Value of Qualitative Methods in Social Validity Research

    Science.gov (United States)

    Leko, Melinda M.

    2014-01-01

    One quality indicator of intervention research is the extent to which the intervention has a high degree of social validity, or practicality. In this study, I drew on Wolf's framework for social validity and used qualitative methods to ascertain five middle schoolteachers' perceptions of the social validity of System 44®--a phonics-based reading…

  18. The Dynamic Similitude Design Method of Thin Walled Structures and Experimental Validation

    Directory of Open Access Journals (Sweden)

    Zhong Luo

    2016-01-01

    Full Text Available For the applicability of dynamic similitude models of thin walled structures, such as engine blades, turbine discs, and cylindrical shells, the dynamic similitude design of typical thin walled structures is investigated. The governing equation of typical thin walled structures is firstly unified, which guides to establishing dynamic scaling laws of typical thin walled structures. Based on the governing equation, geometrically complete scaling law of the typical thin walled structure is derived. In order to determine accurate distorted scaling laws of typical thin walled structures, three principles are proposed and theoretically proved by combining the sensitivity analysis and governing equation. Taking the thin walled annular plate as an example, geometrically complete and distorted scaling laws can be obtained based on the principles of determining dynamic scaling laws. Furthermore, the previous five orders’ accurate distorted scaling laws of thin walled annular plates are presented and numerically validated. Finally, the effectiveness of the similitude design method is validated by experimental annular plates.

  19. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  20. A Rapid, Simple, and Validated RP-HPLC Method for Quantitative Analysis of Levofloxacin in Human Plasma

    Directory of Open Access Journals (Sweden)

    Dion Notario

    2017-04-01

    Full Text Available To conduct a bioequivalence study for a copy product of levofloxacin (LEV, a simple and validated analytical method was needed, but the previous developed methods were still too complicated. For this reason, a simple and rapid high performance liquid chromatography method was developed and validated for LEV quantification in human plasma. Chromatographic separation was performed under isocratic elution on a Luna Phenomenex® C18 (150 × 4.6 mm, 5 µm column. The mobile phase was comprised of acetonitrile, methanol, and phosphate buffer 25 mM that adjusted at pH 3.0 (13:7:80 v/v/v and pumped at a flow rate of 1.5 mL/min. Detection was performed under UV detector at wavelength of 280 nm. Samples were prepared by adding acetonitrile and followed by centrifugation to precipitate plasma protein. Then followed successively by evaporation and reconstitution step. The optimized method meets the requirements of validation parameters which included linearity (r = 0.995, sensitivity (LLOQ and LOD was 1.77 and 0.57 µg/mL respectively, accuracy (%error above LLOQ ≤ 12% and LLOQ ≤ 20%, precision (RSD ≤ 9%, and robustness in the ranges of 1.77-28.83 µg/mL. Therefore, the method can be used as a routine analysis of LEV in human plasma as well as in bioequivalence study of LEV.

  1. Global approach for the validation of an in-line Raman spectroscopic method to determine the API content in real-time during a hot-melt extrusion process.

    Science.gov (United States)

    Netchacovitch, L; Thiry, J; De Bleye, C; Dumont, E; Cailletaud, J; Sacré, P-Y; Evrard, B; Hubert, Ph; Ziemons, E

    2017-08-15

    Since the Food and Drug Administration (FDA) published a guidance based on the Process Analytical Technology (PAT) approach, real-time analyses during manufacturing processes are in real expansion. In this study, in-line Raman spectroscopic analyses were performed during a Hot-Melt Extrusion (HME) process to determine the Active Pharmaceutical Ingredient (API) content in real-time. The method was validated based on a univariate and a multivariate approach and the analytical performances of the obtained models were compared. Moreover, on one hand, in-line data were correlated with the real API concentration present in the sample quantified by a previously validated off-line confocal Raman microspectroscopic method. On the other hand, in-line data were also treated in function of the concentration based on the weighing of the components in the prepared mixture. The importance of developing quantitative methods based on the use of a reference method was thus highlighted. The method was validated according to the total error approach fixing the acceptance limits at ±15% and the α risk at ±5%. This method reaches the requirements of the European Pharmacopeia norms for the uniformity of content of single-dose preparations. The validation proves that future results will be in the acceptance limits with a previously defined probability. Finally, the in-line validated method was compared with the off-line one to demonstrate its ability to be used in routine analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    Energy Technology Data Exchange (ETDEWEB)

    Bravenec, Ronald [Fourth State Research, Austin, TX (United States)

    2017-11-14

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less than half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.

  3. International Harmonization and Cooperation in the Validation of Alternative Methods.

    Science.gov (United States)

    Barroso, João; Ahn, Il Young; Caldeira, Cristiane; Carmichael, Paul L; Casey, Warren; Coecke, Sandra; Curren, Rodger; Desprez, Bertrand; Eskes, Chantra; Griesinger, Claudius; Guo, Jiabin; Hill, Erin; Roi, Annett Janusch; Kojima, Hajime; Li, Jin; Lim, Chae Hyung; Moura, Wlamir; Nishikawa, Akiyoshi; Park, HyeKyung; Peng, Shuangqing; Presgrave, Octavio; Singer, Tim; Sohn, Soo Jung; Westmoreland, Carl; Whelan, Maurice; Yang, Xingfen; Yang, Ying; Zuang, Valérie

    The development and validation of scientific alternatives to animal testing is important not only from an ethical perspective (implementation of 3Rs), but also to improve safety assessment decision making with the use of mechanistic information of higher relevance to humans. To be effective in these efforts, it is however imperative that validation centres, industry, regulatory bodies, academia and other interested parties ensure a strong international cooperation, cross-sector collaboration and intense communication in the design, execution, and peer review of validation studies. Such an approach is critical to achieve harmonized and more transparent approaches to method validation, peer-review and recommendation, which will ultimately expedite the international acceptance of valid alternative methods or strategies by regulatory authorities and their implementation and use by stakeholders. It also allows achieving greater efficiency and effectiveness by avoiding duplication of effort and leveraging limited resources. In view of achieving these goals, the International Cooperation on Alternative Test Methods (ICATM) was established in 2009 by validation centres from Europe, USA, Canada and Japan. ICATM was later joined by Korea in 2011 and currently also counts with Brazil and China as observers. This chapter describes the existing differences across world regions and major efforts carried out for achieving consistent international cooperation and harmonization in the validation and adoption of alternative approaches to animal testing.

  4. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.

    2011-09-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which experimental data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport universe, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new experiments within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies. © 2011 Elsevier Ltd. All rights reserved.

  5. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    Development and Validation of Analytical Method for Losartan-Copper Complex Using UV-Vis Spectrophotometry. ... Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with ...

  6. OWL-based reasoning methods for validating archetypes.

    Science.gov (United States)

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. HPLC method validation for modernization of the tetracycline hydrochloride capsule USP monograph

    Directory of Open Access Journals (Sweden)

    Emad M. Hussien

    2014-12-01

    Full Text Available This paper is a continuation to our previous work aiming at development and validation of a reversed-phase HPLC for modernization of tetracycline-related USP monographs and the USP general chapter . Previous results showed that the method is accurate and precise for the assay of tetracycline hydrochloride and the limit of 4-epianhydrotetracycline impurity in the drug substance and oral suspension monographs. The aim of the current paper is to examine the feasibility of the method for modernization of USP tetracycline hydrochloride capsule monograph. Specificity, linearity, accuracy and precision were examined for tetracycline hydrochloride assay and 4-epianhydrotetracycline limit. The method was linear in the concentration range from 80% to 160% (r>0.9998 of the assay concentration (0.1 mg/mL for tetracycline hydrochloride and from 50% to 150% (r>0.997 of the acceptance criteria specified in tetracycline hydrochloride capsule monograph for 4-epianhydrotetracycline (NMT 3.0%. The recovery at three concentration levels for tetracycline hydrochloride assay was between 99% and 101% and the RSD from six preparations at the concentration 0.1 mg/mL is less than 0.6%. The recovery for 4-epianhydrotetracycline limit procedure over the concentration range from 50% to 150% is between 96% and 102% with RSD less than 5%. The results met the specified acceptance criteria.

  8. Validation of a Previously Developed Geospatial Model That Predicts the Prevalence of Listeria monocytogenes in New York State Produce Fields

    Science.gov (United States)

    Weller, Daniel; Shiwakoti, Suvash; Bergholz, Peter; Grohn, Yrjo; Wiedmann, Martin

    2015-01-01

    Technological advancements, particularly in the field of geographic information systems (GIS), have made it possible to predict the likelihood of foodborne pathogen contamination in produce production environments using geospatial models. Yet, few studies have examined the validity and robustness of such models. This study was performed to test and refine the rules associated with a previously developed geospatial model that predicts the prevalence of Listeria monocytogenes in produce farms in New York State (NYS). Produce fields for each of four enrolled produce farms were categorized into areas of high or low predicted L. monocytogenes prevalence using rules based on a field's available water storage (AWS) and its proximity to water, impervious cover, and pastures. Drag swabs (n = 1,056) were collected from plots assigned to each risk category. Logistic regression, which tested the ability of each rule to accurately predict the prevalence of L. monocytogenes, validated the rules based on water and pasture. Samples collected near water (odds ratio [OR], 3.0) and pasture (OR, 2.9) showed a significantly increased likelihood of L. monocytogenes isolation compared to that for samples collected far from water and pasture. Generalized linear mixed models identified additional land cover factors associated with an increased likelihood of L. monocytogenes isolation, such as proximity to wetlands. These findings validated a subset of previously developed rules that predict L. monocytogenes prevalence in produce production environments. This suggests that GIS and geospatial models can be used to accurately predict L. monocytogenes prevalence on farms and can be used prospectively to minimize the risk of preharvest contamination of produce. PMID:26590280

  9. Development and Validation of Improved Method for Fingerprint ...

    African Journals Online (AJOL)

    Purpose: To develop and validate an improved method by capillary zone electrophoresis with photodiode array detection for the fingerprint analysis of Ligusticum chuanxiong Hort. (Rhizoma Chuanxiong). Methods: The optimum high performance capillary electrophoresis (HPCE) conditions were 30 mM borax containing 5 ...

  10. Method validation for chemical composition determination by electron microprobe with wavelength dispersive spectrometer

    Science.gov (United States)

    Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.

    2016-07-01

    The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.

  11. Development and validation of a spectroscopic method for the ...

    African Journals Online (AJOL)

    Development and validation of a spectroscopic method for the simultaneous analysis of ... advanced analytical methods such as high pressure liquid ..... equipment. DECLARATIONS ... high-performance liquid chromatography. J Chromatogr.

  12. Bayesian risk-based decision method for model validation under uncertainty

    International Nuclear Information System (INIS)

    Jiang Xiaomo; Mahadevan, Sankaran

    2007-01-01

    This paper develops a decision-making methodology for computational model validation, considering the risk of using the current model, data support for the current model, and cost of acquiring new information to improve the model. A Bayesian decision theory-based method is developed for this purpose, using a likelihood ratio as the validation metric for model assessment. An expected risk or cost function is defined as a function of the decision costs, and the likelihood and prior of each hypothesis. The risk is minimized through correctly assigning experimental data to two decision regions based on the comparison of the likelihood ratio with a decision threshold. A Bayesian validation metric is derived based on the risk minimization criterion. Two types of validation tests are considered: pass/fail tests and system response value measurement tests. The methodology is illustrated for the validation of reliability prediction models in a tension bar and an engine blade subjected to high cycle fatigue. The proposed method can effectively integrate optimal experimental design into model validation to simultaneously reduce the cost and improve the accuracy of reliability model assessment

  13. Spacecraft early design validation using formal methods

    International Nuclear Information System (INIS)

    Bozzano, Marco; Cimatti, Alessandro; Katoen, Joost-Pieter; Katsaros, Panagiotis; Mokos, Konstantinos; Nguyen, Viet Yen; Noll, Thomas; Postma, Bart; Roveri, Marco

    2014-01-01

    The size and complexity of software in spacecraft is increasing exponentially, and this trend complicates its validation within the context of the overall spacecraft system. Current validation methods are labor-intensive as they rely on manual analysis, review and inspection. For future space missions, we developed – with challenging requirements from the European space industry – a novel modeling language and toolset for a (semi-)automated validation approach. Our modeling language is a dialect of AADL and enables engineers to express the system, the software, and their reliability aspects. The COMPASS toolset utilizes state-of-the-art model checking techniques, both qualitative and probabilistic, for the analysis of requirements related to functional correctness, safety, dependability and performance. Several pilot projects have been performed by industry, with two of them having focused on the system-level of a satellite platform in development. Our efforts resulted in a significant advancement of validating spacecraft designs from several perspectives, using a single integrated system model. The associated technology readiness level increased from level 1 (basic concepts and ideas) to early level 4 (laboratory-tested)

  14. Method validation for uranium content analysis using a potentiometer T-90

    International Nuclear Information System (INIS)

    Torowati; Ngatijo; Rahmiati

    2016-01-01

    An experimental method validation has been conducted for uranium content analysis using Potentiometer T-90. The method validation experiment was performed in the quality control laboratory of Experiment Fuel Element Installation, PTBBN - BATAN. The objective is to determine the level of precision and accuracy of analytical results for uranium analysis referring to the latest American Standard Test Method (ASTM) of ASTM C1267-11, which is a modified reference method by reducing of reagent consumption by 10% of the amount used by the original method. The ASTM C 1267-11 reference is a new ASTM as a substitute for the older ASTM namely ASTM C799, Vol.12.01, 2003. It is, therefore, necessary to validate the renewed method. The tool used for the analysis of uranium was potentiometer T-90 and the material used was standard uranium oxide powder CRM (Certificate Reference Material). Validation of the method was done by analyzing standard uranium powder by 7 times weighing and 7 times analysis. Analysis results were used to determine the level of accuracy, precision, Relative Standard Deviation (RSD) and Horwitz coefficient Variation and limit detection and quantitation. The average uranium obtained for this method validation is 84.36% with Standard Deviation (SD) of 0.12%, Relative Standard Deviation (RSD) 0.14% and 2/3 Horwitz coefficient Variation (CV Horwitz) 2.05%. The results show that RSD value is smaller than the value of (2/3) CV Horwitz, which means that this method has a high precision. The accuracy value obtained is 0.48%, and since the acceptance limit of high level of accuracy is when the accuracy value is <2.00%, this method is regarded as having a high degree of accuracy [1]. The limit of detection (LOD) and and the limit of quantitation (LOQ) are 0.0145 g/L and 0.0446 g/L respectively. It is concluded that the ASTM C 1267-11 reference method is valid for use. (author)

  15. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    Directory of Open Access Journals (Sweden)

    Daniel Ramos

    2017-02-01

    Full Text Available Data to which the authors refer to throughout this article are likelihood ratios (LR computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR method in forensic evidence evaluation. These data present a necessary asset for conducting validation experiments when validating LR methods used in forensic evidence evaluation and set up validation reports. These data can be also used as a baseline for comparing the fingermark evidence in the same minutiae configuration as presented in (D. Meuwly, D. Ramos, R. Haraksim, [1], although the reader should keep in mind that different feature extraction algorithms and different AFIS systems used may produce different LRs values. Moreover, these data may serve as a reproducibility exercise, in order to train the generation of validation reports of forensic methods, according to [1]. Alongside the data, a justification and motivation for the use of methods is given. These methods calculate LRs from the fingerprint/mark data and are subject to a validation procedure. The choice of using real forensic fingerprint in the validation and simulated data in the development is described and justified. Validation criteria are set for the purpose of validation of the LR methods, which are used to calculate the LR values from the data and the validation report. For privacy and data protection reasons, the original fingerprint/mark images cannot be shared. But these images do not constitute the core data for the validation, contrarily to the LRs that are shared.

  16. Testing and Validation of Computational Methods for Mass Spectrometry.

    Science.gov (United States)

    Gatto, Laurent; Hansen, Kasper D; Hoopmann, Michael R; Hermjakob, Henning; Kohlbacher, Oliver; Beyer, Andreas

    2016-03-04

    High-throughput methods based on mass spectrometry (proteomics, metabolomics, lipidomics, etc.) produce a wealth of data that cannot be analyzed without computational methods. The impact of the choice of method on the overall result of a biological study is often underappreciated, but different methods can result in very different biological findings. It is thus essential to evaluate and compare the correctness and relative performance of computational methods. The volume of the data as well as the complexity of the algorithms render unbiased comparisons challenging. This paper discusses some problems and challenges in testing and validation of computational methods. We discuss the different types of data (simulated and experimental validation data) as well as different metrics to compare methods. We also introduce a new public repository for mass spectrometric reference data sets ( http://compms.org/RefData ) that contains a collection of publicly available data sets for performance evaluation for a wide range of different methods.

  17. Constraining the cross section of 82Se(n, γ)83Se to validate the β-Oslo method

    Science.gov (United States)

    Childers, K.; Liddick, S. N.; Crider, B. P.; Dombos, A. C.; Lewis, R.; Spyrou, A.; Couture, A.; Mosby, S.; Prokop, C. J.; Naqvi, F.; Larsen, A. C.; Guttormsen, M.; Campo, L. C.; Renstrom, T.; Siem, S.; Bleuel, D. L.; Perdikakis, G.; Quinn, S.

    2017-09-01

    Neutron capture cross sections of short-lived nuclei are important for a variety of basic and applied nuclear science problems. However, because of the short half-lives of the nuclei involved and the nonexistence of a neutron target, indirect measurement methods are required. One such method is the β-Oslo method. The nuclear level density and γ strength function of a nucleus are extracted after β-decay and used in a statistical reaction model to constrain the neutron capture cross section. This method has been used previously, but must be validated against a directly measured neutron capture cross section. The neutron capture cross section of 82Se has been measured previously, and 83Se can be accessed by the β-decay of 83As. The β-decay of 83As to 83Se was studied using the SuN detector at the NSCL and the β-Oslo method was utilized to constrain the neutron capture cross section of 82Se, which is compared to the directly measured value.

  18. Human Factors methods concerning integrated validation of nuclear power plant control rooms

    International Nuclear Information System (INIS)

    Oskarsson, Per-Anders; Johansson, Bjoern J.E.; Gonzalez, Natalia

    2010-02-01

    The frame of reference for this work was existing recommendations and instructions from the NPP area, experiences from the review of the Turbic Validation and experiences from system validations performed at the Swedish Armed Forces, e.g. concerning military control rooms and fighter pilots. These enterprises are characterized by complex systems in extreme environments, often with high risks, where human error can lead to serious consequences. A focus group has been performed with representatives responsible for Human Factors issues from all Swedish NPP:s. The questions that were discussed were, among other things, for whom an integrated validation (IV) is performed and its purpose, what should be included in an IV, the comparison with baseline measures, the design process, the role of SSM, which methods of measurement should be used, and how the methods are affected of changes in the control room. The report brings different questions to discussion concerning the validation process. Supplementary methods of measurement for integrated validation are discussed, e.g. dynamic, psychophysiological, and qualitative methods for identification of problems. Supplementary methods for statistical analysis are presented. The study points out a number of deficiencies in the validation process, e.g. the need of common guidelines for validation and design, criteria for different types of measurements, clarification of the role of SSM, and recommendations for the responsibility of external participants in the validation process. The authors propose 12 measures for taking care of the identified problems

  19. Validation of a Previously Developed Geospatial Model That Predicts the Prevalence of Listeria monocytogenes in New York State Produce Fields.

    Science.gov (United States)

    Weller, Daniel; Shiwakoti, Suvash; Bergholz, Peter; Grohn, Yrjo; Wiedmann, Martin; Strawn, Laura K

    2016-02-01

    Technological advancements, particularly in the field of geographic information systems (GIS), have made it possible to predict the likelihood of foodborne pathogen contamination in produce production environments using geospatial models. Yet, few studies have examined the validity and robustness of such models. This study was performed to test and refine the rules associated with a previously developed geospatial model that predicts the prevalence of Listeria monocytogenes in produce farms in New York State (NYS). Produce fields for each of four enrolled produce farms were categorized into areas of high or low predicted L. monocytogenes prevalence using rules based on a field's available water storage (AWS) and its proximity to water, impervious cover, and pastures. Drag swabs (n = 1,056) were collected from plots assigned to each risk category. Logistic regression, which tested the ability of each rule to accurately predict the prevalence of L. monocytogenes, validated the rules based on water and pasture. Samples collected near water (odds ratio [OR], 3.0) and pasture (OR, 2.9) showed a significantly increased likelihood of L. monocytogenes isolation compared to that for samples collected far from water and pasture. Generalized linear mixed models identified additional land cover factors associated with an increased likelihood of L. monocytogenes isolation, such as proximity to wetlands. These findings validated a subset of previously developed rules that predict L. monocytogenes prevalence in produce production environments. This suggests that GIS and geospatial models can be used to accurately predict L. monocytogenes prevalence on farms and can be used prospectively to minimize the risk of preharvest contamination of produce. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  20. Validation of calculational methods for nuclear criticality safety - approved 1975

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    The American National Standard for Nuclear Criticality Safety in Operations with Fissionable Materials Outside Reactors, N16.1-1975, states in 4.2.5: In the absence of directly applicable experimental measurements, the limits may be derived from calculations made by a method shown to be valid by comparison with experimental data, provided sufficient allowances are made for uncertainties in the data and in the calculations. There are many methods of calculation which vary widely in basis and form. Each has its place in the broad spectrum of problems encountered in the nuclear criticality safety field; however, the general procedure to be followed in establishing validity is common to all. The standard states the requirements for establishing the validity and area(s) of applicability of any calculational method used in assessing nuclear criticality safety

  1. Moving beyond Traditional Methods of Survey Validation

    Science.gov (United States)

    Maul, Andrew

    2017-01-01

    In his focus article, "Rethinking Traditional Methods of Survey Validation," published in this issue of "Measurement: Interdisciplinary Research and Perspectives," Andrew Maul wrote that it is commonly believed that self-report, survey-based instruments can be used to measure a wide range of psychological attributes, such as…

  2. Validation Process Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); English, Christine M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gesick, Joshua C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukkamala, Saikrishna [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-01-04

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  3. Development and validation of a method for the determination of regulated fragrance allergens by High-Performance Liquid Chromatography and Parallel Factor Analysis 2.

    Science.gov (United States)

    Pérez-Outeiral, Jessica; Elcoroaristizabal, Saioa; Amigo, Jose Manuel; Vidal, Maider

    2017-12-01

    This work presents the development and validation of a multivariate method for quantitation of 6 potentially allergenic substances (PAS) related to fragrances by ultrasound-assisted emulsification microextraction coupled with HPLC-DAD and PARAFAC2 in the presence of other 18 PAS. The objective is the extension of a previously proposed univariate method to be able to determine the 24 PAS currently considered as allergens. The suitability of the multivariate approach for the qualitative and quantitative analysis of the analytes is discussed through datasets of increasing complexity, comprising the assessment and validation of the method performance. PARAFAC2 showed to adequately model the data facing up different instrumental and chemical issues, such as co-elution profiles, overlapping spectra, unknown interfering compounds, retention time shifts and baseline drifts. Satisfactory quality parameters of the model performance were obtained (R 2 ≥0.94), as well as meaningful chromatographic and spectral profiles (r≥0.97). Moreover, low errors of prediction in external validation standards (below 15% in most cases) as well as acceptable quantification errors in real spiked samples (recoveries from 82 to 119%) confirmed the suitability of PARAFAC2 for resolution and quantification of the PAS. The combination of the previously proposed univariate approach, for the well-resolved peaks, with the developed multivariate method allows the determination of the 24 regulated PAS. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. A validated RP-HPLC method for the determination of Irinotecan hydrochloride residues for cleaning validation in production area

    Directory of Open Access Journals (Sweden)

    Sunil Reddy

    2013-03-01

    Full Text Available Introduction: cleaning validation is an integral part of current good manufacturing practices in pharmaceutical industry. The main purpose of cleaning validation is to prove the effectiveness and consistency of cleaning in a given pharmaceutical production equipment to prevent cross contamination and adulteration of drug product with other active ingredient. Objective: a rapid, sensitive and specific reverse phase HPLC method was developed and validated for the quantitative determination of irinotecan hydrochloride in cleaning validation swab samples. Method: the method was validated using waters symmetry shield RP-18 (250mm x 4.6mm 5 µm column with isocratic mobile phase containing a mixture of 0.02 M potassium di-hydrogen ortho-phosphate, pH adjusted to 3.5 with ortho-phosphoric acid, methanol and acetonitrile (60:20:20 v/v/v. The flow rate of mobile phase was 1.0 mL/min with column temperature of 25°C and detection wavelength at 220nm. The sample injection volume was 100 µl. Results: the calibration curve was linear over a concentration range from 0.024 to 0.143 µg/mL with a correlation coefficient of 0.997. The intra-day and inter-day precision expressed as relative standard deviation were below 3.2%. The recoveries obtained from stainless steel, PCGI, epoxy, glass and decron cloth surfaces were more than 85% and there was no interference from the cotton swab. The detection limit (DL and quantitation limit (QL were 0.008 and 0.023 µg ml-1, respectively. Conclusion: the developed method was validated with respect to specificity, linearity, limit of detection and quantification, accuracy, precision and solution stability. The overall procedure can be used as part of a cleaning validation program in pharmaceutical manufacture of irinotecan hydrochloride.

  5. Development and validation of analytical methods for dietary supplements

    International Nuclear Information System (INIS)

    Sullivan, Darryl; Crowley, Richard

    2006-01-01

    The expanding use of innovative botanical ingredients in dietary supplements and foods has resulted in a flurry of research aimed at the development and validation of analytical methods for accurate measurement of active ingredients. The pressing need for these methods is being met through an expansive collaborative initiative involving industry, government, and analytical organizations. This effort has resulted in the validation of several important assays as well as important advances in the method engineering procedures which have improved the efficiency of the process. The initiative has also allowed researchers to hurdle many of the barricades that have hindered accurate analysis such as the lack of reference standards and comparative data. As the availability for nutraceutical products continues to increase these methods will provide consumers and regulators with the scientific information needed to assure safety and dependable labeling

  6. Validating the JobFit system functional assessment method

    Energy Technology Data Exchange (ETDEWEB)

    Jenny Legge; Robin Burgess-Limerick

    2007-05-15

    Workplace injuries are costing the Australian coal mining industry and its communities $410 Million a year. This ACARP study aims to meet those demands by developing a safe, reliable and valid pre-employment functional assessment tool. All JobFit System Pre-Employment Functional Assessments (PEFAs) consist of a musculoskeletal screen, balance test, aerobic fitness test and job-specific postural tolerances and material handling tasks. The results of each component are compared to the applicant's job demands and an overall PEFA score between 1 and 4 is given with 1 being the better score. The reliability study and validity study were conducted concurrently. The reliability study examined test-retest, intra-tester and inter-tester reliability of the JobFit System Functional Assessment Method. Overall, good to excellent reliability was found, which was sufficient to be used for comparison with injury data for determining the validity of the assessment. The overall assessment score and material handling tasks had the greatest reliability. The validity study compared the assessment results of 336 records from a Queensland underground and open cut coal mine with their injury records. A predictive relationship was found between PEFA score and the risk of a back/trunk/shoulder injury from manual handling. An association was also found between PEFA score of 1 and increased length of employment. Lower aerobic fitness test results had an inverse relationship with injury rates. The study found that underground workers, regardless of PEFA score, were more likely to have an injury when compared to other departments. No relationship was found between age and risk of injury. These results confirm the validity of the JobFit System Functional Assessment method.

  7. Verification and Validation of a Coordinate Transformation Method in Axisymmetric Transient Magnetics.

    Energy Technology Data Exchange (ETDEWEB)

    Ashcraft, C. Chace [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Niederhaus, John Henry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Robinson, Allen C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-29

    We present a verification and validation analysis of a coordinate-transformation-based numerical solution method for the two-dimensional axisymmetric magnetic diffusion equation, implemented in the finite-element simulation code ALEGRA. The transformation, suggested by Melissen and Simkin, yields an equation set perfectly suited for linear finite elements and for problems with large jumps in material conductivity near the axis. The verification analysis examines transient magnetic diffusion in a rod or wire in a very low conductivity background by first deriving an approximate analytic solution using perturbation theory. This approach for generating a reference solution is shown to be not fully satisfactory. A specialized approach for manufacturing an exact solution is then used to demonstrate second-order convergence under spatial refinement and tem- poral refinement. For this new implementation, a significant improvement relative to previously available formulations is observed. Benefits in accuracy for computed current density and Joule heating are also demonstrated. The validation analysis examines the circuit-driven explosion of a copper wire using resistive magnetohydrodynamics modeling, in comparison to experimental tests. The new implementation matches the accuracy of the existing formulation, with both formulations capturing the experimental burst time and action to within approximately 2%.

  8. Content Validity of National Post Marriage Educational Program Using Mixed Methods

    Science.gov (United States)

    MOHAJER RAHBARI, Masoumeh; SHARIATI, Mohammad; KERAMAT, Afsaneh; YUNESIAN, Masoud; ESLAMI, Mohammad; MOUSAVI, Seyed Abbas; MONTAZERI, Ali

    2015-01-01

    Background: Although the validity of content of program is mostly conducted with qualitative methods, this study used both qualitative and quantitative methods for the validation of content of post marriage training program provided for newly married couples. Content validity is a preliminary step of obtaining authorization required to install the program in country's health care system. Methods: This mixed methodological content validation study carried out in four steps with forming three expert panels. Altogether 24 expert panelists were involved in 3 qualitative and quantitative panels; 6 in the first item development one; 12 in the reduction kind, 4 of them were common with the first panel, and 10 executive experts in the last one organized to evaluate psychometric properties of CVR and CVI and Face validity of 57 educational objectives. Results: The raw data of post marriage program had been written by professional experts of Ministry of Health, using qualitative expert panel, the content was more developed by generating 3 topics and refining one topic and its respective content. In the second panel, totally six other objectives were deleted, three for being out of agreement cut of point and three on experts' consensus. The validity of all items was above 0.8 and their content validity indices (0.8–1) were completely appropriate in quantitative assessment. Conclusion: This study provided a good evidence for validation and accreditation of national post marriage program planned for newly married couples in health centers of the country in the near future. PMID:26056672

  9. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    Science.gov (United States)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  10. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    Science.gov (United States)

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the

  11. Absolute quantification method and validation of airborne snow crab allergen tropomyosin using tandem mass spectrometry

    International Nuclear Information System (INIS)

    Rahman, Anas M. Abdel; Lopata, Andreas L.; Randell, Edward W.; Helleur, Robert J.

    2010-01-01

    Measuring the levels of the major airborne allergens of snow crab in the workplace is very important in studying the prevalence of crab asthma in workers. Previously, snow crab tropomyosin (SCTM) was identified as the major aeroallergen in crab plants and a unique signature peptide was identified for this protein. The present study advances our knowledge on aeroallergens by developing a method of quantification of airborne SCTM by using isotope dilution mass spectrometry. Liquid chromatography tandem mass spectrometry was developed for separation and analysis of the signature peptides. The tryptic digestion conditions were optimized to accomplish complete digestion. The validity of the method was studied using international conference on harmonization protocol, Where 2-9% for CV (precision) and 101-110% for accuracy, at three different levels of quality control. Recovery of the spiked protein from PTFE and TopTip filters was measured to be 99% and 96%, respectively. To further demonstrate the applicability and the validity of the method for real samples, 45 kg of whole snow crab were processed in an enclosed (simulated) crab processing line and air samples were collected. The levels of SCTM ranged between 0.36-3.92 μg m -3 and 1.70-2.31 μg m -3 for butchering and cooking stations, respectively.

  12. Absolute quantification method and validation of airborne snow crab allergen tropomyosin using tandem mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Anas M. Abdel, E-mail: anasar@mun.ca [Department of Chemistry, Memorial University of Newfoundland, St. John' s, Newfoundland A1B 3X7 (Canada); Lopata, Andreas L. [School of Applied Science, Marine Biomedical Sciences and Health Research Group, RMIT University, Bundoora, 3083 Victoria (Australia); Randell, Edward W. [Department of Laboratory Medicine, Memorial University of Newfoundland, Eastern Health, St. John' s, Newfoundland and Labrador A1B 3V6 (Canada); Helleur, Robert J. [Department of Chemistry, Memorial University of Newfoundland, St. John' s, Newfoundland A1B 3X7 (Canada)

    2010-11-29

    Measuring the levels of the major airborne allergens of snow crab in the workplace is very important in studying the prevalence of crab asthma in workers. Previously, snow crab tropomyosin (SCTM) was identified as the major aeroallergen in crab plants and a unique signature peptide was identified for this protein. The present study advances our knowledge on aeroallergens by developing a method of quantification of airborne SCTM by using isotope dilution mass spectrometry. Liquid chromatography tandem mass spectrometry was developed for separation and analysis of the signature peptides. The tryptic digestion conditions were optimized to accomplish complete digestion. The validity of the method was studied using international conference on harmonization protocol, Where 2-9% for CV (precision) and 101-110% for accuracy, at three different levels of quality control. Recovery of the spiked protein from PTFE and TopTip filters was measured to be 99% and 96%, respectively. To further demonstrate the applicability and the validity of the method for real samples, 45 kg of whole snow crab were processed in an enclosed (simulated) crab processing line and air samples were collected. The levels of SCTM ranged between 0.36-3.92 {mu}g m{sup -3} and 1.70-2.31 {mu}g m{sup -3} for butchering and cooking stations, respectively.

  13. Determination of Modafinil in Tablet Formulation Using Three New Validated Spectrophotometric Methods

    International Nuclear Information System (INIS)

    Basniwal, P.K.; Jain, D.; Basniwal, P.K.

    2014-01-01

    In this study, three new UV spectrophotometric methods viz. linear regression equation (LRE), standard absorptivity (SA) and first order derivative (FOD) method were developed and validated for determination of modafinil in tablet form. The Beer-Lamberts law was obeyed as linear in the range of 10-50 μg/ mL and all the methods were validated for linearity, accuracy, precision and robustness. These methods were successfully applied for assay of modafinil drug content in tablets in the range of 100.20 - 100.42 %, 100.11 - 100.58 % and 100.25 - 100.34 %, respectively with acceptable standard deviation (less than two) for all the methods. The validated spectrophotometric methods may be successfully applied for assay, dissolution studies, bio-equivalence studies as well as routine analysis in pharmaceutical industries. (author)

  14. Reliability and Validity of a New Method for Isometric Back Extensor Strength Evaluation Using A Hand-Held Dynamometer.

    Science.gov (United States)

    Park, Hee-Won; Baek, Sora; Kim, Hong Young; Park, Jung-Gyoo; Kang, Eun Kyoung

    2017-10-01

    To investigate the reliability and validity of a new method for isometric back extensor strength measurement using a portable dynamometer. A chair equipped with a small portable dynamometer was designed (Power Track II Commander Muscle Tester). A total of 15 men (mean age, 34.8±7.5 years) and 15 women (mean age, 33.1±5.5 years) with no current back problems or previous history of back surgery were recruited. Subjects were asked to push the back of the chair while seated, and their isometric back extensor strength was measured by the portable dynamometer. Test-retest reliability was assessed with intraclass correlation coefficient (ICC). For the validity assessment, isometric back extensor strength of all subjects was measured by a widely used physical performance evaluation instrument, BTE PrimusRS system. The limit of agreement (LoA) from the Bland-Altman plot was evaluated between two methods. The test-retest reliability was excellent (ICC=0.82; 95% confidence interval, 0.65-0.91). The Bland-Altman plots demonstrated acceptable agreement between the two methods: the lower 95% LoA was -63.1 N and the upper 95% LoA was 61.1 N. This study shows that isometric back extensor strength measurement using a portable dynamometer has good reliability and validity.

  15. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    Science.gov (United States)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  16. Development and Validation of a Stability-Indicating LC-UV Method ...

    African Journals Online (AJOL)

    Keywords: Ketotifen, Cetirizine, Stability indicating method, Stressed conditions, Validation. Tropical ... in biological fluids [13] are also reported. Stability indicating HPLC method is reported for ketotifen where drug is ..... paracetamol, cetirizine.

  17. Validation of SWAT+ at field level and comparison with previous SWAT models in simulating hydrologic quantity

    Science.gov (United States)

    GAO, J.; White, M. J.; Bieger, K.; Yen, H.; Arnold, J. G.

    2017-12-01

    Over the past 20 years, the Soil and Water Assessment Tool (SWAT) has been adopted by many researches to assess water quantity and quality in watersheds around the world. As the demand increases in facilitating model support, maintenance, and future development, the SWAT source code and data have undergone major modifications over the past few years. To make the model more flexible in terms of interactions of spatial units and processes occurring in watersheds, a completely revised version of SWAT (SWAT+) was developed to improve SWAT's ability in water resource modelling and management. There are only several applications of SWAT+ in large watersheds, however, no study pays attention to validate the new model at field level and assess its performance. To test the basic hydrologic function of SWAT+, it was implemented in five field cases across five states in the U.S. and compared the SWAT+ created results with that from the previous models at the same fields. Additionally, an automatic calibration tool was used to test which model is easier to be calibrated well in a limited number of parameter adjustments. The goal of the study was to evaluate the performance of SWAT+ in simulating stream flow on field level at different geographical locations. The results demonstrate that SWAT+ demonstrated similar performance with previous SWAT model, but the flexibility offered by SWAT+ via the connection of different spatial objects can result in a more accurate simulation of hydrological processes in spatial, especially for watershed with artificial facilities. Autocalibration shows that SWAT+ is much easier to obtain a satisfied result compared with the previous SWAT. Although many capabilities have already been enhanced in SWAT+, there exist inaccuracies in simulation. This insufficiency will be improved with advancements in scientific knowledge on hydrologic process in specific watersheds. Currently, SWAT+ is prerelease, and any errors are being addressed.

  18. Fuzzy-logic based strategy for validation of multiplex methods: example with qualitative GMO assays.

    Science.gov (United States)

    Bellocchi, Gianni; Bertholet, Vincent; Hamels, Sandrine; Moens, W; Remacle, José; Van den Eede, Guy

    2010-02-01

    This paper illustrates the advantages that a fuzzy-based aggregation method could bring into the validation of a multiplex method for GMO detection (DualChip GMO kit, Eppendorf). Guidelines for validation of chemical, bio-chemical, pharmaceutical and genetic methods have been developed and ad hoc validation statistics are available and routinely used, for in-house and inter-laboratory testing, and decision-making. Fuzzy logic allows summarising the information obtained by independent validation statistics into one synthetic indicator of overall method performance. The microarray technology, introduced for simultaneous identification of multiple GMOs, poses specific validation issues (patterns of performance for a variety of GMOs at different concentrations). A fuzzy-based indicator for overall evaluation is illustrated in this paper, and applied to validation data for different genetically modified elements. Remarks were drawn on the analytical results. The fuzzy-logic based rules were shown to be applicable to improve interpretation of results and facilitate overall evaluation of the multiplex method.

  19. Content Validity of Temporal Bone Models Printed Via Inexpensive Methods and Materials.

    Science.gov (United States)

    Bone, T Michael; Mowry, Sarah E

    2016-09-01

    Computed tomographic (CT) scans of the 3-D printed temporal bone models will be within 15% accuracy of the CT scans of the cadaveric temporal bones. Previous studies have evaluated the face validity of 3-D-printed temporal bone models designed to train otolaryngology residents. The purpose of the study was to determine the content validity of temporal bone models printed using inexpensive printers and materials. Four cadaveric temporal bones were randomly selected and clinical temporal bone CT scans were obtained. Models were generated using previously described methods in acrylonitrile butadiene styrene (ABS) plastic using the Makerbot Replicator 2× and Hyrel printers. Models were radiographically scanned using the same protocol as the cadaveric bones. Four images from each cadaveric CT series and four corresponding images from the model CT series were selected, and voxel values were normalized to black or white. Scan slices were compared using PixelDiff software. Gross anatomic structures were evaluated in the model scans by four board certified otolaryngologists on a 4-point scale. Mean pixel difference between the cadaver and model scans was 14.25 ± 2.30% at the four selected CT slices. Mean cortical bone width difference and mean external auditory canal width difference were 0.58 ± 0.66 mm and 0.55 ± 0.46 mm, respectively. Expert raters felt the mastoid air cells were well represented (2.5 ± 0.5), while middle ear and otic capsule structures were not accurately rendered (all averaged bones for training residents in cortical mastoidectomies, but less effective for middle ear procedures.

  20. Method development and validation of potent pyrimidine derivative by UV-VIS spectrophotometer.

    Science.gov (United States)

    Chaudhary, Anshu; Singh, Anoop; Verma, Prabhakar Kumar

    2014-12-01

    A rapid and sensitive ultraviolet-visible (UV-VIS) spectroscopic method was developed for the estimation of pyrimidine derivative 6-Bromo-3-(6-(2,6-dichlorophenyl)-2-(morpolinomethylamino) pyrimidine4-yl) -2H-chromen-2-one (BT10M) in bulk form. Pyrimidine derivative was monitored at 275 nm with UV detection, and there is no interference of diluents at 275 nm. The method was found to be linear in the range of 50 to 150 μg/ml. The accuracy and precision were determined and validated statistically. The method was validated as a guideline. The results showed that the proposed method is suitable for the accurate, precise, and rapid determination of pyrimidine derivative. Graphical Abstract Method development and validation of potent pyrimidine derivative by UV spectroscopy.

  1. A long-term validation of the modernised DC-ARC-OES solid-sample method.

    Science.gov (United States)

    Flórián, K; Hassler, J; Förster, O

    2001-12-01

    The validation procedure based on ISO 17025 standard has been used to study and illustrate both the longterm stability of the calibration process of the DC-ARC solid sample spectrometric method and the main validation criteria of the method. In the calculation of the validation characteristics depending on the linearity(calibration), also the fulfilment of predetermining criteria such as normality and homoscedasticity was checked. In order to decide whether there are any trends in the time-variation of the analytical signal or not, also the Neumann test of trend was applied and evaluated. Finally, a comparison with similar validation data of the ETV-ICP-OES method was carried out.

  2. Comparing the Validity of Non-Invasive Methods in Measuring Thoracic Kyphosis and Lumbar Lordosis

    Directory of Open Access Journals (Sweden)

    Mohammad Yousefi

    2012-04-01

    Full Text Available Background: the purpose of this article is to study the validity of each of the non-invasive methods (flexible ruler, spinal mouse, and processing the image versus the one through-Ray radiation (the basic method and comparing them with each other.Materials and Methods: for evaluating the validity of each of these non-invasive methods, the thoracic Kyphosis and lumber Lordosis angle of 20 students of Birjand University (age mean and standard deviation: 26±2, weight: 72±2.5 kg, height: 169±5.5 cm through fours methods of flexible ruler, spinal mouse, and image processing and X-ray.Results: the results indicated that the validity of the methods including flexible ruler, spinal mouse, and image processing in measuring the thoracic Kyphosis and lumber Lordosis angle respectively have an adherence of 0.81, 0.87, 0.73, 0.76, 0.83, 0.89 (p>0.05. As a result, regarding the gained validity against the golden method of X-ray, it could be stated that the three mentioned non-invasive methods have adequate validity. In addition, the one-way analysis of variance test indicated that there existed a meaningful relationship between the three methods of measuring the thoracic Kyphosis and lumber Lordosis, and with respect to the Tukey’s test result, the image processing method is the most precise one.Conclusion as a result, this method could be used along with other non-invasive methods as a valid measuring method.

  3. Reliability and validity of the AutoCAD software method in lumbar lordosis measurement.

    Science.gov (United States)

    Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe

    2011-12-01

    The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis.

  4. Increased efficacy for in-house validation of real-time PCR GMO detection methods.

    Science.gov (United States)

    Scholtens, I M J; Kok, E J; Hougs, L; Molenaar, B; Thissen, J T N M; van der Voet, H

    2010-03-01

    To improve the efficacy of the in-house validation of GMO detection methods (DNA isolation and real-time PCR, polymerase chain reaction), a study was performed to gain insight in the contribution of the different steps of the GMO detection method to the repeatability and in-house reproducibility. In the present study, 19 methods for (GM) soy, maize canola and potato were validated in-house of which 14 on the basis of an 8-day validation scheme using eight different samples and five on the basis of a more concise validation protocol. In this way, data was obtained with respect to the detection limit, accuracy and precision. Also, decision limits were calculated for declaring non-conformance (>0.9%) with 95% reliability. In order to estimate the contribution of the different steps in the GMO analysis to the total variation variance components were estimated using REML (residual maximum likelihood method). From these components, relative standard deviations for repeatability and reproducibility (RSD(r) and RSD(R)) were calculated. The results showed that not only the PCR reaction but also the factors 'DNA isolation' and 'PCR day' are important factors for the total variance and should therefore be included in the in-house validation. It is proposed to use a statistical model to estimate these factors from a large dataset of initial validations so that for similar GMO methods in the future, only the PCR step needs to be validated. The resulting data are discussed in the light of agreed European criteria for qualified GMO detection methods.

  5. Validity and Interrater Reliability of the Visual Quarter-Waste Method for Assessing Food Waste in Middle School and High School Cafeteria Settings.

    Science.gov (United States)

    Getts, Katherine M; Quinn, Emilee L; Johnson, Donna B; Otten, Jennifer J

    2017-11-01

    Measuring food waste (ie, plate waste) in school cafeterias is an important tool to evaluate the effectiveness of school nutrition policies and interventions aimed at increasing consumption of healthier meals. Visual assessment methods are frequently applied in plate waste studies because they are more convenient than weighing. The visual quarter-waste method has become a common tool in studies of school meal waste and consumption, but previous studies of its validity and reliability have used correlation coefficients, which measure association but not necessarily agreement. The aims of this study were to determine, using a statistic measuring interrater agreement, whether the visual quarter-waste method is valid and reliable for assessing food waste in a school cafeteria setting when compared with the gold standard of weighed plate waste. To evaluate validity, researchers used the visual quarter-waste method and weighed food waste from 748 trays at four middle schools and five high schools in one school district in Washington State during May 2014. To assess interrater reliability, researcher pairs independently assessed 59 of the same trays using the visual quarter-waste method. Both validity and reliability were assessed using a weighted κ coefficient. For validity, as compared with the measured weight, 45% of foods assessed using the visual quarter-waste method were in almost perfect agreement, 42% of foods were in substantial agreement, 10% were in moderate agreement, and 3% were in slight agreement. For interrater reliability between pairs of visual assessors, 46% of foods were in perfect agreement, 31% were in almost perfect agreement, 15% were in substantial agreement, and 8% were in moderate agreement. These results suggest that the visual quarter-waste method is a valid and reliable tool for measuring plate waste in school cafeteria settings. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  6. Session-RPE Method for Training Load Monitoring: Validity, Ecological Usefulness, and Influencing Factors

    Directory of Open Access Journals (Sweden)

    Monoem Haddad

    2017-11-01

    Full Text Available Purpose: The aim of this review is to (1 retrieve all data validating the Session-rating of perceived exertion (RPE-method using various criteria, (2 highlight the rationale of this method and its ecological usefulness, and (3 describe factors that can alter RPE and users of this method should take into consideration.Method: Search engines such as SPORTDiscus, PubMed, and Google Scholar databases in the English language between 2001 and 2016 were consulted for the validity and usefulness of the session-RPE method. Studies were considered for further analysis when they used the session-RPE method proposed by Foster et al. in 2001. Participants were athletes of any gender, age, or level of competition. Studies using languages other than English were excluded in the analysis of the validity and reliability of the session-RPE method. Other studies were examined to explain the rationale of the session-RPE method and the origin of RPE.Results: A total of 950 studies cited the Foster et al. study that proposed the session RPE-method. 36 studies have examined the validity and reliability of this proposed method using the modified CR-10.Conclusion: These studies confirmed the validity and good reliability and internal consistency of session-RPE method in several sports and physical activities with men and women of different age categories (children, adolescents, and adults among various expertise levels. This method could be used as “standing alone” method for training load (TL monitoring purposes though some recommend to combine it with other physiological parameters as heart rate.

  7. MCNP HPGe detector benchmark with previously validated Cyltran model.

    Science.gov (United States)

    Hau, I D; Russ, W R; Bronson, F

    2009-05-01

    An exact copy of the detector model generated for Cyltran was reproduced as an MCNP input file and the detection efficiency was calculated similarly with the methodology used in previous experimental measurements and simulation of a 280 cm(3) HPGe detector. Below 1000 keV the MCNP data correlated to the Cyltran results within 0.5% while above this energy the difference between MCNP and Cyltran increased to about 6% at 4800 keV, depending on the electron cut-off energy.

  8. Laboratory diagnostic methods, system of quality and validation

    Directory of Open Access Journals (Sweden)

    Ašanin Ružica

    2005-01-01

    Full Text Available It is known that laboratory investigations secure safe and reliable results that provide a final confirmation of the quality of work. Ideas, planning, knowledge, skills, experience, and environment, along with good laboratory practice, quality control and reliability of quality, make the area of biological investigations very complex. In recent years, quality control, including the control of work in the laboratory, is based on international standards and is used at that level. The implementation of widely recognized international standards, such as the International Standard ISO/IEC 17025 (1 and the implementing of the quality system series ISO/IEC 9000 (2 have become the imperative on the grounds of which laboratories have a formal, visible and corresponding system of quality. The diagnostic methods that are used must constantly yield results which identify the animal as positive or negative, and the precise status of the animal is determined with a predefined degree of statistical significance. Methods applied on a selected population reduce the risk of obtaining falsely positive or falsely negative results. A condition for this are well conceived and documented methods, with the application of the corresponding reagents, and work with professional and skilled staff. This process requires also a consistent implementation of the most rigorous experimental plans, epidemiological and statistical data and estimations, with constant monitoring of the validity of the applied methods. Such an approach is necessary in order to cut down the number of misconceptions and accidental mistakes, for a referent population of animals on which the validity of a method is tested. Once a valid method is included in daily routine investigations, it is necessary to apply constant monitoring for the purpose of internal quality control, in order adequately to evaluate its reproducibility and reliability. Consequently, it is necessary at least twice yearly to conduct

  9. Validated RP-HPLC Method for Quantification of Phenolic ...

    African Journals Online (AJOL)

    Purpose: To evaluate the total phenolic content and antioxidant potential of the methanol extracts of aerial parts and roots of Thymus sipyleus Boiss and also to determine some phenolic compounds using a newly developed and validated reversed phase high performance liquid chromatography (RP-HPLC) method.

  10. Development and Validation of a Bioanalytical Method for Direct ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a user-friendly spiked plasma method for the extraction of diclofenac potassium that reduces the number of treatments with plasma sample, in order to minimize human error. Method: Instead of solvent evaporation technique, the spiked plasma sample was modified with H2SO4 and NaCl, ...

  11. A website evaluation model by integration of previous evaluation models using a quantitative approach

    Directory of Open Access Journals (Sweden)

    Ali Moeini

    2015-01-01

    Full Text Available Regarding the ecommerce growth, websites play an essential role in business success. Therefore, many authors have offered website evaluation models since 1995. Although, the multiplicity and diversity of evaluation models make it difficult to integrate them into a single comprehensive model. In this paper a quantitative method has been used to integrate previous models into a comprehensive model that is compatible with them. In this approach the researcher judgment has no role in integration of models and the new model takes its validity from 93 previous models and systematic quantitative approach.

  12. Rationale and methods of the European Food Consumption Validation (EFCOVAL) Project

    NARCIS (Netherlands)

    Boer, de E.J.; Slimani, N.; Boeing, H.; Feinberg, M.; Leclerq, C.; Trolle, E.; Amiano, P.; Andersen, L.F.; Freisling, H.; Geelen, A.; Harttig, U.; Huybrechts, I.; Kaic-Rak, A.; Lafay, L.; Lillegaard, I.T.L.; Ruprich, J.; Vries, de J.H.M.; Ocke, M.C.

    2011-01-01

    Background/Objectives: The overall objective of the European Food Consumption Validation (EFCOVAL) Project was to further develop and validate a trans-European food consumption method to be used for the evaluation of the intake of foods, nutrients and potentially hazardous chemicals within the

  13. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    Science.gov (United States)

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  14. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation.

    Science.gov (United States)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-07-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes' inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of the evidence for a trace specimen, e.g. a fingermark, and a reference specimen, e.g. a fingerprint, to originate from common or different sources. Some theoretical aspects of probabilities necessary for this Guideline were discussed prior to its elaboration, which started after a workshop of forensic researchers and practitioners involved in this topic. In the workshop, the following questions were addressed: "which aspects of a forensic evaluation scenario need to be validated?", "what is the role of the LR as part of a decision process?" and "how to deal with uncertainty in the LR calculation?". The questions: "what to validate?" focuses on the validation methods and criteria and "how to validate?" deals with the implementation of the validation protocol. Answers to these questions were deemed necessary with several objectives. First, concepts typical for validation standards [1], such as performance characteristics, performance metrics and validation criteria, will be adapted or applied by analogy to the LR framework. Second, a validation strategy will be defined. Third, validation methods will be described. Finally, a validation protocol and an example of validation report will be proposed, which can be applied to the forensic fields developing and validating LR methods for the evaluation of the strength of evidence at source level under the following propositions. Copyright © 2016. Published by Elsevier B.V.

  15. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  16. Content validity of methods to assess malnutrition in cancer patients: a systematic review

    NARCIS (Netherlands)

    Sealy, Martine; Nijholt, Willemke; Stuiver, M.M.; van der Berg, M.M.; Ottery, Faith D.; van der Schans, Cees; Roodenburg, Jan L N; Jager-Wittenaar, Harriët

    Content validity of methods to assess malnutrition in cancer patients: A systematic review Rationale: Inadequate operationalisation of the multidimensial concept of malnutrition may result in inadequate evaluation of nutritional status. In this review we aimed to assess content validity of methods

  17. Development and Validation of a Liquid Chromatographic Method ...

    African Journals Online (AJOL)

    A liquid chromatographic method for the simultaneous determination of six human immunodeficiency virus (HIV) protease inhibitors, indinavir, saquinavir, ritonavir, amprenavir, nelfinavir and lopinavir, was developed and validated. Optimal separation was achieved on a PLRP-S 100 Å, 250 x 4.6 mm I.D. column maintained ...

  18. Inter-laboratory validation of an inexpensive streamlined method to measure inorganic arsenic in rice grain.

    Science.gov (United States)

    Chaney, Rufus L; Green, Carrie E; Lehotay, Steven J

    2018-05-04

    With the establishment by CODEX of a 200 ng/g limit of inorganic arsenic (iAs) in polished rice grain, more analyses of iAs will be necessary to ensure compliance in regulatory and trade applications, to assess quality control in commercial rice production, and to conduct research involving iAs in rice crops. Although analytical methods using high-performance liquid chromatography-inductively coupled plasma-mass spectrometry (HPLC-ICP-MS) have been demonstrated for full speciation of As, this expensive and time-consuming approach is excessive when regulations are based only on iAs. We report a streamlined sample preparation and analysis of iAs in powdered rice based on heated extraction with 0.28 M HNO 3 followed by hydride generation (HG) under control of acidity and other simple conditions. Analysis of iAs is then conducted using flow-injection HG and inexpensive ICP-atomic emission spectroscopy (AES) or other detection means. A key innovation compared with previous methods was to increase the acidity of the reagent solution with 4 M HCl (prior to reduction of As 5+ to As 3+ ), which minimized interferences from dimethylarsinic acid. An inter-laboratory method validation was conducted among 12 laboratories worldwide in the analysis of six shared blind duplicates and a NIST Standard Reference Material involving different types of rice and iAs levels. Also, four laboratories used the standard HPLC-ICP-MS method to analyze the samples. The results between the methods were not significantly different, and the Horwitz ratio averaged 0.52 for the new method, which meets official method validation criteria. Thus, the simpler, more versatile, and less expensive method may be used by laboratories for several purposes to accurately determine iAs in rice grain. Graphical abstract Comparison of iAs results from new and FDA methods.

  19. Empirical Performance of Cross-Validation With Oracle Methods in a Genomics Context.

    Science.gov (United States)

    Martinez, Josue G; Carroll, Raymond J; Müller, Samuel; Sampson, Joshua N; Chatterjee, Nilanjan

    2011-11-01

    When employing model selection methods with oracle properties such as the smoothly clipped absolute deviation (SCAD) and the Adaptive Lasso, it is typical to estimate the smoothing parameter by m-fold cross-validation, for example, m = 10. In problems where the true regression function is sparse and the signals large, such cross-validation typically works well. However, in regression modeling of genomic studies involving Single Nucleotide Polymorphisms (SNP), the true regression functions, while thought to be sparse, do not have large signals. We demonstrate empirically that in such problems, the number of selected variables using SCAD and the Adaptive Lasso, with 10-fold cross-validation, is a random variable that has considerable and surprising variation. Similar remarks apply to non-oracle methods such as the Lasso. Our study strongly questions the suitability of performing only a single run of m-fold cross-validation with any oracle method, and not just the SCAD and Adaptive Lasso.

  20. Development and validation of a HPLC method for standardization of herbal and commercial extracts of Myrcia uniflora

    Directory of Open Access Journals (Sweden)

    Andrea N. de L. Batista

    2011-06-01

    Full Text Available Myrcia uniflora Barb. Rodr., Myrtaceae, popularly known as "pedra-hume-caá" in Brazil, is sold as dry extracts in capsules or as tinctures for the treatment of diabetes mellitus. Previous phytochemical studies on this species described the occurrence of the flavonoids mearnsitrin and myricitrin. In the present study, the chromatographic profiles of M. uniflora leaves and commercial extracts were determined using HPLC-PAD. Myricitrin was used as an external standard in the development and validation of the HPLC method. The proposed method is simple, rapid and reliable and can be successfully applied in industry for standardization of herbs and phytomedicines commercialised in Brazil as "pedra-hume-caá".

  1. Development and validation of a HPLC method for standardization of herbal and commercial extracts of Myrcia uniflora

    Directory of Open Access Journals (Sweden)

    Andrea N. de L. Batista

    2011-04-01

    Full Text Available Myrcia uniflora Barb. Rodr., Myrtaceae, popularly known as "pedra-hume-caá" in Brazil, is sold as dry extracts in capsules or as tinctures for the treatment of diabetes mellitus. Previous phytochemical studies on this species described the occurrence of the flavonoids mearnsitrin and myricitrin. In the present study, the chromatographic profiles of M. uniflora leaves and commercial extracts were determined using HPLC-PAD. Myricitrin was used as an external standard in the development and validation of the HPLC method. The proposed method is simple, rapid and reliable and can be successfully applied in industry for standardization of herbs and phytomedicines commercialised in Brazil as "pedra-hume-caá".

  2. New clinical validation method for automated sphygmomanometer: a proposal by Japan ISO-WG for sphygmomanometer standard.

    Science.gov (United States)

    Shirasaki, Osamu; Asou, Yosuke; Takahashi, Yukio

    2007-12-01

    Owing to fast or stepwise cuff deflation, or measuring at places other than the upper arm, the clinical accuracy of most recent automated sphygmomanometers (auto-BPMs) cannot be validated by one-arm simultaneous comparison, which would be the only accurate validation method based on auscultation. Two main alternative methods are provided by current standards, that is, two-arm simultaneous comparison (method 1) and one-arm sequential comparison (method 2); however, the accuracy of these validation methods might not be sufficient to compensate for the suspicious accuracy in lateral blood pressure (BP) differences (LD) and/or BP variations (BPV) between the device and reference readings. Thus, the Japan ISO-WG for sphygmomanometer standards has been studying a new method that might improve validation accuracy (method 3). The purpose of this study is to determine the appropriateness of method 3 by comparing immunity to LD and BPV with those of the current validation methods (methods 1 and 2). The validation accuracy of the above three methods was assessed in human participants [N=120, 45+/-15.3 years (mean+/-SD)]. An oscillometric automated monitor, Omron HEM-762, was used as the tested device. When compared with the others, methods 1 and 3 showed a smaller intra-individual standard deviation of device error (SD1), suggesting their higher reproducibility of validation. The SD1 by method 2 (P=0.004) significantly correlated with the participant's BP, supporting our hypothesis that the increased SD of device error by method 2 is at least partially caused by essential BPV. Method 3 showed a significantly (P=0.0044) smaller interparticipant SD of device error (SD2), suggesting its higher interparticipant consistency of validation. Among the methods of validation of the clinical accuracy of auto-BPMs, method 3, which showed the highest reproducibility and highest interparticipant consistency, can be proposed as being the most appropriate.

  3. Determination of vitamin C in foods: current state of method validation.

    Science.gov (United States)

    Spínola, Vítor; Llorent-Martínez, Eulogio J; Castilho, Paula C

    2014-11-21

    Vitamin C is one of the most important vitamins, so reliable information about its content in foodstuffs is a concern to both consumers and quality control agencies. However, the heterogeneity of food matrixes and the potential degradation of this vitamin during its analysis create enormous challenges. This review addresses the development and validation of high-performance liquid chromatography methods for vitamin C analysis in food commodities, during the period 2000-2014. The main characteristics of vitamin C are mentioned, along with the strategies adopted by most authors during sample preparation (freezing and acidification) to avoid vitamin oxidation. After that, the advantages and handicaps of different analytical methods are discussed. Finally, the main aspects concerning method validation for vitamin C analysis are critically discussed. Parameters such as selectivity, linearity, limit of quantification, and accuracy were studied by most authors. Recovery experiments during accuracy evaluation were in general satisfactory, with usual values between 81 and 109%. However, few methods considered vitamin C stability during the analytical process, and the study of the precision was not always clear or complete. Potential future improvements regarding proper method validation are indicated to conclude this review. Copyright © 2014. Published by Elsevier B.V.

  4. Modeling, implementation, and validation of arterial travel time reliability.

    Science.gov (United States)

    2013-11-01

    Previous research funded by Florida Department of Transportation (FDOT) developed a method for estimating : travel time reliability for arterials. This method was not initially implemented or validated using field data. This : project evaluated and r...

  5. [Data validation methods and discussion on Chinese materia medica resource survey].

    Science.gov (United States)

    Zhang, Yue; Ma, Wei-Feng; Zhang, Xiao-Bo; Zhu, Shou-Dong; Guo, Lan-Ping; Wang, Xing-Xing

    2013-07-01

    From the beginning of the fourth national survey of the Chinese materia medica resources, there were 22 provinces have conducted pilots. The survey teams have reported immense data, it put forward the very high request to the database system construction. In order to ensure the quality, it is necessary to check and validate the data in database system. Data validation is important methods to ensure the validity, integrity and accuracy of census data. This paper comprehensively introduce the data validation system of the fourth national survey of the Chinese materia medica resources database system, and further improve the design idea and programs of data validation. The purpose of this study is to promote the survey work smoothly.

  6. VALUE - Validating and Integrating Downscaling Methods for Climate Change Research

    Science.gov (United States)

    Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose

    2013-04-01

    Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of

  7. Pre-validation methods for developing a patient reported outcome instrument

    Directory of Open Access Journals (Sweden)

    Castillo Mayret M

    2011-08-01

    Full Text Available Abstract Background Measures that reflect patients' assessment of their health are of increasing importance as outcome measures in randomised controlled trials. The methodological approach used in the pre-validation development of new instruments (item generation, item reduction and question formatting should be robust and transparent. The totality of the content of existing PRO instruments for a specific condition provides a valuable resource (pool of items that can be utilised to develop new instruments. Such 'top down' approaches are common, but the explicit pre-validation methods are often poorly reported. This paper presents a systematic and generalisable 5-step pre-validation PRO instrument methodology. Methods The method is illustrated using the example of the Aberdeen Glaucoma Questionnaire (AGQ. The five steps are: 1 Generation of a pool of items; 2 Item de-duplication (three phases; 3 Item reduction (two phases; 4 Assessment of the remaining items' content coverage against a pre-existing theoretical framework appropriate to the objectives of the instrument and the target population (e.g. ICF; and 5 qualitative exploration of the target populations' views of the new instrument and the items it contains. Results The AGQ 'item pool' contained 725 items. Three de-duplication phases resulted in reduction of 91, 225 and 48 items respectively. The item reduction phases discarded 70 items and 208 items respectively. The draft AGQ contained 83 items with good content coverage. The qualitative exploration ('think aloud' study resulted in removal of a further 15 items and refinement to the wording of others. The resultant draft AGQ contained 68 items. Conclusions This study presents a novel methodology for developing a PRO instrument, based on three sources: literature reporting what is important to patient; theoretically coherent framework; and patients' experience of completing the instrument. By systematically accounting for all items dropped

  8. Impurities in biogas - validation of analytical methods for siloxanes; Foeroreningar i biogas - validering av analysmetodik foer siloxaner

    Energy Technology Data Exchange (ETDEWEB)

    Arrhenius, Karine; Magnusson, Bertil; Sahlin, Eskil [SP Technical Research Institute of Sweden, Boraas (Sweden)

    2011-11-15

    Biogas produced from digester or landfill contains impurities which can be harmful for component that will be in contact with the biogas during its utilization. Among these, the siloxanes are often mentioned. During combustion, siloxanes are converted to silicon dioxide which accumulates on the heated surfaces in combustion equipment. Silicon dioxide is a solid compound and will remain in the engine and cause damages. Consequently, it is necessary to develop methods for the accurate determination of these compounds in biogases. In the first part of this report, a method for analysis of siloxanes in biogases was validated. The sampling was performed directly at the plant by drawing a small volume of biogas onto an adsorbent tube under a short period of time. These tubes were subsequently sent to the laboratory for analysis. The purpose of method validation is to demonstrate that the established method is fit for the purpose. This means that the method, as used by the laboratory generating the data, will provide data that meets a set of criteria concerning precision and accuracy. At the end, the uncertainty of the method was calculated. In the second part of this report, the validated method was applied to real samples collected in waste water treatment plants, co-digestion plants and plants digesting other wastes (agriculture waste). Results are presented at the end of this report. As expected, the biogases from waste water treatment plants contained largely higher concentrations of siloxanes than biogases from co-digestion plants and plants digesting agriculture wastes. The concentration of siloxanes in upgraded biogas regardless of which feedstock was digested and which upgrading technique was used was low.

  9. Validation of MIMGO: a method to identify differentially expressed GO terms in a microarray dataset

    Directory of Open Access Journals (Sweden)

    Yamada Yoichi

    2012-12-01

    Full Text Available Abstract Background We previously proposed an algorithm for the identification of GO terms that commonly annotate genes whose expression is upregulated or downregulated in some microarray data compared with in other microarray data. We call these “differentially expressed GO terms” and have named the algorithm “matrix-assisted identification method of differentially expressed GO terms” (MIMGO. MIMGO can also identify microarray data in which genes annotated with a differentially expressed GO term are upregulated or downregulated. However, MIMGO has not yet been validated on a real microarray dataset using all available GO terms. Findings We combined Gene Set Enrichment Analysis (GSEA with MIMGO to identify differentially expressed GO terms in a yeast cell cycle microarray dataset. GSEA followed by MIMGO (GSEA + MIMGO correctly identified (p Conclusions MIMGO is a reliable method to identify differentially expressed GO terms comprehensively.

  10. A new method to model electroconvulsive therapy in rats with increased construct validity and enhanced translational value.

    Science.gov (United States)

    Theilmann, Wiebke; Löscher, Wolfgang; Socala, Katarzyna; Frieling, Helge; Bleich, Stefan; Brandt, Claudia

    2014-06-01

    Electroconvulsive therapy is the most effective therapy for major depressive disorder (MDD). The remission rate is above 50% in previously pharmacoresistant patients but the mechanisms of action are not fully understood. Electroconvulsive stimulation (ECS) in rodents mimics antidepressant electroconvulsive therapy (ECT) in humans and is widely used to investigate the underlying mechanisms of ECT. For the translational value of findings in animal models it is essential to establish models with the highest construct, face and predictive validity possible. The commonly used model for ECT in rodents does not meet the demand for high construct validity. For ECT, cortical surface electrodes are used to induce therapeutic seizures whereas ECS in rodents is exclusively performed by auricular or corneal electrodes. However, the stimulation site has a major impact on the type and spread of the induced seizure activity and its antidepressant effect. We propose a method in which ECS is performed by screw electrodes placed above the motor cortex of rats to closely simulate the clinical situation and thereby increase the construct validity of the model. Cortical ECS in rats induced reliably seizures comparable to human ECT. Cortical ECS was more effective than auricular ECS to reduce immobility in the forced swim test. Importantly, auricular stimulation had a negative influence on the general health condition of the rats with signs of fear during the stimulation sessions. These results suggest that auricular ECS in rats is not a suitable ECT model. Cortical ECS in rats promises to be a valid method to mimic ECT. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. A chromatographic method validation to quantify tablets Mephenesine of national production

    International Nuclear Information System (INIS)

    Suarez Perez, Yania; Izquierdo Castro, Adalberto; Milian Sanchez, Jana Daria

    2009-01-01

    Authors made validation of an analytical method by high performance liquid chromatography (HPLC) for quantification of Mephenesine in recently reformulated 500 mg tablets. With regard to its application to quality control, validation included the following parameters: linearity, accuracy, precision, and selectivity. Results were satisfactory within 50-150 % rank. In the case of its use in subsequent studies of chemical stability, the selectivity for stability and sensitivity was assessed. Estimated detection and quantification limits were appropriate, and the method was selective versus the possible degradation products. (Author)

  12. Development and validation of a reversed phase liquid chromatographic method for analysis of oxytetracycline and related impurities.

    Science.gov (United States)

    Kahsay, Getu; Shraim, Fairouz; Villatte, Philippe; Rotger, Jacques; Cassus-Coussère, Céline; Van Schepdael, Ann; Hoogmartens, Jos; Adams, Erwin

    2013-03-05

    A simple, robust and fast high-performance liquid chromatographic method is described for the analysis of oxytetracycline and its related impurities. The principal peak and impurities are all baseline separated in 20 min using an Inertsil C₈ (150 mm × 4.6 mm, 5 μm) column kept at 50 °C. The mobile phase consists of a gradient mixture of mobile phases A (0.05% trifluoroacetic acid in water) and B (acetonitrile-methanol-tetrahydrofuran, 80:15:5, v/v/v) pumped at a flow rate of 1.3 ml/min. UV detection was performed at 254 nm. The developed method was validated for its robustness, sensitivity, precision and linearity in the range from limit of quantification (LOQ) to 120%. The limits of detection (LOD) and LOQ were found to be 0.08 μg/ml and 0.32 μg/ml, respectively. This method allows the separation of oxytetracycline from all known and 5 unknown impurities, which is better than previously reported in the literature. Moreover, the simple mobile phase composition devoid of non-volatile buffers made the method suitable to interface with mass spectrometry for further characterization of unknown impurities. The developed method has been applied for determination of related substances in oxytetracycline bulk samples available from four manufacturers. The validation results demonstrate that the method is reliable for quantification of oxytetracycline and its impurities. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. A method for acetylcholinesterase staining of brain sections previously processed for receptor autoradiography.

    Science.gov (United States)

    Lim, M M; Hammock, E A D; Young, L J

    2004-02-01

    Receptor autoradiography using selective radiolabeled ligands allows visualization of brain receptor distribution and density on film. The resolution of specific brain regions on the film often can be difficult to discern owing to the general spread of the radioactive label and the lack of neuroanatomical landmarks on film. Receptor binding is a chemically harsh protocol that can render the tissue virtually unstainable by Nissl and other conventional stains used to delineate neuroanatomical boundaries of brain regions. We describe a method for acetylcholinesterase (AChE) staining of slides previously processed for receptor binding. AChE staining is a useful tool for delineating major brain nuclei and tracts. AChE staining on sections that have been processed for receptor autoradiography provides a direct comparison of brain regions for more precise neuroanatomical description. We report a detailed thiocholine protocol that is a modification of the Koelle-Friedenwald method to amplify the AChE signal in brain sections previously processed for autoradiography. We also describe several temporal and experimental factors that can affect the density and clarity of the AChE signal when using this protocol.

  14. Reliability and validity of non-radiographic methods of thoracic kyphosis measurement: a systematic review.

    Science.gov (United States)

    Barrett, Eva; McCreesh, Karen; Lewis, Jeremy

    2014-02-01

    A wide array of instruments are available for non-invasive thoracic kyphosis measurement. Guidelines for selecting outcome measures for use in clinical and research practice recommend that properties such as validity and reliability are considered. This systematic review reports on the reliability and validity of non-invasive methods for measuring thoracic kyphosis. A systematic search of 11 electronic databases located studies assessing reliability and/or validity of non-invasive thoracic kyphosis measurement techniques. Two independent reviewers used a critical appraisal tool to assess the quality of retrieved studies. Data was extracted by the primary reviewer. The results were synthesized qualitatively using a level of evidence approach. 27 studies satisfied the eligibility criteria and were included in the review. The reliability, validity and both reliability and validity were investigated by sixteen, two and nine studies respectively. 17/27 studies were deemed to be of high quality. In total, 15 methods of thoracic kyphosis were evaluated in retrieved studies. All investigated methods showed high (ICC ≥ .7) to very high (ICC ≥ .9) levels of reliability. The validity of the methods ranged from low to very high. The strongest levels of evidence for reliability exists in support of the Debrunner kyphometer, Spinal Mouse and Flexicurve index, and for validity supports the arcometer and Flexicurve index. Further reliability and validity studies are required to strengthen the level of evidence for the remaining methods of measurement. This should be addressed by future research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. List of new names and new combinations previously effectively, but not validly, published.

    Science.gov (United States)

    2008-09-01

    The purpose of this announcement is to effect the valid publication of the following effectively published new names and new combinations under the procedure described in the Bacteriological Code (1990 Revision). Authors and other individuals wishing to have new names and/or combinations included in future lists should send three copies of the pertinent reprint or photocopies thereof, or an electronic copy of the published paper, to the IJSEM Editorial Office for confirmation that all of the other requirements for valid publication have been met. It is also a requirement of IJSEM and the ICSP that authors of new species, new subspecies and new combinations provide evidence that types are deposited in two recognized culture collections in two different countries (i.e. documents certifying deposition and availability of type strains). It should be noted that the date of valid publication of these new names and combinations is the date of publication of this list, not the date of the original publication of the names and combinations. The authors of the new names and combinations are as given below, and these authors' names will be included in the author index of the present issue and in the volume author index. Inclusion of a name on these lists validates the publication of the name and thereby makes it available in bacteriological nomenclature. The inclusion of a name on this list is not to be construed as taxonomic acceptance of the taxon to which the name is applied. Indeed, some of these names may, in time, be shown to be synonyms, or the organisms may be transferred to another genus, thus necessitating the creation of a new combination.

  16. Specification and Preliminary Validation of IAT (Integrated Analysis Techniques) Methods: Executive Summary.

    Science.gov (United States)

    1985-03-01

    conceptual framwork , and preliminary validation of IAT concepts. Planned work for FY85, including more extensive validation, is also described. 20...Developments: Required Capabilities .... ......... 10 2-1 IAT Conceptual Framework - FY85 (FEO) ..... ........... 11 2-2 Recursive Nature of Decomposition...approach: 1) Identify needs & requirements for IAT. 2) Develop IAT conceptual framework. 3) Validate IAT methods. 4) Develop applications materials. To

  17. Robustness study in SSNTD method validation: indoor radon quality

    Energy Technology Data Exchange (ETDEWEB)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L., E-mail: danilacdias@gmail.com [Comissao Nacional de Energia Nuclear (LAPOC/CNEN), Pocos de Caldas, MG (Brazil). Laboratorio de Pocos de Caldas

    2017-07-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  18. Robustness study in SSNTD method validation: indoor radon quality

    International Nuclear Information System (INIS)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L.

    2017-01-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  19. Estimating misclassification error: a closer look at cross-validation based methods

    Directory of Open Access Journals (Sweden)

    Ounpraseuth Songthip

    2012-11-01

    Full Text Available Abstract Background To estimate a classifier’s error in predicting future observations, bootstrap methods have been proposed as reduced-variation alternatives to traditional cross-validation (CV methods based on sampling without replacement. Monte Carlo (MC simulation studies aimed at estimating the true misclassification error conditional on the training set are commonly used to compare CV methods. We conducted an MC simulation study to compare a new method of bootstrap CV (BCV to k-fold CV for estimating clasification error. Findings For the low-dimensional conditions simulated, the modest positive bias of k-fold CV contrasted sharply with the substantial negative bias of the new BCV method. This behavior was corroborated using a real-world dataset of prognostic gene-expression profiles in breast cancer patients. Our simulation results demonstrate some extreme characteristics of variance and bias that can occur due to a fault in the design of CV exercises aimed at estimating the true conditional error of a classifier, and that appear not to have been fully appreciated in previous studies. Although CV is a sound practice for estimating a classifier’s generalization error, using CV to estimate the fixed misclassification error of a trained classifier conditional on the training set is problematic. While MC simulation of this estimation exercise can correctly represent the average bias of a classifier, it will overstate the between-run variance of the bias. Conclusions We recommend k-fold CV over the new BCV method for estimating a classifier’s generalization error. The extreme negative bias of BCV is too high a price to pay for its reduced variance.

  20. Validation of a rapid, non-radioactive method to quantify internalisation of G-protein coupled receptors.

    Science.gov (United States)

    Jongsma, Maikel; Florczyk, Urszula M; Hendriks-Balk, Mariëlle C; Michel, Martin C; Peters, Stephan L M; Alewijnse, Astrid E

    2007-07-01

    Agonist exposure can cause internalisation of G-protein coupled receptors (GPCRs), which may be a part of desensitisation but also of cellular signaling. Previous methods to study internalisation have been tedious or only poorly quantitative. Therefore, we have developed and validated a quantitative method using a sphingosine-1-phosphate (S1P) receptor as a model. Because of a lack of suitable binding studies, it has been difficult to study S1P receptor internalisation. Using a N-terminal HisG-tag, S1P(1) receptors on the cell membrane can be visualised via immunocytochemistry with a specific anti-HisG antibody. S1P-induced internalisation was concentration dependent and was quantified using a microplate reader, detecting either absorbance, a fluorescent or luminescent signal, depending on the antibodies used. Among those, the fluorescence detection method was the most convenient to use. The relative ease of this method makes it suitable to measure a large number of data points, e.g. to compare the potency and efficacy of receptor ligands.

  1. A method to validate quantitative high-frequency power doppler ultrasound with fluorescence in vivo video microscopy.

    Science.gov (United States)

    Pinter, Stephen Z; Kim, Dae-Ro; Hague, M Nicole; Chambers, Ann F; MacDonald, Ian C; Lacefield, James C

    2014-08-01

    Flow quantification with high-frequency (>20 MHz) power Doppler ultrasound can be performed objectively using the wall-filter selection curve (WFSC) method to select the cutoff velocity that yields a best-estimate color pixel density (CPD). An in vivo video microscopy system (IVVM) is combined with high-frequency power Doppler ultrasound to provide a method for validation of CPD measurements based on WFSCs in mouse testicular vessels. The ultrasound and IVVM systems are instrumented so that the mouse remains on the same imaging platform when switching between the two modalities. In vivo video microscopy provides gold-standard measurements of vascular diameter to validate power Doppler CPD estimates. Measurements in four image planes from three mice exhibit wide variation in the optimal cutoff velocity and indicate that a predetermined cutoff velocity setting can introduce significant errors in studies intended to quantify vascularity. Consistent with previously published flow-phantom data, in vivo WFSCs exhibited three characteristic regions and detectable plateaus. Selection of a cutoff velocity at the right end of the plateau yielded a CPD close to the gold-standard vascular volume fraction estimated using IVVM. An investigator can implement the WFSC method to help adapt cutoff velocity to current blood flow conditions and thereby improve the accuracy of power Doppler for quantitative microvascular imaging. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  2. Validity and reliability of a method for assessment of cervical vertebral maturation.

    Science.gov (United States)

    Zhao, Xiao-Guang; Lin, Jiuxiang; Jiang, Jiu-Hui; Wang, Qingzhu; Ng, Sut Hong

    2012-03-01

    To evaluate the validity and reliability of the cervical vertebral maturation (CVM) method with a longitudinal sample. Eighty-six cephalograms from 18 subjects (5 males and 13 females) were selected from the longitudinal database. Total mandibular length was measured on each film; an increased rate served as the gold standard in examination of the validity of the CVM method. Eleven orthodontists, after receiving intensive training in the CVM method, evaluated all films twice. Kendall's W and the weighted kappa statistic were employed. Kendall's W values were higher than 0.8 at both times, indicating strong interobserver reproducibility, but interobserver agreement was documented twice at less than 50%. A wide range of intraobserver agreement was noted (40.7%-79.1%), and substantial intraobserver reproducibility was proved by kappa values (0.53-0.86). With regard to validity, moderate agreement was reported between the gold standard and observer staging at the initial time (kappa values 0.44-0.61). However, agreement seemed to be unacceptable for clinical use, especially in cervical stage 3 (26.8%). Even though the validity and reliability of the CVM method proved statistically acceptable, we suggest that many other growth indicators should be taken into consideration in evaluating adolescent skeletal maturation.

  3. Update of Standard Practices for New Method Validation in Forensic Toxicology.

    Science.gov (United States)

    Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T

    2017-01-01

    International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  4. Triangulation, Respondent Validation, and Democratic Participation in Mixed Methods Research

    Science.gov (United States)

    Torrance, Harry

    2012-01-01

    Over the past 10 years or so the "Field" of "Mixed Methods Research" (MMR) has increasingly been exerting itself as something separate, novel, and significant, with some advocates claiming paradigmatic status. Triangulation is an important component of mixed methods designs. Triangulation has its origins in attempts to validate research findings…

  5. Validity of a Manual Soft Tissue Profile Prediction Method Following Mandibular Setback Osteotomy

    OpenAIRE

    Kolokitha, Olga-Elpis

    2007-01-01

    Objectives The aim of this study was to determine the validity of a manual cephalometric method used for predicting the post-operative soft tissue profiles of patients who underwent mandibular setback surgery and compare it to a computerized cephalometric prediction method (Dentofacial Planner). Lateral cephalograms of 18 adults with mandibular prognathism taken at the end of pre-surgical orthodontics and approximately one year after surgery were used. Methods To test the validity of the manu...

  6. The Validity of Dimensional Regularization Method on Fractal Spacetime

    Directory of Open Access Journals (Sweden)

    Yong Tao

    2013-01-01

    Full Text Available Svozil developed a regularization method for quantum field theory on fractal spacetime (1987. Such a method can be applied to the low-order perturbative renormalization of quantum electrodynamics but will depend on a conjectural integral formula on non-integer-dimensional topological spaces. The main purpose of this paper is to construct a fractal measure so as to guarantee the validity of the conjectural integral formula.

  7. Validating MEDIQUAL Constructs

    Science.gov (United States)

    Lee, Sang-Gun; Min, Jae H.

    In this paper, we validate MEDIQUAL constructs through the different media users in help desk service. In previous research, only two end-users' constructs were used: assurance and responsiveness. In this paper, we extend MEDIQUAL constructs to include reliability, empathy, assurance, tangibles, and responsiveness, which are based on the SERVQUAL theory. The results suggest that: 1) five MEDIQUAL constructs are validated through the factor analysis. That is, importance of the constructs have relatively high correlations between measures of the same construct using different methods and low correlations between measures of the constructs that are expected to differ; and 2) five MEDIQUAL constructs are statistically significant on media users' satisfaction in help desk service by regression analysis.

  8. Validation of analytical method to quality control and the stability study of 0.025 % eyedrops Ketotiphen

    International Nuclear Information System (INIS)

    Troche Concepcion, Yenilen; Romero Diaz, Jacqueline Aylema; Garcia Penna, Caridad M

    2010-01-01

    The Ketotiphen eyedrop is prescribed to relief the signs and symptoms of allergic conjunctivitis due to its potent H 1a ntihistaminic effect showing some ability to inhibit the histamine release and other mediators in cases of mastocytosis. The aim of present paper was to develop and validate an analytical method for the high-performance liquid chromatography, to quality control and the stability studies of 0.025 % eyedrop Ketotiphen. Method was based on active principle separation by means of a Lichrosorb RP-18 (5 μm) (250 x 4 mm), with UV detection to 296 nm using a mobile phase including a non-gasified mixture of methanol:buffer-phosphate (75:25; pH 8.5) adding 1 mL of Isopropanol by each 1 000 mL of the previous mixture at a 1.2 mL/min flow velocity. The analytical method was linear, accurate, specific and exact during the study concentrations

  9. Development and Validation of a RP-HPLC Method for the ...

    African Journals Online (AJOL)

    Development and Validation of a RP-HPLC Method for the Simultaneous Determination of Rifampicin and a Flavonoid Glycoside - A Novel ... range, accuracy, precision, limit of detection, limit of quantification, robustness and specificity.

  10. Validation parameters of instrumental method for determination of total bacterial count in milk

    Directory of Open Access Journals (Sweden)

    Nataša Mikulec

    2004-10-01

    Full Text Available The method of flow citometry as rapid, instrumental and routine microbiological method is used for determination of total bacterial count in milk. The results of flow citometry are expressed as individual bacterial cells count. Problems regarding the interpretation of the results of total bacterial count can be avoided by transformation of the results of flow citometry method onto the scale of reference method (HRN ISO 6610:2001.. The method of flow citometry, like any analitycal method, according to the HRN EN ISO/IEC 17025:2000 standard, requires validation and verification. This paper describes parameters of validation: accuracy, precision, specificity, range, robustness and measuring uncertainty for the method of flow citometry.

  11. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2015-05-01

    Several sophisticated methods of footprint analysis currently exist. However, it is sometimes useful to apply standard measurement methods of recognized evidence with an easy and quick application. We sought to assess the reliability and validity of a new method of footprint assessment in a healthy population using Photoshop CS5 software (Adobe Systems Inc, San Jose, California). Forty-two footprints, corresponding to 21 healthy individuals (11 men with a mean ± SD age of 20.45 ± 2.16 years and 10 women with a mean ± SD age of 20.00 ± 1.70 years) were analyzed. Footprints were recorded in static bipedal standing position using optical podography and digital photography. Three trials for each participant were performed. The Hernández-Corvo, Chippaux-Smirak, and Staheli indices and the Clarke angle were calculated by manual method and by computerized method using Photoshop CS5 software. Test-retest was used to determine reliability. Validity was obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed high values (ICC, 0.98-0.99). Moreover, the validity test clearly showed no difference between techniques (ICC, 0.99-1). The reliability and validity of a method to measure, assess, and record the podometric indices using Photoshop CS5 software has been demonstrated. This provides a quick and accurate tool useful for the digital recording of morphostatic foot study parameters and their control.

  12. Method Development and Validation for the Determination of Caffeine: An Alkaloid from Coffea arabica by High-performance Liquid Chromatography Method.

    Science.gov (United States)

    Naveen, P; Lingaraju, H B; Deepak, M; Medhini, B; Prasad, K Shyam

    2018-01-01

    The present study was investigated to develop and validate a reversed phase high performance liquid chromatography method for the determination of caffeine from bean material of Coffee arabica. The separation was achieved on a reversed-phase C18 column using a mobile phase composed of water: methanol (50:50) at a flow rate of 1.0 mlmin-1. The detection was carried out on a UV detector at 272 nm. The developed method was validated according to the requirements for International Conference on Harmonisation (ICH) guidelines, which includes specificity, linearity, precision, accuracy, limit of detection and limit of quantitation. The developed method validates good linearity with excellent correlation coefficient (R2 > 0.999). In repeatability and intermediate precision, the percentage relative standard deviation (% RSD) of peak area was less than 1% shows high precision of the method. The recovery rate for caffeine was within 98.78% - 101.28% indicates high accuracy of the method. The low limit of detection and limit of quantitation of caffeine enable the detection and quantitation of caffeine from C. arabica at low concentrations. The developed HPLC method is a simple, rapid, precisely, accurately and widely accepted and it is recommended for efficient assays in routine work. A simple, accurate, and sensitive high-performance liquid chromatography (HPLC) method for caffeine from Coffea arabica has been developed and validated. The developed HPLC method was validated for linearity, specificity, precision, recovery, limits of detection, and limits of quantification by the International Conference on Harmonization guidelines. The results revealed that the proposed method is highly reliable. This method could be successfully applied for routine quality work analysis. Abbreviation Used: C. arabica : Coffee arabica, ICH: International Conference on Harmonisation, % RSD: Percentage Relative Standard Deviation, R2: Correlation Coefficient, ppm: Parts per million, LOD: Limits

  13. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  14. Validation of ultraviolet method to determine serum phosphorus level

    International Nuclear Information System (INIS)

    Garcia Borges, Lisandra; Perez Prieto, Teresa Maria; Valdes Diez, Lilliam

    2009-01-01

    Validation of a spectrophotometry method applicable in clinic labs was proposed to analytical assessment of serum phosphates using a kit UV-phosphorus of domestic production from 'Carlos J. Finlay' Biologics Production Enterprise (Havana, Cuba). Analysis method was based on phosphorus reaction to ammonium molybdenum to acid pH to creating a measurable complex to 340 nm. Specificity and precision were measured considering the method strength, linearity, accuracy and sensitivity. Analytical method was linear up to 4,8 mmol/L, precise (CV 30 .999) during clinical interest concentration interval where there were not interferences by matrix. Detection limit values were of 0.037 mmol/L and of quantification of 0.13 mmol/L both were satisfactory for product use

  15. An extended validation of the last generation of particle finite element method for free surface flows

    Science.gov (United States)

    Gimenez, Juan M.; González, Leo M.

    2015-03-01

    In this paper, a new generation of the particle method known as Particle Finite Element Method (PFEM), which combines convective particle movement and a fixed mesh resolution, is applied to free surface flows. This interesting variant, previously described in the literature as PFEM-2, is able to use larger time steps when compared to other similar numerical tools which implies shorter computational times while maintaining the accuracy of the computation. PFEM-2 has already been extended to free surface problems, being the main topic of this paper a deep validation of this methodology for a wider range of flows. To accomplish this task, different improved versions of discontinuous and continuous enriched basis functions for the pressure field have been developed to capture the free surface dynamics without artificial diffusion or undesired numerical effects when different density ratios are involved. A collection of problems has been carefully selected such that a wide variety of Froude numbers, density ratios and dominant dissipative cases are reported with the intention of presenting a general methodology, not restricted to a particular range of parameters, and capable of using large time-steps. The results of the different free-surface problems solved, which include: Rayleigh-Taylor instability, sloshing problems, viscous standing waves and the dam break problem, are compared to well validated numerical alternatives or experimental measurements obtaining accurate approximations for such complex flows.

  16. A Method of Retrospective Computerized System Validation for Drug Manufacturing Software Considering Modifications

    Science.gov (United States)

    Takahashi, Masakazu; Fukue, Yoshinori

    This paper proposes a Retrospective Computerized System Validation (RCSV) method for Drug Manufacturing Software (DMSW) that relates to drug production considering software modification. Because DMSW that is used for quality management and facility control affects big impact to quality of drugs, regulatory agency required proofs of adequacy for DMSW's functions and performance based on developed documents and test results. Especially, the work that explains adequacy for previously developed DMSW based on existing documents and operational records is called RCSV. When modifying RCSV conducted DMSW, it was difficult to secure consistency between developed documents and test results for modified DMSW parts and existing documents and operational records for non-modified DMSW parts. This made conducting RCSV difficult. In this paper, we proposed (a) definition of documents architecture, (b) definition of descriptive items and levels in the documents, (c) management of design information using database, (d) exhaustive testing, and (e) integrated RCSV procedure. As a result, we could conduct adequate RCSV securing consistency.

  17. Critical Values for Lawshe's Content Validity Ratio: Revisiting the Original Methods of Calculation

    Science.gov (United States)

    Ayre, Colin; Scally, Andrew John

    2014-01-01

    The content validity ratio originally proposed by Lawshe is widely used to quantify content validity and yet methods used to calculate the original critical values were never reported. Methods for original calculation of critical values are suggested along with tables of exact binomial probabilities.

  18. Microscale validation of 4-aminoantipyrine test method for quantifying phenolic compounds in microbial culture

    International Nuclear Information System (INIS)

    Justiz Mendoza, Ibrahin; Aguilera Rodriguez, Isabel; Perez Portuondo, Irasema

    2014-01-01

    Validation of test methods microscale is currently of great importance due to the economic and environmental advantages possessed, which constitutes a prerequisite for the performance of services and quality assurance of the results to provide customer. This paper addresses the microscale validation of 4-aminoantipyrine spectrophotometric method for the quantification of phenolic compounds in culture medium. Parameters linearity, precision, regression, accuracy, detection limits, quantification limits and robustness were evaluated, addition to the comparison test with no standardized method for determining polyphenols (Folin Ciocalteu). The results showed that both methods are feasible for determining phenols

  19. Validation of the method for investigation of radiopharmaceuticals for in vitro use

    International Nuclear Information System (INIS)

    Vranjes, S; Jovanovic, M.; Orlic, M.; Lazic, E. . E-mail address of corresponding author: sanjav@vin.bg.ac.yu

    2005-01-01

    The aim of this study was to validate analytical method for determination of total radioactivity and radioactive concentration of 125 I-triiodotironin, radiopharmaceutical for in vitro use. Analytical parameters: selectivity, accuracy, linearity and range of this method were determined. Values obtained for all parameters are reasonable for analytical methods, therefore this method could be used for farther investigation. (author)

  20. Validation method training: nurses' experiences and ratings of work climate.

    Science.gov (United States)

    Söderlund, Mona; Norberg, Astrid; Hansebo, Görel

    2014-03-01

    Training nursing staff in communication skills can impact on the quality of care for residents with dementia and contributes to nurses' job satisfaction. Changing attitudes and practices takes time and energy and can affect the entire nursing staff, not just the nurses directly involved in a training programme. Therefore, it seems important to study nurses' experiences of a training programme and any influence of the programme on work climate among the entire nursing staff. To explore nurses' experiences of a 1-year validation method training programme conducted in a nursing home for residents with dementia and to describe ratings of work climate before and after the programme. A mixed-methods approach. Twelve nurses participated in the training and were interviewed afterwards. These individual interviews were tape-recorded and transcribed, then analysed using qualitative content analysis. The Creative Climate Questionnaire was administered before (n = 53) and after (n = 56) the programme to the entire nursing staff in the participating nursing home wards and analysed with descriptive statistics. Analysis of the interviews resulted in four categories: being under extra strain, sharing experiences, improving confidence in care situations and feeling uncertain about continuing the validation method. The results of the questionnaire on work climate showed higher mean values in the assessment after the programme had ended. The training strengthened the participating nurses in caring for residents with dementia, but posed an extra strain on them. These nurses also described an extra strain on the entire nursing staff that was not reflected in the results from the questionnaire. The work climate at the nursing home wards might have made it easier to conduct this extensive training programme. Training in the validation method could develop nurses' communication skills and improve their handling of complex care situations. © 2013 Blackwell Publishing Ltd.

  1. Validation of single-sample doubly labeled water method

    International Nuclear Information System (INIS)

    Webster, M.D.; Weathers, W.W.

    1989-01-01

    We have experimentally validated a single-sample variant of the doubly labeled water method for measuring metabolic rate and water turnover in a very small passerine bird, the verdin (Auriparus flaviceps). We measured CO 2 production using the Haldane gravimetric technique and compared these values with estimates derived from isotopic data. Doubly labeled water results based on the one-sample calculations differed from Haldane values by less than 0.5% on average (range -8.3 to 11.2%, n = 9). Water flux computed by the single-sample method differed by -1.5% on average from results for the same birds based on the standard, two-sample technique (range -13.7 to 2.0%, n = 9)

  2. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for the detection of genotoxic carcinogens: I. Summary of pre-validation study results.

    Science.gov (United States)

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Burlinson, Brian; Escobar, Patricia A; Kraynak, Andrew R; Nakagawa, Yuzuki; Nakajima, Madoka; Pant, Kamala; Asano, Norihide; Lovell, David; Morita, Takeshi; Ohno, Yasuo; Hayashi, Makoto

    2015-07-01

    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this validation effort was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The purpose of the pre-validation studies (i.e., Phase 1 through 3), conducted in four or five laboratories with extensive comet assay experience, was to optimize the protocol to be used during the definitive validation study. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Experimental validation for calcul methods of structures having shock non-linearity

    International Nuclear Information System (INIS)

    Brochard, D.; Buland, P.

    1987-01-01

    For the seismic analysis of non-linear structures, numerical methods have been developed which need to be validated on experimental results. The aim of this paper is to present the design method of a test program which results will be used for this purpose. Some applications to nuclear components will illustrate this presentation [fr

  4. Validation of the Gatortail method for accurate sizing of pulmonary vessels from 3D medical images.

    Science.gov (United States)

    O'Dell, Walter G; Gormaley, Anne K; Prida, David A

    2017-12-01

    Detailed characterization of changes in vessel size is crucial for the diagnosis and management of a variety of vascular diseases. Because clinical measurement of vessel size is typically dependent on the radiologist's subjective interpretation of the vessel borders, it is often prone to high inter- and intra-user variability. Automatic methods of vessel sizing have been developed for two-dimensional images but a fully three-dimensional (3D) method suitable for vessel sizing from volumetric X-ray computed tomography (CT) or magnetic resonance imaging has heretofore not been demonstrated and validated robustly. In this paper, we refined and objectively validated Gatortail, a method that creates a mathematical geometric 3D model of each branch in a vascular tree, simulates the appearance of the virtual vascular tree in a 3D CT image, and uses the similarity of the simulated image to a patient's CT scan to drive the optimization of the model parameters, including vessel size, to match that of the patient. The method was validated with a 2-dimensional virtual tree structure under deformation, and with a realistic 3D-printed vascular phantom in which the diameter of 64 branches were manually measured 3 times each. The phantom was then scanned on a conventional clinical CT imaging system and the images processed with the in-house software to automatically segment and mathematically model the vascular tree, label each branch, and perform the Gatortail optimization of branch size and trajectory. Previously proposed methods of vessel sizing using matched Gaussian filters and tubularity metrics were also tested. The Gatortail method was then demonstrated on the pulmonary arterial tree segmented from a human volunteer's CT scan. The standard deviation of the difference between the manually measured and Gatortail-based radii in the 3D physical phantom was 0.074 mm (0.087 in-plane pixel units for image voxels of dimension 0.85 × 0.85 × 1.0 mm) over the 64 branches

  5. Validation of innovative technologies and strategies for regulatory safety assessment methods: challenges and opportunities.

    Science.gov (United States)

    Stokes, William S; Wind, Marilyn

    2010-01-01

    Advances in science and innovative technologies are providing new opportunities to develop test methods and strategies that may improve safety assessments and reduce animal use for safety testing. These include high throughput screening and other approaches that can rapidly measure or predict various molecular, genetic, and cellular perturbations caused by test substances. Integrated testing and decision strategies that consider multiple types of information and data are also being developed. Prior to their use for regulatory decision-making, new methods and strategies must undergo appropriate validation studies to determine the extent that their use can provide equivalent or improved protection compared to existing methods and to determine the extent that reproducible results can be obtained in different laboratories. Comprehensive and optimal validation study designs are expected to expedite the validation and regulatory acceptance of new test methods and strategies that will support improved safety assessments and reduced animal use for regulatory testing.

  6. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method.

    Science.gov (United States)

    Badran, Hani; Pluye, Pierre; Grad, Roland

    2017-03-14

    The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n

  7. Method for Pre-Conditioning a Measured Surface Height Map for Model Validation

    Science.gov (United States)

    Sidick, Erkin

    2012-01-01

    This software allows one to up-sample or down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. Because the re-sampling of a surface map is accomplished based on the analytical expressions of Zernike-polynomials and a power spectral density model, such re-sampling does not introduce any aliasing and interpolation errors as is done by the conventional interpolation and FFT-based (fast-Fourier-transform-based) spatial-filtering method. Also, this new method automatically eliminates the measurement noise and other measurement errors such as artificial discontinuity. The developmental cycle of an optical system, such as a space telescope, includes, but is not limited to, the following two steps: (1) deriving requirements or specs on the optical quality of individual optics before they are fabricated through optical modeling and simulations, and (2) validating the optical model using the measured surface height maps after all optics are fabricated. There are a number of computational issues related to model validation, one of which is the "pre-conditioning" or pre-processing of the measured surface maps before using them in a model validation software tool. This software addresses the following issues: (1) up- or down-sampling a measured surface map to match it with the gridded data format of a model validation tool, and (2) eliminating the surface measurement noise or measurement errors such that the resulted surface height map is continuous or smoothly-varying. So far, the preferred method used for re-sampling a surface map is two-dimensional interpolation. The main problem of this method is that the same pixel can take different values when the method of interpolation is changed among the different methods such as the "nearest," "linear," "cubic," and "spline" fitting in Matlab. The conventional, FFT-based spatial filtering method used to

  8. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    Science.gov (United States)

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  9. A Comparison of Three Methods for the Analysis of Skin Flap Viability: Reliability and Validity.

    Science.gov (United States)

    Tim, Carla Roberta; Martignago, Cintia Cristina Santi; da Silva, Viviane Ribeiro; Dos Santos, Estefany Camila Bonfim; Vieira, Fabiana Nascimento; Parizotto, Nivaldo Antonio; Liebano, Richard Eloin

    2018-05-01

    Objective: Technological advances have provided new alternatives to the analysis of skin flap viability in animal models; however, the interrater validity and reliability of these techniques have yet to be analyzed. The present study aimed to evaluate the interrater validity and reliability of three different methods: weight of paper template (WPT), paper template area (PTA), and photographic analysis. Approach: Sixteen male Wistar rats had their cranially based dorsal skin flap elevated. On the seventh postoperative day, the viable tissue area and the necrotic area of the skin flap were recorded using the paper template method and photo image. The evaluation of the percentage of viable tissue was performed using three methods, simultaneously and independently by two raters. The analysis of interrater reliability and viability was performed using the intraclass correlation coefficient and Bland Altman Plot Analysis was used to visualize the presence or absence of systematic bias in the evaluations of data validity. Results: The results showed that interrater reliability for WPT, measurement of PTA, and photographic analysis were 0.995, 0.990, and 0.982, respectively. For data validity, a correlation >0.90 was observed for all comparisons made between the three methods. In addition, Bland Altman Plot Analysis showed agreement between the comparisons of the methods and the presence of systematic bias was not observed. Innovation: Digital methods are an excellent choice for assessing skin flap viability; moreover, they make data use and storage easier. Conclusion: Independently from the method used, the interrater reliability and validity proved to be excellent for the analysis of skin flaps' viability.

  10. Kidnapping Detection and Recognition in Previous Unknown Environment

    Directory of Open Access Journals (Sweden)

    Yang Tian

    2017-01-01

    Full Text Available An unaware event referred to as kidnapping makes the estimation result of localization incorrect. In a previous unknown environment, incorrect localization result causes incorrect mapping result in Simultaneous Localization and Mapping (SLAM by kidnapping. In this situation, the explored area and unexplored area are divided to make the kidnapping recovery difficult. To provide sufficient information on kidnapping, a framework to judge whether kidnapping has occurred and to identify the type of kidnapping with filter-based SLAM is proposed. The framework is called double kidnapping detection and recognition (DKDR by performing two checks before and after the “update” process with different metrics in real time. To explain one of the principles of DKDR, we describe a property of filter-based SLAM that corrects the mapping result of the environment using the current observations after the “update” process. Two classical filter-based SLAM algorithms, Extend Kalman Filter (EKF SLAM and Particle Filter (PF SLAM, are modified to show that DKDR can be simply and widely applied in existing filter-based SLAM algorithms. Furthermore, a technique to determine the adapted thresholds of metrics in real time without previous data is presented. Both simulated and experimental results demonstrate the validity and accuracy of the proposed method.

  11. ReSOLV: Applying Cryptocurrency Blockchain Methods to Enable Global Cross-Platform Software License Validation

    Directory of Open Access Journals (Sweden)

    Alan Litchfield

    2018-05-01

    Full Text Available This paper presents a method for a decentralised peer-to-peer software license validation system using cryptocurrency blockchain technology to ameliorate software piracy, and to provide a mechanism for software developers to protect copyrighted works. Protecting software copyright has been an issue since the late 1970s and software license validation has been a primary method employed in an attempt to minimise software piracy and protect software copyright. The method described creates an ecosystem in which the rights and privileges of participants are observed.

  12. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  13. Validation of methods for the detection and quantification of engineered nanoparticles in food

    DEFF Research Database (Denmark)

    Linsinger, T.P.J.; Chaudhry, Q.; Dehalu, V.

    2013-01-01

    the methods apply equally well to particles of different suppliers. In trueness testing, information whether the particle size distribution has changed during analysis is required. Results are largely expected to follow normal distributions due to the expected high number of particles. An approach...... approach for the validation of methods for detection and quantification of nanoparticles in food samples. It proposes validation of identity, selectivity, precision, working range, limit of detection and robustness, bearing in mind that each “result” must include information about the chemical identity...

  14. An Integrated Research Infrastructure for Validating Cyber-Physical Energy Systems

    DEFF Research Database (Denmark)

    Strasser, T. I.; Moyo, C.; Bründlinger, R.

    2017-01-01

    quality and ensure security of supply. At the same time, the increased availability of advanced automation and communication technologies provides new opportunities for the derivation of intelligent solutions to tackle the challenges. Previous work has shown various new methods of operating highly...... interconnected power grids, and their corresponding components, in a more effective way. As a consequence of these developments, the traditional power system is being transformed into a cyber-physical energy system, a smart grid. Previous and ongoing research have tended to mainly focus on how specific aspects...... of smart grids can be validated, but until there exists no integrated approach for the analysis and evaluation of complex cyber-physical systems configurations. This paper introduces integrated research infrastructure that provides methods and tools for validating smart grid systems in a holistic, cyber...

  15. Development and Validation of HPLC-PDA Assay method of Frangula emodin

    Directory of Open Access Journals (Sweden)

    Deborah Duca

    2016-03-01

    Full Text Available Frangula emodin, (1,3,8-trihydroxy-6-methyl-anthraquinone, is one of the anthraquinone derivatives found abundantly in the roots and bark of a number of plant families traditionally used to treat constipation and haemorrhoids. The present study describes the development and subsequent validation of a specific Assay HPLC method for emodin. The separation was achieved on a Waters Symmetry C18, 4.6 × 250 mm, 5 μm particle size, column at a temperature of 35 °C, with UV detection at 287 and 436 nm. An isocratic elution mode consisting of 0.1% formic acid and 0.01% trifluoroacetic acid as the aqueous mobile phase, and methanol was used. The method was successfully and statistically validated for linearity, range, precision, accuracy, specificity and solution stability.

  16. Discriminant Validity Assessment: Use of Fornell & Larcker criterion versus HTMT Criterion

    Science.gov (United States)

    Hamid, M. R. Ab; Sami, W.; Mohmad Sidek, M. H.

    2017-09-01

    Assessment of discriminant validity is a must in any research that involves latent variables for the prevention of multicollinearity issues. Fornell and Larcker criterion is the most widely used method for this purpose. However, a new method has emerged for establishing the discriminant validity assessment through heterotrait-monotrait (HTMT) ratio of correlations method. Therefore, this article presents the results of discriminant validity assessment using these methods. Data from previous study was used that involved 429 respondents for empirical validation of value-based excellence model in higher education institutions (HEI) in Malaysia. From the analysis, the convergent, divergent and discriminant validity were established and admissible using Fornell and Larcker criterion. However, the discriminant validity is an issue when employing the HTMT criterion. This shows that the latent variables under study faced the issue of multicollinearity and should be looked into for further details. This also implied that the HTMT criterion is a stringent measure that could detect the possible indiscriminant among the latent variables. In conclusion, the instrument which consisted of six latent variables was still lacking in terms of discriminant validity and should be explored further.

  17. Developmental and internal validation of a novel 13 loci STR multiplex method for Cannabis sativa DNA profiling.

    Science.gov (United States)

    Houston, Rachel; Birck, Matthew; Hughes-Stamm, Sheree; Gangitano, David

    2017-05-01

    Marijuana (Cannabis sativa L.) is a plant cultivated and trafficked worldwide as a source of fiber (hemp), medicine, and intoxicant. The development of a validated method using molecular techniques such as short tandem repeats (STRs) could serve as an intelligence tool to link multiple cases by means of genetic individualization or association of cannabis samples. For this purpose, a 13 loci STR multiplex method was developed, optimized, and validated according to relevant ISFG and SWGDAM guidelines. The STR multiplex consists of 13 previously described C. sativa STR loci: ANUCS501, 9269, 4910, 5159, ANUCS305, 9043, B05, 1528, 3735, CS1, D02, C11, and H06. A sequenced allelic ladder consisting of 56 alleles was designed to accurately genotype 101 C. sativa samples from three seizures provided by a U.S. Customs and Border Protection crime lab. Using an optimal range of DNA (0.5-1.0ng), validation studies revealed well-balanced electropherograms (inter-locus balance range: 0.500-1.296), relatively balanced heterozygous peaks (mean peak height ratio of 0.83 across all loci) with minimal artifacts and stutter ratio (mean stutter of 0.021 across all loci). This multi-locus system is relatively sensitive (0.13ng of template DNA) with a combined power of discrimination of 1 in 55 million. The 13 STR panel was found to be species specific for C. sativa; however, non-specific peaks were produced with Humulus lupulus. The results of this research demonstrate the robustness and applicability of this 13 loci STR system for forensic DNA profiling of marijuana samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Concurrent fNIRS-fMRI measurement to validate a method for separating deep and shallow fNIRS signals by using multidistance optodes

    Science.gov (United States)

    Funane, Tsukasa; Sato, Hiroki; Yahata, Noriaki; Takizawa, Ryu; Nishimura, Yukika; Kinoshita, Akihide; Katura, Takusige; Atsumori, Hirokazu; Fukuda, Masato; Kasai, Kiyoto; Koizumi, Hideaki; Kiguchi, Masashi

    2015-01-01

    Abstract. It has been reported that a functional near-infrared spectroscopy (fNIRS) signal can be contaminated by extracerebral contributions. Many algorithms using multidistance separations to address this issue have been proposed, but their spatial separation performance has rarely been validated with simultaneous measurements of fNIRS and functional magnetic resonance imaging (fMRI). We previously proposed a method for discriminating between deep and shallow contributions in fNIRS signals, referred to as the multidistance independent component analysis (MD-ICA) method. In this study, to validate the MD-ICA method from the spatial aspect, multidistance fNIRS, fMRI, and laser-Doppler-flowmetry signals were simultaneously obtained for 12 healthy adult males during three tasks. The fNIRS signal was separated into deep and shallow signals by using the MD-ICA method, and the correlation between the waveforms of the separated fNIRS signals and the gray matter blood oxygenation level–dependent signals was analyzed. A three-way analysis of variance (signal depth×Hb kind×task) indicated that the main effect of fNIRS signal depth on the correlation is significant [F(1,1286)=5.34, pdeep and shallow signals, and the accuracy and reliability of the fNIRS signal will be improved with the method. PMID:26157983

  19. Validation of Experimental whole-body SAR Assessment Method in a Complex Indoor Environment

    DEFF Research Database (Denmark)

    Bamba, Aliou; Joseph, Wout; Vermeeren, Gunter

    2012-01-01

    Assessing experimentally the whole-body specific absorption rate (SARwb) in a complex indoor environment is very challenging. An experimental method based on room electromagnetics theory (accounting only the Line-Of-Sight as specular path) to assess the whole-body SAR is validated by numerical...... of the proposed method is that it allows discarding the computation burden because it does not use any discretizations. Results show good agreement between measurement and computation at 2.8 GHz, as long as the plane wave assumption is valid, i.e., for high distances from the transmitter. Relative deviations 0...

  20. A simple method for validation and verification of pipettes mounted on automated liquid handlers

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Frøslev, Tobias Guldberg

     We have implemented a simple method for validation and verification of the performance of pipettes mounted on automated liquid handlers as necessary for laboratories accredited under ISO 17025. An 8-step serial dilution of Orange G was prepared in quadruplicates in a flat bottom 96-well microtit...... available. In conclusion, we have set up a simple solution for the continuous validation of automated liquid handlers used for accredited work. The method is cheap, simple and easy to use for aqueous solutions but requires a spectrophotometer that can read microtiter plates....... We have implemented a simple method for validation and verification of the performance of pipettes mounted on automated liquid handlers as necessary for laboratories accredited under ISO 17025. An 8-step serial dilution of Orange G was prepared in quadruplicates in a flat bottom 96-well microtiter...

  1. Validation of ascorbic acid tablets of national production by igh-performance liquid chromatography method

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Izquierdo Castro, Idalberto

    2009-01-01

    We validate an analytical method by high-performance liquid chromatography to determine ascorbic acid proportion in vitamin C tablets, which was designed as an alternative method to quality control and to follow-up of active principle chemical stability, since official techniques to quality control of ascorbic acid in tablets are not selective with degradation products. Method was modified according to that reported in USP 28, 2005 for analysis of injectable product. We used a RP-18 column of 250 x 4.6 mm 5 μm with a UV detector to 245 nm. Its validation was necessary for both objectives, considering parameters required for methods of I and II categories. This method was enough linear, exact, and precise in the rank of 100-300 μg/mL. Also, it was selective with remaining components of matrix and with the possible degradation products achieved in stressing conditions. Detection and quantification limits were estimated. When method was validated it was applied to ascorbic acid quantification in two batches of expired tablets and we detected a marked influence of container in active degradation principle after 12 months at room temperature. (Author)

  2. USFDA-GUIDELINE BASED VALIDATION OF TESTING METHOD FOR RIFAMPICIN IN INDONESIAN SERUM SPECIMEN

    Directory of Open Access Journals (Sweden)

    Tri Joko Raharjo

    2010-06-01

    Full Text Available Regarding a new regulation from Indonesia FDA (Badan POM-RI, all new non patent drugs should show bioequivalence with the originator drug prior to registration. Bioequivalence testing (BE-testing has to be performed to the people that represented of population to which the drug to be administrated. BE testing need a valid bio-analytical method for certain drug target and group of population. This research report specific validation of bio-analysis of Rifampicin in Indonesian serum specimen in order to be used for BE testing. The extraction was performed using acetonitrile while the chromatographic separation was accomplished on a RP 18 column (250 × 4.6 mm i.d., 5 µm, with a mobile phase composed of KH2PO4 10 mM-Acetonitrile (40:60, v/v and UV detection was set at 333 nm. The method shown specificity compared to blank serum specimen with retention time of rifampicin at 2.1 min. Lower limit of quantification (LLOQ was 0.06 µg/mL with dynamic range up to 20 µg/mL (R>0.990. Precision of the method was very good with coefficient of variance (CV 0.58; 7.40 and 5.56% for concentration at 0.06, 5, 15 µg/mL, respectively. Accuracies of the method were 3.22; 1.94; 1.90% for concentration 0.06, 5 and 15 µg/mL respectively. The average recoveries were 97.82, 95.50 and 97.31% for concentration of rifampicin 1, 5 and 5 µg/mL, respectively. The method was also shown reliable result on stability test on freezing-thawing, short-term and long-term stability as well as post preparation stability. Validation result shown that the method was ready to be used for Rifampicin BE testing with Indonesian subject.   Keywords: Rifampicin, Validation, USFDA-Guideline

  3. Validate or falsify: Lessons learned from a microscopy method claimed to be useful for detecting Borrelia and Babesia organisms in human blood.

    Science.gov (United States)

    Aase, Audun; Hajdusek, Ondrej; Øines, Øivind; Quarsten, Hanne; Wilhelmsson, Peter; Herstad, Tove K; Kjelland, Vivian; Sima, Radek; Jalovecka, Marie; Lindgren, Per-Eric; Aaberge, Ingeborg S

    2016-01-01

    A modified microscopy protocol (the LM-method) was used to demonstrate what was interpreted as Borrelia spirochetes and later also Babesia sp., in peripheral blood from patients. The method gained much publicity, but was not validated prior to publication, which became the purpose of this study using appropriate scientific methodology, including a control group. Blood from 21 patients previously interpreted as positive for Borrelia and/or Babesia infection by the LM-method and 41 healthy controls without known history of tick bite were collected, blinded and analysed for these pathogens by microscopy in two laboratories by the LM-method and conventional method, respectively, by PCR methods in five laboratories and by serology in one laboratory. Microscopy by the LM-method identified structures claimed to be Borrelia- and/or Babesia in 66% of the blood samples of the patient group and in 85% in the healthy control group. Microscopy by the conventional method for Babesia only did not identify Babesia in any samples. PCR analysis detected Borrelia DNA in one sample of the patient group and in eight samples of the control group; whereas Babesia DNA was not detected in any of the blood samples using molecular methods. The structures interpreted as Borrelia and Babesia by the LM-method could not be verified by PCR. The method was, thus, falsified. This study underlines the importance of doing proper test validation before new or modified assays are introduced.

  4. Time since discharge of 9mm cartridges by headspace analysis, part 1: Comprehensive optimisation and validation of a headspace sorptive extraction (HSSE) method.

    Science.gov (United States)

    Gallidabino, M; Romolo, F S; Weyermann, C

    2017-03-01

    Estimating the time since discharge of spent cartridges can be a valuable tool in the forensic investigation of firearm-related crimes. To reach this aim, it was previously proposed that the decrease of volatile organic compounds released during discharge is monitored over time using non-destructive headspace extraction techniques. While promising results were obtained for large-calibre cartridges (e.g., shotgun shells), handgun calibres yielded unsatisfying results. In addition to the natural complexity of the specimen itself, these can also be attributed to some selective choices in the methods development. Thus, the present series of paper aimed to more systematically evaluate the potential of headspace analysis to estimate the time since discharge of cartridges through the use of more comprehensive analytical and interpretative techniques. Specifically, in this first part, a method based on headspace sorptive extraction (HSSE) was comprehensively optimised and validated, as the latter recently proved to be a more efficient alternative than previous approaches. For this purpose, 29 volatile organic compounds were preliminary selected on the basis of previous works. A multivariate statistical approach based on design of experiments (DOE) was used to optimise variables potentially involved in interaction effects. Introduction of deuterated analogues in sampling vials was also investigated as strategy to account for analytical variations. Analysis was carried out by selected ion mode, gas chromatography coupled to mass spectrometry (GC-MS). Results showed good chromatographic resolution as well as detection limits and peak area repeatability. Application to 9mm spent cartridges confirmed that the use of co-extracted internal standards allowed for improved reproducibility of the measured signals. The validated method will be applied in the second part of this work to estimate the time since discharge of 9mm spent cartridges using multivariate models. Copyright

  5. A New Statistical Method to Determine the Degree of Validity of Health Economic Model Outcomes against Empirical Data.

    Science.gov (United States)

    Corro Ramos, Isaac; van Voorn, George A K; Vemer, Pepijn; Feenstra, Talitha L; Al, Maiwenn J

    2017-09-01

    The validation of health economic (HE) model outcomes against empirical data is of key importance. Although statistical testing seems applicable, guidelines for the validation of HE models lack guidance on statistical validation, and actual validation efforts often present subjective judgment of graphs and point estimates. To discuss the applicability of existing validation techniques and to present a new method for quantifying the degrees of validity statistically, which is useful for decision makers. A new Bayesian method is proposed to determine how well HE model outcomes compare with empirical data. Validity is based on a pre-established accuracy interval in which the model outcomes should fall. The method uses the outcomes of a probabilistic sensitivity analysis and results in a posterior distribution around the probability that HE model outcomes can be regarded as valid. We use a published diabetes model (Modelling Integrated Care for Diabetes based on Observational data) to validate the outcome "number of patients who are on dialysis or with end-stage renal disease." Results indicate that a high probability of a valid outcome is associated with relatively wide accuracy intervals. In particular, 25% deviation from the observed outcome implied approximately 60% expected validity. Current practice in HE model validation can be improved by using an alternative method based on assessing whether the model outcomes fit to empirical data at a predefined level of accuracy. This method has the advantage of assessing both model bias and parameter uncertainty and resulting in a quantitative measure of the degree of validity that penalizes models predicting the mean of an outcome correctly but with overly wide credible intervals. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  6. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  7. GC Method Validation for the Analysis of Menthol in Suppository Pharmaceutical Dosage Form

    Directory of Open Access Journals (Sweden)

    Murad N. Abualhasan

    2017-01-01

    Full Text Available Menthol is widely used as a fragrance and flavor in the food and cosmetic industries. It is also used in the medical and pharmaceutical fields for its various biological effects. Gas chromatography (GC is considered to be a sensitive method for the analysis of menthol. GC chromatographic separation was developed using capillary column (VF-624 and a flame ionization detector (FID. The method was validated as per ICH guidelines for various parameters such as precision, linearity, accuracy, solution stability, robustness, limit of detection, and quantification. The tested validation parameters were found to be within acceptable limits. The method was successfully applied for the quantification of menthol in suppositories formulations. Quality control departments and official pharmacopeias can use our developed method in the analysis of menthol in pharmaceutical dosage formulation and raw material.

  8. New valid spectrofluorimetric method for determination of selected cephalosporins in different pharmaceutical formulations using safranin as fluorophore

    Science.gov (United States)

    Derayea, Sayed M.; Ahmed, Hytham M.; Abdelmageed, Osama H.; Haredy, Ahmed M.

    2016-01-01

    A new validated spectrofluorimetric method has been developed for the determination of some cephalosporins namely; cefepime, cefaclor, cefadroxil, cefpodoxime and cefexime. The method was based on the reaction of these drugs with safranin in slightly alkaline medium (pH 8.0), to form ion-association complexes. The fluorescent products were extracted into chloroform and their fluorescence intensities were measured at 544-565 nm after excitation at 518-524 nm. The reaction conditions influencing the product formation and stability were investigated and optimized. The relative fluorescence intensity was proportional to the drug concentration in the linear ranges of 0.15-1.35, 0.35-1.25, 0.35-1.25, 0.20-1.44 and 0.20-1.25 μg/mL for cefepime, cefaclor, cefadroxil, cefpodoxime proxetil and cefexime, respectively. The detection limits were 40, 100, 100, 60 and 70 ng/mL, respectively. The performance of the developed method was evaluated in terms of Student's t-test and variance ratio F-test to find out the significance of proposed methods over the reference spectrophotometric method. Various pharmaceutical formulations were successfully analyzed using the proposed method and the results were in good agreement with those of the previously reported methods.

  9. Validation of Likelihood Ratio Methods Used for Forensic Evidence Evaluation: Application in Forensic Fingerprints

    NARCIS (Netherlands)

    Haraksim, Rudolf

    2014-01-01

    In this chapter the Likelihood Ratio (LR) inference model will be introduced, the theoretical aspects of probabilities will be discussed and the validation framework for LR methods used for forensic evidence evaluation will be presented. Prior to introducing the validation framework, following

  10. Validation of the Analytical Method for the Determination of Flavonoids in Broccoli

    Directory of Open Access Journals (Sweden)

    Tuszyńska Magdalena

    2014-09-01

    Full Text Available A simple, accurate and selective HPLC method was developed and validated for determination of quercetin and kaempferol, which are the main flavonols in broccoli. The separation was achieved on a reversed-phase C18 column using a mobile phase composed of methanol/water (60/40 and phosphoric acid 0.2% at a flow rate of 1.0 ml min-1. The detection was carried out on a DAD detector at 370 nm. This method was validated according to the requirements for new methods, which include selectivity, linearity, precision, accuracy, limit of detection and limit of quantitation. The current method demonstrates good linearity, with R2 > 0.99. The recovery is within 98.07-102.15% and 97.92-101.83% for quercetin and kaempferol, respectively. The method is selective, in that quercetin and kaempferol are well separated from other compounds of broccoli with good resolution. The low limit of detection and limit of quantitation of quercetin and kaempferol enable the detection and quantitation of these flavonoids in broccoli at low con–centrations.

  11. Development and validation of a dissolution method using HPLC for diclofenac potassium in oral suspension

    Directory of Open Access Journals (Sweden)

    Alexandre Machado Rubim

    2014-04-01

    Full Text Available The present study describes the development and validation of an in vitro dissolution method for evaluation to release diclofenac potassium in oral suspension. The dissolution test was developed and validated according to international guidelines. Parameters like linearity, specificity, precision and accuracy were evaluated, as well as the influence of rotation speed and surfactant concentration on the medium. After selecting the best conditions, the method was validated using apparatus 2 (paddle, 50-rpm rotation speed, 900 mL of water with 0.3% sodium lauryl sulfate (SLS as dissolution medium at 37.0 ± 0.5°C. Samples were analyzed using the HPLC-UV (PDA method. The results obtained were satisfactory for the parameters evaluated. The method developed may be useful in routine quality control for pharmaceutical industries that produce oral suspensions containing diclofenac potassium.

  12. NordVal: A Nordic system for validation of alternative microbiological methods

    DEFF Research Database (Denmark)

    Qvist, Sven

    2007-01-01

    NordVal was created in 1999 by the Nordic Committee of Senior Officials for Food Issues under the Nordic Council of Ministers. The Committee adopted the following objective for NordVal: NordVal evaluates the performance and field of application of alternative microbiological methods. This includes...... analyses of food, water, feed, animal faeces and food environmental samples in the Nordic countries. NordVal is managed by a steering group, which is appointed by the National Food Administrations in Denmark, Finland, Iceland, Norway and Sweden. The background for creation of NordVal was a Danish...... validation system (DanVal) established in 1995 to cope with a need to validate alternative methods to be used in the Danish Salmonella Action Program. The program attracted considerable attention in the other Nordic countries. NordVal has elaborated a number of documents, which describe the requirements...

  13. Validation needs of seismic probabilistic risk assessment (PRA) methods applied to nuclear power plants

    International Nuclear Information System (INIS)

    Kot, C.A.; Srinivasan, M.G.; Hsieh, B.J.

    1985-01-01

    An effort to validate seismic PRA methods is in progress. The work concentrates on the validation of plant response and fragility estimates through the use of test data and information from actual earthquake experience. Validation needs have been identified in the areas of soil-structure interaction, structural response and capacity, and equipment fragility. Of particular concern is the adequacy of linear methodology to predict nonlinear behavior. While many questions can be resolved through the judicious use of dynamic test data, other aspects can only be validated by means of input and response measurements during actual earthquakes. A number of past, ongoing, and planned testing programs which can provide useful validation data have been identified, and validation approaches for specific problems are being formulated

  14. Development and validation of an alternative titration method for the determination of sulfate ion in indinavir sulfate

    Directory of Open Access Journals (Sweden)

    Breno de Carvalho e Silva

    2005-02-01

    Full Text Available A simple and rapid precipitation titration method was developed and validated to determine sulfate ion content in indinavir sulfate raw material. 0.1 mol L-1 lead nitrate volumetric solution was used as titrant employing potentiometric endpoint determination using a lead-specific electrode. The United States Pharmacopoeia Forum indicates a potentiometric method for sulfate ion quantitation using 0.1 mol L-1 lead perchlorate as titrant. Both methods were validated concerning linearity, precision and accuracy, yielding good results. The sulfate ion content found by the two validated methods was compared by the statistical t-student test, indicating that there was no statistically significant difference between the methods.

  15. Comprehensive validation scheme for in situ fiber optics dissolution method for pharmaceutical drug product testing.

    Science.gov (United States)

    Mirza, Tahseen; Liu, Qian Julie; Vivilecchia, Richard; Joshi, Yatindra

    2009-03-01

    There has been a growing interest during the past decade in the use of fiber optics dissolution testing. Use of this novel technology is mainly confined to research and development laboratories. It has not yet emerged as a tool for end product release testing despite its ability to generate in situ results and efficiency improvement. One potential reason may be the lack of clear validation guidelines that can be applied for the assessment of suitability of fiber optics. This article describes a comprehensive validation scheme and development of a reliable, robust, reproducible and cost-effective dissolution test using fiber optics technology. The test was successfully applied for characterizing the dissolution behavior of a 40-mg immediate-release tablet dosage form that is under development at Novartis Pharmaceuticals, East Hanover, New Jersey. The method was validated for the following parameters: linearity, precision, accuracy, specificity, and robustness. In particular, robustness was evaluated in terms of probe sampling depth and probe orientation. The in situ fiber optic method was found to be comparable to the existing manual sampling dissolution method. Finally, the fiber optic dissolution test was successfully performed by different operators on different days, to further enhance the validity of the method. The results demonstrate that the fiber optics technology can be successfully validated for end product dissolution/release testing. (c) 2008 Wiley-Liss, Inc. and the American Pharmacists Association

  16. Validation of a same-day real-time PCR method for screening of meat and carcass swabs for Salmonella

    DEFF Research Database (Denmark)

    Löfström, Charlotta; Krause, Michael; Josefsen, Mathilde Hartmann

    2009-01-01

    of the published PCR methods for Salmonella have been validated in collaborative studies. This study describes a validation including comparative and collaborative trials, based on the recommendations from the Nordic organization for validation of alternative microbiological methods (NordVal) of a same-day, non....... Partly based on results obtained in this study, the method has obtained NordVal approval for analysis of Salmonella in meat and carcass swabs. The PCR method was transferred to a production laboratory and the performance was compared with the BAX Salmonella test on 39 pork samples artificially...... contaminated with Salmonella. There was no significant difference in the results obtained by the two methods. Conclusion: The real-time PCR method for detection of Salmonella in meat and carcass swabs was validated in comparative and collaborative trials according to NordVal recommendations. The PCR method...

  17. Oxcarbazepine: validation and application of an analytical method

    Directory of Open Access Journals (Sweden)

    Paula Cristina Rezende Enéas

    2010-06-01

    Full Text Available Oxcarbazepine (OXC is an important anticonvulsant and mood stabilizing drug. A pharmacopoeial monograph for OXC is not yet available and therefore the development and validation of a new analytical method for quantification of this drug is essential. In the present study, a UV spectrophotometric method for the determination of OXC was developed. The various parameters, such as linearity, precision, accuracy and specificity, were studied according to International Conference on Harmonization Guidelines. Batches of 150 mg OXC capsules were prepared and analyzed using the validated UV method. The formulations were also evaluated for parameters including drug-excipient compatibility, flowability, uniformity of weight, disintegration time, assay, uniformity of content and the amount of drug dissolved during the first hour.Oxcarbazepina (OXC é um fármaco anticonvulsivante e estabilizante do humor. O desenvolvimento e validação de método analítico para quantificação da OXC são de fundamental importância devido à ausência de monografias farmacopéicas oficiais para esse fármaco. Nesse trabalho, um método espectrofotométrico UV para determinação da OXC foi desenvolvido. O método proposto foi validado seguindo os parâmetros de linearidade, precisão, exatidão e especificidade de acordo com as normas da Conferência Internacional de Harmonização. Cápsulas de OXC 150 mg foram preparadas e analisadas utilizando-se o método analítico validado. As formulações foram avaliadas com relação à compatibilidade fármaco-excipientes, fluidez, determinação de peso, tempo de desintegração, doseamento, uniformidade de conteúdo e quantidade do fármaco dissolvido após 60 minutos.

  18. Dependability validation by means of fault injection: method, implementation, application

    International Nuclear Information System (INIS)

    Arlat, Jean

    1990-01-01

    This dissertation presents theoretical and practical results concerning the use of fault injection as a means for testing fault tolerance in the framework of the experimental dependability validation of computer systems. The dissertation first presents the state-of-the-art of published work on fault injection, encompassing both hardware (fault simulation, physical fault Injection) and software (mutation testing) issues. Next, the major attributes of fault injection (faults and their activation, experimental readouts and measures, are characterized taking into account: i) the abstraction levels used to represent the system during the various phases of its development (analytical, empirical and physical models), and Il) the validation objectives (verification and evaluation). An evaluation method is subsequently proposed that combines the analytical modeling approaches (Monte Carlo Simulations, closed-form expressions. Markov chains) used for the representation of the fault occurrence process and the experimental fault Injection approaches (fault Simulation and physical injection); characterizing the error processing and fault treatment provided by the fault tolerance mechanisms. An experimental tool - MESSALINE - is then defined and presented. This tool enables physical faults to be Injected In an hardware and software prototype of the system to be validated. Finally, the application of MESSALINE for testing two fault-tolerant systems possessing very dissimilar features and the utilization of the experimental results obtained - both as design feedbacks and for dependability measures evaluation - are used to illustrate the relevance of the method. (author) [fr

  19. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Directory of Open Access Journals (Sweden)

    Shankarjee Krishnamoorthi

    Full Text Available We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  20. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Science.gov (United States)

    Krishnamoorthi, Shankarjee; Perotti, Luigi E; Borgstrom, Nils P; Ajijola, Olujimi A; Frid, Anna; Ponnaluri, Aditya V; Weiss, James N; Qu, Zhilin; Klug, William S; Ennis, Daniel B; Garfinkel, Alan

    2014-01-01

    We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  1. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    Directory of Open Access Journals (Sweden)

    Daan Nieboer

    Full Text Available External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting.We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1 the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2 the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury.The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples and heterogeneous in scenario 2 (in 17%-39% of simulated samples. Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2.The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  2. Development and Validation of High Performance Liquid Chromatography Method for Determination Atorvastatin in Tablet

    Science.gov (United States)

    Yugatama, A.; Rohmani, S.; Dewangga, A.

    2018-03-01

    Atorvastatin is the primary choice for dyslipidemia treatment. Due to patent expiration of atorvastatin, the pharmaceutical industry makes copy of the drug. Therefore, the development methods for tablet quality tests involving atorvastatin concentration on tablets needs to be performed. The purpose of this research was to develop and validate the simple atorvastatin tablet analytical method by HPLC. HPLC system used in this experiment consisted of column Cosmosil C18 (150 x 4,6 mm, 5 µm) as the stationary reverse phase chomatography, a mixture of methanol-water at pH 3 (80:20 v/v) as the mobile phase, flow rate of 1 mL/min, and UV detector at wavelength of 245 nm. Validation methods were including: selectivity, linearity, accuracy, precision, limit of detection (LOD), and limit of quantitation (LOQ). The results of this study indicate that the developed method had good validation including selectivity, linearity, accuracy, precision, LOD, and LOQ for analysis of atorvastatin tablet content. LOD and LOQ were 0.2 and 0.7 ng/mL, and the linearity range were 20 - 120 ng/mL.

  3. Validation study of core analysis methods for full MOX BWR

    International Nuclear Information System (INIS)

    2013-01-01

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO 2 and MOX fuel rods, (3) analysis of isotopic composition data for UO 2 and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  4. Validation study of core analysis methods for full MOX BWR

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO{sub 2} and MOX fuel rods, (3) analysis of isotopic composition data for UO{sub 2} and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  5. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  6. Statistical methods for mechanistic model validation: Salt Repository Project

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1988-07-01

    As part of the Department of Energy's Salt Repository Program, Pacific Northwest Laboratory (PNL) is studying the emplacement of nuclear waste containers in a salt repository. One objective of the SRP program is to develop an overall waste package component model which adequately describes such phenomena as container corrosion, waste form leaching, spent fuel degradation, etc., which are possible in the salt repository environment. The form of this model will be proposed, based on scientific principles and relevant salt repository conditions with supporting data. The model will be used to predict the future characteristics of the near field environment. This involves several different submodels such as the amount of time it takes a brine solution to contact a canister in the repository, how long it takes a canister to corrode and expose its contents to the brine, the leach rate of the contents of the canister, etc. These submodels are often tested in a laboratory and should be statistically validated (in this context, validate means to demonstrate that the model adequately describes the data) before they can be incorporated into the waste package component model. This report describes statistical methods for validating these models. 13 refs., 1 fig., 3 tabs

  7. The systematic profiling of false identity documents: method validation and performance evaluation using seizures known to originate from common and different sources.

    Science.gov (United States)

    Baechler, Simon; Terrasse, Vincent; Pujol, Jean-Philippe; Fritz, Thibaud; Ribaux, Olivier; Margot, Pierre

    2013-10-10

    False identity documents constitute a potential powerful source of forensic intelligence because they are essential elements of transnational crime and provide cover for organized crime. In previous work, a systematic profiling method using false documents' visual features has been built within a forensic intelligence model. In the current study, the comparison process and metrics lying at the heart of this profiling method are described and evaluated. This evaluation takes advantage of 347 false identity documents of four different types seized in two countries whose sources were known to be common or different (following police investigations and dismantling of counterfeit factories). Intra-source and inter-sources variations were evaluated through the computation of more than 7500 similarity scores. The profiling method could thus be validated and its performance assessed using two complementary approaches to measuring type I and type II error rates: a binary classification and the computation of likelihood ratios. Very low error rates were measured across the four document types, demonstrating the validity and robustness of the method to link documents to a common source or to differentiate them. These results pave the way for an operational implementation of a systematic profiling process integrated in a developed forensic intelligence model. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. A method for the statistical interpretation of friction ridge skin impression evidence: Method development and validation.

    Science.gov (United States)

    Swofford, H J; Koertner, A J; Zemp, F; Ausdemore, M; Liu, A; Salyards, M J

    2018-04-03

    The forensic fingerprint community has faced increasing amounts of criticism by scientific and legal commentators, challenging the validity and reliability of fingerprint evidence due to the lack of an empirically demonstrable basis to evaluate and report the strength of the evidence in a given case. This paper presents a method, developed as a stand-alone software application, FRStat, which provides a statistical assessment of the strength of fingerprint evidence. The performance was evaluated using a variety of mated and non-mated datasets. The results show strong performance characteristics, often with values supporting specificity rates greater than 99%. This method provides fingerprint experts the capability to demonstrate the validity and reliability of fingerprint evidence in a given case and report the findings in a more transparent and standardized fashion with clearly defined criteria for conclusions and known error rate information thereby responding to concerns raised by the scientific and legal communities. Published by Elsevier B.V.

  9. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  10. Development and validation of a GC method for the determination of D004 in emulsions

    International Nuclear Information System (INIS)

    Sierra Perez, Roxana; Rodriguez Leyes, Eduardo; Gonzalez Canavaciolo, Victor; Marrero Delange, David; Vicente Murillo, Roxanay; Velazquez Gomez Caridad

    2008-01-01

    D004 is a new active ingredient, composed of a mixture of fatty acids between 8 y 18 carbons atoms, with proved efficacy in experimental models of benig prostatic hyperplasia. With the aim to make the quality control of the emulsions used in pharmacological and toxicological studies (containing 20 - 300 mg/mL of active ingredient), a capillary gas chromatographic analytical method was developed and validated. The method was based on the extraction of the D004 active ingredient with n-hexane, and previous to the chromatographic analysis, a methylation process with 10 % acetyl chloride in methanol was carried out. The quantitative analysis, using tridecanoic acid as internal standard, was based on the determination of lauric acid. This is one of the majority acids in D004 and it was not affected by the interferences of the vehicle employed in emulsion preparation, fact that was demonstrated in the specificity assay. Good linearity (r = 0.999) and accuracy were proved over a range 10 - 500 mg/mL, with mean recoveries between 98 and 103 % that were not significantly different from 100 %, for p = 0.05. The coefficient of variation (CV = 0.71 %) of the precision study was < 2 %. According to these results, the method was suitable for quality control of D004 emulsions

  11. Development and Validation of a Stability-Indicating LC-UV Method ...

    African Journals Online (AJOL)

    Development and Validation of a Stability-Indicating LC-UV Method for Simultaneous Determination of Ketotifen and Cetirizine in Pharmaceutical Dosage Forms. ... 5 μm) using an isocratic mobile phase that consisted of acetonitrile and 10 mM disodium hydrogen phosphate buffer (pH 6.5) in a ratio of 45:55 % v/v at a flow ...

  12. Comparison of the performances and validation of three methods for ...

    African Journals Online (AJOL)

    SARAH

    2014-02-28

    Feb 28, 2014 ... bacteria in Norwegian slaughter pigs. Int J. Food Microbiol 1, 301–309. [NCFA] Nordic Committee of Food Analysis (1996). Yersinia enterocolitica Detection in foods 117,. 3rd,edn,1-12. Nowak, B., Mueffling, T.V., Caspari, K. and Hartung, J. 2006 Validation of a method for the detection of virulent Yersinia ...

  13. Validation of a new background discrimination method for the TACTIC TeV γ-ray telescope with Markarian 421 data

    International Nuclear Information System (INIS)

    Sharma, Mradul; Nayak, J.; Koul, M.K.; Bose, S.; Mitra, Abhas; Dhar, V.K.; Tickoo, A.K.; Koul, R.

    2015-01-01

    This paper describes the validation of a new background discrimination method based on Random Forest technique by re-analysing the Markarian 421 (Mrk 421) observations performed by the TACTIC (TeV Atmospheric Cherenkov Telescope with Imaging Camera) γ-ray telescope. The Random Forest technique is a flexible multivariate method which combines Bagging and Random Split Selection to construct a large collection of decision trees and then combines them to construct a common classifier. Markarian 421 in a high state was observed by TACTIC during December 07, 2005–April 30, 2006 for 202 h. Previous analysis of this data led to a detection of flaring activity from the source at Energy >1TeV. Within this data set, a spell of 97 h revealed strong detection of a γ-ray signal with daily flux of >1 Crab unit on several days. Here we re-analyze this spell as well as the data from the entire observation period with the Random Forest method. Application of this method led to an improvement in the signal detection strength by ∼26% along with a ∼18% increase in detected γ rays compared to the conventional Dynamic Supercuts method. The resultant differential spectrum obtained is represented by a power law with an exponential cut off Γ=−2.51±0.10 and E 0 =4.71±2.20TeV. Such a spectrum is consistent with previously reported results and justifies the use of Random Forest method for analyzing data from atmospheric Cherenkov telescopes

  14. Validation of a new background discrimination method for the TACTIC TeV γ-ray telescope with Markarian 421 data

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Mradul, E-mail: mradul@barc.gov.in [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai (India); Nayak, J. [The Bayesian and Interdisciplinary Research Unit, Indian Statistical Institute, Kolkata (India); Koul, M.K. [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai (India); Bose, S. [The Bayesian and Interdisciplinary Research Unit, Indian Statistical Institute, Kolkata (India); Mitra, Abhas; Dhar, V.K.; Tickoo, A.K.; Koul, R. [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai (India)

    2015-01-11

    This paper describes the validation of a new background discrimination method based on Random Forest technique by re-analysing the Markarian 421 (Mrk 421) observations performed by the TACTIC (TeV Atmospheric Cherenkov Telescope with Imaging Camera) γ-ray telescope. The Random Forest technique is a flexible multivariate method which combines Bagging and Random Split Selection to construct a large collection of decision trees and then combines them to construct a common classifier. Markarian 421 in a high state was observed by TACTIC during December 07, 2005–April 30, 2006 for 202 h. Previous analysis of this data led to a detection of flaring activity from the source at Energy >1TeV. Within this data set, a spell of 97 h revealed strong detection of a γ-ray signal with daily flux of >1 Crab unit on several days. Here we re-analyze this spell as well as the data from the entire observation period with the Random Forest method. Application of this method led to an improvement in the signal detection strength by ∼26% along with a ∼18% increase in detected γ rays compared to the conventional Dynamic Supercuts method. The resultant differential spectrum obtained is represented by a power law with an exponential cut off Γ=−2.51±0.10 and E{sub 0}=4.71±2.20TeV. Such a spectrum is consistent with previously reported results and justifies the use of Random Forest method for analyzing data from atmospheric Cherenkov telescopes.

  15. Validation of Cyanoacrylate Method for Collection of Stratum Corneum in Human Skin for Lipid Analysis

    DEFF Research Database (Denmark)

    Jungersted, JM; Hellgren, Lars; Drachmann, Tue

    2010-01-01

    Background and Objective: Lipids in the stratum corneum (SC) are of major importance for the skin barrier function. Many different methods have been used for the collection of SC for the analysis of SC lipids. The objective of the present study was to validate the cyanoacrylate method for the col......Background and Objective: Lipids in the stratum corneum (SC) are of major importance for the skin barrier function. Many different methods have been used for the collection of SC for the analysis of SC lipids. The objective of the present study was to validate the cyanoacrylate method...

  16. Validation of a Russian Language Oswestry Disability Index Questionnaire.

    Science.gov (United States)

    Yu, Elizabeth M; Nosova, Emily V; Falkenstein, Yuri; Prasad, Priya; Leasure, Jeremi M; Kondrashov, Dimitriy G

    2016-11-01

    Study Design  Retrospective reliability and validity study. Objective  To validate a recently translated Russian language version of the Oswestry Disability Index (R-ODI) using standardized methods detailed from previous validations in other languages. Methods  We included all subjects who were seen in our spine surgery clinic, over the age of 18, and fluent in the Russian language. R-ODI was translated by six bilingual people and combined into a consensus version. R-ODI and visual analog scale (VAS) questionnaires for leg and back pain were distributed to subjects during both their initial and follow-up visits. Test validity, stability, and internal consistency were measured using standardized psychometric methods. Results Ninety-seven subjects participated in the study. No change in the meaning of the questions on R-ODI was noted with translation from English to Russian. There was a significant positive correlation between R-ODI and VAS scores for both the leg and back during both the initial and follow-up visits ( p  Russian-speaking population in the United States.

  17. Stepwise Procedure for Development and Validation of a Multipesticide Method

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    The stepwise procedure for development and the validation of so called multi-pesticide methods are described. Principles, preliminary actions, criteria for the selection of chromatographic separation, detection and performance verification of multi-pesticide methods are outlined. Also the long term repeatability and reproducibility, as well as the necessity for the documentation of laboratory work are highlighted. Appendix I hereof describes in detail the calculation of calibration parameters, whereas Appendix II focuses on the calculation of the significance of differences of concentrations obtained on two different separation columns. (author)

  18. A validated solid-liquid extraction method for the HPLC determination of polyphenols in apple tissues Comparison with pressurised liquid extraction.

    Science.gov (United States)

    Alonso-Salces, Rosa M; Barranco, Alejandro; Corta, Edurne; Berrueta, Luis A; Gallo, Blanca; Vicente, Francisca

    2005-02-15

    A solid-liquid extraction procedure followed by reversed-phase high-performance liquid chromatography (RP-HPLC) coupled with a photodiode array detector (DAD) for the determination of polyphenols in freeze-dried apple peel and pulp is reported. The extraction step consists in sonicating 0.5g of freeze-dried apple tissue with 30mL of methanol-water-acetic acid (30:69:1, v/v/v) containing 2g of ascorbic acid/L, for 10min in an ultrasonic bath. The whole method was validated, concluding that it is a robust method that presents high extraction efficiencies (peel: >91%, pulp: >95%) and appropriate precisions (within day: R.S.D. (n = 5) <5%, and between days: R.S.D. (n = 5) <7%) at the different concentration levels of polyphenols that can be found in apple samples. The method was compared with one previously published, consisting in a pressurized liquid extraction (PLE) followed by RP-HPLC-DAD determination. The advantages and disadvantages of both methods are discussed.

  19. Validation of verbal autopsy methods using hospital medical records: a case study in Vietnam.

    Science.gov (United States)

    Tran, Hong Thi; Nguyen, Hoa Phuong; Walker, Sue M; Hill, Peter S; Rao, Chalapati

    2018-05-18

    Information on causes of death (COD) is crucial for measuring the health outcomes of populations and progress towards the Sustainable Development Goals. In many countries such as Vietnam where the civil registration and vital statistics (CRVS) system is dysfunctional, information on vital events will continue to rely on verbal autopsy (VA) methods. This study assesses the validity of VA methods used in Vietnam, and provides recommendations on methods for implementing VA validation studies in Vietnam. This validation study was conducted on a sample of 670 deaths from a recent VA study in Quang Ninh province. The study covered 116 cases from this sample, which met three inclusion criteria: a) the death occurred within 30 days of discharge after last hospitalisation, and b) medical records (MRs) for the deceased were available from respective hospitals, and c) the medical record mentioned that the patient was terminally ill at discharge. For each death, the underlying cause of death (UCOD) identified from MRs was compared to the UCOD from VA. The validity of VA diagnoses for major causes of death was measured using sensitivity, specificity and positive predictive value (PPV). The sensitivity of VA was at least 75% in identifying some leading CODs such as stroke, road traffic accidents and several site-specific cancers. However, sensitivity was less than 50% for other important causes including ischemic heart disease, chronic obstructive pulmonary diseases, and diabetes. Overall, there was 57% agreement between UCOD from VA and MR, which increased to 76% when multiple causes from VA were compared to UCOD from MR. Our findings suggest that VA is a valid method to ascertain UCOD in contexts such as Vietnam. Furthermore, within cultural contexts in which patients prefer to die at home instead of a healthcare facility, using the available MRs as the gold standard may be meaningful to the extent that recall bias from the interval between last hospital discharge and death

  20. MR-based water content estimation in cartilage: design and validation of a method

    DEFF Research Database (Denmark)

    Shiguetomi Medina, Juan Manuel; Kristiansen, Maja Sophie; Ringgaard, Steffen

    Purpose: Design and validation of an MR-based method that allows the calculation of the water content in cartilage tissue. Methods and Materials: Cartilage tissue T1 map based water content MR sequences were used on a 37 Celsius degree stable system. The T1 map intensity signal was analyzed on 6...... cartilage samples from living animals (pig) and on 8 gelatin samples which water content was already known. For the data analysis a T1 intensity signal map software analyzer used. Finally, the method was validated after measuring and comparing 3 more cartilage samples in a living animal (pig). The obtained...... map based water content sequences can provide information that, after being analyzed using a T1-map analysis software, can be interpreted as the water contained inside a cartilage tissue. The amount of water estimated using this method was similar to the one obtained at the dry-freeze procedure...

  1. Development and validation of stability indicating UPLC assay method for ziprasidone active pharma ingredient

    Directory of Open Access Journals (Sweden)

    Sonam Mittal

    2012-01-01

    Full Text Available Background: Ziprasidone, a novel antipsychotic, exhibits a potent highly selective antagonistic activity on D2 and 5HT2A receptors. Literature survey for ziprasidone revealed several analytical methods based on different techniques but no UPLC method has been reported so far. Aim: Aim of this research paper is to present a simple and rapid stability indicating isocratic, ultra performance liquid chromatographic (UPLC method which was developed and validated for the determination of ziprasidone active pharmaceutical ingredient. Forced degradation studies of ziprasidone were studied under acid, base, oxidative hydrolysis, thermal stress and photo stress conditions. Materials and Methods: The quantitative determination of ziprasidone drug was performed on a Supelco analytical column (100×2.1 mm i.d., 2.7 ΅m with 10 mM ammonium acetate buffer (pH: 6.7 and acetonitrile (ACN as mobile phase with the ratio (55:45-Buffer:ACN at a flow rate of 0.35 ml/ min. For UPLC method, UV detection was made at 318 nm and the run time was 3 min. Developed UPLC method was validated as per ICH guidelines. Results and Conclusion: Mild degradation of the drug substance was observed during oxidative hydrolysis and considerable degradation observed during basic hydrolysis. During method validation, parameters such as precision, linearity, ruggedness, stability, robustness, and specificity were evaluated, which remained within acceptable limits. Developed UPLC method was successfully applied for evaluating assay of Ziprasidone active Pharma ingredient.

  2. Clashing Validities in the Comparative Method? Balancing In-Depth Understanding and Generalizability in Small-N Policy Studies

    NARCIS (Netherlands)

    van der Heijden, J.

    2013-01-01

    The comparative method receives considerable attention in political science. To some a main advantage of the method is that it allows for both in-depth insights (internal validity), and generalizability beyond the cases studied (external validity). However, others consider internal and external

  3. Developing and validating the Youth Conduct Problems Scale-Rwanda: a mixed methods approach.

    Directory of Open Access Journals (Sweden)

    Lauren C Ng

    Full Text Available This study developed and validated the Youth Conduct Problems Scale-Rwanda (YCPS-R. Qualitative free listing (n = 74 and key informant interviews (n = 47 identified local conduct problems, which were compared to existing standardized conduct problem scales and used to develop the YCPS-R. The YCPS-R was cognitive tested by 12 youth and caregiver participants, and assessed for test-retest and inter-rater reliability in a sample of 64 youth. Finally, a purposive sample of 389 youth and their caregivers were enrolled in a validity study. Validity was assessed by comparing YCPS-R scores to conduct disorder, which was diagnosed with the Mini International Neuropsychiatric Interview for Children, and functional impairment scores on the World Health Organization Disability Assessment Schedule Child Version. ROC analyses assessed the YCPS-R's ability to discriminate between youth with and without conduct disorder. Qualitative data identified a local presentation of youth conduct problems that did not match previously standardized measures. Therefore, the YCPS-R was developed solely from local conduct problems. Cognitive testing indicated that the YCPS-R was understandable and required little modification. The YCPS-R demonstrated good reliability, construct, criterion, and discriminant validity, and fair classification accuracy. The YCPS-R is a locally-derived measure of Rwandan youth conduct problems that demonstrated good psychometric properties and could be used for further research.

  4. Principles of Single-Laboratory Validation of Analytical Methods for Testing the Chemical Composition of Pesticides

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Underlying theoretical and practical approaches towards pesticide formulation analysis are discussed, i.e. general principles, performance characteristics, applicability of validation data, verification of method performance, and adaptation of validated methods by other laboratories. The principles of single laboratory validation of analytical methods for testing the chemical composition of pesticides are outlined. Also the theoretical background is described for performing pesticide formulation analysis as outlined in ISO, CIPAC/AOAC and IUPAC guidelines, including methodological characteristics such as specificity, selectivity, linearity, accuracy, trueness, precision and bias. Appendices I–III hereof give practical and elaborated examples on how to use the Horwitz approach and formulae for estimating the target standard deviation towards acceptable analytical repeatability. The estimation of trueness and the establishment of typical within-laboratory reproducibility are treated in greater detail by means of worked-out examples. (author)

  5. Right ventriculography as a valid method for the diagnosis of tricuspid insufficiency.

    Science.gov (United States)

    Ubago, J L; Figueroa, A; Colman, T; Ochoteco, A; Rodríguez, M; Durán, C M

    1981-01-01

    The value of right ventriculography in the diagnosis of tricuspid insufficiency (TI) is often questioned because of 1) the high incidence of premature ventricular contractions (PVCs) during injections and 2) interference of the catheter in the valve closure mechanism. In 168 patients a commercially available, not preshaped, balloon-tipped catheter was used for right ventriculography. To avoid the induction of PVCs, the catheter tip was placed in the middle third of the diafragmatic wall of the right ventricle, and the balloon was inflated, becoming trapped by the trabeculae. In this position the catheter's side holes should be located in the inflow chamber. To ensure this correct position, and therefore lack of ectopic beats during angiography, a saline test injection was performed previously in every case. With this technique the incidence of PVCs during ventriculography was only 7.7%. In all but one case, such beats were isolated. The 168 patients were divided into three groups according to their likelihood of experiencing tricuspid interference by the catheter: group 1 included 41 patients with a normal heart or with coronary artery disease. No one from this group had TI. Of group II, 28 patients with right ventricular pressure or volume overload or cardiomyopathy, only 2 had TI, both with a previous clinical diagnosis of regurgitation. Group III contained 99 patients with rheumatic heart disease. Thirty-five of them showed angiographic TI, and 24 of these had this diagnosis confirmed either clinically or at surgery. It is felt that this technique of right ventriculography, with its low incidence of PVCs and slight interference with tricuspid closure, is a valid method for the objective study of the tricuspid valve.

  6. High-performance liquid chromatography method validation for determination of tetracycline residues in poultry meat

    Directory of Open Access Journals (Sweden)

    Vikas Gupta

    2014-01-01

    Full Text Available Background: In this study, a method for determination of tetracycline (TC residues in poultry with the help of high-performance liquid chromatography technique was validated. Materials and Methods: The principle step involved in ultrasonic-assisted extraction of TCs from poultry samples by 2 ml of 20% trichloroacetic acid and phosphate buffer (pH 4, which gave a clearer supernatant and high recovery, followed by centrifugation and purification by using 0.22 μm filter paper. Results: Validity study of the method revealed that all obtained calibration curves showed good linearity (r2 > 0.999 over the range of 40-4500 ng. Sensitivity was found to be 1.54 and 1.80 ng for oxytetracycline (OTC and TC. Accuracy was in the range of 87.94-96.20% and 72.40-79.84% for meat. Precision was lower than 10% in all cases indicating that the method can be used as a validated method. Limit of detection was found to be 4.8 and 5.10 ng for OTC and TC, respectively. The corresponding values of limit of quantitation were 11 and 12 ng. Conclusion: The method reliably identifies and quantifies the selected TC and OTC in the reconstituted poultry meat in the low and sub-nanogram range and can be applied in any laboratory.

  7. Reference Proteome Extracts for Mass Spec Instrument Performance Validation and Method Development

    Science.gov (United States)

    Rosenblatt, Mike; Urh, Marjeta; Saveliev, Sergei

    2014-01-01

    Biological samples of high complexity are required to test protein mass spec sample preparation procedures and validate mass spec instrument performance. Total cell protein extracts provide the needed sample complexity. However, to be compatible with mass spec applications, such extracts should meet a number of design requirements: compatibility with LC/MS (free of detergents, etc.)high protein integrity (minimal level of protein degradation and non-biological PTMs)compatibility with common sample preparation methods such as proteolysis, PTM enrichment and mass-tag labelingLot-to-lot reproducibility Here we describe total protein extracts from yeast and human cells that meet the above criteria. Two extract formats have been developed: Intact protein extracts with primary use for sample preparation method development and optimizationPre-digested extracts (peptides) with primary use for instrument validation and performance monitoring

  8. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    Science.gov (United States)

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders.

  9. Validation of three new methods for determination of metal emissions using a modified Environmental Protection Agency Method 301

    Energy Technology Data Exchange (ETDEWEB)

    Catherine A. Yanca; Douglas C. Barth; Krag A. Petterson; Michael P. Nakanishi; John A. Cooper; Bruce E. Johnsen; Richard H. Lambert; Daniel G. Bivins [Cooper Environmental Services, LLC, Portland, OR (United States)

    2006-12-15

    Three new methods applicable to the determination of hazardous metal concentrations in stationary source emissions were developed and evaluated for use in U.S. Environmental Protection Agency (EPA) compliance applications. Two of the three independent methods, a continuous emissions monitor-based method (Xact) and an X-ray-based filter method (XFM), are used to measure metal emissions. The third method involves a quantitative aerosol generator (QAG), which produces a reference aerosol used to evaluate the measurement methods. A modification of EPA Method 301 was used to validate the three methods for As, Cd, Cr, Pb, and Hg, representing three hazardous waste combustor Maximum Achievable Control Technology (MACT) metal categories (low volatile, semivolatile, and volatile). The measurement methods were evaluated at a hazardous waste combustor (HWC) by comparing measured with reference aerosol concentrations. The QAG, Xact, and XFM met the modified Method 301 validation criteria. All three of the methods demonstrated precisions and accuracies on the order of 5%. The measurement methods should be applicable to emissions from a wide range of sources, and the reference aerosol generator should be applicable to additional analytes. EPA recently approved an alternative monitoring petition for an HWC at Eli Lilly's Tippecanoe site in Lafayette, IN, in which the Xact is used for demonstrating compliance with the HWC MACT metal emissions (low volatile, semivolatile, and volatile). The QAG reference aerosol generator was approved as a method for providing a quantitative reference aerosol, which is required for certification and continuing quality assurance of the Xact. 30 refs., 5 figs., 11 tabs.

  10. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    Science.gov (United States)

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  11. AOAC Official MethodSM Matrix Extension Validation Study of Assurance GDSTM for the Detection of Salmonella in Selected Spices.

    Science.gov (United States)

    Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew

    2015-01-01

    Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella.

  12. A Component-Based Modeling and Validation Method for PLC Systems

    Directory of Open Access Journals (Sweden)

    Rui Wang

    2014-05-01

    Full Text Available Programmable logic controllers (PLCs are complex embedded systems that are widely used in industry. This paper presents a component-based modeling and validation method for PLC systems using the behavior-interaction-priority (BIP framework. We designed a general system architecture and a component library for a type of device control system. The control software and hardware of the environment were all modeled as BIP components. System requirements were formalized as monitors. Simulation was carried out to validate the system model. A realistic example from industry of the gates control system was employed to illustrate our strategies. We found a couple of design errors during the simulation, which helped us to improve the dependability of the original systems. The results of experiment demonstrated the effectiveness of our approach.

  13. Acceptability criteria for linear dependence in validating UV-spectrophotometric methods of quantitative determination in forensic and toxicological analysis

    Directory of Open Access Journals (Sweden)

    L. Yu. Klimenko

    2014-08-01

    Full Text Available Introduction. This article is the result of authors’ research in the field of development of the approaches to validation of quantitative determination methods for purposes of forensic and toxicological analysis and devoted to the problem of acceptability criteria formation for validation parameter «linearity/calibration model». The aim of research. The purpose of this paper is to analyse the present approaches to acceptability estimation of the calibration model chosen for method description according to the requirements of the international guidances, to form the own approaches to acceptability estimation of the linear dependence when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. Materials and methods. UV-spectrophotometric method of doxylamine quantitative determination in blood. Results. The approaches to acceptability estimation of calibration models when carrying out the validation of bioanalytical methods is stated in international papers, namely «Guidance for Industry: Bioanalytical method validation» (U.S. FDA, 2001, «Standard Practices for Method Validation in Forensic Toxicology» (SWGTOX, 2012, «Guidance for the Validation of Analytical Methodology and Calibration of Equipment used for Testing of Illicit Drugs in Seized Materials and Biological Specimens» (UNODC, 2009 and «Guideline on validation of bioanalytical methods» (ЕМА, 2011 have been analysed. It has been suggested to be guided by domestic developments in the field of validation of analysis methods for medicines and, particularly, by the approaches to validation methods in the variant of the calibration curve method for forming the acceptability criteria of the obtained linear dependences when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. The choice of the method of calibration curve is

  14. Pressurised liquid extraction of flavonoids in onions. Method development and validation

    DEFF Research Database (Denmark)

    Søltoft, Malene; Christensen, J.H.; Nielsen, J.

    2009-01-01

    A rapid and reliable analytical method for quantification of flavonoids in onions was developed and validated. Five extraction methods were tested on freeze-dried onions and subsequently high performance liquid chromatography (HPLC) with UV detection was used for quantification of seven flavonoids...... extraction methods. However. PLE was the preferred extraction method because the method can be highly automated, use only small amounts of solvents, provide the cleanest extracts, and allow the extraction of light and oxygen-sensitive flavonoids to be carried out in an inert atmosphere protected from light......-step PLE method showed good selectivity, precision (RSDs = 3.1-11%) and recovery of the extractable flavonoids (98-99%). The method also appeared to be a multi-method, i.e. generally applicable to, e.g. phenolic acids in potatoes and carrots....

  15. Application of EU guidelines for the validation of screening methods for veterinary drugs

    NARCIS (Netherlands)

    Stolker, A.A.M.

    2012-01-01

    Commission Decision (CD) 2002/657/EC describes detailed rules for method validation within the framework of residue monitoring programmes. The approach described in this CD is based on criteria. For (qualitative) screening methods, the most important criteria is that the CCß has to be below any

  16. Validity of the CT to attenuation coefficient map conversion methods

    International Nuclear Information System (INIS)

    Faghihi, R.; Ahangari Shahdehi, R.; Fazilat Moadeli, M.

    2004-01-01

    The most important commercialized methods of attenuation correction in SPECT are based on attenuation coefficient map from a transmission imaging method. The transmission imaging system can be the linear source of radioelement or a X-ray CT system. The image of transmission imaging system is not useful unless to replacement of the attenuation coefficient or CT number with the attenuation coefficient in SPECT energy. In this paper we essay to evaluate the validity and estimate the error of the most used method of this transformation. The final result shows that the methods which use a linear or multi-linear curve accept a error in their estimation. The value of mA is not important but the patient thickness is very important and it can introduce a error more than 10 percent in the final result

  17. Validation of methods for the determination of radium in waters and soil

    International Nuclear Information System (INIS)

    Decaillon, J.-G.; Bickel, M.; Hill, C.; Altzitzoglou, T.

    2004-01-01

    This article describes the advantages and disadvantages of several analytical methods used to prepare the alpha-particle source. As a result of this study, a new method combining commercial extraction and ion chromatography prior to a final co-precipitation step is proposed. This method has been applied and validated on several matrices (soil, waters) in the framework of international intercomparisons. The integration of this method in a global procedure to analyze actinoids and radium from a single solution (or digested soil) is also described

  18. A systematic review and appraisal of methods of developing and validating lifestyle cardiovascular disease risk factors questionnaires.

    Science.gov (United States)

    Nse, Odunaiya; Quinette, Louw; Okechukwu, Ogah

    2015-09-01

    Well developed and validated lifestyle cardiovascular disease (CVD) risk factors questionnaires is the key to obtaining accurate information to enable planning of CVD prevention program which is a necessity in developing countries. We conducted this review to assess methods and processes used for development and content validation of lifestyle CVD risk factors questionnaires and possibly develop an evidence based guideline for development and content validation of lifestyle CVD risk factors questionnaires. Relevant databases at the Stellenbosch University library were searched for studies conducted between 2008 and 2012, in English language and among humans. Using the following databases; pubmed, cinahl, psyc info and proquest. Search terms used were CVD risk factors, questionnaires, smoking, alcohol, physical activity and diet. Methods identified for development of lifestyle CVD risk factors were; review of literature either systematic or traditional, involvement of expert and /or target population using focus group discussion/interview, clinical experience of authors and deductive reasoning of authors. For validation, methods used were; the involvement of expert panel, the use of target population and factor analysis. Combination of methods produces questionnaires with good content validity and other psychometric properties which we consider good.

  19. Validation of analytical methods for the stability studies of naproxen suppositories for infant and adult use

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar

    2011-01-01

    Analytical and validating studies were performed in this paper, with a view to using them in the stability studies of the future formulations of naproxen suppositories for children and adults. The most influential factors in the naproxen stability were determined, that is, the major degradation occurred in acid medium, oxidative medium and by light action. One high-performance liquid chromatography-based method was evaluated, which proved to be adequate to quantify naproxen in suppositories and was selective against degradation products. The quantification limit was 3,480 μg, so it was valid for these studies. Additionally, the parameters specificity for stability, detection and quantification limits were evaluated for the direct semi-aqueous acid-base method, which was formerly validated for the quality control and showed satisfactory results. Nevertheless, the volumetric methods were not regarded as stability indicators; therefore, this method will be used along with the chromatographic methods of choice, that is, thin-layer chromatography and highperformance liquid chromatography, to determine the degradation products

  20. Validation of a cartridge method for the quality control determination of 99Tcm-HMPAO

    International Nuclear Information System (INIS)

    Pandos, G.; Penglis, S.; Tsopelas, C.; Royal Adelaide Hospital, Adelaide, SA

    1999-01-01

    Full text: The manufacturer's method for assessing the radiochemical purity (RCP) of ±-HMPAO requires the use of three solvent types on two different stationary phases, and is time-consuming (∼ 15 min) in consideration of the short shelf-life (30 min). An impetus to develop a rapid quality control procedure for this product has led to the use of a single strip Whatman 17 chromatography system using ethyl acetate as the developing solvent. This popular Whatman paper system was previously validates against the manufacturer's method. We have developed a new method to successfully determine the % RCP of ±-HMPAO, which employs a disposable, inexpensive and reusable Amprep C-18 cartridge with normal saline as a non-organic mobile phase. The Whatman paper system separates the primary lipophilic 99 Tc m -HMPAO complex from 99 Tc m O-2, 99 Tc m O 4 and secondary 99 Tc m -HMPAO complex at the origin. By comparison, the lipophilic portion was retained on the cartridge and the hydrophilic impurities were found in saline eluent with the cartridge method. Whatman 17 paper system results showed 95.1 ± I.7% 99 Tc m -HMPAO after 5 min and the cartridge method gave 95.5 ± 1.5% 99 Tc m -HMPAO (n = 8) after 3 min. The % 99 Tc m O 2 levels in 99 Tc m -HMPAO were insignificant. When a failed kit was assessed for RCP at 2.5 h post-reconstitution, the Whatman paper system and the cartridge method correlated well, resulting in 63.1 ± 2.7% and 62.9±2.1% 99 Tc m -HMPAO (n=3) respectively. Although the cartridge method may slightly overestimate the % RCP of 99 Tc m -HMPAO, it was found to be simple, rapid and reliable for the quality control analysis of routine 99 Tc m -HMPAO preparations

  1. Determination of C-glucosidic ellagitannins in Lythri salicariaeherba by ultra-high performance liquid chromatography coupled with charged aerosol detector: method development and validation.

    Science.gov (United States)

    Granica, Sebastian; Piwowarski, Jakub P; Kiss, Anna K

    2014-01-01

    Lythri salicariaeherba is a pharmacopoeial plant material used by patients in the form of infusions in the treatment of acute diarrhoea. According to its pharmacopoeial monograph it is standardised for total tannin content, which should be not less than 5.0% using pyrogallol as a standard. Previous studies have shown that aqueous extracts from Lythri herba contain mainly ellagitannins among which vescalagin, castalagin and salicarinins A and B are dominating constituents. To develop and validate an efficient UHPLC coupled with a charged aerosol detector (CAD) method for quantification of four major ellagitannins in Lythri salicariaeherba and in one commercial preparation. Extraction conditions of ellagitannins from plant material were optimised. The relative response factors for vescalagin, castalagin and salicarinins A and B using gallic acid as an external standard were determined for the CAD detector. Then, a UHPLC method for quantification of ellagitannins was developed and validated. Four major ellagitannins were quantified in four samples of Lythri herba and in one commercial preparation. The sum of ellagitannins for each sample was determined, which varied from 30.66 to 48.80 mg/g of raw material and 16.57 mg per capsule for the preparation investigated. The first validated UHPLC/CAD UHPLC-CAD method for quantification of four major ellagitannins was developed. The universality of the CAD response was evaluated and it is shown that although all compounds analysed have similar structures their CAD response differs significantly. Copyright © 2013 John Wiley & Sons, Ltd.

  2. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    Science.gov (United States)

    Nieciąg, Halina

    2015-10-01

    Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling) was implemented as alternative to the simple sampling schema of classic algorithm.

  3. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    Science.gov (United States)

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  4. Comparison of validation methods for forming simulations

    Science.gov (United States)

    Schug, Alexander; Kapphan, Gabriel; Bardl, Georg; Hinterhölzl, Roland; Drechsler, Klaus

    2018-05-01

    The forming simulation of fibre reinforced thermoplastics could reduce the development time and improve the forming results. But to take advantage of the full potential of the simulations it has to be ensured that the predictions for material behaviour are correct. For that reason, a thorough validation of the material model has to be conducted after characterising the material. Relevant aspects for the validation of the simulation are for example the outer contour, the occurrence of defects and the fibre paths. To measure these features various methods are available. Most relevant and also most difficult to measure are the emerging fibre orientations. For that reason, the focus of this study was on measuring this feature. The aim was to give an overview of the properties of different measuring systems and select the most promising systems for a comparison survey. Selected were an optical, an eddy current and a computer-assisted tomography system with the focus on measuring the fibre orientations. Different formed 3D parts made of unidirectional glass fibre and carbon fibre reinforced thermoplastics were measured. Advantages and disadvantages of the tested systems were revealed. Optical measurement systems are easy to use, but are limited to the surface plies. With an eddy current system also lower plies can be measured, but it is only suitable for carbon fibres. Using a computer-assisted tomography system all plies can be measured, but the system is limited to small parts and challenging to evaluate.

  5. Unexpected but Most Welcome: Mixed Methods for the Validation and Revision of the Participatory Evaluation Measurement Instrument

    Science.gov (United States)

    Daigneault, Pierre-Marc; Jacob, Steve

    2014-01-01

    Although combining methods is nothing new, more contributions about why and how to mix methods for validation purposes are needed. This article presents a case of validating the inferences drawn from the Participatory Evaluation Measurement Instrument, an instrument that purports to measure stakeholder participation in evaluation. Although the…

  6. Reliability and validity in measurement of true humeral retroversion by a three-dimensional cylinder fitting method.

    Science.gov (United States)

    Saka, Masayuki; Yamauchi, Hiroki; Hoshi, Kenji; Yoshioka, Toru; Hamada, Hidetoshi; Gamada, Kazuyoshi

    2015-05-01

    Humeral retroversion is defined as the orientation of the humeral head relative to the distal humerus. Because none of the previous methods used to measure humeral retroversion strictly follow this definition, values obtained by these techniques vary and may be biased by morphologic variations of the humerus. The purpose of this study was 2-fold: to validate a method to define the axis of the distal humerus with a virtual cylinder and to establish the reliability of 3-dimensional (3D) measurement of humeral retroversion by this cylinder fitting method. Humeral retroversion in 14 baseball players (28 humeri) was measured by the 3D cylinder fitting method. The root mean square error was calculated to compare values obtained by a single tester and by 2 different testers using the embedded coordinate system. To establish the reliability, intraclass correlation coefficient (ICC) and precision (standard error of measurement [SEM]) were calculated. The root mean square errors for the humeral coordinate system were reliability and precision of the 3D measurement of retroversion yielded an intratester ICC of 0.99 (SEM, 1.0°) and intertester ICC of 0.96 (SEM, 2.8°). The error in measurements obtained by a distal humerus cylinder fitting method was small enough not to affect retroversion measurement. The 3D measurement of retroversion by this method provides excellent intratester and intertester reliability. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  7. Validation of histamine determination Method in yoghurt using High Performance Liquid Chromatography

    Directory of Open Access Journals (Sweden)

    M Jahedinia

    2014-02-01

    Full Text Available Biogenic amines are organic, basic nitrogenous compounds of low molecular weight that are mainly generated by the enzymatic decarboxylation of amino acids by microorganisms. Dairy products are among the foods with the highest amine content. A wide variety of methods and procedures for determination of histamine and biogenic amines have been established. Amongst, HPLC method is considered as reference method. The aim of this study was to validate Reversed Phase HPLC method determination of histamine in yoghurt. The mobile phase consisted of acetonitrile/water (18:88 v/v and the flow rate was set at 0.5 ml/min using isocratic HPLC. Detection was carried out at 254 nm using UV-detector. Calibration curve that was constructed using peak area of standards was linear and value of correlation coefficient (r2 was estimated at 0.998. Good recoveries were observed for histamine under investigation at all spiking levels and average of recoveries was 84%. The RSD% value from repeatability test was found to be %4.4. Limit of detection and limit of quantitation were 0.14 and 0.42 µ/ml, respectively. The results of validation tests showed that the method is reliable and rapid for quantification of histamine in yoghurt.

  8. Developmental validation of a Nextera XT mitogenome Illumina MiSeq sequencing method for high-quality samples.

    Science.gov (United States)

    Peck, Michelle A; Sturk-Andreaggi, Kimberly; Thomas, Jacqueline T; Oliver, Robert S; Barritt-Ross, Suzanne; Marshall, Charla

    2018-05-01

    Generating mitochondrial genome (mitogenome) data from reference samples in a rapid and efficient manner is critical to harnessing the greater power of discrimination of the entire mitochondrial DNA (mtDNA) marker. The method of long-range target enrichment, Nextera XT library preparation, and Illumina sequencing on the MiSeq is a well-established technique for generating mitogenome data from high-quality samples. To this end, a validation was conducted for this mitogenome method processing up to 24 samples simultaneously along with analysis in the CLC Genomics Workbench and utilizing the AQME (AFDIL-QIAGEN mtDNA Expert) tool to generate forensic profiles. This validation followed the Federal Bureau of Investigation's Quality Assurance Standards (QAS) for forensic DNA testing laboratories and the Scientific Working Group on DNA Analysis Methods (SWGDAM) validation guidelines. The evaluation of control DNA, non-probative samples, blank controls, mixtures, and nonhuman samples demonstrated the validity of this method. Specifically, the sensitivity was established at ≥25 pg of nuclear DNA input for accurate mitogenome profile generation. Unreproducible low-level variants were observed in samples with low amplicon yields. Further, variant quality was shown to be a useful metric for identifying sequencing error and crosstalk. Success of this method was demonstrated with a variety of reference sample substrates and extract types. These studies further demonstrate the advantages of using NGS techniques by highlighting the quantitative nature of heteroplasmy detection. The results presented herein from more than 175 samples processed in ten sequencing runs, show this mitogenome sequencing method and analysis strategy to be valid for the generation of reference data. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Performance Validity Testing in Neuropsychology: Methods for Measurement Development and Maximizing Diagnostic Accuracy.

    Science.gov (United States)

    Wodushek, Thomas R; Greher, Michael R

    2017-05-01

    In the first column in this 2-part series, Performance Validity Testing in Neuropsychology: Scientific Basis and Clinical Application-A Brief Review, the authors introduced performance validity tests (PVTs) and their function, provided a justification for why they are necessary, traced their ongoing endorsement by neuropsychological organizations, and described how they are used and interpreted by ever increasing numbers of clinical neuropsychologists. To enhance readers' understanding of these measures, this second column briefly describes common detection strategies used in PVTs as well as the typical methods used to validate new PVTs and determine cut scores for valid/invalid determinations. We provide a discussion of the latest research demonstrating how neuropsychologists can combine multiple PVTs in a single battery to improve sensitivity/specificity to invalid responding. Finally, we discuss future directions for the research and application of PVTs.

  10. A newly validated high-performance liquid chromatography method with diode array ultraviolet detection for analysis of the antimalarial drug primaquine in the blood plasma

    Directory of Open Access Journals (Sweden)

    Ana Paula Barbosa do Carmo

    Full Text Available Abstract INTRODUCTION: Primaquine (PQ diphosphate is an 8-aminoquinoline antimalarial drug with unique therapeutic properties. It is the only drug that prevents relapses of Plasmodium vivax or Plasmodium ovale infections. In this study, a fast, sensitive, cost-effective, and robust method for the extraction and high-performance liquid chromatography with diode array ultraviolet detection (HPLC-DAD-UV analysis of PQ in the blood plasma was developed and validated. METHODS: After plasma protein precipitation, PQ was obtained by liquid-liquid extraction and analyzed by HPLC-DAD-UV with a modified-silica cyanopropyl column (250mm × 4.6mm i.d. × 5μm as the stationary phase and a mixture of acetonitrile and 10mM ammonium acetate buffer (pH = 3.80 (45:55 as the mobile phase. The flow rate was 1.0mL·min-1, the oven temperature was 50OC, and absorbance was measured at 264nm. The method was validated for linearity, intra-day and inter-day precision, accuracy, recovery, and robustness. The detection (LOD and quantification (LOQ limits were 1.0 and 3.5ng·mL-1, respectively. The method was used to analyze the plasma of female DBA-2 mice treated with 20mg.kg-1 (oral PQ diphosphate. RESULTS: By combining a simple, low-cost extraction procedure with a sensitive, precise, accurate, and robust method, it was possible to analyze PQ in small volumes of plasma. The new method presents lower LOD and LOQ limits and requires a shorter analysis time and smaller plasma volumes than those of previously reported HPLC methods with DAD-UV detection. CONCLUSIONS: The new validated method is suitable for kinetic studies of PQ in small rodents, including mouse models for the study of malaria.

  11. Validation of a dissolution method with RP-HPLC analysis for Perindopril erbumine and Indapamide combination tablet

    Directory of Open Access Journals (Sweden)

    Jain P.S.

    2012-01-01

    Full Text Available A Dissolution method with high performance liquid chromatography (HPLC analysis was validated for perindopril erbumine and indapamide in combination tablet formulation. The method was validated to meet requirements for a global regulatory filing and this validation included specificity, linearity, accuracy, precision, range, robustness and solution stability studies. The dissolution method, which uses USP apparatus 1 with basket rotating at 100 rpm, 1000 ml of phosphate buffer pH 6.8 as the dissolution medium, and reversed-phased HPLC was carried out at 50⁰C on a 4.6mm×250mm 5μm cyano column that contained USP packing L1 with acetonitrile: buffer pH 2.8::40:60 (v/v, as mobile phase. UV detector was set at 225 nm. A method was found to be selective, linear, accurate and precise in the specified ranges. Intra-day and inter-day variability for method was <2% RSD. This method was successfully used for quantification of perindopril erbumine and indapamide combination tablet formulations.

  12. Evaluation and validation of a multi-residue method based on biochip technology for the simultaneous screening of six families of antibiotics in muscle and aquaculture products.

    Science.gov (United States)

    Gaudin, Valérie; Hedou, Celine; Soumet, Christophe; Verdon, Eric

    2016-01-01

    The Evidence Investigator™ system (Randox, UK) is a biochip and semi-automated system. The microarray kit II (AM II) is capable of detecting several compounds belonging to different families of antibiotics: quinolones, ceftiofur, thiamphenicol, streptomycin, tylosin and tetracyclines. The performance of this innovative system was evaluated for the detection of antibiotic residues in new matrices, in muscle of different animal species and in aquaculture products. The method was validated according to the European Decision No. EC/2002/657 and the European guideline for the validation of screening methods, which represents a complete initial validation. The false-positive rate was equal to 0% in muscle and in aquaculture products. The detection capabilities CCβ for 12 validated antibiotics (enrofloxacin, difloxacin, ceftiofur, desfuroyl ceftiofur cysteine disulfide, thiamphenicol, florfenicol, tylosin, tilmicosin, streptomycin, dihydrostreptomycin, tetracycline, doxycycline) were all lower than the respective maximum residue limits (MRLs) in muscle from different animal origins (bovine, ovine, porcine, poultry). No cross-reactions were observed with other antibiotics, neither with the six detected families nor with other families of antibiotics. The AM II kit could be applied to aquaculture products but with higher detection capabilities from those in muscle. The detection capabilities CCβ in aquaculture products were respectively at 0.25, 0.10 and 0.5 of the respective MRL in aquaculture products for enrofloxacin, tylosin and oxytetracycline. The performance of the AM II kit has been compared with other screening methods and with the performance characteristics previously determined in honey.

  13. Validation method for determination of cholesterol in human urine with electrochemical sensors using gold electrodes

    Science.gov (United States)

    Riyanto, Laksono, Tomy Agung

    2017-12-01

    Electrochemical sensors for the determination of cholesterol with Au as a working electrode (Au) and its application to the analysis of urine have been done. The gold electrode was prepared using gold pure (99.99%), with size 1.0 mm by length and wide respectively, connected with silver wire using silver conductive paint. Validation methods have been investigated in the analysis of cholesterol in human urine using electrochemical sensors or cyclic voltammetry (CV) method. The effect of electrolyte and uric acid concentration has been determined to produce the optimum method. Validation method parameters for cholesterol analysis in human urine using CV are precision, recovery, linearity, limit of detection (LOD) and limit of quantification (LOQ). The result showed the correlation of concentration of cholesterol to anodic peak current is the coefficient determination of R2 = 0.916. The results of the validation method showed the precision, recovery, linearity, LOD, and LOQ are 1.2539%, 144.33%, 0.916, 1.49 × 10-1 mM and 4.96 × 10-1 mM, respectively. As a conclusion is Au electrode is a good electrode for electrochemical sensors to determination of cholesterol in human urine.

  14. Low-cost extrapolation method for maximal lte radio base station exposure estimation: Test and validation

    International Nuclear Information System (INIS)

    Verloock, L.; Joseph, W.; Gati, A.; Varsier, N.; Flach, B.; Wiart, J.; Martens, L.

    2013-01-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on down-link band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2x2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders. (authors)

  15. Review on Laryngeal Palpation Methods in Muscle Tension Dysphonia: Validity and Reliability Issues.

    Science.gov (United States)

    Khoddami, Seyyedeh Maryam; Ansari, Noureddin Nakhostin; Jalaie, Shohreh

    2015-07-01

    Laryngeal palpation is a common clinical method for the assessment of neck and laryngeal muscles in muscle tension dysphonia (MTD). To review the available laryngeal palpation methods used in patients with MTD for the assessment, diagnosis, or document of treatment outcomes. A systematic review of the literature concerning palpatory methods in MTD was conducted using the databases MEDLINE (PubMed), ScienceDirect, Scopus, Web of science, Web of knowledge and Cochrane Library between July and October 2013. Relevant studies were identified by one reviewer based on screened titles/abstracts and full texts. Manual searching was also used to track the source literature. There were five main as well as miscellaneous palpation methods that were different according to target anatomical structures, judgment or grading system, and using tasks. There were only a few scales available, and the majority of the palpatory methods were qualitative. Most of the palpatory methods evaluate the tension at both static and dynamic tasks. There was little information about the validity and reliability of the available methods. The literature on the scientific evidence of muscle tension indicators perceived by laryngeal palpation in MTD is scarce. Future studies should be conducted to investigate the validity and reliability of palpation methods. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  16. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    Science.gov (United States)

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Method Points: towards a metric for method complexity

    Directory of Open Access Journals (Sweden)

    Graham McLeod

    1998-11-01

    Full Text Available A metric for method complexity is proposed as an aid to choosing between competing methods, as well as in validating the effects of method integration or the products of method engineering work. It is based upon a generic method representation model previously developed by the author and adaptation of concepts used in the popular Function Point metric for system size. The proposed technique is illustrated by comparing two popular I.E. deliverables with counterparts in the object oriented Unified Modeling Language (UML. The paper recommends ways to improve the practical adoption of new methods.

  18. Shielding design method for LMFBR validation on the Phenix factor

    International Nuclear Information System (INIS)

    Cabrillat, J.C.; Crouzet, J.; Misrakis, J.; Salvatores, M.; Rado, V.; Palmiotti, G.

    1983-05-01

    Shielding design methods, developed at CEA for shielding calculations find a global validation by the means of Phenix power reactor (250 MWe) measurements. Particularly, the secondary sodium activation of pool type LMFBR such as Super Phenix (1200 MWe) which is subject to strict safety limitation is well calculated by the adapted scheme, i.e. a two dimension transport calculation of shielding coupled to a Monte-Carlo calculation of secondary sodium activation

  19. The Validation of AAN Method Used by Rock Sample SRM 2780

    International Nuclear Information System (INIS)

    Rina Mulyaningsih, Th.

    2004-01-01

    AAN methods is a non standard testing method. The testing laboratory must be validate its using method to ensure and confirm that it is suitable with application. The analysis of SRM 2780 Hard rock mine waste with 9 replicates has been done to test the accuracy of AAN methods. The result showed that the elements As, Ba, Mn, V, Zn and Na have good accuration were evaluated against the acceptance criteria for accuracy with confidence level 95 %. The elements As, Co, Sc, Cr, Ba, Sb, Cs, Mn, V, Au, Zn and Na have low relative bias between the analyst's value and the target value. The continued testing must be done to test the accuracy of another certificated elements. (author)

  20. A Validated, Rapid HPLC-ESI-MS/MS Method for the Determination of Lycopsamine.

    Science.gov (United States)

    Jedlinszki, Nikoletta; Csupor, Dezső

    2015-07-01

    The aim of the present work was to develop and validate an HPLC-MS/MS method for the determination of a major pyrrolizidine alkaloid of comfrey (lycopsamine) in aqueous samples as a basis for the development of a method for the determination of absorption of lycopsamine by human skin. A linear calibration curve was established in the range of 1.32-440 ng. The intraday precision during the 3-day validation period ranged between 0.57 and 2.48% while the interday precision was 1.70% and 1.95% for quality control samples. LOD was 0.014 ng and recovery was above 97%. The lycopsamine content of the samples stored for 9 and 25 days at 22 degrees C, 10 degrees C and -25 degrees C did not vary. These results underline the good repeatability and accuracy of our method and allow the analysis of samples with very low lycopsamine content.

  1. Validity of a Simulation Game as a Method for History Teaching

    Science.gov (United States)

    Corbeil, Pierre; Laveault, Dany

    2011-01-01

    The aim of this research is, first, to determine the validity of a simulation game as a method of teaching and an instrument for the development of reasoning and, second, to study the relationship between learning and students' behavior toward games. The participants were college students in a History of International Relations course, with two…

  2. Catch-up validation study of an in vitro skin irritation test method based on an open source reconstructed epidermis (phase II).

    Science.gov (United States)

    Groeber, F; Schober, L; Schmid, F F; Traube, A; Kolbus-Hernandez, S; Daton, K; Hoffmann, S; Petersohn, D; Schäfer-Korting, M; Walles, H; Mewes, K R

    2016-10-01

    To replace the Draize skin irritation assay (OECD guideline 404) several test methods based on reconstructed human epidermis (RHE) have been developed and were adopted in the OECD test guideline 439. However, all validated test methods in the guideline are linked to RHE provided by only three companies. Thus, the availability of these test models is dependent on the commercial interest of the producer. To overcome this limitation and thus to increase the accessibility of in vitro skin irritation testing, an open source reconstructed epidermis (OS-REp) was introduced. To demonstrate the capacity of the OS-REp in regulatory risk assessment, a catch-up validation study was performed. The participating laboratories used in-house generated OS-REp to assess the set of 20 reference substances according to the performance standards amending the OECD test guideline 439. Testing was performed under blinded conditions. The within-laboratory reproducibility of 87% and the inter-laboratory reproducibility of 85% prove a high reliability of irritancy testing using the OS-REp protocol. In addition, the prediction capacity was with an accuracy of 80% comparable to previous published RHE based test protocols. Taken together the results indicate that the OS-REp test method can be used as a standalone alternative skin irritation test replacing the OECD test guideline 404. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. A newly validated high-performance liquid chromatography method with diode array ultraviolet detection for analysis of the antimalarial drug primaquine in the blood plasma.

    Science.gov (United States)

    Carmo, Ana Paula Barbosa do; Borborema, Manoella; Ribeiro, Stephan; De-Oliveira, Ana Cecilia Xavier; Paumgartten, Francisco Jose Roma; Moreira, Davyson de Lima

    2017-01-01

    Primaquine (PQ) diphosphate is an 8-aminoquinoline antimalarial drug with unique therapeutic properties. It is the only drug that prevents relapses of Plasmodium vivax or Plasmodium ovale infections. In this study, a fast, sensitive, cost-effective, and robust method for the extraction and high-performance liquid chromatography with diode array ultraviolet detection (HPLC-DAD-UV ) analysis of PQ in the blood plasma was developed and validated. After plasma protein precipitation, PQ was obtained by liquid-liquid extraction and analyzed by HPLC-DAD-UV with a modified-silica cyanopropyl column (250mm × 4.6mm i.d. × 5μm) as the stationary phase and a mixture of acetonitrile and 10mM ammonium acetate buffer (pH = 3.80) (45:55) as the mobile phase. The flow rate was 1.0mL·min-1, the oven temperature was 50OC, and absorbance was measured at 264nm. The method was validated for linearity, intra-day and inter-day precision, accuracy, recovery, and robustness. The detection (LOD) and quantification (LOQ) limits were 1.0 and 3.5ng·mL-1, respectively. The method was used to analyze the plasma of female DBA-2 mice treated with 20mg.kg-1 (oral) PQ diphosphate. By combining a simple, low-cost extraction procedure with a sensitive, precise, accurate, and robust method, it was possible to analyze PQ in small volumes of plasma. The new method presents lower LOD and LOQ limits and requires a shorter analysis time and smaller plasma volumes than those of previously reported HPLC methods with DAD-UV detection. The new validated method is suitable for kinetic studies of PQ in small rodents, including mouse models for the study of malaria.

  4. Validated spectrophotometric methods for simultaneous determination of troxerutin and carbazochrome in dosage form

    Science.gov (United States)

    Khattab, Fatma I.; Ramadan, Nesrin K.; Hegazy, Maha A.; Al-Ghobashy, Medhat A.; Ghoniem, Nermine S.

    2015-03-01

    Four simple, accurate, sensitive and precise spectrophotometric methods were developed and validated for simultaneous determination of Troxerutin (TXN) and Carbazochrome (CZM) in their bulk powders, laboratory prepared mixtures and pharmaceutical dosage forms. Method A is first derivative spectrophotometry (D1) where TXN and CZM were determined at 294 and 483.5 nm, respectively. Method B is first derivative of ratio spectra (DD1) where the peak amplitude at 248 for TXN and 439 nm for CZM were used for their determination. Method C is ratio subtraction (RS); in which TXN was determined at its λmax (352 nm) in the presence of CZM which was determined by D1 at 483.5 nm. While, method D is mean centering of the ratio spectra (MCR) in which the mean centered values at 300 nm and 340.0 nm were used for the two drugs in a respective order. The two compounds were simultaneously determined in the concentration ranges of 5.00-50.00 μg mL-1 and 0.5-10.0 μg mL-1 for TXN and CZM, respectively. The methods were validated according to the ICH guidelines and the results were statistically compared to the manufacturer's method.

  5. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    Directory of Open Access Journals (Sweden)

    Nieciąg Halina

    2015-10-01

    Full Text Available Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling was implemented as alternative to the simple sampling schema of classic algorithm.

  6. Survey and assessment of conventional software verification and validation methods

    International Nuclear Information System (INIS)

    Miller, L.A.; Groundwater, E.; Mirsky, S.M.

    1993-04-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 134 methods so identified were classified according to their appropriateness for various phases of a developmental lifecycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes in terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each method for the four identified components of knowledge-based and expert systems, as well as the system as a whole

  7. Evaluating the Social Validity of the Early Start Denver Model: A Convergent Mixed Methods Study

    Science.gov (United States)

    Ogilvie, Emily; McCrudden, Matthew T.

    2017-01-01

    An intervention has social validity to the extent that it is socially acceptable to participants and stakeholders. This pilot convergent mixed methods study evaluated parents' perceptions of the social validity of the Early Start Denver Model (ESDM), a naturalistic behavioral intervention for children with autism. It focused on whether the parents…

  8. Validation of the quality control methods for active ingredients of Fungirex cream

    International Nuclear Information System (INIS)

    Perez Navarro, Maikel; Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania

    2014-01-01

    Fungirex cream is a two-drug product, that is, undecylenic acid and zinc undecylenate over a suitable basis. Since this is a product not documented in the official monographs of the pharmacopoeia, simple analytical methods were suggested for quantitation of analytes of interest in the cream, which are useful for release of newly prepared cream batches. To validate two volumetric methods for the quality control of active ingredients in Fungirex cream

  9. General criteria for validation of dosimetry methods in the context of a quality system ISO / IEC 17025

    International Nuclear Information System (INIS)

    Martin Garcia, R.; Navarro Bravo, T.

    2011-01-01

    The accreditation of a testing laboratory in accordance with ISO / IEC 17025 recognizes the technical competence of a laboratory to perform certain tests. One of the requirements of that rule states that laboratories must demonstrate that the methods used are valid and appropriate for the intended use and customer needs. This demonstration is accomplished through the process of validation of methods, defined in the rule it self as c onfirmation by examination and provision of objective evidence that the requirements for a particular purpose . The process of validating a test method should be well planned and documented, including the requirements under the applicable rules and criteria established by the laboratory to comply with these requirements.

  10. Uncertainty estimates of purity measurements based on current information: toward a "live validation" of purity methods.

    Science.gov (United States)

    Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech

    2012-12-01

    To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.

  11. The use of Geographic Information System (GIS) and non-GIS methods to assess the external validity of samples postcollection.

    Science.gov (United States)

    Richardson, Esther; Good, Margaret; McGrath, Guy; More, Simon J

    2009-09-01

    External validity is fundamental to veterinary diagnostic investigation, reflecting the accuracy with which sample results can be extrapolated to a broader population of interest. Probability sampling methods are routinely used during the collection of samples from populations, specifically to maximize external validity. Nonprobability sampling (e.g., of blood samples collected as part of routine surveillance programs or laboratory submissions) may provide useful data for further posthoc epidemiological analysis, adding value to the collection and submission of samples. As the sample has already been submitted, the analyst or investigator does not have any control over the sampling methodology, and hence external validity as routine probability sampling methods may not have been employed. The current study describes several Geographic Information System (GIS) and non-GIS methods, applied posthoc, to assess the external validity of samples collected using both probability and nonprobability sampling methods. These methods could equally be employed for inspecting other datasets. Mapping was conducted using ArcView 9.1. Based on this posthoc assessment, results from the random field sample could provide an externally valid, albeit relatively imprecise, estimate of national disease prevalence, of disease prevalence in 3 of the 4 provinces (all but Ulster, in the north and northwest, where sample size was small), and in beef and dairy herds. This study provides practical methods for examining the external validity of samples postcollection.

  12. Treatment response in psychotic patients classified according to social and clinical needs, drug side effects, and previous treatment; a method to identify functional remission.

    Science.gov (United States)

    Alenius, Malin; Hammarlund-Udenaes, Margareta; Hartvig, Per; Sundquist, Staffan; Lindström, Leif

    2009-01-01

    Various approaches have been made over the years to classify psychotic patients according to inadequate treatment response, using terms such as treatment resistant or treatment refractory. Existing classifications have been criticized for overestimating positive symptoms; underestimating residual symptoms, negative symptoms, and side effects; or being to open for individual interpretation. The aim of this study was to present and evaluate a new method of classification according to treatment response and, thus, to identify patients in functional remission. A naturalistic, cross-sectional study was performed using patient interviews and information from patient files. The new classification method CANSEPT, which combines the Camberwell Assessment of Need rating scale, the Udvalg for Kliniske Undersøgelser side effect rating scale (SE), and the patient's previous treatment history (PT), was used to group the patients according to treatment response. CANSEPT was evaluated by comparison of expected and observed results. In the patient population (n = 123), the patients in functional remission, as defined by CANSEPT, had higher quality of life, fewer hospitalizations, fewer psychotic symptoms, and higher rate of workers than those with the worst treatment outcome. In the evaluation, CANSEPT showed validity in discriminating the patients of interest and was well tolerated by the patients. CANSEPT could secure inclusion of correct patients in the clinic or in research.

  13. Method of administration of PROMIS scales did not significantly impact score level, reliability, or validity

    DEFF Research Database (Denmark)

    Bjorner, Jakob B; Rose, Matthias; Gandek, Barbara

    2014-01-01

    OBJECTIVES: To test the impact of the method of administration (MOA) on score level, reliability, and validity of scales developed in the Patient Reported Outcomes Measurement Information System (PROMIS). STUDY DESIGN AND SETTING: Two nonoverlapping parallel forms each containing eight items from......, no significant mode differences were found and all confidence intervals were within the prespecified minimal important difference of 0.2 standard deviation. Parallel-forms reliabilities were very high (ICC = 0.85-0.93). Only one across-mode ICC was significantly lower than the same-mode ICC. Tests of validity...... questionnaire (PQ), personal digital assistant (PDA), or personal computer (PC) and a second form by PC, in the same administration. Method equivalence was evaluated through analyses of difference scores, intraclass correlations (ICCs), and convergent/discriminant validity. RESULTS: In difference score analyses...

  14. Validation of an open-formula, diagnostic real-time PCR method for 20-hr detection of Salmonella in animal feeds

    DEFF Research Database (Denmark)

    Löfström, Charlotta; Hoorfar, Jeffrey

    2012-01-01

    A comparative study of a 20-hr, non-commercial, open-formula PCR method and the standard culture-based method NMKL 187, for detection of Salmonella, was performed according to the validation protocol from the Nordic organization for validation of alternative microbiological methods (NordVal) on 81...

  15. Validation on milk and sprouts of EN ISO 16654:2001 - Microbiology of food and animal feeding stuffs - Horizontal method for the detection of Escherichia coli O157.

    Science.gov (United States)

    Tozzoli, Rosangela; Maugliani, Antonella; Michelacci, Valeria; Minelli, Fabio; Caprioli, Alfredo; Morabito, Stefano

    2018-05-08

    In 2006, the European Committee for standardisation (CEN)/Technical Committee 275 - Food analysis - Horizontal methods/Working Group 6 - Microbiology of the food chain (TC275/WG6), launched the project of validating the method ISO 16654:2001 for the detection of Escherichia coli O157 in foodstuff by the evaluation of its performance, in terms of sensitivity and specificity, through collaborative studies. Previously, a validation study had been conducted to assess the performance of the Method No 164 developed by the Nordic Committee for Food Analysis (NMKL), which aims at detecting E. coli O157 in food as well, and is based on a procedure equivalent to that of the ISO 16654:2001 standard. Therefore, CEN established that the validation data obtained for the NMKL Method 164 could be exploited for the ISO 16654:2001 validation project, integrated with new data obtained through two additional interlaboratory studies on milk and sprouts, run in the framework of the CEN mandate No. M381. The ISO 16654:2001 validation project was led by the European Union Reference Laboratory for Escherichia coli including VTEC (EURL-VTEC), which organized the collaborative validation study on milk in 2012 with 15 participating laboratories and that on sprouts in 2014, with 14 participating laboratories. In both studies, a total of 24 samples were tested by each laboratory. Test materials were spiked with different concentration of E. coli O157 and the 24 samples corresponded to eight replicates of three levels of contamination: zero, low and high spiking level. The results submitted by the participating laboratories were analyzed to evaluate the sensitivity and specificity of the ISO 16654:2001 method when applied to milk and sprouts. The performance characteristics calculated on the data of the collaborative validation studies run under the CEN mandate No. M381 returned sensitivity and specificity of 100% and 94.4%, respectively for the milk study. As for sprouts matrix, the sensitivity

  16. Method validation and stability study of quercetin in topical emulsions

    Directory of Open Access Journals (Sweden)

    Rúbia Casagrande

    2009-01-01

    Full Text Available This study validated a high performance liquid chromatography (HPLC method for the quantitative evaluation of quercetin in topical emulsions. The method was linear within 0.05 - 200 μg/mL range with a correlation coefficient of 0.9997, and without interference in the quercetin peak. The detection and quantitation limits were 18 and 29 ng/mL, respectively. The intra- and inter-assay precisions presented R.S.D. values lower than 2%. An average of 93% and 94% of quercetin was recovered for non-ionic and anionic emulsions, respectively. The raw material and anionic emulsion, but not non-ionic emulsion, were stable in all storage conditions for one year. The method reported is a fast and reliable HPLC technique useful for quercetin determination in topical emulsions.

  17. A Validated RP-HPLC Method for the Determination of Atazanavir in Pharmaceutical Dosage Form

    Directory of Open Access Journals (Sweden)

    K. Srinivasu

    2011-01-01

    Full Text Available A validated RP HPLC method for the estimation of atazanavir in capsule dosage form on YMC ODS 150 × 4.6 mm, 5 μ column using mobile phase composition of ammonium dihydrogen phosphate buffer (pH 2.5 with acetonitrile (55:45 v/v. Flow rate was maintained at 1.5 mL/min with 288 nm UV detection. The retention time obtained for atazanavir was at 4.7 min. The detector response was linear in the concentration range of 30 - 600 μg/mL. This method has been validated and shown to be specific, sensitive, precise, linear, accurate, rugged, robust and fast. Hence, this method can be applied for routine quality control of atazanavir in capsule dosage forms as well as in bulk drug.

  18. Method validation for simultaneous counting of Total α , β in Drinking Water using Liquid Scintillation Counter

    International Nuclear Information System (INIS)

    Al-Masri, M. S.; Nashawati, A.

    2014-05-01

    In this work, Method Validation Methods and Pulse Shape Analysis were validated to determine gross Alpha and Beta Emitters in Drinking Water using Liquid Scintillation Counter Win spectral 1414. Validation parameters include Method Detection Limit, Method Quantitation Limit, Repeatability Limit, Intermediate Precision, Trueness) Bias), Recovery Coefficient, Linearity and Uncertainty Budget in analysis. The results show that the Method Detection Limit and Method Quantitation Limit were 0.07, 0.24 Bq/l for Alpha emitters respectively, and 0.42, 1.4 Bq/l for Beta emitters, respectively. The relative standard deviation of Repeatability Limit reached 2.81% for Alpha emitters and 3.96% for Beta emitters. In addition to, the relative standard deviation of Intermediate Precisionis was 0.54% for Alpha emitters and 1.17% for Beta emitters. Moreover, the trueness was - 7.7% for Alpha emitters and - 4.5% for Beta emitters. Recovery Coefficient ranged between 87 - 96% and 88-101 for Alpha and Beta emitters, respectively. Linearity reached 1 for both Alpha and Beta emitters. on the other hand, Uncertainty Budget for all continents was 96.65% ,83.14% for Alpha and Beta emitters, respectively (author).

  19. THE INFLUENCE OF THE ASSESSMENT MODEL AND METHOD TOWARD THE SCIENCE LEARNING ACHIEVEMENT BY CONTROLLING THE STUDENTS? PREVIOUS KNOWLEDGE OF MATHEMATICS.

    OpenAIRE

    Adam rumbalifar; I. g. n. Agung; Burhanuddin tola.

    2018-01-01

    This research aims to study the influence of the assessment model and method toward the science learning achievement by controlling the students? previous knowledge of mathematics. This study was conducted at SMP East Seram district with the population of 295 students. This study applied a quasi-experimental method with 2 X 2 factorial design using the ANCOVA model. The findings after controlling the students\\' previous knowledge of mathematics show that the science learning achievement of th...

  20. A Sensitive Validated Spectrophotometric Method for the Determination of Flucloxacillin Sodium

    Directory of Open Access Journals (Sweden)

    R. Singh Gujral

    2009-01-01

    Full Text Available A simple and sensitive spectrophotometric method has been proposed for the determination of flucloxacillin sodium. The determination method is based on charge transfer complexation reaction of the drug with iodine in methanol-dichloromethane medium. The absorbance was measured at 362 nm against the reagent blank. Under optimized experimental conditions, Beer's law is obeyed in the concentration ranges 1-9 μg/mL for flucloxacillin. The method was validated for specificity, linearity, precision, accuracy. The degree of linearity of the calibration curves, the percent recoveries, limit of detection and quantitation for the spectrophotometric method were determined. No interferences could be observed from the additives commonly present in the pharmaceutical formulations. The method was successfully applied for in vitro determination of human urine samples with low RSD value. This is simple, specific, accurate and sensitive spectrophotometric method.

  1. Validation of a Smartphone Image-Based Dietary Assessment Method for Pregnant Women

    Directory of Open Access Journals (Sweden)

    Amy M. Ashman

    2017-01-01

    Full Text Available Image-based dietary records could lower participant burden associated with traditional prospective methods of dietary assessment. They have been used in children, adolescents and adults, but have not been evaluated in pregnant women. The current study evaluated relative validity of the DietBytes image-based dietary assessment method for assessing energy and nutrient intakes. Pregnant women collected image-based dietary records (via a smartphone application of all food, drinks and supplements consumed over three non-consecutive days. Intakes from the image-based method were compared to intakes collected from three 24-h recalls, taken on random days; once per week, in the weeks following the image-based record. Data were analyzed using nutrient analysis software. Agreement between methods was ascertained using Pearson correlations and Bland-Altman plots. Twenty-five women (27 recruited, one withdrew, one incomplete, median age 29 years, 15 primiparas, eight Aboriginal Australians, completed image-based records for analysis. Significant correlations between the two methods were observed for energy, macronutrients and fiber (r = 0.58–0.84, all p < 0.05, and for micronutrients both including (r = 0.47–0.94, all p < 0.05 and excluding (r = 0.40–0.85, all p < 0.05 supplements in the analysis. Bland-Altman plots confirmed acceptable agreement with no systematic bias. The DietBytes method demonstrated acceptable relative validity for assessment of nutrient intakes of pregnant women.

  2. MR-based Water Content Estimation in Cartilage: Design and Validation of a Method

    DEFF Research Database (Denmark)

    Shiguetomi Medina, Juan Manuel; Kristiansen, Maja Sofie; Ringgaard, Steffen

    2012-01-01

    Objective Design and validation of an MR-based method that allows the calculation of the water content in cartilage tissue. Material and Methods We modified and adapted to cartilage tissue T1 map based water content MR sequences commonly used in the neurology field. Using a 37 Celsius degree stable...... was costumed and programmed. Finally, we validated the method after measuring and comparing 3 more cartilage samples in a living animal (pig). The obtained data was analyzed and the water content calculated. Then, the same samples were freeze-dried (this technique allows to take out all the water that a tissue...... contains) and we measured the water they contained. Results We could reproduce twice the 37 Celsius degree system and could perform the measurements in a similar way. We found that the MR T1 map based water content sequences can provide information that, after being analyzed with a special software, can...

  3. Determination of methylmercury in marine biota samples with advanced mercury analyzer: method validation.

    Science.gov (United States)

    Azemard, Sabine; Vassileva, Emilia

    2015-06-01

    In this paper, we present a simple, fast and cost-effective method for determination of methyl mercury (MeHg) in marine samples. All important parameters influencing the sample preparation process were investigated and optimized. Full validation of the method was performed in accordance to the ISO-17025 (ISO/IEC, 2005) and Eurachem guidelines. Blanks, selectivity, working range (0.09-3.0ng), recovery (92-108%), intermediate precision (1.7-4.5%), traceability, limit of detection (0.009ng), limit of quantification (0.045ng) and expanded uncertainty (15%, k=2) were assessed. Estimation of the uncertainty contribution of each parameter and the demonstration of traceability of measurement results was provided as well. Furthermore, the selectivity of the method was studied by analyzing the same sample extracts by advanced mercury analyzer (AMA) and gas chromatography-atomic fluorescence spectrometry (GC-AFS). Additional validation of the proposed procedure was effectuated by participation in the IAEA-461 worldwide inter-laboratory comparison exercises. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Using method triangulation to validate a new instrument (CPWQ-com) assessing cancer patients' satisfaction with communication

    DEFF Research Database (Denmark)

    Ross, Lone; Lundstrøm, Louise Hyldborg; Petersen, Morten Aagaard

    2012-01-01

    Patients' perceptions of care including the communication with health care staff is recognized as an important aspect of the quality of cancer care. Using mixed methods, we developed and validated a short instrument assessing this communication.......Patients' perceptions of care including the communication with health care staff is recognized as an important aspect of the quality of cancer care. Using mixed methods, we developed and validated a short instrument assessing this communication....

  5. Validation of an analytical method for nitrous oxide (N2O) laughing gas by headspace gas chromatography coupled to mass spectrometry (HS-GC-MS): forensic application to a lethal intoxication.

    Science.gov (United States)

    Giuliani, N; Beyer, J; Augsburger, M; Varlet, V

    2015-03-01

    Drug abuse is a widespread problem affecting both teenagers and adults. Nitrous oxide is becoming increasingly popular as an inhalation drug, causing harmful neurological and hematological effects. Some gas chromatography-mass spectrometry (GC-MS) methods for nitrous oxide measurement have been previously described. The main drawbacks of these methods include a lack of sensitivity for forensic applications; including an inability to quantitatively determine the concentration of gas present. The following study provides a validated method using HS-GC-MS which incorporates hydrogen sulfide as a suitable internal standard allowing the quantification of nitrous oxide. Upon analysis, sample and internal standard have similar retention times and are eluted quickly from the molecular sieve 5Å PLOT capillary column and the Porabond Q column therefore providing rapid data collection whilst preserving well defined peaks. After validation, the method has been applied to a real case of N2O intoxication indicating concentrations in a mono-intoxication. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. A gradient-based method for segmenting FDG-PET images: methodology and validation

    International Nuclear Information System (INIS)

    Geets, Xavier; Lee, John A.; Gregoire, Vincent; Bol, Anne; Lonneux, Max

    2007-01-01

    A new gradient-based method for segmenting FDG-PET images is described and validated. The proposed method relies on the watershed transform and hierarchical cluster analysis. To allow a better estimation of the gradient intensity, iteratively reconstructed images were first denoised and deblurred with an edge-preserving filter and a constrained iterative deconvolution algorithm. Validation was first performed on computer-generated 3D phantoms containing spheres, then on a real cylindrical Lucite phantom containing spheres of different volumes ranging from 2.1 to 92.9 ml. Moreover, laryngeal tumours from seven patients were segmented on PET images acquired before laryngectomy by the gradient-based method and the thresholding method based on the source-to-background ratio developed by Daisne (Radiother Oncol 2003;69:247-50). For the spheres, the calculated volumes and radii were compared with the known values; for laryngeal tumours, the volumes were compared with the macroscopic specimens. Volume mismatches were also analysed. On computer-generated phantoms, the deconvolution algorithm decreased the mis-estimate of volumes and radii. For the Lucite phantom, the gradient-based method led to a slight underestimation of sphere volumes (by 10-20%), corresponding to negligible radius differences (0.5-1.1 mm); for laryngeal tumours, the segmented volumes by the gradient-based method agreed with those delineated on the macroscopic specimens, whereas the threshold-based method overestimated the true volume by 68% (p = 0.014). Lastly, macroscopic laryngeal specimens were totally encompassed by neither the threshold-based nor the gradient-based volumes. The gradient-based segmentation method applied on denoised and deblurred images proved to be more accurate than the source-to-background ratio method. (orig.)

  7. Validity of a structured method of selecting abstracts for a plastic surgical scientific meeting

    NARCIS (Netherlands)

    van der Steen, LPE; Hage, JJ; Kon, M; Monstrey, SJ

    In 1999, the European Association of Plastic Surgeons accepted a structured method to assess and select the abstracts that are submitted for its yearly scientific meeting. The two criteria used to evaluate whether such a selection method is accurate were reliability and validity. The authors

  8. A mixed methods inquiry into the validity of data

    Directory of Open Access Journals (Sweden)

    Vaarst Mette

    2008-07-01

    Full Text Available Abstract Background Research in herd health management solely using a quantitative approach may present major challenges to the interpretation of the results, because the humans involved may have responded to their observations based on previous experiences and own beliefs. This challenge can be met through increased awareness and dialogue between researchers and farmers or other stakeholders about the background for data collection related to management and changes in management. By integrating quantitative and qualitative research methods in a mixed methods research approach, the researchers will improve their understanding of this potential bias of the observed data and farms, which will enable them to obtain more useful results of quantitative analyses. Case description An example is used to illustrate the potentials of combining quantitative and qualitative approaches to herd health related data analyses. The example is based on two studies on bovine metritis. The first study was a quantitative observational study of risk factors for metritis in Danish dairy cows based on data from the Danish Cattle Database. The other study was a semi-structured interview study involving 20 practicing veterinarians with the aim to gain insight into veterinarians' decision making when collecting and processing data related to metritis. Discussion and Evaluation The relations between risk factors and metritis in the first project supported the findings in several other quantitative observational studies; however, the herd incidence risk was highly skewed. There may be simple practical reasons for this, e.g. underreporting and differences in the veterinarians' decision making. Additionally, the interviews in the second project identified several problems with correctness and validity of data regarding the occurrence of metritis because of differences regarding case definitions and thresholds for treatments between veterinarians. Conclusion Studies where

  9. Validated, Ultra Violet Spectroscopy method for the Dissolution study of Mycophenolate mofetil immediate release 500mg tablets

    OpenAIRE

    Surajpal P. Verma; Ozair Alam; Pooja Mullick; Nadeem Siddiqui; Suroor A. Khan

    2008-01-01

    A simple, selective and precise dissolution method was developed and validated for the Mycophenolate mofetil immediate release tablets. The method employed dissolution medium 0.1N HCl (pH1.2) and volume 900ml with USP-II apparatus (Paddle). Detection was made by measuring the absorbance on UV at the [lambda]~max~ 250nm. The method show the linearity in the range of conc. 5[micro]g/ml to 40[micro]g/ml with r^2^=0.999. The method is also validated as per International Conference of Harmonizatio...

  10. Fatty acid ethyl esters (FAEEs) as markers for alcohol in meconium: method validation and implementation of a screening program for prenatal drug exposure.

    Science.gov (United States)

    Hastedt, Martin; Krumbiegel, Franziska; Gapert, René; Tsokos, Michael; Hartwig, Sven

    2013-09-01

    Alcohol consumption during pregnancy is a widespread problem and can cause severe fetal damage. As the diagnosis of fetal alcohol syndrome is difficult, the implementation of a reliable marker for alcohol consumption during pregnancy into meconium drug screening programs would be invaluable. A previously published gas chromatography mass spectrometry method for the detection of fatty acid ethyl esters (FAEEs) as alcohol markers in meconium was optimized and newly validated for a sample size of 50 mg. This method was applied to 122 cases from a drug-using population. The meconium samples were also tested for common drugs of abuse. In 73 % of the cases, one or more drugs were found. Twenty percent of the samples tested positive for FAEEs at levels indicating significant alcohol exposure. Consequently, alcohol was found to be the third most frequently abused substance within the study group. This re-validated method provides an increase in testing sensitivity, is reliable and easily applicable as part of a drug screening program. It can be used as a non-invasive tool to detect high alcohol consumption in the last trimester of pregnancy. The introduction of FAEEs testing in meconium screening was found to be of particular use in a drug-using population.

  11. Fast CSF MRI for brain segmentation; Cross-validation by comparison with 3D T1-based brain segmentation methods

    DEFF Research Database (Denmark)

    van der Kleij, Lisa A.; de Bresser, Jeroen; Hendrikse, Jeroen

    2018-01-01

    ObjectiveIn previous work we have developed a fast sequence that focusses on cerebrospinal fluid (CSF) based on the long T-2 of CSF. By processing the data obtained with this CSF MRI sequence, brain parenchymal volume (BPV) and intracranial volume (ICV) can be automatically obtained. The aim...... of this study was to assess the precision of the BPV and ICV measurements of the CSF MRI sequence and to validate the CSF MRI sequence by comparison with 3D T-1-based brain segmentation methods.Materials and methodsTen healthy volunteers (2 females; median age 28 years) were scanned (3T MRI) twice......cc) and CSF HR (5 +/- 5/4 +/- 2cc) were comparable to FSL HR (9 +/- 11/19 +/- 23cc), FSL LR (7 +/- 4,6 +/- 5cc),FreeSurfer HR (5 +/- 3/14 +/- 8cc), FreeSurfer LR (9 +/- 8,12 +/- 10cc), and SPM HR (5 +/- 3/4 +/- 7cc), and SPM LR (5 +/- 4,5 +/- 3cc). The correlation between the measured volumes...

  12. Using digital photography in a clinical setting: a valid, accurate, and applicable method to assess food intake.

    Science.gov (United States)

    Winzer, Eva; Luger, Maria; Schindler, Karin

    2018-06-01

    Regular monitoring of food intake is hardly integrated in clinical routine. Therefore, the aim was to examine the validity, accuracy, and applicability of an appropriate and also quick and easy-to-use tool for recording food intake in a clinical setting. Two digital photography methods, the postMeal method with a picture after the meal, the pre-postMeal method with a picture before and after the meal, and the visual estimation method (plate diagram; PD) were compared against the reference method (weighed food records; WFR). A total of 420 dishes from lunch (7 weeks) were estimated with both photography methods and the visual method. Validity, applicability, accuracy, and precision of the estimation methods, and additionally food waste, macronutrient composition, and energy content were examined. Tests of validity revealed stronger correlations for photography methods (postMeal: r = 0.971, p < 0.001; pre-postMeal: r = 0.995, p < 0.001) compared to the visual estimation method (r = 0.810; p < 0.001). The pre-postMeal method showed smaller variability (bias < 1 g) and also smaller overestimation and underestimation. This method accurately and precisely estimated portion sizes in all food items. Furthermore, the total food waste was 22% for lunch over the study period. The highest food waste was observed in salads and the lowest in desserts. The pre-postMeal digital photography method is valid, accurate, and applicable in monitoring food intake in clinical setting, which enables a quantitative and qualitative dietary assessment. Thus, nutritional care might be initiated earlier. This method might be also advantageous for quantitative and qualitative evaluation of food waste, with a resultantly reduction in costs.

  13. Validation of the Nuclear Design Method for MOX Fuel Loaded LWR Cores

    International Nuclear Information System (INIS)

    Saji, E.; Inoue, Y.; Mori, M.; Ushio, T.

    2001-01-01

    The actual batch loading of mixed-oxide (MOX) fuel in light water reactors (LWRs) is now ready to start in Japan. One of the efforts that have been devoted to realizing this batch loading has been validation of the nuclear design methods calculating the MOX-fuel-loaded LWR core characteristics. This paper summarizes the validation work for the applicability of the CASMO-4/SIMULATE-3 in-core fuel management code system to MOX-fuel-loaded LWR cores. This code system is widely used by a number of electric power companies for the core management of their commercial LWRs. The validation work was performed for both boiling water reactor (BWR) and pressurized water reactor (PWR) applications. Each validation consists of two parts: analyses of critical experiments and core tracking calculations of operating plants. For the critical experiments, we have chosen a series of experiments known as the VENUS International Program (VIP), which was performed at the SCK/CEN MOL laboratory in Belgium. VIP consists of both BWR and PWR fuel assembly configurations. As for the core tracking calculations, the operating data of MOX-fuel-loaded BWR and PWR cores in Europe have been utilized

  14. Development and validity of a method for the evaluation of printed education material.

    Directory of Open Access Journals (Sweden)

    Castro MS

    2007-06-01

    Full Text Available Objectives: To develop and study the validity of an instrument for evaluation of Printed Education Materials (PEM; to evaluate the use of acceptability indices; to identify possible influences of professional aspects. Methods: An instrument for PEM evaluation was developed which included tree steps: domain identification, item generation and instrument design. A reading to easy PEM was developed for education of patient with systemic hypertension and its treatment with hydrochlorothiazide. Construct validity was measured based on previously established errors purposively introduced into the PEM, which served as extreme groups. An acceptability index was applied taking into account the rate of professionals who should approve each item. Participants were 10 physicians (9 men and 5 nurses (all women.Results: Many professionals identified intentional errors of crude character. Few participants identified errors that needed more careful evaluation, and no one detected the intentional error that required literature analysis. Physicians considered as acceptable 95.8% of the items of the PEM, and nurses 29.2%. The differences between the scoring were statistically significant in 27% of the items. In the overall evaluation, 66.6% were considered as acceptable. The analysis of each item revealed a behavioral pattern for each professional group.Conclusions: The use of instruments for evaluation of printed education materials is required and may improve the quality of the PEM available for the patients. Not always are the acceptability indices totally correct or represent high quality of information. The professional experience, the practice pattern, and perhaps the gendre of the reviewers may influence their evaluation. An analysis of the PEM by professionals in communication, in drug information, and patients should be carried out to improve the quality of the proposed material.

  15. Validation and further development of a novel thermal analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Mathews, E.H.; Shuttleworth, A.G.; Rousseau, P.G. [Pretoria Univ. (South Africa). Dept. of Mechanical Engineering

    1994-12-31

    The design of thermal and energy efficient buildings requires inter alia the investigation of the passive performance, natural ventilation, mechanical ventilation as well as structural and evaporative cooling of the building. Only when these fail to achieve the desired thermal comfort should mechanical cooling systems be considered. Few computer programs have the ability to investigate all these comfort regulating methods at the design stage. The QUICK design program can simulate these options with the exception of mechanical cooling. In this paper, Quick`s applicability is extended to include the analysis of basic air-conditioning systems. Since the design of these systems is based on indoor loads, it was necessary to validate QUICK`s load predictions before extending it. This article addresses validation in general and proposes a procedure to establish the efficiency of a program`s load predictions. This proposed procedure is used to compare load predictions by the ASHRAE, CIBSE, CARRIER, CHEETAH, BSIMAC and QUICK methods for 46 case studies involving 36 buildings in various climatic conditions. Although significant differences in the results of the various methods were observed, it is concluded that QUICK can be used with the same confidence as the other methods. It was further shown that load prediction programs usually under-estimate the effect of building mass and therefore over-estimate the peak loads. The details for the 46 case studies are available to other researchers for further verification purposes. With the confidence gained in its load predictions, QUICK was extended to include air-conditioning system analysis. The program was then applied to different case studies. It is shown that system size and energy usage can be reduced by more than 60% by using a combination of passive and mechanical cooling systems as well as different control strategies. (author)

  16. Validation of cleaning method for various parts fabricated at a Beryllium facility

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Cynthia M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-15

    This study evaluated and documented a cleaning process that is used to clean parts that are fabricated at a beryllium facility at Los Alamos National Laboratory. The purpose of evaluating this cleaning process was to validate and approve it for future use to assure beryllium surface levels are below the Department of Energy’s release limits without the need to sample all parts leaving the facility. Inhaling or coming in contact with beryllium can cause an immune response that can result in an individual becoming sensitized to beryllium, which can then lead to a disease of the lungs called chronic beryllium disease, and possibly lung cancer. Thirty aluminum and thirty stainless steel parts were fabricated on a lathe in the beryllium facility, as well as thirty-two beryllium parts, for the purpose of testing a parts cleaning method that involved the use of ultrasonic cleaners. A cleaning method was created, documented, validated, and approved, to reduce beryllium contamination.

  17. Validation of internal dosimetry protocols based on stochastic method

    International Nuclear Information System (INIS)

    Mendes, Bruno M.; Fonseca, Telma C.F.; Almeida, Iassudara G.; Trindade, Bruno M.; Campos, Tarcisio P.R.

    2015-01-01

    Computational phantoms adapted to Monte Carlo codes have been applied successfully in radiation dosimetry fields. NRI research group has been developing Internal Dosimetry Protocols - IDPs, addressing distinct methodologies, software and computational human-simulators, to perform internal dosimetry, especially for new radiopharmaceuticals. Validation of the IDPs is critical to ensure the reliability of the simulations results. Inter comparisons of data from literature with those produced by our IDPs is a suitable method for validation. The aim of this study was to validate the IDPs following such inter comparison procedure. The Golem phantom has been reconfigured to run on MCNP5. The specific absorbed fractions (SAF) for photon at 30, 100 and 1000 keV energies were simulated based on the IDPs and compared with reference values (RV) published by Zankl and Petoussi-Henss, 1998. The SAF average differences from RV and those obtained in IDP simulations was 2.3 %. The SAF largest differences were found in situations involving low energy photons at 30 keV. The Adrenals and thyroid, i.e. the lowest mass organs, had the highest SAF discrepancies towards RV as 7.2 % and 3.8 %, respectively. The statistic differences of SAF applying our IDPs from reference values were considered acceptable at the 30, 100 and 1000 keV spectra. We believe that the main reason for the discrepancies in IDPs run, found in lower masses organs, was due to our source definition methodology. Improvements of source spatial distribution in the voxels may provide outputs more consistent with reference values for lower masses organs. (author)

  18. Validation of internal dosimetry protocols based on stochastic method

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, Bruno M.; Fonseca, Telma C.F., E-mail: bmm@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Almeida, Iassudara G.; Trindade, Bruno M.; Campos, Tarcisio P.R., E-mail: tprcampos@yahoo.com.br [Universidade Federal de Minas Gerais (DEN/UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2015-07-01

    Computational phantoms adapted to Monte Carlo codes have been applied successfully in radiation dosimetry fields. NRI research group has been developing Internal Dosimetry Protocols - IDPs, addressing distinct methodologies, software and computational human-simulators, to perform internal dosimetry, especially for new radiopharmaceuticals. Validation of the IDPs is critical to ensure the reliability of the simulations results. Inter comparisons of data from literature with those produced by our IDPs is a suitable method for validation. The aim of this study was to validate the IDPs following such inter comparison procedure. The Golem phantom has been reconfigured to run on MCNP5. The specific absorbed fractions (SAF) for photon at 30, 100 and 1000 keV energies were simulated based on the IDPs and compared with reference values (RV) published by Zankl and Petoussi-Henss, 1998. The SAF average differences from RV and those obtained in IDP simulations was 2.3 %. The SAF largest differences were found in situations involving low energy photons at 30 keV. The Adrenals and thyroid, i.e. the lowest mass organs, had the highest SAF discrepancies towards RV as 7.2 % and 3.8 %, respectively. The statistic differences of SAF applying our IDPs from reference values were considered acceptable at the 30, 100 and 1000 keV spectra. We believe that the main reason for the discrepancies in IDPs run, found in lower masses organs, was due to our source definition methodology. Improvements of source spatial distribution in the voxels may provide outputs more consistent with reference values for lower masses organs. (author)

  19. Examination of packaging materials in bakery products : a validated method for detection and quantification

    NARCIS (Netherlands)

    Raamsdonk, van L.W.D.; Pinckaers, V.G.Z.; Vliege, J.J.M.; Egmond, van H.J.

    2012-01-01

    Methods for the detection and quantification of packaging materials are necessary for the control of the prohibition of these materials according to Regulation (EC)767/2009. A method has been developed and validated at RIKILT for bakery products, including sweet bread and raisin bread. This choice

  20. Chemical analysis of solid residue from liquid and solid fuel combustion: Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Trkmic, M. [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecturek Zagreb (Croatia); Curkovic, L. [University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb (Croatia); Asperger, D. [HEP-Proizvodnja, Thermal Power Plant Department, Zagreb (Croatia)

    2012-06-15

    This paper deals with the development and validation of methods for identifying the composition of solid residue after liquid and solid fuel combustion in thermal power plant furnaces. The methods were developed for energy dispersive X-ray fluorescence (EDXRF) spectrometer analysis. Due to the fuels used, the different composition and the location of creation of solid residue, it was necessary to develop two methods. The first method is used for identifying solid residue composition after fuel oil combustion (Method 1), while the second method is used for identifying solid residue composition after the combustion of solid fuels, i. e. coal (Method 2). Method calibration was performed on sets of 12 (Method 1) and 6 (Method 2) certified reference materials (CRM). CRMs and analysis test samples were prepared in pellet form using hydraulic press. For the purpose of method validation the linearity, accuracy, precision and specificity were determined, and the measurement uncertainty of methods for each analyte separately was assessed. The methods were applied in the analysis of real furnace residue samples. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  1. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  2. Modelling of the activity system - development of an evaluation method for integrated system validation

    International Nuclear Information System (INIS)

    Norros, Leena; Savioja, Paula

    2004-01-01

    In this paper we present our recent research which focuses on creating an evaluation method for human-system interfaces of complex systems. The method is aimed for use in the validation of modernised nuclear power plant (NPP) control rooms, and other complex systems with high reliability requirements. The task in validation is to determine whether the human-system functions safely and effectively. This question can be operationalized to the selection of relevant operational features and their appropriate acceptance criteria. Thus, there is a need to ensure that the results of the evaluation can be generalized so that they serve the purpose of integrated system validation. The definition of the appropriate acceptance criteria provides basis for the judgement of the appropriateness of the performance of the system. We propose that the operational situations and the acceptance criteria should be defined based on modelling of the NPP operation that is comprehended as an activity system. We developed a new core-tasks modelling framework. It is a formative modelling approach that combines causal, functional and understanding explanations of system performance. In this paper we reason how modelling can be used as a medium to determine the validity of the emerging control room system. (Author)

  3. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods

    Directory of Open Access Journals (Sweden)

    Alistair Currie

    2011-11-01

    Full Text Available In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches.

  4. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods.

    Science.gov (United States)

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-11-29

    In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches.

  5. Methods and practices for verification and validation of programmable systems

    International Nuclear Information System (INIS)

    Heimbuerger, H.; Haapanen, P.; Pulkkinen, U.

    1993-01-01

    The programmable systems deviate by their properties and behaviour from the conventional non-programmable systems in such extent, that their verification and validation for safety critical applications requires new methods and practices. The safety assessment can not be based on conventional probabilistic methods due to the difficulties in the quantification of the reliability of the software and hardware. The reliability estimate of the system must be based on qualitative arguments linked to a conservative claim limit. Due to the uncertainty of the quantitative reliability estimate other means must be used to get more assurance about the system safety. Methods and practices based on research done by VTT for STUK, are discussed in the paper as well as the methods applicable in the reliability analysis of software based safety functions. The most essential concepts and models of quantitative reliability analysis are described. The application of software models in probabilistic safety analysis (PSA) is evaluated. (author). 18 refs

  6. Parallel Resolved Open Source CFD-DEM: Method, Validation and Application

    Directory of Open Access Journals (Sweden)

    A. Hager

    2014-03-01

    Full Text Available In the following paper the authors present a fully parallelized Open Source method for calculating the interaction of immersed bodies and surrounding fluid. A combination of computational fluid dynamics (CFD and a discrete element method (DEM accounts for the physics of both the fluid and the particles. The objects considered are relatively big compared to the cells of the fluid mesh, i.e. they cover several cells each. Thus this fictitious domain method (FDM is called resolved. The implementation is realized within the Open Source framework CFDEMcOupling (www.cfdem.com, which provides an interface between OpenFOAM® based CFD-solvers and the DEM software LIGGGHTS (www.liggghts.com. While both LIGGGHTS and OpenFOAM® were already parallelized, only a recent improvement of the algorithm permits the fully parallel computation of resolved problems. Alongside with a detailed description of the method, its implementation and recent improvements, a number of application and validation examples is presented in the scope of this paper.

  7. Reliability and validity of a brief method to assess nociceptive flexion reflex (NFR) threshold.

    Science.gov (United States)

    Rhudy, Jamie L; France, Christopher R

    2011-07-01

    The nociceptive flexion reflex (NFR) is a physiological tool to study spinal nociception. However, NFR assessment can take several minutes and expose participants to repeated suprathreshold stimulations. The 4 studies reported here assessed the reliability and validity of a brief method to assess NFR threshold that uses a single ascending series of stimulations (Peak 1 NFR), by comparing it to a well-validated method that uses 3 ascending/descending staircases of stimulations (Staircase NFR). Correlations between the NFR definitions were high, were on par with test-retest correlations of Staircase NFR, and were not affected by participant sex or chronic pain status. Results also indicated the test-retest reliabilities for the 2 definitions were similar. Using larger stimulus increments (4 mAs) to assess Peak 1 NFR tended to result in higher NFR threshold estimates than using the Staircase NFR definition, whereas smaller stimulus increments (2 mAs) tended to result in lower NFR threshold estimates than the Staircase NFR definition. Neither NFR definition was correlated with anxiety, pain catastrophizing, or anxiety sensitivity. In sum, a single ascending series of electrical stimulations results in a reliable and valid estimate of NFR threshold. However, caution may be warranted when comparing NFR thresholds across studies that differ in the ascending stimulus increments. This brief method to assess NFR threshold is reliable and valid; therefore, it should be useful to clinical pain researchers interested in quickly assessing inter- and intra-individual differences in spinal nociceptive processes. Copyright © 2011 American Pain Society. Published by Elsevier Inc. All rights reserved.

  8. Methods and procedures for the verification and validation of artificial neural networks

    CERN Document Server

    Taylor, Brian J

    2006-01-01

    Neural networks are members of a class of software that have the potential to enable intelligent computational systems capable of simulating characteristics of biological thinking and learning. This volume introduces some of the methods and techniques used for the verification and validation of neural networks and adaptive systems.

  9. Validity Argument for Assessing L2 Pragmatics in Interaction Using Mixed Methods

    Science.gov (United States)

    Youn, Soo Jung

    2015-01-01

    This study investigates the validity of assessing L2 pragmatics in interaction using mixed methods, focusing on the evaluation inference. Open role-plays that are meaningful and relevant to the stakeholders in an English for Academic Purposes context were developed for classroom assessment. For meaningful score interpretations and accurate…

  10. Validity of the remote food photography method against doubly labeled water among minority preschoolers

    Science.gov (United States)

    The aim of this study was to determine the validity of energy intake (EI) estimations made using the remote food photography method (RFPM) compared to the doubly labeled water (DLW) method in minority preschool children in a free-living environment. Seven days of food intake and spot urine samples...

  11. Validation of the Abdominal Pain Index using a revised scoring method.

    Science.gov (United States)

    Laird, Kelsey T; Sherman, Amanda L; Smith, Craig A; Walker, Lynn S

    2015-06-01

    Evaluate the psychometric properties of child- and parent-report versions of the four-item Abdominal Pain Index (API) in children with functional abdominal pain (FAP) and healthy controls, using a revised scoring method that facilitates comparisons of scores across samples and time. Pediatric patients aged 8-18 years with FAP and controls completed the API at baseline (N = 1,967); a subset of their parents (N = 290) completed the API regarding the child's pain. Subsets of patients completed follow-up assessments at 2 weeks (N = 231), 3 months (N = 330), and 6 months (N = 107). Subsets of both patients (N = 389) and healthy controls (N = 172) completed a long-term follow-up assessment (mean age at follow-up = 20.21 years, SD = 3.75). The API demonstrated good concurrent, discriminant, and construct validity, as well as good internal consistency. We conclude that the API, using the revised scoring method, is a useful, reliable, and valid measure of abdominal pain severity. © The Author 2015. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Validation of a same-day real-time PCR method for screening of meat and carcass swabs for Salmonella

    Science.gov (United States)

    2009-01-01

    Background One of the major sources of human Salmonella infections is meat. Therefore, efficient and rapid monitoring of Salmonella in the meat production chain is necessary. Validation of alternative methods is needed to prove that the performance is equal to established methods. Very few of the published PCR methods for Salmonella have been validated in collaborative studies. This study describes a validation including comparative and collaborative trials, based on the recommendations from the Nordic organization for validation of alternative microbiological methods (NordVal) of a same-day, non-commercial real-time PCR method for detection of Salmonella in meat and carcass swabs. Results The comparative trial was performed against a reference method (NMKL-71:5, 1999) using artificially and naturally contaminated samples (60 minced veal and pork meat samples, 60 poultry neck-skins, and 120 pig carcass swabs). The relative accuracy was 99%, relative detection level 100%, relative sensitivity 103% and relative specificity 100%. The collaborative trial included six laboratories testing minced meat, poultry neck-skins, and carcass swabs as un-inoculated samples and samples artificially contaminated with 1–10 CFU/25 g, and 10–100 CFU/25 g. Valid results were obtained from five of the laboratories and used for the statistical analysis. Apart from one of the non-inoculated samples being false positive with PCR for one of the laboratories, no false positive or false negative results were reported. Partly based on results obtained in this study, the method has obtained NordVal approval for analysis of Salmonella in meat and carcass swabs. The PCR method was transferred to a production laboratory and the performance was compared with the BAX Salmonella test on 39 pork samples artificially contaminated with Salmonella. There was no significant difference in the results obtained by the two methods. Conclusion The real-time PCR method for detection of Salmonella in meat

  13. Validation approach for a fast and simple targeted screening method for 75 antibiotics in meat and aquaculture products using LC-MS/MS.

    Science.gov (United States)

    Dubreil, Estelle; Gautier, Sophie; Fourmond, Marie-Pierre; Bessiral, Mélaine; Gaugain, Murielle; Verdon, Eric; Pessel, Dominique

    2017-04-01

    An approach is described to validate a fast and simple targeted screening method for antibiotic analysis in meat and aquaculture products by LC-MS/MS. The strategy of validation was applied for a panel of 75 antibiotics belonging to different families, i.e., penicillins, cephalosporins, sulfonamides, macrolides, quinolones and phenicols. The samples were extracted once with acetonitrile, concentrated by evaporation and injected into the LC-MS/MS system. The approach chosen for the validation was based on the Community Reference Laboratory (CRL) guidelines for the validation of screening qualitative methods. The aim of the validation was to prove sufficient sensitivity of the method to detect all the targeted antibiotics at the level of interest, generally the maximum residue limit (MRL). A robustness study was also performed to test the influence of different factors. The validation showed that the method is valid to detect and identify 73 antibiotics of the 75 antibiotics studied in meat and aquaculture products at the validation levels.

  14. CONSTRUCT VALIDITY AND SCORING METHODS OF THE WORLD HEALTH ORGANIZATION- HEALTH AND WORK PERFORMANCE QUESTIONNAIRE AMONG WORKERS WITH ARTHRITIS AND RHEUMATOLOGICAL CONDITIONS

    Science.gov (United States)

    AlHeresh, Rawan; LaValley, Michael P.; Coster, Wendy; Keysor, Julie J.

    2017-01-01

    Objective To evaluate construct validity and scoring methods of the world health organization- health and work performance questionnaire (HPQ) for people with arthritis. Methods Construct validity was examined through hypothesis testing using the recommended guidelines of the Consensus-based Standards for the selection of health Measurement Instruments (COSMIN). Results The HPQ using the absolute scoring method showed moderate construct validity as 4 of the 7 hypotheses were met. The HPQ using the relative scoring method had weak construct validity as only one of the 7 hypotheses were met. Conclusion The absolute scoring method for the HPQ is superior in construct validity to the relative scoring method in assessing work performance among people with arthritis and related rheumatic conditions; however, more research is needed to further explore other psychometric properties of the HPQ. PMID:28598938

  15. Validation and Clinical Evaluation of a Novel Method To Measure Miltefosine in Leishmaniasis Patients Using Dried Blood Spot Sample Collection

    Science.gov (United States)

    Rosing, H.; Hillebrand, M. J. X.; Blesson, S.; Mengesha, B.; Diro, E.; Hailu, A.; Schellens, J. H. M.; Beijnen, J. H.

    2016-01-01

    To facilitate future pharmacokinetic studies of combination treatments against leishmaniasis in remote regions in which the disease is endemic, a simple cheap sampling method is required for miltefosine quantification. The aims of this study were to validate a liquid chromatography-tandem mass spectrometry method to quantify miltefosine in dried blood spot (DBS) samples and to validate its use with Ethiopian patients with visceral leishmaniasis (VL). Since hematocrit (Ht) levels are typically severely decreased in VL patients, returning to normal during treatment, the method was evaluated over a range of clinically relevant Ht values. Miltefosine was extracted from DBS samples using a simple method of pretreatment with methanol, resulting in >97% recovery. The method was validated over a calibration range of 10 to 2,000 ng/ml, and accuracy and precision were within ±11.2% and ≤7.0% (≤19.1% at the lower limit of quantification), respectively. The method was accurate and precise for blood spot volumes between 10 and 30 μl and for Ht levels of 20 to 35%, although a linear effect of Ht levels on miltefosine quantification was observed in the bioanalytical validation. DBS samples were stable for at least 162 days at 37°C. Clinical validation of the method using paired DBS and plasma samples from 16 VL patients showed a median observed DBS/plasma miltefosine concentration ratio of 0.99, with good correlation (Pearson's r = 0.946). Correcting for patient-specific Ht levels did not further improve the concordance between the sampling methods. This successfully validated method to quantify miltefosine in DBS samples was demonstrated to be a valid and practical alternative to venous blood sampling that can be applied in future miltefosine pharmacokinetic studies with leishmaniasis patients, without Ht correction. PMID:26787691

  16. Cross-validation pitfalls when selecting and assessing regression and classification models.

    Science.gov (United States)

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  17. Triacylglycerol secretion in rats: validation of a tracer method employing radioactive glycerol

    International Nuclear Information System (INIS)

    Bird, M.; Williams, M.A.; Baker, N.

    1984-01-01

    A two-compartment model was developed to analyze the temporal changes in plasma triacylglycerol (TG)-specific radioactivity after injection of [2- 3 H]glycerol into rats. The analysis, which yielded fractional rate constants of TG secretion, was tested in rats fed diets either adequate or deficient in essential fatty acids (EFA) and containing either glucose, fructose or sucrose as the dietary carbohydrate. The method of analysis appeared valid, first, because of a close agreement between experimental and computer-fitted TG-specific radioactivity curves, and second, because the fractional rate constants obtained were quite similar to fractional rate constants determined previously by the Triton WR-1339 technique in rats maintained on identical diets. The results show that EFA deficiency increased the fractional rate constant of TG secretion 1.7-, 1.8- and 3.3-fold and the rate of TG secretion 1.8-, 1.6- and 1.4-fold when the dietary carbohydrate was glucose, sucrose and fructose, respectively, in comparison with control rats fed diets supplying these same carbohydrates but adequate in EFA. In the latter groups, the rates of plasma TG secretion were in the range of 0.14-0.17 mg/min per 100 g body weight, and the rate of secretion in the fructose-fed rats was only 20% higher than in the glucose-fed rats

  18. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    Science.gov (United States)

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Content validity across methods of malnutrition assessment in patients with cancer is limited

    NARCIS (Netherlands)

    Sealy, Martine J.; Nijholt, Willemke; Stuiver, Martijn M.; van der Berg, Marit M.; Roodenburg, Jan L. N.; Schans, van der Cees P.; Ottery, Faith D.; Jager-Wittenaar, Harriet

    Objective: To identify malnutrition assessment methods in cancer patients and assess their content validity based on internationally accepted definitions for malnutrition. Study Design and Setting: Systematic review of studies in cancer patients that operationalized malnutrition as a variable,

  20. Content validity across methods of malnutrition assessment in patients with cancer is limited

    NARCIS (Netherlands)

    Sealy, Martine; Nijholt, Willemke; Stuiver, M.M.; van der Berg, M.M.; Roodenburg, Jan; Ottery, Faith D.; van der Schans, Cees; Jager, Harriët

    2016-01-01

    Objective To identify malnutrition assessment methods in cancer patients and assess their content validity based on internationally accepted definitions for malnutrition. Study Design and Setting Systematic review of studies in cancer patients that operationalized malnutrition as a variable,

  1. Testing and Validation of the Dynamic Inertia Measurement Method

    Science.gov (United States)

    Chin, Alexander W.; Herrera, Claudia Y.; Spivey, Natalie D.; Fladung, William A.; Cloutier, David

    2015-01-01

    The Dynamic Inertia Measurement (DIM) method uses a ground vibration test setup to determine the mass properties of an object using information from frequency response functions. Most conventional mass properties testing involves using spin tables or pendulum-based swing tests, which for large aerospace vehicles becomes increasingly difficult and time-consuming, and therefore expensive, to perform. The DIM method has been validated on small test articles but has not been successfully proven on large aerospace vehicles. In response, the National Aeronautics and Space Administration Armstrong Flight Research Center (Edwards, California) conducted mass properties testing on an "iron bird" test article that is comparable in mass and scale to a fighter-type aircraft. The simple two-I-beam design of the "iron bird" was selected to ensure accurate analytical mass properties. Traditional swing testing was also performed to compare the level of effort, amount of resources, and quality of data with the DIM method. The DIM test showed favorable results for the center of gravity and moments of inertia; however, the products of inertia showed disagreement with analytical predictions.

  2. Gradient HPLC method development and validation for Simultaneous estimation of Rosiglitazone and Gliclazide.

    Directory of Open Access Journals (Sweden)

    Uttam Singh Baghel

    2012-10-01

    Full Text Available Objective: The aim of present work was to develop a gradient RP-HPLC method for simultaneous analysis of rosiglitazone and gliclazide, in a tablet dosage form. Method: Chromatographic system was optimized using a hypersil C18 (250mm x 4.6mm, 5毺 m column with potassium dihydrogen phosphate (pH-7.0 and acetonitrile in the ratio of 60:40, as mobile phase, at a flow rate of 1.0 ml/min. Detection was carried out at 225 nm by a SPD-20A prominence UV/Vis detector. Result: Rosiglitazone and gliclazide were eluted with retention times of 17.36 and 7.06 min, respectively. Beer’s Lambert ’s Law was obeyed over the concentration ranges of 5 to 70 毺 g/ml and 2 to 12 毺 g/ml for rosiglitazone and gliclazide, respectively. Conclusion: The high recovery and low coefficients of variation confirm the suitability of the method for simultaneous analysis of both drugs in a tablets dosage form. Statistical analysis proves that the method is sensitive and significant for the analysis of rosiglitazone and gliclazide in pure and in pharmaceutical dosage form without any interference from the excipients. The method was validated in accordance with ICH guidelines. Validation revealed the method is specific, rapid, accurate, precise, reliable, and reproducible.

  3. Influence of Previous Knowledge, Language Skills and Domain-specific Interest on Observation Competency

    Science.gov (United States)

    Kohlhauf, Lucia; Rutke, Ulrike; Neuhaus, Birgit

    2011-10-01

    Many epoch-making biological discoveries (e.g. Darwinian Theory) were based upon observations. Nevertheless, observation is often regarded as `just looking' rather than a basic scientific skill. As observation is one of the main research methods in biological sciences, it must be considered as an independent research method and systematic practice of this method is necessary. Because observation skills form the basis of further scientific methods (e.g. experiments or comparisons) and children from the age of 4 years are able to independently generate questions and hypotheses, it seems possible to foster observation competency at a preschool level. To be able to provide development-adequate individual fostering of this competency, it is first necessary to assess each child's competency. Therefore, drawing on the recent literature, we developed in this study a competency model that was empirically evaluated within learners ( N = 110) from different age groups, from kindergarten to university. In addition, we collected data on language skills, domain-specific interest and previous knowledge to analyse coherence between these skills and observation competency. The study showed as expected that previous knowledge had a high impact on observation competency, whereas the influence of domain-specific interest was nonexistent. Language skills were shown to have a weak influence. By utilising the empirically validated model consisting of three dimensions (`Describing', `Scientific reasoning' and `Interpreting') and three skill levels, it was possible to assess each child's competency level and to develop and evaluate guided play activities to individually foster a child's observation competency.

  4. Numerical Validation of the Delaunay Normalization and the Krylov-Bogoliubov-Mitropolsky Method

    Directory of Open Access Journals (Sweden)

    David Ortigosa

    2014-01-01

    Full Text Available A scalable second-order analytical orbit propagator programme based on modern and classical perturbation methods is being developed. As a first step in the validation and verification of part of our orbit propagator programme, we only consider the perturbation produced by zonal harmonic coefficients in the Earth’s gravity potential, so that it is possible to analyze the behaviour of the mathematical expressions involved in Delaunay normalization and the Krylov-Bogoliubov-Mitropolsky method in depth and determine their limits.

  5. Validation of the quality control method for sodium dicloxacillin in Dicloxen capsules

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Perez Navarro, Maikel; Suarez Perez, Yania

    2014-01-01

    Sodium dicloxacillin is a semi synthetic derivative of the isoxasocyl penicillin group that may appear in oral suspension form and in caplets. For the analysis of the raw materials and the finished products, it is recommended to use high performance liquid chromatography that is an unavailable method at the dicloxen capsule manufacturing lab for the routine analysis of the drug. To develop and to validate a useful ultraviolet spectrophotometry method for the quality control of sodium dicloxacillin in Dicloxen capsules

  6. A validation framework for microbial forensic methods based on statistical pattern recognition

    Energy Technology Data Exchange (ETDEWEB)

    Velsko, S P

    2007-11-12

    This report discusses a general approach to validating microbial forensic methods that attempt to simultaneously distinguish among many hypotheses concerning the manufacture of a questioned biological agent sample. It focuses on the concrete example of determining growth medium from chemical or molecular properties of a bacterial agent to illustrate the concepts involved.

  7. Validity of a manual soft tissue profile prediction method following mandibular setback osteotomy.

    Science.gov (United States)

    Kolokitha, Olga-Elpis

    2007-10-01

    The aim of this study was to determine the validity of a manual cephalometric method used for predicting the post-operative soft tissue profiles of patients who underwent mandibular setback surgery and compare it to a computerized cephalometric prediction method (Dentofacial Planner). Lateral cephalograms of 18 adults with mandibular prognathism taken at the end of pre-surgical orthodontics and approximately one year after surgery were used. To test the validity of the manual method the prediction tracings were compared to the actual post-operative tracings. The Dentofacial Planner software was used to develop the computerized post-surgical prediction tracings. Both manual and computerized prediction printouts were analyzed by using the cephalometric system PORDIOS. Statistical analysis was performed by means of t-test. Comparison between manual prediction tracings and the actual post-operative profile showed that the manual method results in more convex soft tissue profiles; the upper lip was found in a more prominent position, upper lip thickness was increased and, the mandible and lower lip were found in a less posterior position than that of the actual profiles. Comparison between computerized and manual prediction methods showed that in the manual method upper lip thickness was increased, the upper lip was found in a more anterior position and the lower anterior facial height was increased as compared to the computerized prediction method. Cephalometric simulation of post-operative soft tissue profile following orthodontic-surgical management of mandibular prognathism imposes certain limitations related to the methods implied. However, both manual and computerized prediction methods remain a useful tool for patient communication.

  8. Classification in hyperspectral images by independent component analysis, segmented cross-validation and uncertainty estimates

    Directory of Open Access Journals (Sweden)

    Beatriz Galindo-Prieto

    2018-02-01

    Full Text Available Independent component analysis combined with various strategies for cross-validation, uncertainty estimates by jack-knifing and critical Hotelling’s T2 limits estimation, proposed in this paper, is used for classification purposes in hyperspectral images. To the best of our knowledge, the combined approach of methods used in this paper has not been previously applied to hyperspectral imaging analysis for interpretation and classification in the literature. The data analysis performed here aims to distinguish between four different types of plastics, some of them containing brominated flame retardants, from their near infrared hyperspectral images. The results showed that the method approach used here can be successfully used for unsupervised classification. A comparison of validation approaches, especially leave-one-out cross-validation and regions of interest scheme validation is also evaluated.

  9. Development and validation of a novel, simple, and accurate spectrophotometric method for the determination of lead in human serum.

    Science.gov (United States)

    Shayesteh, Tavakol Heidari; Khajavi, Farzad; Khosroshahi, Abolfazl Ghafuri; Mahjub, Reza

    2016-01-01

    The determination of blood lead levels is the most useful indicator of the determination of the amount of lead that is absorbed by the human body. Various methods, like atomic absorption spectroscopy (AAS), have already been used for the detection of lead in biological fluid, but most of these methods are based on complicated, expensive, and highly instructed instruments. In this study, a simple and accurate spectroscopic method for the determination of lead has been developed and applied for the investigation of lead concentration in biological samples. In this study, a silica gel column was used to extract lead and eliminate interfering agents in human serum samples. The column was washed with deionized water. The pH was adjusted to the value of 8.2 using phosphate buffer, and then tartrate and cyanide solutions were added as masking agents. The lead content was extracted into the organic phase containing dithizone as a complexion reagent and the dithizone-Pb(II) complex was formed and approved by visible spectrophotometry at 538 nm. The recovery was found to be 84.6 %. In order to validate the method, a calibration curve involving the use of various concentration levels was calculated and proven to be linear in the range of 0.01-1.5 μg/ml, with an R (2) regression coefficient of 0.9968 by statistical analysis of linear model validation. The largest error % values were found to be -5.80 and +11.6 % for intra-day and inter-day measurements, respectively. The largest RSD % values were calculated to be 6.54 and 12.32 % for intra-day and inter-day measurements, respectively. Further, the limit of detection (LOD) was calculated to be 0.002 μg/ml. The developed method was applied to determine the lead content in the human serum of voluntary miners, and it has been proven that there is no statistically significant difference between the data provided from this novel method and the data obtained from previously studied AAS.

  10. Development and Statistical Validation of Spectrophotometric Methods for the Estimation of Nabumetone in Tablet Dosage Form

    Directory of Open Access Journals (Sweden)

    A. R. Rote

    2010-01-01

    Full Text Available Three new simple, economic spectrophotometric methods were developed and validated for the estimation of nabumetone in bulk and tablet dosage form. First method includes determination of nabumetone at absorption maxima 330 nm, second method applied was area under curve for analysis of nabumetone in the wavelength range of 326-334 nm and third method was First order derivative spectra with scaling factor 4. Beer law obeyed in the concentration range of 10-30 μg/mL for all three methods. The correlation coefficients were found to be 0.9997, 0.9998 and 0.9998 by absorption maxima, area under curve and first order derivative spectra. Results of analysis were validated statistically and by performing recovery studies. The mean percent recoveries were found satisfactory for all three methods. The developed methods were also compared statistically using one way ANOVA. The proposed methods have been successfully applied for the estimation of nabumetone in bulk and pharmaceutical tablet dosage form.

  11. Improved numerical algorithm and experimental validation of a system thermal-hydraulic/CFD coupling method for multi-scale transient simulations of pool-type reactors

    International Nuclear Information System (INIS)

    Toti, A.; Vierendeels, J.; Belloni, F.

    2017-01-01

    Highlights: • A system thermal-hydraulic/CFD coupling methodology is proposed for high-fidelity transient flow analyses. • The method is based on domain decomposition and implicit numerical scheme. • A novel interface Quasi-Newton algorithm is implemented to improve stability and convergence rate. • Preliminary validation analyses on the TALL-3D experiment. - Abstract: The paper describes the development and validation of a coupling methodology between the best-estimate system thermal-hydraulic code RELAP5-3D and the CFD code FLUENT, conceived for high fidelity plant-scale safety analyses of pool-type reactors. The computational tool is developed to assess the impact of three-dimensional phenomena occurring in accidental transients such as loss of flow (LOF) in the research reactor MYRRHA, currently in the design phase at the Belgian Nuclear Research Centre, SCK• CEN. A partitioned, implicit domain decomposition coupling algorithm is implemented, in which the coupled domains exchange thermal-hydraulics variables at coupling boundary interfaces. Numerical stability and interface convergence rates are improved by a novel interface Quasi-Newton algorithm, which is compared in this paper with previously tested numerical schemes. The developed computational method has been assessed for validation purposes against the experiment performed at the test facility TALL-3D, operated by the Royal Institute of Technology (KTH) in Sweden. This paper details the results of the simulation of a loss of forced convection test, showing the capability of the developed methodology to predict transients influenced by local three-dimensional phenomena.

  12. Construct Validity and Scoring Methods of the World Health Organization: Health and Work Performance Questionnaire Among Workers With Arthritis and Rheumatological Conditions.

    Science.gov (United States)

    AlHeresh, Rawan; LaValley, Michael P; Coster, Wendy; Keysor, Julie J

    2017-06-01

    To evaluate construct validity and scoring methods of the world health organization-health and work performance questionnaire (HPQ) for people with arthritis. Construct validity was examined through hypothesis testing using the recommended guidelines of the consensus-based standards for the selection of health measurement instruments (COSMIN). The HPQ using the absolute scoring method showed moderate construct validity as four of the seven hypotheses were met. The HPQ using the relative scoring method had weak construct validity as only one of the seven hypotheses were met. The absolute scoring method for the HPQ is superior in construct validity to the relative scoring method in assessing work performance among people with arthritis and related rheumatic conditions; however, more research is needed to further explore other psychometric properties of the HPQ.

  13. A method of knowledge base verification and validation for nuclear power plants expert systems

    International Nuclear Information System (INIS)

    Kwon, Il Won

    1996-02-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. As a result of this popularity, a large number of expert systems are developed. The nature of expert systems, however, requires that they be verified and validated carefully and that detailed methodologies for their development be devised. Therefore, it is widely noted that assuring the reliability of expert systems is very important, especially in nuclear industry, and it is also recognized that the process of verification and validation is an essential part of reliability assurance for these systems. Research and practices have produced numerous methods for expert system verification and validation (V and V) that suggest traditional software and system approaches to V and V. However, many approaches and methods for expert system V and V are partial, unreliable, and not uniform. The purpose of this paper is to present a new approach to expert system V and V, based on Petri nets, providing a uniform model. We devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for checking incorrectness, inconsistency, and incompleteness in a knowledge base. We also suggest heuristic analysis for validation process to show that the reasoning path is correct

  14. A Validated Method for the Quality Control of Andrographis paniculata Preparations.

    Science.gov (United States)

    Karioti, Anastasia; Timoteo, Patricia; Bergonzi, Maria Camilla; Bilia, Anna Rita

    2017-10-01

    Andrographis paniculata is a herbal drug of Asian traditional medicine largely employed for the treatment of several diseases. Recently, it has been introduced in Europe for the prophylactic and symptomatic treatment of common cold and as an ingredient of dietary supplements. The active principles are diterpenes with andrographolide as the main representative. In the present study, an analytical protocol was developed for the determination of the main constituents in the herb and preparations of A. paniculata . Three different extraction protocols (methanol extraction using a modified Soxhlet procedure, maceration under ultrasonication, and decoction) were tested. Ultrasonication achieved the highest content of analytes. HPLC conditions were optimized in terms of solvent mixtures, time course, and temperature. A reversed phase C18 column eluted with a gradient system consisting of acetonitrile and acidified water and including an isocratic step at 30 °C was used. The HPLC method was validated for linearity, limits of quantitation and detection, repeatability, precision, and accuracy. The overall method was validated for precision and accuracy over at least three different concentration levels. Relative standard deviation was less than 1.13%, whereas recovery was between 95.50% and 97.19%. The method also proved to be suitable for the determination of a large number of commercial samples and was proposed to the European Pharmacopoeia for the quality control of Andrographidis herba. Georg Thieme Verlag KG Stuttgart · New York.

  15. Determination of methylmercury in marine sediment samples: Method validation and occurrence data

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, Luis; Vassileva, Emilia, E-mail: e.vasileva-veleva@iaea.org

    2015-01-01

    Highlights: • A method for MeHg determination at trace level in marine sediments is completely validated. • Validation is performed according to ISO-17025 and Eurachem guidelines. • The extraction efficiency of four sample preparation procedures is evaluated. • The uncertainty budget is used as a tool for evaluation of main uncertainty contributors. • Comparison with independent methods yields good agreement within stated uncertainty. - Abstract: The determination of methylmercury (MeHg) in sediment samples is a difficult task due to the extremely low MeHg/THg (total mercury) ratio and species interconversion. Here, we present the method validation of a cost-effective fit-for-purpose analytical procedure for the measurement of MeHg in sediments, which is based on aqueous phase ethylation, followed by purge and trap and hyphenated gas chromatography–pyrolysis–atomic fluorescence spectrometry (GC–Py–AFS) separation and detection. Four different extraction techniques, namely acid and alkaline leaching followed by solvent extraction and evaporation, microwave-assisted extraction with 2-mercaptoethanol, and acid leaching, solvent extraction and back extraction into sodium thiosulfate, were examined regarding their potential to selectively extract MeHg from estuarine sediment IAEA-405 certified reference material (CRM). The procedure based on acid leaching with HNO{sub 3}/CuSO{sub 4}, solvent extraction and back extraction into Na{sub 2}S{sub 2}O{sub 3} yielded the highest extraction recovery, i.e., 94 ± 3% and offered the possibility to perform the extraction of a large number of samples in a short time, by eliminating the evaporation step. The artifact formation of MeHg was evaluated by high performance liquid chromatography coupled to inductively coupled plasma mass spectrometry (HPLC–ICP–MS), using isotopically enriched Me{sup 201}Hg and {sup 202}Hg and it was found to be nonexistent. A full validation approach in line with ISO 17025 and

  16. Determination of methylmercury in marine sediment samples: Method validation and occurrence data

    International Nuclear Information System (INIS)

    Carrasco, Luis; Vassileva, Emilia

    2015-01-01

    Highlights: • A method for MeHg determination at trace level in marine sediments is completely validated. • Validation is performed according to ISO-17025 and Eurachem guidelines. • The extraction efficiency of four sample preparation procedures is evaluated. • The uncertainty budget is used as a tool for evaluation of main uncertainty contributors. • Comparison with independent methods yields good agreement within stated uncertainty. - Abstract: The determination of methylmercury (MeHg) in sediment samples is a difficult task due to the extremely low MeHg/THg (total mercury) ratio and species interconversion. Here, we present the method validation of a cost-effective fit-for-purpose analytical procedure for the measurement of MeHg in sediments, which is based on aqueous phase ethylation, followed by purge and trap and hyphenated gas chromatography–pyrolysis–atomic fluorescence spectrometry (GC–Py–AFS) separation and detection. Four different extraction techniques, namely acid and alkaline leaching followed by solvent extraction and evaporation, microwave-assisted extraction with 2-mercaptoethanol, and acid leaching, solvent extraction and back extraction into sodium thiosulfate, were examined regarding their potential to selectively extract MeHg from estuarine sediment IAEA-405 certified reference material (CRM). The procedure based on acid leaching with HNO 3 /CuSO 4 , solvent extraction and back extraction into Na 2 S 2 O 3 yielded the highest extraction recovery, i.e., 94 ± 3% and offered the possibility to perform the extraction of a large number of samples in a short time, by eliminating the evaporation step. The artifact formation of MeHg was evaluated by high performance liquid chromatography coupled to inductively coupled plasma mass spectrometry (HPLC–ICP–MS), using isotopically enriched Me 201 Hg and 202 Hg and it was found to be nonexistent. A full validation approach in line with ISO 17025 and Eurachem guidelines was followed

  17. Estimating Rooftop Suitability for PV: A Review of Methods, Patents, and Validation Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Melius, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ong, S. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    A number of methods have been developed using remote sensing data to estimate rooftop area suitable for the installation of photovoltaics (PV) at various geospatial resolutions. This report reviews the literature and patents on methods for estimating rooftop-area appropriate for PV, including constant-value methods, manual selection methods, and GIS-based methods. This report also presents NREL's proposed method for estimating suitable rooftop area for PV using Light Detection and Ranging (LiDAR) data in conjunction with a GIS model to predict areas with appropriate slope, orientation, and sunlight. NREL's method is validated against solar installation data from New Jersey, Colorado, and California to compare modeled results to actual on-the-ground measurements.

  18. Lodenafil carbonate tablets: optimization and validation of a capillary zone electrophoresis method

    OpenAIRE

    Codevilla, Cristiane F; Ferreira, Pâmela Cristina L; Sangoi, Maximiliano S; Fröehlich, Pedro Eduardo; Bergold, Ana Maria

    2012-01-01

    A simple capillary zone electrophoresis (CZE) method was developed and validated for the analysis of lodenafil carbonate in tablets. Response surface methodology was used for optimization of the pH and concentration of the buffer, applied voltage and temperature. The method employed 50 mmol L-1 borate buffer at pH 10 as background electrolyte with an applied voltage of 15 kV. The separation was carried out in a fused-silica capillary maintained at 32.5 ºC and the detection wavelength was 214 ...

  19. Validation of high-performance liquid chromatography (HPLC method for quantitative analysis of histamine in fish and fishery products

    Directory of Open Access Journals (Sweden)

    B.K.K.K. Jinadasa

    2016-12-01

    Full Text Available A high-performance liquid chromatography method is described for quantitative determination and validation of histamine in fish and fishery product samples. Histamine is extracted from fish/fishery products by homogenizing with tri-chloro acetic acid, separated with Amberlite CG-50 resin and C18-ODS Hypersil reversed phase column at ambient temperature (25°C. Linear standard curves with high correlation coefficients were obtained. An isocratic elution program was used; the total elution time was 10 min. The method was validated by assessing the following aspects; specificity, repeatability, reproducibility, linearity, recovery, limits of detection, limit of quantification and uncertainty. The validated parameters are in good agreement with method and it is a useful tool for determining histamine in fish and fishery products.

  20. Validated method for the analysis of goji berry, a rich source of zeaxanthin dipalmitate.

    Science.gov (United States)

    Karioti, Anastasia; Bergonzi, Maria Camilla; Vincieri, Franco F; Bilia, Anna Rita

    2014-12-31

    In the present study an HPLC-DAD method was developed for the determination of the main carotenoid, zeaxanthin dipalmitate, in the fruits of Lycium barbarum. The aim was to develop and optimize an extraction protocol to allow fast, exhaustive, and repeatable extraction, suitable for labile carotenoid content. Use of liquid N2 allowed the grinding of the fruit. A step of ultrasonication with water removed efficiently the polysaccharides and enabled the exhaustive extraction of carotenoids by hexane/acetone 50:50. The assay was fast and simple and permitted the quality control of a large number of commercial samples including fruits, juices, and a jam. The HPLC method was validated according to ICH guidelines and satisfied the requirements. Finally, the overall method was validated for precision (% RSD ranging between 3.81 and 4.13) and accuracy at three concentration levels. The recovery was between 94 and 107% with RSD values <2%, within the acceptable limits, especially if the difficulty of the matrix is taken into consideration.

  1. A validated HPTLC method for estimation of moxifloxacin hydrochloride in tablets.

    Science.gov (United States)

    Dhillon, Vandana; Chaudhary, Alok Kumar

    2010-10-01

    A simple HPTLC method having high accuracy, precision and reproducibility was developed for the routine estimation of moxifloxacin hydrochloride in the tablets available in market and was validated for various parameters according to ICH guidelines. moxifloxacin hydrochloride was estimated at 292 nm by densitometry using Silica gel 60 F254 as stationary phase and a premix of methylene chloride: methanol: strong ammonia solution and acetonitrile (10:10:5:10) as mobile phase. Method was found linear in a range of 9-54 nanograms with a correlation coefficient >0.99. The regression equation was: AUC = 65.57 × (Amount in nanograms) + 163 (r(2) = 0.9908).

  2. Validation of the k0 standardization method in neutron activation analysis

    International Nuclear Information System (INIS)

    Kubesova, Marie

    2009-01-01

    The goal of this work was to validate the k 0 standardization method in neutron activation analysis for use by the Nuclear Physics Institute's NAA Laboratory. The precision and accuracy of the method were examined by using two types of reference materials: the one type comprised a set of synthetic materials and served to check the implementation of k 0 standardization, the other type consisted of matrix NIST SRMs comprising various different matrices. In general, a good agreement was obtained between the results of this work and the certified values, giving evidence of the accuracy of our results. In addition, the limits were evaluated for 61 elements

  3. Review of seismic tests for qualification of components and validation of methods

    International Nuclear Information System (INIS)

    Buland, P.; Gantenbein, F.; Gibert, R.J.; Hoffmann, A.; Queval, J.C.

    1988-01-01

    Seismic tests are performed in CEA-DEMT since many years in order: to demonstrate the qualification of components, to give an experimental validation of calculation methods used for seismic design of components. The paper presents examples of these two types of tests, a description of the existing facilities and details about the new facility TAMARIS under construction. (author)

  4. Review of seismic tests for qualification of components and validation of methods

    Energy Technology Data Exchange (ETDEWEB)

    Buland, P; Gantenbein, F; Gibert, R J; Hoffmann, A; Queval, J C [CEA-CEN SACLAY-DEMT, Gif sur Yvette-Cedex (France)

    1988-07-01

    Seismic tests are performed in CEA-DEMT since many years in order: to demonstrate the qualification of components, to give an experimental validation of calculation methods used for seismic design of components. The paper presents examples of these two types of tests, a description of the existing facilities and details about the new facility TAMARIS under construction. (author)

  5. Validation of analytical methods for the quality control of Naproxen suppositories

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar; Hernandez Contreras, Orestes Yuniel

    2011-01-01

    The analysis methods that will be used for the quality control of the future Cuban-made Naproxen suppositories for adults and children were developed for the first time in this paper. One method based on direct ultraviolet spectrophotometry was put forward, which proved to be specific, linear, accurate and precise for the quality control of Naproxen suppositories, taking into account the presence of chromophore groups in their structure. Likewise, the direct semi-aqueous acid-base volumetry method aimed at the quality control of the Naproxen raw material was changed and adapted to the quality control of suppositories. On the basis of the validation process, there was demonstrated the adequate specificity of this method with respect to the formulation components, as well as its linearity, accuracy and precision in 1-3 mg/ml range. The final results were compared and no significant statistical differences among the replicas per each dose were found in both methods; therefore, both may be used in the quality control of Naproxen suppositories

  6. A statistical method (cross-validation) for bone loss region detection after spaceflight

    Science.gov (United States)

    Zhao, Qian; Li, Wenjun; Li, Caixia; Chu, Philip W.; Kornak, John; Lang, Thomas F.

    2010-01-01

    Astronauts experience bone loss after the long spaceflight missions. Identifying specific regions that undergo the greatest losses (e.g. the proximal femur) could reveal information about the processes of bone loss in disuse and disease. Methods for detecting such regions, however, remains an open problem. This paper focuses on statistical methods to detect such regions. We perform statistical parametric mapping to get t-maps of changes in images, and propose a new cross-validation method to select an optimum suprathreshold for forming clusters of pixels. Once these candidate clusters are formed, we use permutation testing of longitudinal labels to derive significant changes. PMID:20632144

  7. Two Validated HPLC Methods for the Quantification of Alizarin and other Anthraquinones in Rubia tinctorum Cultivars

    NARCIS (Netherlands)

    Derksen, G.C.H.; Lelyveld, G.P.; Beek, van T.A.; Capelle, A.; Groot, de Æ.

    2004-01-01

    Direct and indirect HPLC-UV methods for the quantitative determination of anthraquinones in dried madder root have been developed, validated and compared. In the direct method, madder root was extracted twice with refluxing ethanol-water. This method allowed the determination of the two major native

  8. Ecological content validation of the Information Assessment Method for parents (IAM-parent): A mixed methods study.

    Science.gov (United States)

    Bujold, M; El Sherif, R; Bush, P L; Johnson-Lafleur, J; Doray, G; Pluye, P

    2018-02-01

    This mixed methods study content validated the Information Assessment Method for parents (IAM-parent) that allows users to systematically rate and comment on online parenting information. Quantitative data and results: 22,407 IAM ratings were collected; of the initial 32 items, descriptive statistics showed that 10 had low relevance. Qualitative data and results: IAM-based comments were collected, and 20 IAM users were interviewed (maximum variation sample); the qualitative data analysis assessed the representativeness of IAM items, and identified items with problematic wording. Researchers, the program director, and Web editors integrated quantitative and qualitative results, which led to a shorter and clearer IAM-parent. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. A systematic review of validated methods to capture acute bronchospasm using administrative or claims data.

    Science.gov (United States)

    Sharifi, Mona; Krishanswami, Shanthi; McPheeters, Melissa L

    2013-12-30

    To identify and assess billing, procedural, or diagnosis code, or pharmacy claim-based algorithms used to identify acute bronchospasm in administrative and claims databases. We searched the MEDLINE database from 1991 to September 2012 using controlled vocabulary and key terms related to bronchospasm, wheeze and acute asthma. We also searched the reference lists of included studies. Two investigators independently assessed the full text of studies against pre-determined inclusion criteria. Two reviewers independently extracted data regarding participant and algorithm characteristics. Our searches identified 677 citations of which 38 met our inclusion criteria. In these 38 studies, the most commonly used ICD-9 code was 493.x. Only 3 studies reported any validation methods for the identification of bronchospasm, wheeze or acute asthma in administrative and claims databases; all were among pediatric populations and only 2 offered any validation statistics. Some of the outcome definitions utilized were heterogeneous and included other disease based diagnoses, such as bronchiolitis and pneumonia, which are typically of an infectious etiology. One study offered the validation of algorithms utilizing Emergency Department triage chief complaint codes to diagnose acute asthma exacerbations with ICD-9 786.07 (wheezing) revealing the highest sensitivity (56%), specificity (97%), PPV (93.5%) and NPV (76%). There is a paucity of studies reporting rigorous methods to validate algorithms for the identification of bronchospasm in administrative data. The scant validated data available are limited in their generalizability to broad-based populations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Method validation for control determination of mercury in fresh fish and shrimp samples by solid sampling thermal decomposition/amalgamation atomic absorption spectrometry.

    Science.gov (United States)

    Torres, Daiane Placido; Martins-Teixeira, Maristela Braga; Cadore, Solange; Queiroz, Helena Müller

    2015-01-01

    A method for the determination of total mercury in fresh fish and shrimp samples by solid sampling thermal decomposition/amalgamation atomic absorption spectrometry (TDA AAS) has been validated following international foodstuff protocols in order to fulfill the Brazilian National Residue Control Plan. The experimental parameters have been previously studied and optimized according to specific legislation on validation and inorganic contaminants in foodstuff. Linearity, sensitivity, specificity, detection and quantification limits, precision (repeatability and within-laboratory reproducibility), robustness as well as accuracy of the method have been evaluated. Linearity of response was satisfactory for the two range concentrations available on the TDA AAS equipment, between approximately 25.0 and 200.0 μg kg(-1) (square regression) and 250.0 and 2000.0 μg kg(-1) (linear regression) of mercury. The residues for both ranges were homoscedastic and independent, with normal distribution. Correlation coefficients obtained for these ranges were higher than 0.995. Limits of quantification (LOQ) and of detection of the method (LDM), based on signal standard deviation (SD) for a low-in-mercury sample, were 3.0 and 1.0 μg kg(-1), respectively. Repeatability of the method was better than 4%. Within-laboratory reproducibility achieved a relative SD better than 6%. Robustness of the current method was evaluated and pointed sample mass as a significant factor. Accuracy (assessed as the analyte recovery) was calculated on basis of the repeatability, and ranged from 89% to 99%. The obtained results showed the suitability of the present method for direct mercury measurement in fresh fish and shrimp samples and the importance of monitoring the analysis conditions for food control purposes. Additionally, the competence of this method was recognized by accreditation under the standard ISO/IEC 17025.

  11. A simple, rapid and validated high-performance liquid chromatography method suitable for clinical measurements of human mercaptalbumin and non-mercaptalbumin.

    Science.gov (United States)

    Yasukawa, Keiko; Shimosawa, Tatsuo; Okubo, Shigeo; Yatomi, Yutaka

    2018-01-01

    Background Human mercaptalbumin and human non-mercaptalbumin have been reported as markers for various pathological conditions, such as kidney and liver diseases. These markers play important roles in redox regulations throughout the body. Despite the recognition of these markers in various pathophysiologic conditions, the measurements of human mercaptalbumin and non-mercaptalbumin have not been popular because of the technical complexity and long measurement time of conventional methods. Methods Based on previous reports, we explored the optimal analytical conditions for a high-performance liquid chromatography method using an anion-exchange column packed with a hydrophilic polyvinyl alcohol gel. The method was then validated using performance tests as well as measurements of various patients' serum samples. Results We successfully established a reliable high-performance liquid chromatography method with an analytical time of only 12 min per test. The repeatability (within-day variability) and reproducibility (day-to-day variability) were 0.30% and 0.27% (CV), respectively. A very good correlation was obtained with the results of the conventional method. Conclusions A practical method for the clinical measurement of human mercaptalbumin and non-mercaptalbumin was established. This high-performance liquid chromatography method is expected to be a powerful tool enabling the expansion of clinical usefulness and ensuring the elucidation of the roles of albumin in redox reactions throughout the human body.

  12. Development and Validation of Liquid Chromatographic Method for Estimation of Naringin in Nanoformulation

    Directory of Open Access Journals (Sweden)

    Kranti P. Musmade

    2014-01-01

    Full Text Available A simple, precise, accurate, rapid, and sensitive reverse phase high performance liquid chromatography (RP-HPLC method with UV detection has been developed and validated for quantification of naringin (NAR in novel pharmaceutical formulation. NAR is a polyphenolic flavonoid present in most of the citrus plants having variety of pharmacological activities. Method optimization was carried out by considering the various parameters such as effect of pH and column. The analyte was separated by employing a C18 (250.0 × 4.6 mm, 5 μm column at ambient temperature in isocratic conditions using phosphate buffer pH 3.5: acetonitrile (75 : 25% v/v as mobile phase pumped at a flow rate of 1.0 mL/min. UV detection was carried out at 282 nm. The developed method was validated according to ICH guidelines Q2(R1. The method was found to be precise and accurate on statistical evaluation with a linearity range of 0.1 to 20.0 μg/mL for NAR. The intra- and interday precision studies showed good reproducibility with coefficients of variation (CV less than 1.0%. The mean recovery of NAR was found to be 99.33 ± 0.16%. The proposed method was found to be highly accurate, sensitive, and robust. The proposed liquid chromatographic method was successfully employed for the routine analysis of said compound in developed novel nanopharmaceuticals. The presence of excipients did not show any interference on the determination of NAR, indicating method specificity.

  13. Seeking a Valid Gold Standard for an Innovative, Dialect-Neutral Language Test

    Science.gov (United States)

    Pearson, Barbara Zurer; Jackson, Janice E.; Wu, Haotian

    2014-01-01

    Purpose: In this study, the authors explored alternative gold standards to validate an innovative, dialect-neutral language assessment. Method: Participants were 78 African American children, ages 5;0 (years;months) to 6;11. Twenty participants had previously been identified as having language impairment. The Diagnostic Evaluation of Language…

  14. Contribution to the validation of thermal ratchetting prevision methods in metallic structures; Contribution a la validation des methodes de prevision du rochet thermique dans les structures metalliques

    Energy Technology Data Exchange (ETDEWEB)

    Rakotovelo, A.M

    1998-03-01

    This work concerns the steady state assessment in the metallic structures subjected to thermomechanical cyclic loadings in biaxial stress state. The effect of the short time mechanical overloads is also investigated. The first chapter is devoted to a bibliographic research concerning the behaviour of the materials and the structures in the cyclic plasticity. Some works relate to the experimental aspect as well as the numerical one for the steady state assessment of such structures are presented. The experimental part of the study is presented in the second chapter. The experimental device was carried out in order to prescribe tension and torsion forces combined with cyclic thermal loading. Some tests was then carried out, among these tests certain include some overloads in tension or torsion. The last chapter describes the numerical calculations using different models (linear isotropic hardening, linear kinematic hardening and elasto-viscoplastic Chaboche's model) and the application of some simplified methods for the ratchetting assessment in the structures. We have considered two categories of methods. The first one is based on an elastic analysis (Bree's diagram, 3 Sm rule, efficiency rule) and the second one combines elastic analysis and elastoplastic analysis of the first cycle (Gatt's and Taleb's methods). The results of this study have enabled: to validate in the biaxial stress state an expression which takes into account the effect of mechanical short time overloads; to test the performances of considered models to describe the evolution of the structure during the first cycle and to take into account the effect of short time overloads. Among the considered models, the elastoplastic Chaboche's model seems to be the most accurate to describe the structure's behaviour during the first cycles; to validate some simplified methods. Certain methods based only on elastic analysis (Bee's diagram and efficiency rule) seem not

  15. In Vitro Dissolution Profile of Dapagliflozin: Development, Method Validation, and Analysis of Commercial Tablets

    Directory of Open Access Journals (Sweden)

    Rafaela Zielinski Cavalheiro de Meira

    2017-01-01

    Full Text Available Dapagliflozin was the first of its class (inhibitors of sodium-glucose cotransporter to be approved in Europe, USA, and Brazil. As the drug was recently approved, there is the need for research on analytical methods, including dissolution studies for the quality evaluation and assurance of tablets. The dissolution methodology was developed with apparatus II (paddle in 900 mL of medium (simulated gastric fluid, pH 1.2, temperature set at 37±0.5°C, and stirring speed of 50 rpm. For the quantification, a spectrophotometric (λ=224 nm method was developed and validated. In validation studies, the method proved to be specific and linear in the range from 0.5 to 15 μg·mL−1 (r2=0.998. The precision showed results with RSD values lower than 2%. The recovery of 80.72, 98.47, and 119.41% proved the accuracy of the method. Through a systematic approach by applying Factorial 23, the robustness of the method was confirmed (p>0.05. The studies of commercial tablets containing 5 or 10 mg demonstrated that they could be considered similar through f1, f2, and dissolution efficiency analyses. Also, the developed method can be used for the quality evaluation of dapagliflozin tablets and can be considered as a scientific basis for future official pharmacopoeial methods.

  16. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  17. Evaluation of the confusion matrix method in the validation of an automated system for measuring feeding behaviour of cattle.

    Science.gov (United States)

    Ruuska, Salla; Hämäläinen, Wilhelmiina; Kajava, Sari; Mughal, Mikaela; Matilainen, Pekka; Mononen, Jaakko

    2018-03-01

    The aim of the present study was to evaluate empirically confusion matrices in device validation. We compared the confusion matrix method to linear regression and error indices in the validation of a device measuring feeding behaviour of dairy cattle. In addition, we studied how to extract additional information on classification errors with confusion probabilities. The data consisted of 12 h behaviour measurements from five dairy cows; feeding and other behaviour were detected simultaneously with a device and from video recordings. The resulting 216 000 pairs of classifications were used to construct confusion matrices and calculate performance measures. In addition, hourly durations of each behaviour were calculated and the accuracy of measurements was evaluated with linear regression and error indices. All three validation methods agreed when the behaviour was detected very accurately or inaccurately. Otherwise, in the intermediate cases, the confusion matrix method and error indices produced relatively concordant results, but the linear regression method often disagreed with them. Our study supports the use of confusion matrix analysis in validation since it is robust to any data distribution and type of relationship, it makes a stringent evaluation of validity, and it offers extra information on the type and sources of errors. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Validation of a Novel 3-Dimensional Sonographic Method for Assessing Gastric Accommodation in Healthy Adults

    NARCIS (Netherlands)

    Buisman, Wijnand J; van Herwaarden-Lindeboom, MYA; Mauritz, Femke A; El Ouamari, Mourad; Hausken, Trygve; Olafsdottir, Edda J; van der Zee, David C; Gilja, Odd Helge

    OBJECTIVES: A novel automated 3-dimensional (3D) sonographic method has been developed for measuring gastric volumes. This study aimed to validate and assess the reliability of this novel 3D sonographic method compared to the reference standard in 3D gastric sonography: freehand magneto-based 3D

  19. Optimization, validation and application of UV-Vis spectrophotometric-colorimetric methods for determination of trimethoprim in different medicinal products

    Directory of Open Access Journals (Sweden)

    Goran Stojković

    2016-03-01

    Full Text Available Two simple, sensitive, selective, precise, and accurate methods for determination of trimethoprim in different sulfonamide formulations intended for use in human and veterinary medicine were optimized and validated. The methods are based on the trimethoprim reaction with bromcresol green (BCG and 2,4-dinitro-1-fluorobenzene (DNFB. As extraction solvents we used 10 % N,N-dimethylacetamide in methanol and acetone for both methods, respectively. The colored products are quantified applying visible spectrophotometry at their corresponding absorption maxima. The methods were validated for linearity, sensitivity, accuracy, and precision. We tested the method applicability on four different medicinal products in tablet and powder forms containing sulfametrole and sulfamethoxazole in combination with trimethoprim. The results revealed that both methods are equally accurate with recoveries within the range 95-105 %. The obtained between-day precision for both methods, when applied on four different medicinal products, was within in the range 1.08-3.20 %. By applying the F-statistical test (P<0.05, it was concluded that for three medicinal products tested both methods are applicable with statistically insignificant difference in precision. The optimized and validated BCG and DNFB methods could find application in routine quality control of trimethoprim in various formulation forms, at different concentration levels, and in combination with different sulfonamides.

  20. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    NARCIS (Netherlands)

    Ramos, Daniel; Haraksim, Rudolf; Meuwly, Didier

    2017-01-01

    Data to which the authors refer to throughout this article are likelihood ratios (LR) computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR) method in forensic evidence evaluation. These data present a

  1. Validation of a spectrophotometric method for quantification of carboxyhemoglobin.

    Science.gov (United States)

    Luchini, Paulo D; Leyton, Jaime F; Strombech, Maria de Lourdes C; Ponce, Julio C; Jesus, Maria das Graças S; Leyton, Vilma

    2009-10-01

    The measurement of carboxyhemoglobin (COHb) levels in blood is a valuable procedure to confirm exposure to carbon monoxide (CO) either for forensic or occupational matters. A previously described method using spectrophotometric readings at 420 and 432 nm after reduction of oxyhemoglobin (O(2)Hb) and methemoglobin with sodium hydrosulfite solution leads to an exponential curve. This curve, used with pre-established factors, serves well for lower concentrations (1-7%) or for high concentrations (> 20%) but very rarely for both. The authors have observed that small variations on the previously described factors F1, F2, and F3, obtained from readings for 100% COHb and 100% O(2)Hb, turn into significant changes in COHb% results and propose that these factors should be determined every time COHb is measured by reading CO and O(2) saturated samples. This practice leads to an increase in accuracy and precision.

  2. FDIR Strategy Validation with the B Method

    Science.gov (United States)

    Sabatier, D.; Dellandrea, B.; Chemouil, D.

    2008-08-01

    In a formation flying satellite system, the FDIR strategy (Failure Detection, Isolation and Recovery) is paramount. When a failure occurs, satellites should be able to take appropriate reconfiguration actions to obtain the best possible results given the failure, ranging from avoiding satellite-to-satellite collision to continuing the mission without disturbance if possible. To achieve this goal, each satellite in the formation has an implemented FDIR strategy that governs how it detects failures (from tests or by deduction) and how it reacts (reconfiguration using redundant equipments, avoidance manoeuvres, etc.). The goal is to protect the satellites first and the mission as much as possible. In a project initiated by the CNES, ClearSy experiments the B Method to validate the FDIR strategies developed by Thales Alenia Space, of the inter satellite positioning and communication devices that will be used for the SIMBOL-X (2 satellite configuration) and the PEGASE (3 satellite configuration) missions and potentially for other missions afterward. These radio frequency metrology sensor devices provide satellite positioning and inter satellite communication in formation flying. This article presents the results of this experience.

  3. A proactive alarm reduction method and its human factors validation test for a main control room for SMART

    International Nuclear Information System (INIS)

    Jang, Gwi-sook; Suh, Sang-moon; Kim, Sa-kil; Suh, Yong-suk; Park, Je-yun

    2013-01-01

    Highlights: ► A proactive alarm reduction method improves effectiveness on the alarm reduction. ► The method suppresses alarms based on the ECA rules and facts for the alarm reduction under an alarm flood situation. ► The alarm reduction logics are supplemented to a high hit ratio of the reduction logics during on-line operations. ► The method is validated by human factors validation test based on regulatory requirements. -- Abstract: Conventional alarm systems tend to overwhelm operators during a transient because of a large number of nearly simultaneous annunciator activations with varying degrees of relevance to operator tasks. Thus alarm processing techniques have developed to support operators in coping with the volume of alarms, to identify which alarms are significant, and to reduce the need for operators to infer the plant conditions. This paper proposes a proactive alarm reduction method for SMART (System-integrated Modular Advanced ReacTor) whereby based on the contents of the past operating effects alarm reduction is carried out during the next transient. We designed and implemented the proactive alarm reduction system and constructed the environment for the human factors validation test. Also, eight subjects actually working in a nuclear power plant (NPP) tested the practical effectiveness of the proposed proactive alarm reduction method according to the procedure of human factors validation test under a dynamic simulation of a partial scope for an NPP.

  4. Development and validation of an improved method for the determination of chloropropanols in paperboard food packaging by GC-MS.

    Science.gov (United States)

    Mezouari, S; Liu, W Yun; Pace, G; Hartman, T G

    2015-01-01

    The objective of this study was to develop an improved analytical method for the determination of 3-chloro-1,2-propanediol (3-MCPD) and 1,3-dichloropropanol (1,3-DCP) in paper-type food packaging. The established method includes aqueous extraction, matrix spiking of a deuterated surrogate internal standard (3-MCPD-d₅), clean-up using Extrelut solid-phase extraction, derivatisation using a silylation reagent, and GC-MS analysis of the chloropropanols as their corresponding trimethyl silyl ethers. The new method is applicable to food-grade packaging samples using European Commission standard aqueous extraction and aqueous food stimulant migration tests. In this improved method, the derivatisation procedure was optimised; the cost and time of the analysis were reduced by using 10 times less sample, solvents and reagents than in previously described methods. Overall the validation data demonstrate that the method is precise and reliable. The limit of detection (LOD) of the aqueous extract was 0.010 mg kg(-1) (w/w) for both 3-MCPD and 1,3-DCP. Analytical precision had a relative standard deviation (RSD) of 3.36% for 3-MCPD and an RSD of 7.65% for 1,3-DCP. The new method was satisfactorily applied to the analysis of over 100 commercial paperboard packaging samples. The data are being used to guide the product development of a next generation of wet-strength resins with reduced chloropropanol content, and also for risk assessments to calculate the virtual safe dose (VSD).

  5. An analysis of normalization methods for Drosophila RNAi genomic screens and development of a robust validation scheme

    Science.gov (United States)

    Wiles, Amy M.; Ravi, Dashnamoorthy; Bhavani, Selvaraj; Bishop, Alexander J.R.

    2010-01-01

    Genome-wide RNAi screening is a powerful, yet relatively immature technology that allows investigation into the role of individual genes in a process of choice. Most RNAi screens identify a large number of genes with a continuous gradient in the assessed phenotype. Screeners must then decide whether to examine just those genes with the most robust phenotype or to examine the full gradient of genes that cause an effect and how to identify the candidate genes to be validated. We have used RNAi in Drosophila cells to examine viability in a 384-well plate format and compare two screens, untreated control and treatment. We compare multiple normalization methods, which take advantage of different features within the data, including quantile normalization, background subtraction, scaling, cellHTS2 1, and interquartile range measurement. Considering the false-positive potential that arises from RNAi technology, a robust validation method was designed for the purpose of gene selection for future investigations. In a retrospective analysis, we describe the use of validation data to evaluate each normalization method. While no normalization method worked ideally, we found that a combination of two methods, background subtraction followed by quantile normalization and cellHTS2, at different thresholds, captures the most dependable and diverse candidate genes. Thresholds are suggested depending on whether a few candidate genes are desired or a more extensive systems level analysis is sought. In summary, our normalization approaches and experimental design to perform validation experiments are likely to apply to those high-throughput screening systems attempting to identify genes for systems level analysis. PMID:18753689

  6. Cluster analysis of European Y-chromosomal STR haplotypes using the discrete Laplace method

    DEFF Research Database (Denmark)

    Andersen, Mikkel Meyer; Eriksen, Poul Svante; Morling, Niels

    2014-01-01

    The European Y-chromosomal short tandem repeat (STR) haplotype distribution has previously been analysed in various ways. Here, we introduce a new way of analysing population substructure using a new method based on clustering within the discrete Laplace exponential family that models the probabi......The European Y-chromosomal short tandem repeat (STR) haplotype distribution has previously been analysed in various ways. Here, we introduce a new way of analysing population substructure using a new method based on clustering within the discrete Laplace exponential family that models...... the probability distribution of the Y-STR haplotypes. Creating a consistent statistical model of the haplotypes enables us to perform a wide range of analyses. Previously, haplotype frequency estimation using the discrete Laplace method has been validated. In this paper we investigate how the discrete Laplace...... method can be used for cluster analysis to further validate the discrete Laplace method. A very important practical fact is that the calculations can be performed on a normal computer. We identified two sub-clusters of the Eastern and Western European Y-STR haplotypes similar to results of previous...

  7. Validation and Recommendation of Methods to Measure Biogas Production Potential of Animal Manure

    Directory of Open Access Journals (Sweden)

    C. H. Pham

    2013-06-01

    Full Text Available In developing countries, biogas energy production is seen as a technology that can provide clean energy in poor regions and reduce pollution caused by animal manure. Laboratories in these countries have little access to advanced gas measuring equipment, which may limit research aimed at improving local adapted biogas production. They may also be unable to produce valid estimates of an international standard that can be used for articles published in international peer-reviewed science journals. This study tested and validated methods for measuring total biogas and methane (CH4 production using batch fermentation and for characterizing the biomass. The biochemical methane potential (BMP (CH4 NL kg−1 VS of pig manure, cow manure and cellulose determined with the Moller and VDI methods was not significantly different in this test (p>0.05. The biodegradability using a ratio of BMP and theoretical BMP (TBMP was slightly higher using the Hansen method, but differences were not significant. Degradation rate assessed by methane formation rate showed wide variation within the batch method tested. The first-order kinetics constant k for the cumulative methane production curve was highest when two animal manures were fermented using the VDI 4630 method, indicating that this method was able to reach steady conditions in a shorter time, reducing fermentation duration. In precision tests, the repeatability of the relative standard deviation (RSDr for all batch methods was very low (4.8 to 8.1%, while the reproducibility of the relative standard deviation (RSDR varied widely, from 7.3 to 19.8%. In determination of biomethane concentration, the values obtained using the liquid replacement method (LRM were comparable to those obtained using gas chromatography (GC. This indicates that the LRM method could be used to determine biomethane concentration in biogas in laboratories with limited access to GC.

  8. Pesticide applicators questionnaire content validation: A fuzzy delphi method.

    Science.gov (United States)

    Manakandan, S K; Rosnah, I; Mohd Ridhuan, J; Priya, R

    2017-08-01

    The most crucial step in forming a set of survey questionnaire is deciding the appropriate items in a construct. Retaining irrelevant items and removing important items will certainly mislead the direction of a particular study. This article demonstrates Fuzzy Delphi method as one of the scientific analysis technique to consolidate consensus agreement within a panel of experts pertaining to each item's appropriateness. This method reduces the ambiguity, diversity, and discrepancy of the opinions among the experts hence enhances the quality of the selected items. The main purpose of this study was to obtain experts' consensus on the suitability of the preselected items on the questionnaire. The panel consists of sixteen experts from the Occupational and Environmental Health Unit of Ministry of Health, Vector-borne Disease Control Unit of Ministry of Health and Occupational and Safety Health Unit of both public and private universities. A set of questionnaires related to noise and chemical exposure were compiled based on the literature search. There was a total of six constructs with 60 items in which three constructs for knowledge, attitude, and practice of noise exposure and three constructs for knowledge, attitude, and practice of chemical exposure. The validation process replicated recent Fuzzy Delphi method that using a concept of Triangular Fuzzy Numbers and Defuzzification process. A 100% response rate was obtained from all the sixteen experts with an average Likert scoring of four to five. Post FDM analysis, the first prerequisite was fulfilled with a threshold value (d) ≤ 0.2, hence all the six constructs were accepted. For the second prerequisite, three items (21%) from noise-attitude construct and four items (40%) from chemical-practice construct had expert consensus lesser than 75%, which giving rise to about 12% from the total items in the questionnaire. The third prerequisite was used to rank the items within the constructs by calculating the average

  9. Validated UV-Spectrophotometric Methods for Determination of Gemifloxacin Mesylate in Pharmaceutical Tablet Dosage Forms

    Directory of Open Access Journals (Sweden)

    R. Rote Ambadas

    2010-01-01

    Full Text Available Two simple, economic and accurate UV spectrophotometric methods have been developed for determination of gemifloxacin mesylate in pharmaceutical tablet formulation. The first UV-spectrophotometric method depends upon the measurement of absorption at the wavelength 263.8 nm. In second area under curve method the wavelength range for detection was selected from 268.5-258.5 nm. Beer’s law was obeyed in the range of 2 to 12 μgmL-1 for both the methods. The proposed methods was validated statistically and applied successfully to determination of gemifloxacin mesylate in pharmaceutical formulation.

  10. Validation of an analytical method for determining halothane in urine as an instrument for evaluating occupational exposure

    International Nuclear Information System (INIS)

    Gonzalez Chamorro, Rita Maria; Jaime Novas, Arelis; Diaz Padron, Heliodora

    2010-01-01

    The occupational exposure to harmful substances may impose the apparition of determined significative changes in the normal physiology of the organism when the adequate security measures are not taken in time in a working place where the risk may be present. Among the chemical risks that may affect the workers' health are the inhalable anesthetic agents. With the objective to take the first steps for the introduction of an epidemiological surveillance system to this personnel, an analytical method for determining this anesthetic in urine was validated with the instrumental conditions created in our laboratory. To carry out this validation the following parameters were taken into account: specificity, lineament, precision, accuracy, detection limit and quantification limit, and the uncertainty of the method was calculated. In the validation procedure it was found that the technique is specific and precise, the detection limit was of 0,118 μg/L, and of the quantification limit of 0,354 μg/L. The global uncertainty was of 0,243, and the expanded of 0,486. The validated method, together with the posterior introduction of the biological exposure limits, will serve as an auxiliary means of diagnosis which will allow us a periodical control of the personnel exposure

  11. Predicting Radiation Pneumonitis After Stereotactic Ablative Radiation Therapy in Patients Previously Treated With Conventional Thoracic Radiation Therapy

    International Nuclear Information System (INIS)

    Liu Hui; Zhang Xu; Vinogradskiy, Yevgeniy Y.; Swisher, Stephen G.; Komaki, Ritsuko; Chang, Joe Y.

    2012-01-01

    Purpose: To determine the incidence of and risk factors for radiation pneumonitis (RP) after stereotactic ablative radiation therapy (SABR) to the lung in patients who had previously undergone conventional thoracic radiation therapy. Methods and Materials: Seventy-two patients who had previously received conventionally fractionated radiation therapy to the thorax were treated with SABR (50 Gy in 4 fractions) for recurrent disease or secondary parenchymal lung cancer (T 10 and mean lung dose (MLD) of the previous plan and the V 10 -V 40 and MLD of the composite plan were also related to RP. Multivariate analysis revealed that ECOG PS scores of 2-3 before SABR (P=.009), FEV1 ≤65% before SABR (P=.012), V 20 ≥30% of the composite plan (P=.021), and an initial PTV in the bilateral mediastinum (P=.025) were all associated with RP. Conclusions: We found that severe RP was relatively common, occurring in 20.8% of patients, and could be predicted by an ECOG PS score of 2-3, an FEV1 ≤65%, a previous PTV spanning the bilateral mediastinum, and V 20 ≥30% on composite (previous RT+SABR) plans. Prospective studies are needed to validate these predictors and the scoring system on which they are based.

  12. Validation of a method for assessing resident physicians' quality improvement proposals.

    Science.gov (United States)

    Leenstra, James L; Beckman, Thomas J; Reed, Darcy A; Mundell, William C; Thomas, Kris G; Krajicek, Bryan J; Cha, Stephen S; Kolars, Joseph C; McDonald, Furman S

    2007-09-01

    Residency programs involve trainees in quality improvement (QI) projects to evaluate competency in systems-based practice and practice-based learning and improvement. Valid approaches to assess QI proposals are lacking. We developed an instrument for assessing resident QI proposals--the Quality Improvement Proposal Assessment Tool (QIPAT-7)-and determined its validity and reliability. QIPAT-7 content was initially obtained from a national panel of QI experts. Through an iterative process, the instrument was refined, pilot-tested, and revised. Seven raters used the instrument to assess 45 resident QI proposals. Principal factor analysis was used to explore the dimensionality of instrument scores. Cronbach's alpha and intraclass correlations were calculated to determine internal consistency and interrater reliability, respectively. QIPAT-7 items comprised a single factor (eigenvalue = 3.4) suggesting a single assessment dimension. Interrater reliability for each item (range 0.79 to 0.93) and internal consistency reliability among the items (Cronbach's alpha = 0.87) were high. This method for assessing resident physician QI proposals is supported by content and internal structure validity evidence. QIPAT-7 is a useful tool for assessing resident QI proposals. Future research should determine the reliability of QIPAT-7 scores in other residency and fellowship training programs. Correlations should also be made between assessment scores and criteria for QI proposal success such as implementation of QI proposals, resident scholarly productivity, and improved patient outcomes.

  13. Validation of an ultra-fast UPLC-UV method for the separation of antituberculosis tablets.

    Science.gov (United States)

    Nguyen, Dao T-T; Guillarme, Davy; Rudaz, Serge; Veuthey, Jean-Luc

    2008-04-01

    A simple method using ultra performance LC (UPLC) coupled with UV detection was developed and validated for the determination of antituberculosis drugs in combined dosage form, i. e. isoniazid (ISN), pyrazinamide (PYR) and rifampicin (RIF). Drugs were separated on a short column (2.1 mm x 50 mm) packed with 1.7 mum particles, using an elution gradient procedure. At 30 degrees C, less than 2 min was necessary for the complete separation of the three antituberculosis drugs, while the original USP method was performed in 15 min. Further improvements were obtained with the combination of UPLC and high temperature (up to 90 degrees C), namely HT-UPLC, which allows the application of higher mobile phase flow rates. Therefore, the separation of ISN, PYR and RIF was performed in less than 1 min. After validation (selectivity, trueness, precision and accuracy), both methods (UPLC and HT-UPLC) have proven suitable for the routine quality control analysis of antituberculosis drugs in combined dosage form. Additionally, a large number of samples per day can be analysed due to the short analysis times.

  14. Reoperative sentinel lymph node biopsy after previous mastectomy.

    Science.gov (United States)

    Karam, Amer; Stempel, Michelle; Cody, Hiram S; Port, Elisa R

    2008-10-01

    Sentinel lymph node (SLN) biopsy is the standard of care for axillary staging in breast cancer, but many clinical scenarios questioning the validity of SLN biopsy remain. Here we describe our experience with reoperative-SLN (re-SLN) biopsy after previous mastectomy. Review of the SLN database from September 1996 to December 2007 yielded 20 procedures done in the setting of previous mastectomy. SLN biopsy was performed using radioisotope with or without blue dye injection superior to the mastectomy incision, in the skin flap in all patients. In 17 of 20 patients (85%), re-SLN biopsy was performed for local or regional recurrence after mastectomy. Re-SLN biopsy was successful in 13 of 20 patients (65%) after previous mastectomy. Of the 13 patients, 2 had positive re-SLN, and completion axillary dissection was performed, with 1 having additional positive nodes. In the 11 patients with negative re-SLN, 2 patients underwent completion axillary dissection demonstrating additional negative nodes. One patient with a negative re-SLN experienced chest wall recurrence combined with axillary recurrence 11 months after re-SLN biopsy. All others remained free of local or axillary recurrence. Re-SLN biopsy was unsuccessful in 7 of 20 patients (35%). In three of seven patients, axillary dissection was performed, yielding positive nodes in two of the three. The remaining four of seven patients all had previous modified radical mastectomy, so underwent no additional axillary surgery. In this small series, re-SLN was successful after previous mastectomy, and this procedure may play some role when axillary staging is warranted after mastectomy.

  15. Proposal for Requirement Validation Criteria and Method Based on Actor Interaction

    Science.gov (United States)

    Hattori, Noboru; Yamamoto, Shuichiro; Ajisaka, Tsuneo; Kitani, Tsuyoshi

    We propose requirement validation criteria and a method based on the interaction between actors in an information system. We focus on the cyclical transitions of one actor's situation against another and clarify observable stimuli and responses based on these transitions. Both actors' situations can be listed in a state transition table, which describes the observable stimuli or responses they send or receive. Examination of the interaction between both actors in the state transition tables enables us to detect missing or defective observable stimuli or responses. Typically, this method can be applied to the examination of the interaction between a resource managed by the information system and its user. As a case study, we analyzed 332 requirement defect reports of an actual system development project in Japan. We found that there were a certain amount of defects regarding missing or defective stimuli and responses, which can be detected using our proposed method if this method is used in the requirement definition phase. This means that we can reach a more complete requirement definition with our proposed method.

  16. Validity of the Remote Food Photography Method against Doubly Labeled Water among Minority Preschoolers

    OpenAIRE

    Nicklas, Theresa; Saab, Rabab; Islam, Noemi G.; Wong, William; Butte, Nancy; Schulin, Rebecca; Liu, Yan; Apolzan, John W.; Myers, Candice A.; Martin, Corby K.

    2017-01-01

    Objective To determine the validity of energy intake (EI) estimations made using the Remote Food Photography Method (RFPM) compared to the doubly-labeled water (DLW) method in minority preschool children in a free-living environment. Methods Seven days of food intake and spot urine samples excluding first void collections for DLW analysis were obtained on 39 3-to-5 year old Hispanic and African American children. Using an iPhone, caregivers captured before and after pictures of the child’s in...

  17. Measurement and data analysis methods for field-scale wind erosion studies and model validation

    NARCIS (Netherlands)

    Zobeck, T.M.; Sterk, G.; Funk, R.F.; Rajot, J.L.; Stout, J.E.; Scott Van Pelt, R.

    2003-01-01

    Accurate and reliable methods of measuring windblown sediment are needed to confirm, validate, and improve erosion models, assess the intensity of aeolian processes and related damage, determine the source of pollutants, and for other applications. This paper outlines important principles to

  18. Validation of a multi-residue method for the determination of several antibiotic groups in honey by LC-MS/MS.

    Science.gov (United States)

    Bohm, Detlef A; Stachel, Carolin S; Gowik, Petra

    2012-07-01

    The presented multi-method was developed for the confirmation of 37 antibiotic substances from the six antibiotic groups: macrolides, lincosamides, quinolones, tetracyclines, pleuromutilines and diamino-pyrimidine derivatives. All substances were analysed simultaneously in a single analytical run with the same procedure, including an extraction with buffer, a clean-up by solid-phase extraction, and the measurement by liquid chromatography tandem mass spectrometry in ESI+ mode. The method was validated on the basis of an in-house validation concept with factorial design by combination of seven factors to check the robustness in a concentration range of 5-50 μg kg(-1). The honeys used were of different types with regard to colour and origin. The values calculated for the validation parameters-decision limit CCα (range, 7.5-12.9 μg kg(-1)), detection capability CCβ (range, 9.4-19.9 μg kg(-1)), within-laboratory reproducibility RSD(wR) (tylvalosin with 21.4 %), repeatability RSD(r) (tylvalosin with 21.1%), and recovery (range, 92-106%)-were acceptable and in agreement with the criteria of Commission Decision 2002/657/EC. The validation results showed that the method was applicable for the residue analysis of antibiotics in honey to substances with and without recommended concentrations, although some changes had been tested during validation to determine the robustness of the method.

  19. High-throughput screening assay used in pharmacognosy: Selection, optimization and validation of methods of enzymatic inhibition by UV-visible spectrophotometry

    Directory of Open Access Journals (Sweden)

    Graciela Granados-Guzmán

    2014-02-01

    Full Text Available In research laboratories of both organic synthesis and extraction of natural products, every day a lot of products that can potentially introduce some biological activity are obtained. Therefore it is necessary to have in vitro assays, which provide reliable information for further evaluation in in vivo systems. From this point of view, in recent years has intensified the use of high-throughput screening assays. Such trials should be optimized and validated for accurate and precise results, i.e. reliable. The present review addresses the steps needed to develop and validate bioanalytical methods, emphasizing UV-Visible spectrophotometry as detection system. Particularly focuses on the selection of the method, the optimization to determine the best experimental conditions, validation, implementation of optimized and validated method to real samples, and finally maintenance and possible transfer it to a new laboratory.

  20. A simple method for validation and verification of pipettes mounted on automated liquid handlers

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Frøslev, Tobias G

    2011-01-01

    We have implemented a simple, inexpensive, and fast procedure for validation and verification of the performance of pipettes mounted on automated liquid handlers (ALHs) as necessary for laboratories accredited under ISO 17025. A six- or seven-step serial dilution of OrangeG was prepared in quadru......We have implemented a simple, inexpensive, and fast procedure for validation and verification of the performance of pipettes mounted on automated liquid handlers (ALHs) as necessary for laboratories accredited under ISO 17025. A six- or seven-step serial dilution of OrangeG was prepared...... are freely available. In conclusion, we have set up a simple, inexpensive, and fast solution for the continuous validation of ALHs used for accredited work according to the ISO 17025 standard. The method is easy to use for aqueous solutions but requires a spectrophotometer that can read microtiter plates....

  1. [A brief history of resuscitation - the influence of previous experience on modern techniques and methods].

    Science.gov (United States)

    Kucmin, Tomasz; Płowaś-Goral, Małgorzata; Nogalski, Adam

    2015-02-01

    Cardiopulmonary resuscitation (CPR) is relatively novel branch of medical science, however first descriptions of mouth-to-mouth ventilation are to be found in the Bible and literature is full of descriptions of different resuscitation methods - from flagellation and ventilation with bellows through hanging the victims upside down and compressing the chest in order to stimulate ventilation to rectal fumigation with tobacco smoke. The modern history of CPR starts with Kouwenhoven et al. who in 1960 published a paper regarding heart massage through chest compressions. Shortly after that in 1961Peter Safar presented a paradigm promoting opening the airway, performing rescue breaths and chest compressions. First CPR guidelines were published in 1966. Since that time guidelines were modified and improved numerously by two leading world expert organizations ERC (European Resuscitation Council) and AHA (American Heart Association) and published in a new version every 5 years. Currently 2010 guidelines should be obliged. In this paper authors made an attempt to present history of development of resuscitation techniques and methods and assess the influence of previous lifesaving methods on nowadays technologies, equipment and guidelines which allow to help those women and men whose life is in danger due to sudden cardiac arrest. © 2015 MEDPRESS.

  2. Method validation using weighted linear regression models for quantification of UV filters in water samples.

    Science.gov (United States)

    da Silva, Claudia Pereira; Emídio, Elissandro Soares; de Marchi, Mary Rosa Rodrigues

    2015-01-01

    This paper describes the validation of a method consisting of solid-phase extraction followed by gas chromatography-tandem mass spectrometry for the analysis of the ultraviolet (UV) filters benzophenone-3, ethylhexyl salicylate, ethylhexyl methoxycinnamate and octocrylene. The method validation criteria included evaluation of selectivity, analytical curve, trueness, precision, limits of detection and limits of quantification. The non-weighted linear regression model has traditionally been used for calibration, but it is not necessarily the optimal model in all cases. Because the assumption of homoscedasticity was not met for the analytical data in this work, a weighted least squares linear regression was used for the calibration method. The evaluated analytical parameters were satisfactory for the analytes and showed recoveries at four fortification levels between 62% and 107%, with relative standard deviations less than 14%. The detection limits ranged from 7.6 to 24.1 ng L(-1). The proposed method was used to determine the amount of UV filters in water samples from water treatment plants in Araraquara and Jau in São Paulo, Brazil. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Reference Values for Spirometry Derived Using Lambda, Mu, Sigma (LMS) Method in Korean Adults: in Comparison with Previous References.

    Science.gov (United States)

    Jo, Bum Seak; Myong, Jun Pyo; Rhee, Chin Kook; Yoon, Hyoung Kyu; Koo, Jung Wan; Kim, Hyoung Ryoul

    2018-01-15

    The present study aimed to update the prediction equations for spirometry and their lower limits of normal (LLN) by using the lambda, mu, sigma (LMS) method and to compare the outcomes with the values of previous spirometric reference equations. Spirometric data of 10,249 healthy non-smokers (8,776 females) were extracted from the fourth and fifth versions of the Korea National Health and Nutrition Examination Survey (KNHANES IV, 2007-2009; V, 2010-2012). Reference equations were derived using the LMS method which allows modeling skewness (lambda [L]), mean (mu [M]), and coefficient of variation (sigma [S]). The outcome equations were compared with previous reference values. Prediction equations were presented in the following form: predicted value = e{a + b × ln(height) + c × ln(age) + M - spline}. The new predicted values for spirometry and their LLN derived using the LMS method were shown to more accurately reflect transitions in pulmonary function in young adults than previous prediction equations derived using conventional regression analysis in 2013. There were partial discrepancies between the new reference values and the reference values from the Global Lung Function Initiative in 2012. The results should be interpreted with caution for young adults and elderly males, particularly in terms of the LLN for forced expiratory volume in one second/forced vital capacity in elderly males. Serial spirometry follow-up, together with correlations with other clinical findings, should be emphasized in evaluating the pulmonary function of individuals. Future studies are needed to improve the accuracy of reference data and to develop continuous reference values for spirometry across all ages. © 2018 The Korean Academy of Medical Sciences.

  4. Adaptation, validation and application of the chemo-thermal oxidation method to quantify black carbon in soils

    International Nuclear Information System (INIS)

    Agarwal, Tripti; Bucheli, Thomas D.

    2011-01-01

    The chemo-thermal oxidation method at 375 o C (CTO-375) has been widely used to quantify black carbon (BC) in sediments. In the present study, CTO-375 was tested and adapted for application to soil, accounting for some matrix specific properties like high organic carbon (≤39%) and carbonate (≤37%) content. Average recoveries of standard reference material SRM-2975 ranged from 25 to 86% for nine representative Swiss and Indian samples, which is similar to literature data for sediments. The adapted method was applied to selected samples of the Swiss soil monitoring network (NABO). BC content exhibited different patterns in three soil profiles while contribution of BC to TOC was found maximum below the topsoil at all three sites, however at different depths (60-130 cm). Six different NABO sites exhibited largely constant BC concentrations over the last 25 years, with short-term (6 months) prevailing over long-term (5 years) temporal fluctuations. - Research highlights: → The CTO-375 method was adapted and validated for BC analysis in soils. → Method validation figures of merit proofed satisfactory. → Application is shown with soil cores and topsoil temporal variability. → BC content can be elevated in subsurface soils. → BC contents in surface soils were largely constant over the last 25 years. - Although widely used also for soils, the chemo-thermal oxidation method at 375 o C to quantify black carbon has never been properly validated for this matrix before.

  5. Validation and uncertainty estimation of fast neutron activation analysis method for Cu, Fe, Al, Si elements in sediment samples

    International Nuclear Information System (INIS)

    Sunardi; Samin Prihatin

    2010-01-01

    Validation and uncertainty estimation of Fast Neutron Activation Analysis (FNAA) method for Cu, Fe, Al, Si elements in sediment samples has been conduced. The aim of the research is to confirm whether FNAA method is still matches to ISO/lEC 17025-2005 standard. The research covered the verification, performance, validation of FNM and uncertainty estimation. Standard of SRM 8704 and sediments were weighted for certain weight and irradiated with 14 MeV fast neutron and then counted using gamma spectrometry. The result of validation method for Cu, Fe, Al, Si element showed that the accuracy were in the range of 95.89-98.68 %, while the precision were in the range 1.13-2.29 %. The result of uncertainty estimation for Cu, Fe, Al, and Si were 2.67, 1.46, 1.71 and 1.20 % respectively. From this data, it can be concluded that the FNM method is still reliable and valid for element contents analysis in samples, because the accuracy is up to 95 % and the precision is under 5 %, while the uncertainty are relatively small and suitable for the range 95 % level of confidence where the uncertainty maximum is 5 %. (author)

  6. Model validation of solar PV plant with hybrid data dynamic simulation based on fast-responding generator method

    Directory of Open Access Journals (Sweden)

    Zhao Dawei

    2016-01-01

    Full Text Available In recent years, a significant number of large-scale solar photovoltaic (PV plants have been put into operation or been under planning around the world. The model accuracy of solar PV plant is the key factor to investigate the mutual influences between solar PV plants and a power grid. However, this problem has not been well solved, especially in how to apply the real measurements to validate the models of the solar PV plants. Taking fast-responding generator method as an example, this paper presents a model validation methodology for solar PV plant via the hybrid data dynamic simulation. First, the implementation scheme of hybrid data dynamic simulation suitable for DIgSILENT PowerFactory software is proposed, and then an analysis model of solar PV plant integration based on IEEE 9 system is established. At last, model validation of solar PV plant is achieved by employing hybrid data dynamic simulation. The results illustrate the effectiveness of the proposed method in solar PV plant model validation.

  7. Internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis.

    NARCIS (Netherlands)

    Steultjens, M.P.M.; Dekker, J.; Baar, M.E. van; Oostendorp, R.A.B.; Bijlsma, J.W.J.

    1999-01-01

    Objective: To establish the internal consistency of validity of an observational method for assessing diasbility in mobility in patients with osteoarthritis (OA), Methods: Data were obtained from 198 patients with OA of the hip or knee. Results of the observational method were compared with results

  8. Geostatistical validation and cross-validation of magnetometric measurements of soil pollution with Potentially Toxic Elements in problematic areas

    Science.gov (United States)

    Fabijańczyk, Piotr; Zawadzki, Jarosław

    2016-04-01

    Field magnetometry is fast method that was previously effectively used to assess the potential soil pollution. One of the most popular devices that are used to measure the soil magnetic susceptibility on the soil surface is a MS2D Bartington. Single reading using MS2D device of soil magnetic susceptibility is low time-consuming but often characterized by considerable errors related to the instrument or environmental and lithogenic factors. In this connection, measured values of soil magnetic susceptibility have to be usually validated using more precise, but also much more expensive, chemical measurements. The goal of this study was to analyze validation methods of magnetometric measurements using chemical analyses of a concentration of elements in soil. Additionally, validation of surface measurements of soil magnetic susceptibility was performed using selected parameters of a distribution of magnetic susceptibility in a soil profile. Validation was performed using selected geostatistical measures of cross-correlation. The geostatistical approach was compared with validation performed using the classic statistics. Measurements were performed at selected areas located in the Upper Silesian Industrial Area in Poland, and in the selected parts of Norway. In these areas soil magnetic susceptibility was measured on the soil surface using a MS2D Bartington device and in the soil profile using MS2C Bartington device. Additionally, soil samples were taken in order to perform chemical measurements. Acknowledgment The research leading to these results has received funding from the Polish-Norwegian Research Programme operated by the National Centre for Research and Development under the Norwegian Financial Mechanism 2009-2014 in the frame of Project IMPACT - Contract No Pol-Nor/199338/45/2013.

  9. A photographic method to measure food item intake. Validation in geriatric institutions.

    Science.gov (United States)

    Pouyet, Virginie; Cuvelier, Gérard; Benattar, Linda; Giboreau, Agnès

    2015-01-01

    From both a clinical and research perspective, measuring food intake is an important issue in geriatric institutions. However, weighing food in this context can be complex, particularly when the items remaining on a plate (side dish, meat or fish and sauce) need to be weighed separately following consumption. A method based on photography that involves taking photographs after a meal to determine food intake consequently seems to be a good alternative. This method enables the storage of raw data so that unhurried analyses can be performed to distinguish the food items present in the images. Therefore, the aim of this paper was to validate a photographic method to measure food intake in terms of differentiating food item intake in the context of a geriatric institution. Sixty-six elderly residents took part in this study, which was performed in four French nursing homes. Four dishes of standardized portions were offered to the residents during 16 different lunchtimes. Three non-trained assessors then independently estimated both the total and specific food item intakes of the participants using images of their plates taken after the meal (photographic method) and a reference image of one plate taken before the meal. Total food intakes were also recorded by weighing the food. To test the reliability of the photographic method, agreements between different assessors and agreements among various estimates made by the same assessor were evaluated. To test the accuracy and specificity of this method, food intake estimates for the four dishes were compared with the food intakes determined using the weighed food method. To illustrate the added value of the photographic method, food consumption differences between the dishes were explained by investigating the intakes of specific food items. Although they were not specifically trained for this purpose, the results demonstrated that the assessor estimates agreed between assessors and among various estimates made by the same

  10. Experimental validation of the buildings energy performance (PEC assessment methods with reference to occupied spaces heating

    Directory of Open Access Journals (Sweden)

    Cristian PETCU

    2010-01-01

    Full Text Available This paper is part of the series of pre-standardization research aimed to analyze the existing methods of calculating the Buildings Energy Performance (PEC in view of their correction of completing. The entire research activity aims to experimentally validate the PEC Calculation Algorithm as well as the comparative application, on the support of several case studies focused on representative buildings of the stock of buildings in Romania, of the PEC calculation methodology for buildings equipped with occupied spaces heating systems. The targets of the report are the experimental testing of the calculation models so far known (NP 048-2000, Mc 001-2006, SR EN 13790:2009, on the support provided by the CE INCERC Bucharest experimental building, together with the complex calculation algorithms specific to the dynamic modeling, for the evaluation of the occupied spaces heat demand in the cold season, specific to the traditional buildings and to modern buildings equipped with solar radiation passive systems, of the ventilated solar space type. The schedule of the measurements performed in the 2008-2009 cold season is presented as well as the primary processing of the measured data and the experimental validation of the heat demand monthly calculation methods, on the support of CE INCERC Bucharest. The calculation error per heating season (153 days of measurements between the measured heat demand and the calculated one was of 0.61%, an exceptional value confirming the phenomenological nature of the INCERC method, NP 048-2006. The mathematical model specific to the hourly thermal balance is recurrent – decisional with alternating paces. The experimental validation of the theoretical model is based on the measurements performed on the CE INCERC Bucharest building, within a time lag of 57 days (06.01-04.03.2009. The measurements performed on the CE INCERC Bucharest building confirm the accuracy of the hourly calculation model by comparison to the values

  11. An Improved Fuzzy Based Missing Value Estimation in DNA Microarray Validated by Gene Ranking

    Directory of Open Access Journals (Sweden)

    Sujay Saha

    2016-01-01

    Full Text Available Most of the gene expression data analysis algorithms require the entire gene expression matrix without any missing values. Hence, it is necessary to devise methods which would impute missing data values accurately. There exist a number of imputation algorithms to estimate those missing values. This work starts with a microarray dataset containing multiple missing values. We first apply the modified version of the fuzzy theory based existing method LRFDVImpute to impute multiple missing values of time series gene expression data and then validate the result of imputation by genetic algorithm (GA based gene ranking methodology along with some regular statistical validation techniques, like RMSE method. Gene ranking, as far as our knowledge, has not been used yet to validate the result of missing value estimation. Firstly, the proposed method has been tested on the very popular Spellman dataset and results show that error margins have been drastically reduced compared to some previous works, which indirectly validates the statistical significance of the proposed method. Then it has been applied on four other 2-class benchmark datasets, like Colorectal Cancer tumours dataset (GDS4382, Breast Cancer dataset (GSE349-350, Prostate Cancer dataset, and DLBCL-FL (Leukaemia for both missing value estimation and ranking the genes, and the results show that the proposed method can reach 100% classification accuracy with very few dominant genes, which indirectly validates the biological significance of the proposed method.

  12. Validity of an observation method for assessing pain behavior in individuals with multiple sclerosis.

    Science.gov (United States)

    Cook, Karon F; Roddey, Toni S; Bamer, Alyssa M; Amtmann, Dagmar; Keefe, Francis J

    2013-09-01

    Pain is a common and complex experience for individuals who live with multiple sclerosis (MS) and it interferes with physical, psychological, and social function. A valid and reliable tool for quantifying observed pain behaviors in MS is critical to understand how pain behaviors contribute to pain-related disability in this clinical population. To evaluate the reliability and validity of a pain behavioral observation protocol in individuals who have MS. Community-dwelling volunteers with MS (N=30), back pain (N=5), or arthritis (N=8) were recruited based on clinician referrals, advertisements, fliers, web postings, and participation in previous research. Participants completed the measures of pain severity, pain interference, and self-reported pain behaviors and were videotaped doing typical activities (e.g., walking and sitting). Two coders independently recorded frequencies of pain behaviors by category (e.g., guarding and bracing) and interrater reliability statistics were calculated. Naïve observers reviewed videotapes of individuals with MS and rated their pain. The Spearman's correlations were calculated between pain behavior frequencies and self-reported pain and pain ratings by naïve observers. Interrater reliability estimates indicated the reliability of pain codes in the MS sample. Kappa coefficients ranged from moderate (sighing=0.40) to substantial agreements (guarding=0.83). These values were comparable with those obtained in the combined back pain and arthritis sample. Concurrent validity was supported by correlations with self-reported pain (0.46-0.53) and with self-reports of pain behaviors (0.58). Construct validity was supported by a finding of 0.87 correlation between total pain behaviors observed by coders and mean pain ratings by naïve observers. Results support the use of the pain behavior observation protocol for assessing pain behaviors of individuals with MS. Valid assessments of pain behaviors of individuals with MS could lead to

  13. Validity of a Simple Method for Measuring Force-Velocity-Power Profile in Countermovement Jump.

    Science.gov (United States)

    Jiménez-Reyes, Pedro; Samozino, Pierre; Pareja-Blanco, Fernando; Conceição, Filipe; Cuadrado-Peñafiel, Víctor; González-Badillo, Juan José; Morin, Jean-Benoît

    2017-01-01

    To analyze the reliability and validity of a simple computation method to evaluate force (F), velocity (v), and power (P) output during a countermovement jump (CMJ) suitable for use in field conditions and to verify the validity of this computation method to compute the CMJ force-velocity (F-v) profile (including unloaded and loaded jumps) in trained athletes. Sixteen high-level male sprinters and jumpers performed maximal CMJs under 6 different load conditions (0-87 kg). A force plate sampling at 1000 Hz was used to record vertical ground-reaction force and derive vertical-displacement data during CMJ trials. For each condition, mean F, v, and P of the push-off phase were determined from both force-plate data (reference method) and simple computation measures based on body mass, jump height (from flight time), and push-off distance and used to establish the linear F-v relationship for each individual. Mean absolute bias values were 0.9% (± 1.6%), 4.7% (± 6.2%), 3.7% (± 4.8%), and 5% (± 6.8%) for F, v, P, and slope of the F-v relationship (S Fv ), respectively. Both methods showed high correlations for F-v-profile-related variables (r = .985-.991). Finally, all variables computed from the simple method showed high reliability, with ICC >.980 and CV push-off distance, and jump height are known.

  14. Development and validation of a method to estimate body weight in ...

    African Journals Online (AJOL)

    Mid-arm circumference (MAC) has previously been used as a surrogate indicator of habitus, and the objective of this study was to determine whether MAC cut-off values could be used to predict habitus scores (HSs) to create an objective and standardised weight estimation methodology, the PAWPER XL-MAC method.

  15. Development and validation of a thin-layer chromatography method for stability studies of naproxen

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar; Rodriguez Borges, Tania

    2011-01-01

    The validation of an analytical method was carried out to be applied to the stability studies of the future formulations of naproxen suppositories for infant and adult use. The factors which mostly influenced in the naproxen stability were determined, the major degradation occurred in oxidizing acid medium and by action of light. The possible formation of esters between the free carboxyl group present in naproxen and the glyceryl monoestereate present in the base was identified as one of the degradation paths in the new formulation. The results were satisfactory. A thin-layer chromatography-based method was developed as well as the best chromatographic conditions were selected. GF 254 silica gel plates and ultraviolet developer at 254 nm were employed. Three solvent systems were evaluated of which A made up of glacial acetic: tetrahydrofurane:toluene (3:9:90 v/v/v)allowed adequate resolution between the analyte and the possible degradation products, with detection limit of 1 μg. The use of the suggested method was restricted to the identification of possible degradation products just for qualitative purposes and not as final test. The method proved to be sensitive and selective enough to be applied for the stated objective, according to the validation results

  16. Development and validation of Ketorolac Tromethamine in eye drop formulation by RP-HPLC method

    Directory of Open Access Journals (Sweden)

    G. Sunil

    2017-02-01

    Full Text Available A simple, precise and accurate method was developed and validated for analysis of Ketorolac Tromethamine in eye drop formulation. An isocratic HPLC analysis was performed on Kromosil C18 column (150 cm × 4.6 mm × 5 μm. The compound was separated with the mixture of methanol and ammonium dihydrogen phosphate buffer in the ratio of 55:45 V/V, pH 3.0 was adjusted with O-phosphoric acid as the mobile phase at flow of 1.5 mL min−1. UV detection was performed at 314 nm using photo diode array detection. The retention time was found to be 6.01 min. The system suitability parameters such as theoretical plate count, tailing and percentage RSD between six standard injections were within the limit. The method was validated according to ICH guidelines. Calibrations were linear over the concentration range of 50–150 μg mL−1 as indicated by correlation coefficient (r of 0.999. The robustness of the method was evaluated by deliberately altering the chromatographic conditions. The developed method can be applicable for routine quantitative analysis.

  17. Validation of micro-CT against the section method regarding the assessment of marginal leakage of sealants.

    NARCIS (Netherlands)

    Chen, X.; Cuijpers, V.M.J.I.; Fan, M.W.; Frencken, J.E.F.M.

    2012-01-01

    BACKGROUND: The aim of this study was to validate the micro-CT and related software against the section method using the stereomicroscope for marginal leakage assessment along the sealant-enamel interface. METHODS: Pits and fissures of the occlusal surface of 10 teeth were sealed with a

  18. Stability indicating method development and validation of assay method for the estimation of rizatriptan benzoate in tablet

    Directory of Open Access Journals (Sweden)

    Chandrashekhar K. Gadewar

    2017-05-01

    Full Text Available A simple, sensitive, precise and specific high performance liquid chromatography method was developed and validated for the determination of rizatriptan in rizatriptan benzoate tablet. The separation was carried out by using a mobile phase consisting of acetonitrile: pH 3.4 phosphate buffer in ratio of 20:80. The column used was Zorbax SB CN 250 mm × 4.6 mm, 5 μ with a flow rate of 1 ml/min using UV detection at 225 nm. The retention time of rizatriptan and benzoic acid was found to be 4.751 and 8.348 min respectively. A forced degradation study of rizatriptan benzoate in its tablet form was conducted under the condition of hydrolysis, oxidation, thermal and photolysis. Rizatriptan was found to be stable in basic buffer while in acidic buffer was found to be degraded (water bath at 60 °C for 15 min. The detector response of rizatriptan is directly proportional to concentration ranging from 30% to 160% of test concentration i.e. 15.032 to 80.172 mcg/ml. Results of analysis were validated statistically and by recovery studies (mean recovery = 99.44. The result of the study showed that the proposed method is simple, rapid, precise and accurate, which is useful for the routine determination of rizatriptan in pharmaceutical dosage forms.

  19. A new method for assessing content validity in model-based creation and iteration of eHealth interventions.

    Science.gov (United States)

    Kassam-Adams, Nancy; Marsac, Meghan L; Kohser, Kristen L; Kenardy, Justin A; March, Sonja; Winston, Flaura K

    2015-04-15

    The advent of eHealth interventions to address psychological concerns and health behaviors has created new opportunities, including the ability to optimize the effectiveness of intervention activities and then deliver these activities consistently to a large number of individuals in need. Given that eHealth interventions grounded in a well-delineated theoretical model for change are more likely to be effective and that eHealth interventions can be costly to develop, assuring the match of final intervention content and activities to the underlying model is a key step. We propose to apply the concept of "content validity" as a crucial checkpoint to evaluate the extent to which proposed intervention activities in an eHealth intervention program are valid (eg, relevant and likely to be effective) for the specific mechanism of change that each is intended to target and the intended target population for the intervention. The aims of this paper are to define content validity as it applies to model-based eHealth intervention development, to present a feasible method for assessing content validity in this context, and to describe the implementation of this new method during the development of a Web-based intervention for children. We designed a practical 5-step method for assessing content validity in eHealth interventions that includes defining key intervention targets, delineating intervention activity-target pairings, identifying experts and using a survey tool to gather expert ratings of the relevance of each activity to its intended target, its likely effectiveness in achieving the intended target, and its appropriateness with a specific intended audience, and then using quantitative and qualitative results to identify intervention activities that may need modification. We applied this method during our development of the Coping Coach Web-based intervention for school-age children. In the evaluation of Coping Coach content validity, 15 experts from five countries

  20. Assessment of Advanced Life Support competence when combining different test methods--reliability and validity

    DEFF Research Database (Denmark)

    Ringsted, C; Lippert, F; Hesselfeldt, R

    2007-01-01

    Cardiac Arrest Simulation Test (CASTest) scenarios for the assessments according to guidelines 2005. AIMS: To analyse the reliability and validity of the individual sub-tests provided by ERC and to find a combination of MCQ and CASTest that provides a reliable and valid single effect measure of ALS...... that possessed high reliability, equality of test sets, and ability to discriminate between the two groups of supposedly different ALS competence. CONCLUSIONS: ERC sub-tests of ALS competence possess sufficient reliability and validity. A combined ALS score with equal weighting of one MCQ and one CASTest can...... competence. METHODS: Two groups of participants were included in this randomised, controlled experimental study: a group of newly graduated doctors, who had not taken the ALS course (N=17) and a group of students, who had passed the ALS course 9 months before the study (N=16). Reliability in terms of inter...

  1. Method for appraising model validity of randomised controlled trials of homeopathic treatment: multi-rater concordance study

    Science.gov (United States)

    2012-01-01

    Background A method for assessing the model validity of randomised controlled trials of homeopathy is needed. To date, only conventional standards for assessing intrinsic bias (internal validity) of trials have been invoked, with little recognition of the special characteristics of homeopathy. We aimed to identify relevant judgmental domains to use in assessing the model validity of homeopathic treatment (MVHT). We define MVHT as the extent to which a homeopathic intervention and the main measure of its outcome, as implemented in a randomised controlled trial (RCT), reflect 'state-of-the-art' homeopathic practice. Methods Using an iterative process, an international group of experts developed a set of six judgmental domains, with associated descriptive criteria. The domains address: (I) the rationale for the choice of the particular homeopathic intervention; (II) the homeopathic principles reflected in the intervention; (III) the extent of homeopathic practitioner input; (IV) the nature of the main outcome measure; (V) the capability of the main outcome measure to detect change; (VI) the length of follow-up to the endpoint of the study. Six papers reporting RCTs of homeopathy of varying design were randomly selected from the literature. A standard form was used to record each assessor's independent response per domain, using the optional verdicts 'Yes', 'Unclear', 'No'. Concordance among the eight verdicts per domain, across all six papers, was evaluated using the kappa (κ) statistic. Results The six judgmental domains enabled MVHT to be assessed with 'fair' to 'almost perfect' concordance in each case. For the six RCTs examined, the method allowed MVHT to be classified overall as 'acceptable' in three, 'unclear' in two, and 'inadequate' in one. Conclusion Future systematic reviews of RCTs in homeopathy should adopt the MVHT method as part of a complete appraisal of trial validity. PMID:22510227

  2. A simple HPLC method for the determination of halcinonide in lipid nanoparticles: development, validation, encapsulation efficiency, and in vitro drug permeation

    Directory of Open Access Journals (Sweden)

    Clarissa Elize Lopes

    2017-06-01

    Full Text Available ABSTRACT Halcinonide is a high-potency topical glucocorticoid used for skin inflammation treatments that presents toxic systemic effects. A simple and quick analytical method to quantify the amount of halcinonide encapsulated into lipid nanoparticles, such as polymeric lipid-core nanoparticles and solid lipid nanoparticles, was developed and validated regarding the drug's encapsulation efficiency and in vitro permeation. The development and validation of the analytical method were carried out using the high performance liquid chromatography with the UV detection at 239 nm. The validation parameters were specificity, linearity, precision and accuracy, limits of detection and quantitation, and robustness. The method presented an isocratic flow rate of 1.0 mL.min-1, a mobile phase methanol:water (85:15 v/v, and a retention time of 4.21 min. The method was validated according to international and national regulations. The halcinonide encapsulation efficiency in nanoparticles was greater than 99% and the in vitro drug permeation study showed that less than 9% of the drug permeated through the membrane, indicating a nanoparticle reservoir effect, which can reduce the halcinonide's toxic systemic effects. These studies demonstrated the applicability of the developed and validated analytical method to quantify halcinonide in lipid nanoparticles.

  3. Experimental validation of calculated atomic charges in ionic liquids

    Science.gov (United States)

    Fogarty, Richard M.; Matthews, Richard P.; Ashworth, Claire R.; Brandt-Talbot, Agnieszka; Palgrave, Robert G.; Bourne, Richard A.; Vander Hoogerstraete, Tom; Hunt, Patricia A.; Lovelock, Kevin R. J.

    2018-05-01

    A combination of X-ray photoelectron spectroscopy and near edge X-ray absorption fine structure spectroscopy has been used to provide an experimental measure of nitrogen atomic charges in nine ionic liquids (ILs). These experimental results are used to validate charges calculated with three computational methods: charges from electrostatic potentials using a grid-based method (ChelpG), natural bond orbital population analysis, and the atoms in molecules approach. By combining these results with those from a previous study on sulfur, we find that ChelpG charges provide the best description of the charge distribution in ILs. However, we find that ChelpG charges can lead to significant conformational dependence and therefore advise that small differences in ChelpG charges (<0.3 e) should be interpreted with care. We use these validated charges to provide physical insight into nitrogen atomic charges for the ILs probed.

  4. Validated Spectrophotometric Methods for Simultaneous Determination of Food Colorants and Sweeteners

    Directory of Open Access Journals (Sweden)

    Fatma Turak

    2013-01-01

    Full Text Available Two simple spectrophotometric methods have been proposed for simultaneous determination of two colorants (Indigotin and Brilliant Blue and two sweeteners (Acesulfame-K and Aspartame in synthetic mixtures and chewing gums without any prior separation or purification. The first method, derivative spectrophotometry (ZCDS, is based on recording the first derivative curves (for Indigotin, Brillant Blue, and Acesulfame-K and third-derivative curve (for Aspartame and determining each component using the zero-crossing technique. The other method, ratio derivative spectrophotometry (RDS, depends on application ratio spectra of first- and third-derivative spectrophotometry to resolve the interference due to spectral overlapping. Both colorants and sweeteners showed good linearity, with regression coefficients of 0.9992–0.9999. The LOD and LOQ values ranged from 0.05 to 0.33 μgmL−1 and from 0.06 to 0.47 μgmL−1, respectively. The intraday and interday precision tests produced good RSD% values (<0.81%; recoveries ranged from 99.78% to 100.67% for all two methods. The accuracy and precision of the methods have been determined, and the methods have been validated by analyzing synthetic mixtures containing colorants and sweeteners. Two methods were applied for the above combination, and satisfactory results were obtained. The results obtained by applying the ZCDS method were statistically compared with those obtained by the RDS method.

  5. Validation of the Five-Phase Method for Simulating Complex Fenestration Systems with Radiance against Field Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Geisler-Moroder, David [Bartenbach GmbH, Aldrans (Austria); Lee, Eleanor S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ward, Gregory J. [Anyhere Software, Albany, NY (United States)

    2016-08-29

    The Five-Phase Method (5-pm) for simulating complex fenestration systems with Radiance is validated against field measurements. The capability of the method to predict workplane illuminances, vertical sensor illuminances, and glare indices derived from captured and rendered high dynamic range (HDR) images is investigated. To be able to accurately represent the direct sun part of the daylight not only in sensor point simulations, but also in renderings of interior scenes, the 5-pm calculation procedure was extended. The validation shows that the 5-pm is superior to the Three-Phase Method for predicting horizontal and vertical illuminance sensor values as well as glare indices derived from rendered images. Even with input data from global and diffuse horizontal irradiance measurements only, daylight glare probability (DGP) values can be predicted within 10% error of measured values for most situations.

  6. Validation of an improved abnormality insertion method for medical image perception investigations

    Science.gov (United States)

    Madsen, Mark T.; Durst, Gregory R.; Caldwell, Robert T.; Schartz, Kevin M.; Thompson, Brad H.; Berbaum, Kevin S.

    2009-02-01

    The ability to insert abnormalities in clinical tomographic images makes image perception studies with medical images practical. We describe a new insertion technique and its experimental validation that uses complementary image masks to select an abnormality from a library and place it at a desired location. The method was validated using a 4-alternative forced-choice experiment. For each case, four quadrants were simultaneously displayed consisting of 5 consecutive frames of a chest CT with a pulmonary nodule. One quadrant was unaltered, while the other 3 had the nodule from the unaltered quadrant artificially inserted. 26 different sets were generated and repeated with order scrambling for a total of 52 cases. The cases were viewed by radiology staff and residents who ranked each quadrant by realistic appearance. On average, the observers were able to correctly identify the unaltered quadrant in 42% of cases, and identify the unaltered quadrant both times it appeared in 25% of cases. Consensus, defined by a majority of readers, correctly identified the unaltered quadrant in only 29% of 52 cases. For repeats, the consensus observer successfully identified the unaltered quadrant only once. We conclude that the insertion method can be used to reliably place abnormalities in perception experiments.

  7. Validation of a spectrophotometric method to determine ciprofibrate content in tablets

    Directory of Open Access Journals (Sweden)

    Guilherme Nobre Lima do Nascimento

    2011-03-01

    Full Text Available Ciprofibrate is a drug indicated in cases of hypertriglyceridemia and mixed hyperlipidemia, but no monographs are available in official compendia for the analysis of this substance in tablets. The objective of this work was to develop and validate a spectrophotometric method for routine analysis of ciprofibrate in tablets. In this study, commercial and standard ciprofibrate were used, as well as placebo in absolute ethanol, analyzed by UV spectrophotometer. All tests followed the rules of Resolution RE-899, 2003. The results showed that the developed and validated method offers low cost, easy implementation, precision and accuracy, and may be included in the routine of quality control laboratories.O ciprofibrato é um fármaco indicado em casos de hipertrigliceridemia e hiperlipidemia mista, mas não há monografias em compêndios oficiais para a análise desta substância em comprimidos. O objetivo deste trabalho é desenvolver e validar um método espectrofotométrico para análise de rotina de ciprofibrato em comprimidos. Neste estudo foram empregados ciprofibrato comercial, padrão e placebo em etanol absoluto, analisadas por espectrofotometria UV. Todos os testes seguiram as regras da Resolução RE- 899, 2003. Os resultados mostraram que o método desenvolvido e validado apresenta baixo custo, fácil implementação, precisão e exatidão e pode ser incluído em rotina de laboratórios de controle de qualidade.

  8. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software in Young People with Down Syndrome.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Rey-Abella, Ferran; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2016-05-01

    People with Down syndrome present skeletal abnormalities in their feet that can be analyzed by commonly used gold standard indices (the Hernández-Corvo index, the Chippaux-Smirak index, the Staheli arch index, and the Clarke angle) based on footprint measurements. The use of Photoshop CS5 software (Adobe Systems Software Ireland Ltd, Dublin, Ireland) to measure footprints has been validated in the general population. The present study aimed to assess the reliability and validity of this footprint assessment technique in the population with Down syndrome. Using optical podography and photography, 44 footprints from 22 patients with Down syndrome (11 men [mean ± SD age, 23.82 ± 3.12 years] and 11 women [mean ± SD age, 24.82 ± 6.81 years]) were recorded in a static bipedal standing position. A blinded observer performed the measurements using a validated manual method three times during the 4-month study, with 2 months between measurements. Test-retest was used to check the reliability of the Photoshop CS5 software measurements. Validity and reliability were obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed very good values for the Photoshop CS5 method (ICC, 0.982-0.995). Validity testing also found no differences between the techniques (ICC, 0.988-0.999). The Photoshop CS5 software method is reliable and valid for the study of footprints in young people with Down syndrome.

  9. Contribution to the validation of thermal ratchetting prevision methods in metallic structures

    International Nuclear Information System (INIS)

    Rakotovelo, A.M.

    1998-03-01

    This work concerns the steady state assessment in the metallic structures subjected to thermomechanical cyclic loadings in biaxial stress state. The effect of the short time mechanical overloads is also investigated. The first chapter is devoted to a bibliographic research concerning the behaviour of the materials and the structures in the cyclic plasticity. Some works relate to the experimental aspect as well as the numerical one for the steady state assessment of such structures are presented. The experimental part of the study is presented in the second chapter. The experimental device was carried out in order to prescribe tension and torsion forces combined with cyclic thermal loading. Some tests was then carried out, among these tests certain include some overloads in tension or torsion. The last chapter describes the numerical calculations using different models (linear isotropic hardening, linear kinematic hardening and elasto-viscoplastic Chaboche's model) and the application of some simplified methods for the ratchetting assessment in the structures. We have considered two categories of methods. The first one is based on an elastic analysis (Bree's diagram, 3 Sm rule, efficiency rule) and the second one combines elastic analysis and elastoplastic analysis of the first cycle (Gatt's and Taleb's methods). The results of this study have enabled: to validate in the biaxial stress state an expression which takes into account the effect of mechanical short time overloads; to test the performances of considered models to describe the evolution of the structure during the first cycle and to take into account the effect of short time overloads. Among the considered models, the elastoplastic Chaboche's model seems to be the most accurate to describe the structure's behaviour during the first cycles; to validate some simplified methods. Certain methods based only on elastic analysis (Bee's diagram and efficiency rule) seem not suitable for the considered kind of

  10. Development and validation of an automated, microscopy-based method for enumeration of groups of intestinal bacteria

    NARCIS (Netherlands)

    Jansen, GJ; Wildeboer-Veloo, ACM; Tonk, RHJ; Franks, AH; Welling, G

    An automated microscopy-based method using fluorescently labelled 16S rRNA-targeted oligonucleotide probes directed against the predominant groups of intestinal bacteria was developed and validated. The method makes use of the Leica 600HR. image analysis system, a Kodak MegaPlus camera model 1.4 and

  11. Fisk-based criteria to support validation of detection methods for drinking water and air.

    Energy Technology Data Exchange (ETDEWEB)

    MacDonell, M.; Bhattacharyya, M.; Finster, M.; Williams, M.; Picel, K.; Chang, Y.-S.; Peterson, J.; Adeshina, F.; Sonich-Mullin, C.; Environmental Science Division; EPA

    2009-02-18

    This report was prepared to support the validation of analytical methods for threat contaminants under the U.S. Environmental Protection Agency (EPA) National Homeland Security Research Center (NHSRC) program. It is designed to serve as a resource for certain applications of benchmark and fate information for homeland security threat contaminants. The report identifies risk-based criteria from existing health benchmarks for drinking water and air for potential use as validation targets. The focus is on benchmarks for chronic public exposures. The priority sources are standard EPA concentration limits for drinking water and air, along with oral and inhalation toxicity values. Many contaminants identified as homeland security threats to drinking water or air would convert to other chemicals within minutes to hours of being released. For this reason, a fate analysis has been performed to identify potential transformation products and removal half-lives in air and water so appropriate forms can be targeted for detection over time. The risk-based criteria presented in this report to frame method validation are expected to be lower than actual operational targets based on realistic exposures following a release. Note that many target criteria provided in this report are taken from available benchmarks without assessing the underlying toxicological details. That is, although the relevance of the chemical form and analogues are evaluated, the toxicological interpretations and extrapolations conducted by the authoring organizations are not. It is also important to emphasize that such targets in the current analysis are not health-based advisory levels to guide homeland security responses. This integrated evaluation of chronic public benchmarks and contaminant fate has identified more than 200 risk-based criteria as method validation targets across numerous contaminants and fate products in drinking water and air combined. The gap in directly applicable values is

  12. Validation of newly developed and redesigned key indicator methods for assessment of different working conditions with physical workloads based on mixed-methods design: a study protocol.

    Science.gov (United States)

    Klussmann, Andre; Liebers, Falk; Brandstädt, Felix; Schust, Marianne; Serafin, Patrick; Schäfer, Andreas; Gebhardt, Hansjürgen; Hartmann, Bernd; Steinberg, Ulf

    2017-08-21

    The impact of work-related musculoskeletal disorders is considerable. The assessment of work tasks with physical workloads is crucial to estimate the work-related health risks of exposed employees. Three key indicator methods are available for risk assessment regarding manual lifting, holding and carrying of loads; manual pulling and pushing of loads; and manual handling operations. Three further KIMs for risk assessment regarding whole-body forces, awkward body postures and body movement have been developed de novo. In addition, the development of a newly drafted combined method for mixed exposures is planned. All methods will be validated regarding face validity, reliability, convergent validity, criterion validity and further aspects of utility under practical conditions. As part of the joint project MEGAPHYS (multilevel risk assessment of physical workloads), a mixed-methods study is being designed for the validation of KIMs and conducted in companies of different sizes and branches in Germany. Workplaces are documented and analysed by observations, applying KIMs, interviews and assessment of environmental conditions. Furthermore, a survey among the employees at the respective workplaces takes place with standardised questionnaires, interviews and physical examinations. It is intended to include 1200 employees at 120 different workplaces. For analysis of the quality criteria, recommendations of the COSMIN checklist (COnsensus-based Standards for the selection of health Measurement INstruments) will be taken into account. The study was planned and conducted in accordance with the German Medical Professional Code and the Declaration of Helsinki as well as the German Federal Data Protection Act. The design of the study was approved by ethics committees. We intend to publish the validated KIMs in 2018. Results will be published in peer-reviewed journals, presented at international meetings and disseminated to actual users for practical application. © Article

  13. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    Science.gov (United States)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  14. Summary of Validation of Multi-Pesticide Methods for Various Pesticide Formulations

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    The validation of multi-pesticide methods applicable for various types of pesticide formulations is treated. In a worked-out practical example, i.e. lambda cyhalothrin, the theoretical considerations outlined in the General Guidance section are put into practice. GC conditions, selection of an internal standard and criteria for an acceptable repeatability of injections are outlined, followed by sample preparation, calibration, batch analysis and confirmation of results through comparison using different separation columns. Complete sets of data are displayed in tabular form for other pesticide active ingredients and real formulations. (author)

  15. Method for Determining Volumetric Efficiency and Its Experimental Validation

    Directory of Open Access Journals (Sweden)

    Ambrozik Andrzej

    2017-12-01

    Full Text Available Modern means of transport are basically powered by piston internal combustion engines. Increasingly rigorous demands are placed on IC engines in order to minimise the detrimental impact they have on the natural environment. That stimulates the development of research on piston internal combustion engines. The research involves experimental and theoretical investigations carried out using computer technologies. While being filled, the cylinder is considered to be an open thermodynamic system, in which non-stationary processes occur. To make calculations of thermodynamic parameters of the engine operating cycle, based on the comparison of cycles, it is necessary to know the mean constant value of cylinder pressure throughout this process. Because of the character of in-cylinder pressure pattern and difficulties in pressure experimental determination, in the present paper, a novel method for the determination of this quantity was presented. In the new approach, the iteration method was used. In the method developed for determining the volumetric efficiency, the following equations were employed: the law of conservation of the amount of substance, the first law of thermodynamics for open system, dependences for changes in the cylinder volume vs. the crankshaft rotation angle, and the state equation. The results of calculations performed with this method were validated by means of experimental investigations carried out for a selected engine at the engine test bench. A satisfactory congruence of computational and experimental results as regards determining the volumetric efficiency was obtained. The method for determining the volumetric efficiency presented in the paper can be used to investigate the processes taking place in the cylinder of an IC engine.

  16. Comparison of the Effects of Cross-validation Methods on Determining Performances of Classifiers Used in Diagnosing Congestive Heart Failure

    Directory of Open Access Journals (Sweden)

    Isler Yalcin

    2015-08-01

    Full Text Available Congestive heart failure (CHF occurs when the heart is unable to provide sufficient pump action to maintain blood flow to meet the needs of the body. Early diagnosis is important since the mortality rate of the patients with CHF is very high. There are different validation methods to measure performances of classifier algorithms designed for this purpose. In this study, k-fold and leave-one-out cross-validation methods were tested for performance measures of five distinct classifiers in the diagnosis of the patients with CHF. Each algorithm was run 100 times and the average and the standard deviation of classifier performances were recorded. As a result, it was observed that average performance was enhanced and the variability of performances was decreased when the number of data sections used in the cross-validation method was increased.

  17. Interinstrument comparison of remote-sensing devices and a new method for calculating on-road nitrogen oxides emissions and validation of vehicle-specific power.

    Science.gov (United States)

    Rushton, Christopher E; Tate, James E; Shepherd, Simon P; Carslaw, David C

    2018-02-01

    measurements made by the RSD4600 was constructed, validated, and shown to be more accurate than previous methods. Synchronized remote-sensing measurements of NO were taken using two different remote-sensing devices in an off-road study. It was found that the measurements taken by both instruments were well correlated. Fractional NO 2 measurements from a prior study, measurable on only one device, were used to create new NO x emission factors for the device that could not be measured by the second device. These estimates were validated against direct measurement of total NO x emission factors and shown to be an improvement on previous methodologies. Validation of vehicle-specific power was performed with good correlation observed.

  18. [Method for evaluating the competence of specialists--the validation of 360-degree-questionnaire].

    Science.gov (United States)

    Nørgaard, Kirsten; Pedersen, Juri; Ravn, Lisbeth; Albrecht-Beste, Elisabeth; Holck, Kim; Fredløv, Maj; Møller, Lars Krag

    2010-04-19

    Assessment of physicians' performance focuses on the quality of their work. The aim of this study was to develop a valid, usable and acceptable multisource feedback assessment tool (MFAT) for hospital consultants. Statements were produced on consultant competencies within non-medical areas like collaboration, professionalism, communication, health promotion, academics and administration. The statements were validated by physicians and later by non-physician professionals after adjustments had been made. In a pilot test, a group of consultants was assessed using the final collection of statements of the MFAT. They received a report with their personal results and subsequently evaluated the assessment method. In total, 66 statements were developed and after validation they were reduced and reformulated to 35. Mean scores for relevance and "easy to understand" of the statements were in the range between "very high degree" and "high degree". In the pilot test, 18 consultants were assessed by themselves, by 141 other physicians and by 125 other professionals in the hospital. About two thirds greatly benefited of the assessment report and half identified areas for personal development. About a third did not want the head of their department to know the assessment results directly; however, two thirds found a potential value in discussing the results with the head. We developed an MFAT for consultants with relevant and understandable statements. A pilot test confirmed that most of the consultants gained from the assessment, but some did not like to share their results with their heads. For these specialists other methods should be used.

  19. Cross-validation and Peeling Strategies for Survival Bump Hunting using Recursive Peeling Methods

    Science.gov (United States)

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    We introduce a framework to build a survival/risk bump hunting model with a censored time-to-event response. Our Survival Bump Hunting (SBH) method is based on a recursive peeling procedure that uses a specific survival peeling criterion derived from non/semi-parametric statistics such as the hazards-ratio, the log-rank test or the Nelson--Aalen estimator. To optimize the tuning parameter of the model and validate it, we introduce an objective function based on survival or prediction-error statistics, such as the log-rank test and the concordance error rate. We also describe two alternative cross-validation techniques adapted to the joint task of decision-rule making by recursive peeling and survival estimation. Numerical analyses show the importance of replicated cross-validation and the differences between criteria and techniques in both low and high-dimensional settings. Although several non-parametric survival models exist, none addresses the problem of directly identifying local extrema. We show how SBH efficiently estimates extreme survival/risk subgroups unlike other models. This provides an insight into the behavior of commonly used models and suggests alternatives to be adopted in practice. Finally, our SBH framework was applied to a clinical dataset. In it, we identified subsets of patients characterized by clinical and demographic covariates with a distinct extreme survival outcome, for which tailored medical interventions could be made. An R package PRIMsrc (Patient Rule Induction Method in Survival, Regression and Classification settings) is available on CRAN (Comprehensive R Archive Network) and GitHub. PMID:27034730

  20. Validation of the actuator line method using near wake measurements of the MEXICO rotor

    DEFF Research Database (Denmark)

    Nilsson, Karl; Shen, Wen Zhong; Sørensen, Jens Nørkær

    2015-01-01

    The purpose of the present work is to validate the capability of the actuator line method to compute vortex structures in the near wake behind the MEXICO experimental wind turbine rotor. In the MEXICO project/MexNext Annex, particle image velocimetry measurements have made it possible to determine...

  1. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation

    NARCIS (Netherlands)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-01-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes’ inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of

  2. The Bland-Altman Method Should Not Be Used in Regression Cross-Validation Studies

    Science.gov (United States)

    O'Connor, Daniel P.; Mahar, Matthew T.; Laughlin, Mitzi S.; Jackson, Andrew S.

    2011-01-01

    The purpose of this study was to demonstrate the bias in the Bland-Altman (BA) limits of agreement method when it is used to validate regression models. Data from 1,158 men were used to develop three regression equations to estimate maximum oxygen uptake (R[superscript 2] = 0.40, 0.61, and 0.82, respectively). The equations were evaluated in a…

  3. Development and validation of HPLC analytical method for quantitative determination of metronidazole in human plasma

    International Nuclear Information System (INIS)

    Safdar, K.A.; Shyum, S.B.; Usman, S.

    2016-01-01

    The objective of the present study was to develop a simple, rapid and sensitive reversed-phase high performance liquid chromatographic (RP-HPLC) analytical method with UV detection system for the quantitative determination of metronidazole in human plasma. The chromatographic separation was performed by using C18 RP column (250mm X 4.6mm, 5 meu m) as stationary phase and 0.01M potassium dihydrogen phosphate buffered at pH 3.0 and acetonitrile (83:17, v/v) as mobile phase at flow rate of 1.0 ml/min. The UV detection was carried out at 320nm. The method was validated as per the US FDA guideline for bioanalytical method validation and was found to be selective without interferences from mobile phase components, impurities and biological matrix. The method found to be linear over the concentration range of 0.2812 meu g/ml to 18.0 meu g/ml (r2 = 0.9987) with adequate level of accuracy and precision. The samples were found to be stable under various recommended laboratory and storage conditions. Therefore, the method can be used with adequate level of confidence and assurance for bioavailability, bioequivalence and other pharmacokinetic studies of metronidazole in human. (author)

  4. Validation and application of an high-order spectral difference method for flow induced noise simulation

    KAUST Repository

    Parsani, Matteo; Ghorbaniasl, Ghader; Lacor, C.

    2011-01-01

    . The method is based on the Ffowcs WilliamsHawkings approach, which provides noise contributions for monopole, dipole and quadrupole acoustic sources. This paper will focus on the validation and assessment of this hybrid approach using different test cases

  5. Toward feasible, valid, and reliable video-based assessments of technical surgical skills in the operating room

    DEFF Research Database (Denmark)

    Aggarwal, R.; Grantcharov, T.; Moorthy, K.

    2008-01-01

    .72). Conclusions: Video-based technical skills evaluation in the operating room is feasible, valid and reliable. Global rating scales hold promise for summative assessment, though further work is necessary to elucidate the value of procedural rating scales Udgivelsesdato: 2008/2......Objective: To determine the feasibility, validity, inter-rater, and intertest reliability of 4 previously published video-based rating scales, for technical skills assessment on a benchmark laparoscopic procedure. Summary Background Data: Assessment of technical skills is crucial...... to the demonstration and maintenance of competent healthcare practitioners. Traditional assessment methods are prone to subjectivity through a lack of proven validity and reliability. Methods: Nineteen surgeons (6 novice and 13 experienced) performed a median of 2 laparoscopic cholecystectomies each (range 1-5) on 53...

  6. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods

    Science.gov (United States)

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-01-01

    Simple Summary Many vaccines are tested for quality in experiments that require the use of large numbers of animals in procedures that often cause significant pain and distress. Newer technologies have fostered the development of vaccine quality control tests that reduce or eliminate the use of animals, but the availability of these newer methods has not guaranteed their acceptance by regulators or use by manufacturers. We discuss a strategic approach that has been used to assess and ultimately increase the use of non-animal vaccine quality tests in the U.S. and U.K. Abstract In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches. PMID:26486625

  7. Validity of observer ratings of the five-factor model of personality traits: a meta-analysis.

    Science.gov (United States)

    Oh, In-Sue; Wang, Gang; Mount, Michael K

    2011-07-01

    Conclusions reached in previous research about the magnitude and nature of personality-performance linkages have been based almost exclusively on self-report measures of personality. The purpose of this study is to address this void in the literature by conducting a meta-analysis of the relationship between observer ratings of the five-factor model (FFM) personality traits and overall job performance. Our results show that the operational validities of FFM traits based on observer ratings are higher than those based on self-report ratings. In addition, the results show that when based on observer ratings, all FFM traits are significant predictors of overall performance. Further, observer ratings of FFM traits show meaningful incremental validity over self-reports of corresponding FFM traits in predicting overall performance, but the reverse is not true. We conclude that the validity of FFM traits in predicting overall performance is higher than previously believed, and our results underscore the importance of disentangling the validity of personality traits from the method of measurement of the traits.

  8. Validation of uncertainty of weighing in the preparation of radionuclide standards by Monte Carlo Method

    International Nuclear Information System (INIS)

    Cacais, F.L.; Delgado, J.U.; Loayza, V.M.

    2016-01-01

    In preparing solutions for the production of radionuclide metrology standards is necessary measuring the quantity Activity by mass. The gravimetric method by elimination is applied to perform weighing with smaller uncertainties. At this work is carried out the validation, by the Monte Carlo method, of the uncertainty calculation approach implemented by Lourenco and Bobin according to ISO GUM for the method by elimination. The results obtained by both uncertainty calculation methods were consistent indicating that were fulfilled the conditions for the application of ISO GUM in the preparation of radioactive standards. (author)

  9. Fuzzy decision-making: a new method in model selection via various validity criteria

    International Nuclear Information System (INIS)

    Shakouri Ganjavi, H.; Nikravesh, K.

    2001-01-01

    Modeling is considered as the first step in scientific investigations. Several alternative models may be candida ted to express a phenomenon. Scientists use various criteria to select one model between the competing models. Based on the solution of a Fuzzy Decision-Making problem, this paper proposes a new method in model selection. The method enables the scientist to apply all desired validity criteria, systematically by defining a proper Possibility Distribution Function due to each criterion. Finally, minimization of a utility function composed of the Possibility Distribution Functions will determine the best selection. The method is illustrated through a modeling example for the A verage Daily Time Duration of Electrical Energy Consumption in Iran

  10. Development and validation of a stability-indicating capillary zone electrophoretic method for the assessment of entecavir and its correlation with liquid chromatographic methods.

    Science.gov (United States)

    Dalmora, Sergio Luiz; Nogueira, Daniele Rubert; D'Avila, Felipe Bianchini; Souto, Ricardo Bizogne; Leal, Diogo Paim

    2011-01-01

    A stability-indicating capillary zone electrophoresis (CZE) method was validated for the analysis of entecavir in pharmaceutical formulations, using nimesulide as an internal standard. A fused-silica capillary (50 µm i.d.; effective length, 40 cm) was used while being maintained at 25°C; the applied voltage was 25 kV. A background electrolyte solution consisted of a 20 mM sodium tetraborate solution at pH 10. Injections were performed using a pressure mode at 50 mbar for 5 s, with detection at 216 nm. The specificity and stability-indicating capability were proven through forced degradation studies, evaluating also the in vitro cytotoxicity test of the degraded products. The method was linear over the concentration range of 1-200 µg mL(-1) (r(2) = 0.9999), and was applied for the analysis of entecavir in tablet dosage forms. The results were correlated to those of validated conventional and fast LC methods, showing non-significant differences (p > 0.05).

  11. A Validated Stability-Indicating HPLC Method for Simultaneous Determination of Amoxicillin and Enrofloxacin Combination in an Injectable Suspension

    Directory of Open Access Journals (Sweden)

    Nidal Batrawi

    2017-02-01

    Full Text Available The combination of amoxicillin and enrofloxacin is a well-known mixture of veterinary drugs; it is used for the treatment of Gram-positive and Gram-negative bacteria. In the scientific literature, there is no high-performance liquid chromatography (HPLC-UV method for the simultaneous determination of this combination. The objective of this work is to develop and validate an HPLC method for the determination of this combination. In this regard, a new, simple and efficient reversed-phase HPLC method for simultaneous qualitative and quantitative determination of amoxicillin and enrofloxacin, in an injectable preparation with a mixture of inactive excipients, has been developed and validated. The HPLC separation method was performed using a reversed-phase (RP-C18e (250 mm × 4.0 mm, 5 μm column at room temperature, with a gradient mobile phase of acetonitrile and phosphate buffer containing methanol at pH 5.0, a flow rate of 0.8 mL/min and ultraviolet detection at 267 nm. This method was validated in accordance with the Food and Drug Administration (FDA and the International Conference on Harmonisation (ICH guidelines and showed excellent linearity, accuracy, precision, specificity, robustness, ruggedness, and system suitability results within the acceptance criteria. A stability-indicating study was also carried out and indicated that this method can also be used for purity and degradation evaluation of these formulations.

  12. Rapid detection of Salmonella in meat: Comparative and collaborative validation of a non-complex and cost effective pre-PCR protocol

    DEFF Research Database (Denmark)

    Löfström, Charlotta; Hansen, F.; Mansdal, S.

    2011-01-01

    samples using a real-time PCR method. The protocol included incubation in buffered peptone water, centrifugation of an aliquot and a boiling procedure. The validation study included comparative and collaborative trials recommended by the Nordic Organization for Validation of Alternative Methods (NordVal......). The comparative trial was performed against a culture based reference method (NMKL187, 2007) and a previously NordVal approved PCR method with a semi-automated magnetic bead-based DNA extraction step using 122 artificially contaminated samples. The limit of detection (LOD50) was found to be 3.0, 3.2 and 3.4 CFU...

  13. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin.

    Science.gov (United States)

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2015-11-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. © The Author(s) 2015.

  14. Experimental validation of a method characterizing bow tie filters in CT scanners using a real-time dose probe

    International Nuclear Information System (INIS)

    McKenney, Sarah E.; Nosratieh, Anita; Gelskey, Dale; Yang Kai; Huang Shinying; Chen Lin; Boone, John M.

    2011-01-01

    Purpose: Beam-shaping or ''bow tie'' (BT) filters are used to spatially modulate the x-ray beam in a CT scanner, but the conventional method of step-and-shoot measurement to characterize a beam's profile is tedious and time-consuming. The theory for characterization of bow tie relative attenuation (COBRA) method, which relies on a real-time dosimeter to address the issues of conventional measurement techniques, was previously demonstrated using computer simulations. In this study, the feasibility of the COBRA theory is further validated experimentally through the employment of a prototype real-time radiation meter and a known BT filter. Methods: The COBRA method consisted of four basic steps: (1) The probe was placed at the edge of a scanner's field of view; (2) a real-time signal train was collected as the scanner's gantry rotated with the x-ray beam on; (3) the signal train, without a BT filter, was modeled using peak values measured in the signal train of step 2; and (4) the relative attenuation of the BT filter was estimated from filtered and unfiltered data sets. The prototype probe was first verified to have an isotropic and linear response to incident x-rays. The COBRA method was then tested on a dedicated breast CT scanner with a custom-designed BT filter and compared to the conventional step-and-shoot characterization of the BT filter. Using basis decomposition of dual energy signal data, the thickness of the filter was estimated and compared to the BT filter's manufacturing specifications. The COBRA method was also demonstrated with a clinical whole body CT scanner using the body BT filter. The relative attenuation was calculated at four discrete x-ray tube potentials and used to estimate the thickness of the BT filter. Results: The prototype probe was found to have a linear and isotropic response to x-rays. The relative attenuation produced from the COBRA method fell within the error of the relative attenuation measured with the step-and-shoot method

  15. Validation of Field Methods to Assess Body Fat Percentage in Elite Youth Soccer Players.

    Science.gov (United States)

    Munguia-Izquierdo, Diego; Suarez-Arrones, Luis; Di Salvo, Valter; Paredes-Hernandez, Victor; Alcazar, Julian; Ara, Ignacio; Kreider, Richard; Mendez-Villanueva, Alberto

    2018-05-01

    This study determined the most effective field method for quantifying body fat percentage in male elite youth soccer players and developed prediction equations based on anthropometric variables. Forty-four male elite-standard youth soccer players aged 16.3-18.0 years underwent body fat percentage assessments, including bioelectrical impedance analysis and the calculation of various skinfold-based prediction equations. Dual X-ray absorptiometry provided a criterion measure of body fat percentage. Correlation coefficients, bias, limits of agreement, and differences were used as validity measures, and regression analyses were used to develop soccer-specific prediction equations. The equations from Sarria et al. (1998) and Durnin & Rahaman (1967) reached very large correlations and the lowest biases, and they reached neither the practically worthwhile difference nor the substantial difference between methods. The new youth soccer-specific skinfold equation included a combination of triceps and supraspinale skinfolds. None of the practical methods compared in this study are adequate for estimating body fat percentage in male elite youth soccer players, except for the equations from Sarria et al. (1998) and Durnin & Rahaman (1967). The new youth soccer-specific equation calculated in this investigation is the only field method specifically developed and validated in elite male players, and it shows potentially good predictive power. © Georg Thieme Verlag KG Stuttgart · New York.

  16. Method validation for determination of heavy metals in wine and slightly alcoholic beverages by ICP-MS

    International Nuclear Information System (INIS)

    Voica, Cezara; Dehelean, Adriana; Pamula, A

    2009-01-01

    The Organisation International de la Vigne et du Vin (OIV) fixed an uppermost level for some heavy metals in wine. Consequently, the need to determine very low concentration of elements that may be present in wine in trace and ultra trace levels occurred. Inductively coupled plasma mass spectrometry ICP-MS is considered an excellent tool for detailed characterization of the elementary composition of many samples, including samples of drinks. In this study a method of quantitative analysis for the determination of toxic metals (Cr, As, Cd, Ni, Hg, Pb) in wines and slightly alcoholic beverages by ICP-MS was validated. Several parameters have been taken into account and evaluated for the validation of method, namely: linearity, the minimum detection limit, the limit of quantification, accuracy and uncertainty.

  17. Method validation for determination of heavy metals in wine and slightly alcoholic beverages by ICP-MS

    Energy Technology Data Exchange (ETDEWEB)

    Voica, Cezara; Dehelean, Adriana; Pamula, A, E-mail: cezara.voica@itim-cj.r [National Institute for Research and Development of Isotopic and Molecular Technologies, 65-103 Donath, 400293 Cluj-Napoca (Romania)

    2009-08-01

    The Organisation International de la Vigne et du Vin (OIV) fixed an uppermost level for some heavy metals in wine. Consequently, the need to determine very low concentration of elements that may be present in wine in trace and ultra trace levels occurred. Inductively coupled plasma mass spectrometry ICP-MS is considered an excellent tool for detailed characterization of the elementary composition of many samples, including samples of drinks. In this study a method of quantitative analysis for the determination of toxic metals (Cr, As, Cd, Ni, Hg, Pb) in wines and slightly alcoholic beverages by ICP-MS was validated. Several parameters have been taken into account and evaluated for the validation of method, namely: linearity, the minimum detection limit, the limit of quantification, accuracy and uncertainty.

  18. Validation of a Residual Stress Measurement Method by Swept High-Frequency Eddy Currents

    International Nuclear Information System (INIS)

    Lee, C.; Shen, Y.; Lo, C. C. H.; Nakagawa, N.

    2007-01-01

    This paper reports on a swept high-frequency eddy current (SHFEC) measurement method developed for electromagnetic nondestructive characterization of residual stresses in shot peened aerospace materials. In this approach, we regard shot-peened surfaces as modified surface layers of varying conductivity, and determine the conductivity deviation profile by inversion of the SHFEC data. The SHFEC measurement system consists of a pair of closely matched printed-circuit-board coils driven by laboratory instrument under software control. This provides improved sensitivity and high frequency performance compared to conventional coils, so that swept frequency EC measurements up to 50 MHz can be made to achieve the smallest skin depth of 80 μm for nickel-based superalloys. We devised a conductivity profile inversion procedure based on the laterally uniform multi-layer theory of Cheng, Dodd and Deeds. The main contribution of this paper is the methodology validation. Namely, the forward and inverse models were validated against measurements on artificial layer specimens consisting of metal films with different conductivities placed on a metallic substrate. The inversion determined the film conductivities which were found to agree with those measured using the direct current potential drop (DCPD) method

  19. Validation of a Residual Stress Measurement Method by Swept High-Frequency Eddy Currents

    Science.gov (United States)

    Lee, C.; Shen, Y.; Lo, C. C. H.; Nakagawa, N.

    2007-03-01

    This paper reports on a swept high-frequency eddy current (SHFEC) measurement method developed for electromagnetic nondestructive characterization of residual stresses in shot peened aerospace materials. In this approach, we regard shot-peened surfaces as modified surface layers of varying conductivity, and determine the conductivity deviation profile by inversion of the SHFEC data. The SHFEC measurement system consists of a pair of closely matched printed-circuit-board coils driven by laboratory instrument under software control. This provides improved sensitivity and high frequency performance compared to conventional coils, so that swept frequency EC measurements up to 50 MHz can be made to achieve the smallest skin depth of 80 μm for nickel-based superalloys. We devised a conductivity profile inversion procedure based on the laterally uniform multi-layer theory of Cheng, Dodd and Deeds. The main contribution of this paper is the methodology validation. Namely, the forward and inverse models were validated against measurements on artificial layer specimens consisting of metal films with different conductivities placed on a metallic substrate. The inversion determined the film conductivities which were found to agree with those measured using the direct current potential drop (DCPD) method.

  20. Amphenicols stability in medicated feed – development and validation of liquid chromatography method

    Directory of Open Access Journals (Sweden)

    Pietro Wojciech Jerzy

    2014-12-01

    Full Text Available A liquid chromatography-ultraviolet detection method for the determination of florfenicol (FF and thiamphenicol (TAP in feeds is presented. The method comprises the extraction of analytes from the matrix with a mixture of methanol and acetonitrile, drying of the extract, and its dissolution in phosphate buffer. The analysis was performed with a gradient programme of the mobile phase composed of acetonitrile and buffer (pH = 7.3 on a Zorbax Eclipse Plus C18 (150 × 4.6 mm, 5 μm analytical column with UV (λ = 220 nm detection. The analytical procedure has been successfully adopted and validated for quantitative determination of florfenicol and thiamphenicol in feed samples. Sensitivity, specificity, linearity, repeatability, and intralaboratory reproducibility were included in the validation. The mean recovery of amphenicols was 93.5% within the working range of 50-4000 mg/kg. Simultaneous determination of chloramphenicol, which is banned in the feed, was also included within the same procedure of FF and TAP stability studies. Storing the medicated feed at room temperature for up to one month decreased concentration in the investigated drugs even by 45%. These findings are relevant to successful provision of therapy to animals.

  1. Validated modified Lycopodium spore method development for standardisation of ingredients of an ayurvedic powdered formulation Shatavaryadi churna.

    Science.gov (United States)

    Kumar, Puspendra; Jha, Shivesh; Naved, Tanveer

    2013-01-01

    Validated modified lycopodium spore method has been developed for simple and rapid quantification of herbal powdered drugs. Lycopodium spore method was performed on ingredients of Shatavaryadi churna, an ayurvedic formulation used as immunomodulator, galactagogue, aphrodisiac and rejuvenator. Estimation of diagnostic characters of each ingredient of Shatavaryadi churna individually was carried out. Microscopic determination, counting of identifying number, measurement of area, length and breadth of identifying characters were performed using Leica DMLS-2 microscope. The method was validated for intraday precision, linearity, specificity, repeatability, accuracy and system suitability, respectively. The method is simple, precise, sensitive, and accurate, and can be used for routine standardisation of raw materials of herbal drugs. This method gives the ratio of individual ingredients in the powdered drug so that any adulteration of genuine drug with its adulterant can be found out. The method shows very good linearity value between 0.988-0.999 for number of identifying character and area of identifying character. Percentage purity of the sample drug can be determined by using the linear equation of standard genuine drug.

  2. Raman fiber-optical method for colon cancer detection: Cross-validation and outlier identification approach

    Science.gov (United States)

    Petersen, D.; Naveed, P.; Ragheb, A.; Niedieker, D.; El-Mashtoly, S. F.; Brechmann, T.; Kötting, C.; Schmiegel, W. H.; Freier, E.; Pox, C.; Gerwert, K.

    2017-06-01

    Endoscopy plays a major role in early recognition of cancer which is not externally accessible and therewith in increasing the survival rate. Raman spectroscopic fiber-optical approaches can help to decrease the impact on the patient, increase objectivity in tissue characterization, reduce expenses and provide a significant time advantage in endoscopy. In gastroenterology an early recognition of malign and precursor lesions is relevant. Instantaneous and precise differentiation between adenomas as precursor lesions for cancer and hyperplastic polyps on the one hand and between high and low-risk alterations on the other hand is important. Raman fiber-optical measurements of colon biopsy samples taken during colonoscopy were carried out during a clinical study, and samples of adenocarcinoma (22), tubular adenomas (141), hyperplastic polyps (79) and normal tissue (101) from 151 patients were analyzed. This allows us to focus on the bioinformatic analysis and to set stage for Raman endoscopic measurements. Since spectral differences between normal and cancerous biopsy samples are small, special care has to be taken in data analysis. Using a leave-one-patient-out cross-validation scheme, three different outlier identification methods were investigated to decrease the influence of systematic errors, like a residual risk in misplacement of the sample and spectral dilution of marker bands (esp. cancerous tissue) and therewith optimize the experimental design. Furthermore other validations methods like leave-one-sample-out and leave-one-spectrum-out cross-validation schemes were compared with leave-one-patient-out cross-validation. High-risk lesions were differentiated from low-risk lesions with a sensitivity of 79%, specificity of 74% and an accuracy of 77%, cancer and normal tissue with a sensitivity of 79%, specificity of 83% and an accuracy of 81%. Additionally applied outlier identification enabled us to improve the recognition of neoplastic biopsy samples.

  3. Raman fiber-optical method for colon cancer detection: Cross-validation and outlier identification approach.

    Science.gov (United States)

    Petersen, D; Naveed, P; Ragheb, A; Niedieker, D; El-Mashtoly, S F; Brechmann, T; Kötting, C; Schmiegel, W H; Freier, E; Pox, C; Gerwert, K

    2017-06-15

    Endoscopy plays a major role in early recognition of cancer which is not externally accessible and therewith in increasing the survival rate. Raman spectroscopic fiber-optical approaches can help to decrease the impact on the patient, increase objectivity in tissue characterization, reduce expenses and provide a significant time advantage in endoscopy. In gastroenterology an early recognition of malign and precursor lesions is relevant. Instantaneous and precise differentiation between adenomas as precursor lesions for cancer and hyperplastic polyps on the one hand and between high and low-risk alterations on the other hand is important. Raman fiber-optical measurements of colon biopsy samples taken during colonoscopy were carried out during a clinical study, and samples of adenocarcinoma (22), tubular adenomas (141), hyperplastic polyps (79) and normal tissue (101) from 151 patients were analyzed. This allows us to focus on the bioinformatic analysis and to set stage for Raman endoscopic measurements. Since spectral differences between normal and cancerous biopsy samples are small, special care has to be taken in data analysis. Using a leave-one-patient-out cross-validation scheme, three different outlier identification methods were investigated to decrease the influence of systematic errors, like a residual risk in misplacement of the sample and spectral dilution of marker bands (esp. cancerous tissue) and therewith optimize the experimental design. Furthermore other validations methods like leave-one-sample-out and leave-one-spectrum-out cross-validation schemes were compared with leave-one-patient-out cross-validation. High-risk lesions were differentiated from low-risk lesions with a sensitivity of 79%, specificity of 74% and an accuracy of 77%, cancer and normal tissue with a sensitivity of 79%, specificity of 83% and an accuracy of 81%. Additionally applied outlier identification enabled us to improve the recognition of neoplastic biopsy samples. Copyright

  4. Design for validation: An approach to systems validation

    Science.gov (United States)

    Carter, William C.; Dunham, Janet R.; Laprie, Jean-Claude; Williams, Thomas; Howden, William; Smith, Brian; Lewis, Carl M. (Editor)

    1989-01-01

    Every complex system built is validated in some manner. Computer validation begins with review of the system design. As systems became too complicated for one person to review, validation began to rely on the application of adhoc methods by many individuals. As the cost of the changes mounted and the expense of failure increased, more organized procedures became essential. Attempts at devising and carrying out those procedures showed that validation is indeed a difficult technical problem. The successful transformation of the validation process into a systematic series of formally sound, integrated steps is necessary if the liability inherent in the future digita-system-based avionic and space systems is to be minimized. A suggested framework and timetable for the transformtion are presented. Basic working definitions of two pivotal ideas (validation and system life-cyle) are provided and show how the two concepts interact. Many examples are given of past and present validation activities by NASA and others. A conceptual framework is presented for the validation process. Finally, important areas are listed for ongoing development of the validation process at NASA Langley Research Center.

  5. Validation of methods for determination of free water content in poultry meat

    Directory of Open Access Journals (Sweden)

    Jarmila Žítková

    2007-01-01

    Full Text Available Methods for determination of free water content in poultry meat are described in Commission Regulation EEC No 1538/91 as amended and in ČSN 57 3100. Two of them (method A and D have been validated in conditions of a Czech poultry processing plant. The capacity of slaughtering was 6000 pieces per hour and carcasses were chilled by air with spraying. All determinations were carried out in the plant’s lab and in the lab of the Institute of Food Technology. Method A was used to detect the amount of water lost from frozen chicken during thawing in controlled conditions. Twenty carcasses from six weight groups (900 g–1400 g were tested. The average values of thaw loss water contents ranged between 0.46% and 1.71%, the average value of total 120 samples was 1.16%. The results were compared with the required maximum limit value of 3.3%. The water loss content was in negative correlation with the weight of chicken (r = –0.56. Method D (chemical test has been applied to determine the total water content of certain poultry cuts. It involved the determination of water and protein contents of 62 representative samples in total. The average values of ratio of water weight to proteins weight WA/RPA were in breast fillets 3.29, in legs with a portion of the back 4.06, legs 4.00, thighs 3.85 and drumsticks 4.10. The results corresponded to the required limit values for breast fillets 3.40 and for leg cuts 4.15. The ratio of water weight to proteins weight WA/RPA was correlated with the weight of chicken for breast fillets negatively (r = –0.61 and for leg cuts positively (r = 0.70. Different correlations can be explained by the distribution of water, protein and fat in carcasses. The evaluation of methods in the parameter of percentage ratio of the average value to the limit showed that method D (results were at the level of 97% of the limit was more exact than method A (results were at the level 32% of the limit but it is more expensive. Both methods

  6. Validated Reverse Phase HPLC Method for the Determination of Impurities in Etoricoxib

    Directory of Open Access Journals (Sweden)

    S. Venugopal

    2011-01-01

    Full Text Available This paper describes the development of reverse phase HPLC method for etoricoxib in the presence of impurities and degradation products generated from the forced degradation studies. The drug substance was subjected to stress conditions of hydrolysis, oxidation, photolysis and thermal degradation. The degradation of etoricoxib was observed under base and oxidation environment. The drug was found stable in other stress conditions studied. Successful separation of the drug from the process related impurities and degradation products were achieved on zorbax SB CN (250 x 4.6 mm 5 μm particle size column using reverse phase HPLC method. The isocratic method employed with a mixture of buffer and acetonitrile in a ratio of 60:40 respectively. Disodium hydrogen orthophosphate (0.02 M is used as buffer and pH adjusted to 7.20 with 1 N sodium hydroxide solution. The HPLC method was developed and validated with respect to linearity, accuracy, precision, specificity and ruggedness.

  7. Validated spectophotometric methods for the assay of cinitapride hydrogen tartrate in pharmaceuticals

    Directory of Open Access Journals (Sweden)

    Satyanarayana K.V.V.

    2013-01-01

    Full Text Available Three simple, selective and rapid spectrophotometric methods have been established for the determination of cinitapride hydrogen tartrate (CHT in pharmaceutical tablets. The proposed methods are based on the diazotization of CHT with sodium nitrite and hydrochloric acid, followed by coupling with resorcinol, 1-benzoylacetone and 8-hydroxyquinoline in alkaline medium for methods A, B and C respectively. The formed azo dyes are measured at 442, 465 and 552 nm for methods A, B and C respectively. The parameters that affect the reaction were carefully optimized. Under optimum conditions, Beer’s law is obeyed over the ranges 2.0-32.0, 1.0-24.0 and 1.0-20.0 μg. mL-1 for methods A, B, and C, respectively. The calculated molar absorptivity values are 1.2853 x104, 1.9624 x104 and 3.92 x104 L.mol-1.cm-1 for methods A, B and C, respectively. The results of the proposed procedures were validated statistically according to ICH guidelines. The proposed methods were successfully applied to the determination of CHT in Cintapro tablets without interference from common excipients encountered.

  8. Methods for validating the performance of wearable motion-sensing devices under controlled conditions

    International Nuclear Information System (INIS)

    Bliley, Kara E; Kaufman, Kenton R; Gilbert, Barry K

    2009-01-01

    This paper presents validation methods for assessing the accuracy and precision of motion-sensing device (i.e. accelerometer) measurements. The main goals of this paper were to assess the accuracy and precision of these measurements against a gold standard, to determine if differences in manufacturing and assembly significantly affected device performance and to determine if measurement differences due to manufacturing and assembly could be corrected by applying certain post-processing techniques to the measurement data during analysis. In this paper, the validation of a posture and activity detector (PAD), a device containing a tri-axial accelerometer, is described. Validation of the PAD devices required the design of two test fixtures: a test fixture to position the device in a known orientation, and a test fixture to rotate the device at known velocities and accelerations. Device measurements were compared to these known orientations and accelerations. Several post-processing techniques were utilized in an attempt to reduce variability in the measurement error among the devices. In conclusion, some of the measurement errors due to the inevitable differences in manufacturing and assembly were significantly improved (p < 0.01) by these post-processing techniques

  9. Validation of a method to determine methylmercury in fish tissues using gas chromatography

    International Nuclear Information System (INIS)

    Vega Bolannos, Luisa O.; Arias Verdes, Jose A.; Beltran Llerandi, Gilberto; Castro Diaz, Odalys; Moreno Tellez, Olga L.

    2000-01-01

    We validated a method to determine methylmercury in fish tissues using gas chromatography with an electron capture detector as described by the Association of Official Analytical Chemist (AOAC) International. The linear curve range was 0.02 to 1 g/ml and linear correlation coefficient was 0.9979. A 1 mg/kg methylmercury-contaminated fish sample was analyzed 20 times to determine repeatability of the method. The quantification limit was 0.16 mg/kg and detection limit was 0.06 ppm. Fish samples contaminated with 0.2 to 10 mg/kg methylmercury showed recovery indexes from 94.66 to 108.8%

  10. A validated high performance thin layer chromatography method for determination of yohimbine hydrochloride in pharmaceutical preparations

    OpenAIRE

    Jihan M Badr

    2013-01-01

    Background: Yohimbine is an indole alkaloid used as a promising therapy for erectile dysfunction. A number of methods were reported for the analysis of yohimbine in the bark or in pharmaceutical preparations. Materials and Method: In the present work, a simple and sensitive high performance thin layer chromatographic method is developed for determination of yohimbine (occurring as yohimbine hydrochloride) in pharmaceutical preparations and validated according to International Conference of Ha...

  11. Alternative method to validate the seasonal land cover regions of the conterminous United States

    Science.gov (United States)

    Zhiliang Zhu; Donald O. Ohlen; Raymond L. Czaplewski; Robert E. Burgan

    1996-01-01

    An accuracy assessment method involving double sampling and the multivariate composite estimator has been used to validate the prototype seasonal land cover characteristics database of the conterminous United States. The database consists of 159 land cover classes, classified using time series of 1990 1-km satellite data and augmented with ancillary data including...

  12. Validated stability-indicating spectrofluorimetric methods for the determination of ebastine in pharmaceutical preparations

    Directory of Open Access Journals (Sweden)

    Eid Manal

    2011-03-01

    Full Text Available Abstract Two sensitive, selective, economic, and validated spectrofluorimetric methods were developed for the determination of ebastine (EBS in pharmaceutical preparations depending on reaction with its tertiary amino group. Method I involves condensation of the drug with mixed anhydrides (citric and acetic anhydrides producing a product with intense fluorescence, which was measured at 496 nm after excitation at 388 nm. Method (IIA describes quantitative fluorescence quenching of eosin upon addition of the studied drug where the decrease in the fluorescence intensity was directly proportional to the concentration of ebastine; the fluorescence quenching was measured at 553 nm after excitation at 457 nm. This method was extended to (Method IIB to apply first and second derivative synchronous spectrofluorimetric method (FDSFS & SDSFS for the simultaneous analysis of EBS in presence of its alkaline, acidic, and UV degradation products. The proposed methods were successfully applied for the determination of the studied compound in its dosage forms. The results obtained were in good agreement with those obtained by a comparison method. Both methods were utilized to investigate the kinetics of the degradation of the drug.

  13. Validity of Evidence-Derived Criteria for Reactive Attachment Disorder: Indiscriminately Social/Disinhibited and Emotionally Withdrawn/Inhibited Types

    Science.gov (United States)

    Gleason, Mary Margaret; Fox, Nathan A.; Drury, Stacy; Smyke, Anna; Egger, Helen L.; Nelson, Charles A., III; Gregas, Matthew C.; Zeanah, Charles H.

    2011-01-01

    Objective: This study examined the validity of criteria for indiscriminately social/disinhibited and emotionally withdrawn/inhibited reactive attachment disorder (RAD). Method: As part of a longitudinal intervention trial of previously institutionalized children, caregiver interviews and direct observational measurements provided continuous and…

  14. A Study of Method Development, Validation, and Forced Degradation for Simultaneous Quantification of Paracetamol and Ibuprofen in Pharmaceutical Dosage Form by RP-HPLC Method

    OpenAIRE

    Jahan, Md. Sarowar; Islam, Md. Jahirul; Begum, Rehana; Kayesh, Ruhul; Rahman, Asma

    2014-01-01

    A rapid and stability-indicating reversed phase high-performance liquid chromatography (RP-HPLC) method was developed for simultaneous quantification of paracetamol and ibuprofen in their combined dosage form especially to get some more advantages over other methods already developed for this combination. The method was validated according to United States Pharmacopeia (USP) guideline with respect to accuracy, precision, specificity, linearity, solution stability, robustness, sensitivity, and...

  15. Validation of a digital photographic method for assessment of dietary quality of school lunch sandwiches brought from home

    DEFF Research Database (Denmark)

    Sabinsky, Marianne; Toft, Ulla; Andersen, Klaus K

    2013-01-01

    Background: It is a challenge to assess children’s dietary intake. The digital photographic method (DPM) may be an objective method that can overcome some of these challenges. Objective: The aim of this study was to evaluate the validity and reliability of a DPM to assess the quality of dietary....... The lunches were photographed using a standardised DPM. From the digital images, the dietary components were estimated by a trained image analyst using weights or household measures and the dietary quality was assessed using a validated Meal Index of Dietary Quality (Meal IQ). The dietary components...... and the Meal IQ obtained from the digital images were validated against the objective weighed foods of the school lunch sandwiches. To determine interrater reliability, the digital images were evaluated by a second image analyst. Results: Correlation coefficients between the DPM and the weighed foods ranged...

  16. Validation of multivariate classification methods using analytical fingerprints – concept and case study on organic feed for laying hens

    NARCIS (Netherlands)

    Alewijn, Martin; Voet, van der Hilko; Ruth, van Saskia

    2016-01-01

    Multivariate classification methods based on analytical fingerprints have found many applications in the food and feed area, but practical applications are still scarce due to a lack of a generally accepted validation procedure. This paper proposes a new approach for validation of this type of

  17. Establishing survey validity and reliability for American Indians through "think aloud" and test-retest methods.

    Science.gov (United States)

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D

    2015-06-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. © The Author(s) 2015.

  18. Validation of a method to measure plutonium levels in marine sediments in Cuba

    International Nuclear Information System (INIS)

    Sibello Hernández, Rita Y.; Cartas Aguila, Héctor A.; Cozzella, María Letizia

    2008-01-01

    The main objective of this research was to develop and to validate a method of radiochemical separation of plutonium, suitable from the economic and practical point of view, in Cuba conditions. This method allowed to determine plutonium activity levels in the marine sediments from Cienfuegos Bay. The selected method of radiochemical separation was that of anionic chromatography and the measure technique was the quadrupole inductively coupled plasma mass spectrometry. The method was applied to a certified reference material, six repetitions were carried out and a good correspondence between the average measured value and the average certified value of plutonium was achieved, so the trueness of the method was demonstrated. It was also proven the precision of the method, since it was obtained a variation coefficient of 11% at 95% confidence level. The obtained results show that the presence of plutonium in the analyzed marine sediment samples is only due to the global radioactive fallout. (author)

  19. An intercomparison of a large ensemble of statistical downscaling methods for Europe: Overall results from the VALUE perfect predictor cross-validation experiment

    Science.gov (United States)

    Gutiérrez, Jose Manuel; Maraun, Douglas; Widmann, Martin; Huth, Radan; Hertig, Elke; Benestad, Rasmus; Roessler, Ole; Wibig, Joanna; Wilcke, Renate; Kotlarski, Sven

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. This framework is based on a user-focused validation tree, guiding the selection of relevant validation indices and performance measures for different aspects of the validation (marginal, temporal, spatial, multi-variable). Moreover, several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur (assessment of intrinsic performance, effect of errors inherited from the global models, effect of non-stationarity, etc.). The list of downscaling experiments includes 1) cross-validation with perfect predictors, 2) GCM predictors -aligned with EURO-CORDEX experiment- and 3) pseudo reality predictors (see Maraun et al. 2015, Earth's Future, 3, doi:10.1002/2014EF000259, for more details). The results of these experiments are gathered, validated and publicly distributed through the VALUE validation portal, allowing for a comprehensive community-open downscaling intercomparison study. In this contribution we describe the overall results from Experiment 1), consisting of a European wide 5-fold cross-validation (with consecutive 6-year periods from 1979 to 2008) using predictors from ERA-Interim to downscale precipitation and temperatures (minimum and maximum) over a set of 86 ECA&D stations representative of the main geographical and climatic regions in Europe. As a result of the open call for contribution to this experiment (closed in Dec. 2015), over 40 methods representative of the main approaches (MOS and Perfect Prognosis, PP) and techniques (linear scaling, quantile mapping, analogs, weather typing, linear and generalized regression, weather generators, etc.) were submitted, including information both data

  20. Development and validation of a high throughput LC–MS/MS method for simultaneous quantitation of pioglitazone and telmisartan in rat plasma and its application to a pharmacokinetic study

    Directory of Open Access Journals (Sweden)

    Pinaki Sengupta

    2017-12-01

    Full Text Available Management of cardiovascular risk factors in diabetes demands special attention due to their co-existence. Pioglitazone (PIO and telmisartan (TLM combination can be beneficial in effective control of cardiovascular complication in diabetes. In this research, we developed and validated a high throughput LC–MS/MS method for simultaneous quantitation of PIO and TLM in rat plasma. This developed method is more sensitive and can quantitate the analytes in relatively shorter period of time compared to the previously reported methods for their individual quantification. Moreover, till date, there is no bioanalytical method available to simultaneously quantitate PIO and TLM in a single run. The method was validated according to the USFDA guidelines for bioanalytical method validation. A linear response of the analytes was observed over the range of 0.005–10 µg/mL with satisfactory precision and accuracy. Accuracy at four quality control levels was within 94.27%–106.10%. The intra- and inter-day precision ranged from 2.32%–10.14 and 5.02%–8.12%, respectively. The method was reproducible and sensitive enough to quantitate PIO and TLM in rat plasma samples of a preclinical pharmacokinetic study. Due to the potential of PIO-TLM combination to be therapeutically explored, this method is expected to have significant usefulness in future. Keywords: LC–MS/MS, Rat plasma, Pharmacokinetic applicability, Telmisartan, Pioglitazone, Pharmacokinetic application

  1. Validation of an HPLC–UV method for the determination of digoxin residues on the surface of manufacturing equipment

    Directory of Open Access Journals (Sweden)

    ZORAN B. TODOROVIĆ

    2009-09-01

    Full Text Available In the pharmaceutical industry, an important step consists in the removal of possible drug residues from the involved equipments and areas. The cleaning procedures must be validated and methods to determine trace amounts of drugs have, therefore, to be considered with special attention. An HPLC–UV method for the determination of digoxin residues on stainless steel surfaces was developed and validated in order to control a cleaning procedure. Cotton swabs, moistened with methanol were used to remove any residues of drugs from stainless steel surfaces, and give recoveries of 85.9, 85.2 and 78.7 % for three concentration levels. The precision of the results, reported as the relative standard deviation (RSD, were below 6.3 %. The method was validated over a concentration range of 0.05–12.5 µg mL-1. Low quantities of drug residues were determined by HPLC–UV using a Symmetry C18 column (150´4.6 mm, 5 µm at 20 °C with an acetonitrile–water (28:72, v/v mobile phase at a flow rate of 1.1 mL min-1, an injection volume of 100 µL and were detected at 220 nm. A simple, selective and sensitive HPLC–UV assay for the determination of digoxin residues on stainless steel was developed, validated and applied.

  2. Validation of nitrogen-nitrate analysis by the chromotropic acid method

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Ana Claudia O.; Matoso, Erika, E-mail: anaclaudia.oliveira@marinha.mil.br [Centro Tecnológico da Marinha em São Paulo (CTMSP/CEA), Iperó, SP (Brazil). Centro Experimental ARAMAR

    2017-07-01

    The problems caused by contamination of water bodies demand strict control of disposal in rivers, seas and oceans. Nitrate ion is present in agricultural inputs, which are applied to the soil to boost plant growth. However, excess or indiscriminate use of these products contaminates water bodies, triggering eutrophication of the aquatic ecosystems. Furthermore, due to diseases that can be caused by the ingestion of high levels of nitrate, such as methaemoglobinaemia, nitrate levels should be controlled in drinking waters and effluents. There are several methods for the determination of nitrate, being the chromotropic acid method a simple and low-cost solution. This method consists of acid addition into the sample in the presence of H{sub 2}SO{sub 4}. The absorbance related to the produced yellow color can be measured by a UV-Vis spectrophotometer at 410 nm. In a modified form, this method can be applied to different aqueous matrices by use of other reagents that eliminate interferences. The aim of this study was to validate the nitrate determination method in waters using chromotropic acid. This method is used in Laboratório Radioecológico (LARE) to analyze effluent to comply with Wastewater Controlling Program of Centro Tecnológico da Marinha em São Paulo – Centro Experimental ARAMAR (CTMSP-CEA). The correlation coefficient for the linearity test was 0.9997. The evaluated detection limit was relatively high (LD = 0.045 mgN/L), if compared to ion chromatography, for example, but enough to determine the presence of this ion, considering the maximum limit proposed by the current legislation. The chromotropic acid method showed to be a robust, accurate and precise method, according the parameters used in this work. (author)

  3. Validation of nitrogen-nitrate analysis by the chromotropic acid method

    International Nuclear Information System (INIS)

    Santos, Ana Claudia O.; Matoso, Erika

    2017-01-01

    The problems caused by contamination of water bodies demand strict control of disposal in rivers, seas and oceans. Nitrate ion is present in agricultural inputs, which are applied to the soil to boost plant growth. However, excess or indiscriminate use of these products contaminates water bodies, triggering eutrophication of the aquatic ecosystems. Furthermore, due to diseases that can be caused by the ingestion of high levels of nitrate, such as methaemoglobinaemia, nitrate levels should be controlled in drinking waters and effluents. There are several methods for the determination of nitrate, being the chromotropic acid method a simple and low-cost solution. This method consists of acid addition into the sample in the presence of H 2 SO 4 . The absorbance related to the produced yellow color can be measured by a UV-Vis spectrophotometer at 410 nm. In a modified form, this method can be applied to different aqueous matrices by use of other reagents that eliminate interferences. The aim of this study was to validate the nitrate determination method in waters using chromotropic acid. This method is used in Laboratório Radioecológico (LARE) to analyze effluent to comply with Wastewater Controlling Program of Centro Tecnológico da Marinha em São Paulo – Centro Experimental ARAMAR (CTMSP-CEA). The correlation coefficient for the linearity test was 0.9997. The evaluated detection limit was relatively high (LD = 0.045 mgN/L), if compared to ion chromatography, for example, but enough to determine the presence of this ion, considering the maximum limit proposed by the current legislation. The chromotropic acid method showed to be a robust, accurate and precise method, according the parameters used in this work. (author)

  4. Internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis

    NARCIS (Netherlands)

    Steultjens, M. P.; Dekker, J.; van Baar, M. E.; Oostendorp, R. A.; Bijlsma, J. W.

    1999-01-01

    To establish the internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis (OA). Data were obtained from 198 patients with OA of the hip or knee. Results of the observational method were compared with results of self-report

  5. Validation of a physical anthropology methodology using mandibles for gender estimation in a Brazilian population

    Science.gov (United States)

    CARVALHO, Suzana Papile Maciel; BRITO, Liz Magalhães; de PAIVA, Luiz Airton Saavedra; BICUDO, Lucilene Arilho Ribeiro; CROSATO, Edgard Michel; de OLIVEIRA, Rogério Nogueira

    2013-01-01

    Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. Objective This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. Material and Methods The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. Results The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. Conclusion It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological

  6. Validation of a physical anthropology methodology using mandibles for gender estimation in a Brazilian population

    Directory of Open Access Journals (Sweden)

    Suzana Papile Maciel Carvalho

    2013-07-01

    Full Text Available Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. OBJECTIVE: This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995, previously used in a population sample from Northeast Brazil. MATERIAL AND METHODS: The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. RESULTS: The results demonstrated that the application of the method of Oliveira, et al. (1995 in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. CONCLUSION: It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995 presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South

  7. An Anatomically Validated Brachial Plexus Contouring Method for Intensity Modulated Radiation Therapy Planning

    International Nuclear Information System (INIS)

    Van de Velde, Joris; Audenaert, Emmanuel; Speleers, Bruno; Vercauteren, Tom; Mulliez, Thomas; Vandemaele, Pieter; Achten, Eric; Kerckaert, Ingrid; D'Herde, Katharina; De Neve, Wilfried; Van Hoof, Tom

    2013-01-01

    Purpose: To develop contouring guidelines for the brachial plexus (BP) using anatomically validated cadaver datasets. Magnetic resonance imaging (MRI) and computed tomography (CT) were used to obtain detailed visualizations of the BP region, with the goal of achieving maximal inclusion of the actual BP in a small contoured volume while also accommodating for anatomic variations. Methods and Materials: CT and MRI were obtained for 8 cadavers positioned for intensity modulated radiation therapy. 3-dimensional reconstructions of soft tissue (from MRI) and bone (from CT) were combined to create 8 separate enhanced CT project files. Dissection of the corresponding cadavers anatomically validated the reconstructions created. Seven enhanced CT project files were then automatically fitted, separately in different regions, to obtain a single dataset of superimposed BP regions that incorporated anatomic variations. From this dataset, improved BP contouring guidelines were developed. These guidelines were then applied to the 7 original CT project files and also to 1 additional file, left out from the superimposing procedure. The percentage of BP inclusion was compared with the published guidelines. Results: The anatomic validation procedure showed a high level of conformity for the BP regions examined between the 3-dimensional reconstructions generated and the dissected counterparts. Accurate and detailed BP contouring guidelines were developed, which provided corresponding guidance for each level in a clinical dataset. An average margin of 4.7 mm around the anatomically validated BP contour is sufficient to accommodate for anatomic variations. Using the new guidelines, 100% inclusion of the BP was achieved, compared with a mean inclusion of 37.75% when published guidelines were applied. Conclusion: Improved guidelines for BP delineation were developed using combined MRI and CT imaging with validation by anatomic dissection

  8. An assessment of the validity of inelastic design analysis methods by comparisons of predictions with test results

    International Nuclear Information System (INIS)

    Corum, J.M.; Clinard, J.A.; Sartory, W.K.

    1976-01-01

    The use of computer programs that employ relatively complex constitutive theories and analysis procedures to perform inelastic design calculations on fast reactor system components introduces questions of validation and acceptance of the analysis results. We may ask ourselves, ''How valid are the answers.'' These questions, in turn, involve the concepts of verification of computer programs as well as qualification of the computer programs and of the underlying constitutive theories and analysis procedures. This paper addresses the latter - the qualification of the analysis methods for inelastic design calculations. Some of the work underway in the United States to provide the necessary information to evaluate inelastic analysis methods and computer programs is described, and typical comparisons of analysis predictions with inelastic structural test results are presented. It is emphasized throughout that rather than asking ourselves how valid, or correct, are the analytical predictions, we might more properly question whether or not the combination of the predictions and the associated high-temperature design criteria leads to an acceptable level of structural integrity. It is believed that in this context the analysis predictions are generally valid, even though exact correlations between predictions and actual behavior are not obtained and cannot be expected. Final judgment, however, must be reserved for the design analyst in each specific case. (author)

  9. Validation of QuEChERS method for the determination of some pesticide residues in two apple varieties.

    Science.gov (United States)

    Tiryaki, Osman

    2016-10-02

    This study was undertaken to validate the "quick, easy, cheap, effective, rugged and safe" (QuEChERS) method using Golden Delicious and Starking Delicious apple matrices spiked at 0.1 maximum residue limit (MRL), 1.0 MRL and 10 MRL levels of the four pesticides (chlorpyrifos, dimethoate, indoxacarb and imidacloprid). For the extraction and cleanup, original QuEChERS method was followed, then the samples were subjected to liquid chromatography-triple quadrupole mass spectrometry (LC-MS/MS) for chromatographic analyses. According to t test, matrix effect was not significant for chlorpyrifos in both sample matrices, but it was significant for dimethoate, indoxacarb and imidacloprid in both sample matrices. Thus, matrix-matched calibration (MC) was used to compensate matrix effect and quantifications were carried out by using MC. The overall recovery of the method was 90.15% with a relative standard deviation of 13.27% (n = 330). Estimated method detection limit of analytes blew the MRLs. Some other parameters of the method validation, such as recovery, precision, accuracy and linearity were found to be within the required ranges.

  10. [The development and validation of the methods for the quantitative determination of sibutramine derivatives in dietary supplements].

    Science.gov (United States)

    Stern, K I; Malkova, T L

    The objective of the present study was the development and validation of sibutramine demethylated derivatives, desmethyl sibutramine and didesmethyl sibutramine. Gas-liquid chromatography with the flame ionization detector was used for the quantitative determination of the above substances in dietary supplements. The conditions for the chromatographic determination of the analytes in the presence of the reference standard, methyl stearate, were proposed allowing to achieve the efficient separation. The method has the necessary sensitivity, specificity, linearity, accuracy, and precision (on the intra-day and inter-day basis) which suggests its good validation characteristics. The proposed method can be employed in the analytical laboratories for the quantitative determination of sibutramine derivatives in biologically active dietary supplements.

  11. Validation of the analytical method for sodium dichloroisocyanurate aimed at drinking water disinfection

    International Nuclear Information System (INIS)

    Martinez Alvarez, Luis Octavio; Alejo Cisneros, Pedro; Garcia Pereira, Reynaldo; Campos Valdez, Doraily

    2014-01-01

    Cuba has developed the first effervescent 3.5 mg sodium dichloroisocyanurate tablets as a non-therapeutic active principle. This ingredient releases certain amount of chlorine when dissolved into a litre of water and it can cause adequate disinfection of drinking water ready to be taken after 30 min. Developing and validating an analytical iodometric method applicable to the quality control of effervescent 3.5 mg sodium dichloroisocyanurate tablets

  12. Development of Method for X-band Weather Radar Calibration

    DEFF Research Database (Denmark)

    Nielsen, Jesper Ellerbæk; Thorndahl, Søren Liedtke; Rasmussen, Michael R.

    2013-01-01

    Calibration of the X-band LAWR (Local Area Weather Radar) is traditionally based on an assumed linear relation between the LAWRradar output and the rainfall intensity. However, closer inspections of the data reveal that the validity of this linear assumption is doubtful. Previous studies of this ......Calibration of the X-band LAWR (Local Area Weather Radar) is traditionally based on an assumed linear relation between the LAWRradar output and the rainfall intensity. However, closer inspections of the data reveal that the validity of this linear assumption is doubtful. Previous studies...... of this type of weather radar have also illustrated that the radar commonly has difficulties in estimating high rain rates. Therefore, a new radar–rainfall transformation model and a calibration method have been developed. The new method is based on nonlinear assumptions and is aimed at describing the whole...

  13. Analytical method validation of GC-FID for the simultaneous measurement of hydrocarbons (C2-C4 in their gas mixture

    Directory of Open Access Journals (Sweden)

    Oman Zuas

    2016-09-01

    Full Text Available An accurate gas chromatography coupled to a flame ionization detector (GC-FID method was validated for the simultaneous analysis of light hydrocarbons (C2-C4 in their gas mixture. The validation parameters were evaluated based on the ISO/IEC 17025 definition including method selectivity, repeatability, accuracy, linearity, limit of detection (LOD, limit of quantitation (LOQ, and ruggedness. Under the optimum analytical conditions, the analysis of gas mixture revealed that each target component was well-separated with high selectivity property. The method was also found to be precise and accurate. The method linearity was found to be high with good correlation coefficient values (R2 ≥ 0.999 for all target components. It can be concluded that the GC-FID developed method is reliable and suitable for determination of light C2-C4 hydrocarbons (including ethylene, propane, propylene, isobutane, and n-butane in their gas mixture. The validated method has successfully been applied to the estimation of hydrocarbons light C2-C4 hydrocarbons in natural gas samples, showing high performance repeatability with relative standard deviation (RSD less than 1.0% and good selectivity with no interference from other possible components could be observed.

  14. Estimating incidence from prevalence in generalised HIV epidemics: methods and validation.

    Directory of Open Access Journals (Sweden)

    Timothy B Hallett

    2008-04-01

    Full Text Available HIV surveillance of generalised epidemics in Africa primarily relies on prevalence at antenatal clinics, but estimates of incidence in the general population would be more useful. Repeated cross-sectional measures of HIV prevalence are now becoming available for general populations in many countries, and we aim to develop and validate methods that use these data to estimate HIV incidence.Two methods were developed that decompose observed changes in prevalence between two serosurveys into the contributions of new infections and mortality. Method 1 uses cohort mortality rates, and method 2 uses information on survival after infection. The performance of these two methods was assessed using simulated data from a mathematical model and actual data from three community-based cohort studies in Africa. Comparison with simulated data indicated that these methods can accurately estimates incidence rates and changes in incidence in a variety of epidemic conditions. Method 1 is simple to implement but relies on locally appropriate mortality data, whilst method 2 can make use of the same survival distribution in a wide range of scenarios. The estimates from both methods are within the 95% confidence intervals of almost all actual measurements of HIV incidence in adults and young people, and the patterns of incidence over age are correctly captured.It is possible to estimate incidence from cross-sectional prevalence data with sufficient accuracy to monitor the HIV epidemic. Although these methods will theoretically work in any context, we have able to test them only in southern and eastern Africa, where HIV epidemics are mature and generalised. The choice of method will depend on the local availability of HIV mortality data.

  15. Analytical method validation of GC-FID for the simultaneous measurement of hydrocarbons (C2-C4) in their gas mixture

    OpenAIRE

    Oman Zuas; Harry budiman; Muhammad Rizky Mulyana

    2016-01-01

    An accurate gas chromatography coupled to a flame ionization detector (GC-FID) method was validated for the simultaneous analysis of light hydrocarbons (C2-C4) in their gas mixture. The validation parameters were evaluated based on the ISO/IEC 17025 definition including method selectivity, repeatability, accuracy, linearity, limit of detection (LOD), limit of quantitation (LOQ), and ruggedness. Under the optimum analytical conditions, the analysis of gas mixture revealed that each target comp...

  16. "INTRODUCING A FULL VALIDATED ANALYTICAL PROCEDURE AS AN OFFICIAL COMPENDIAL METHOD FOR FENTANYL TRANSDERMAL PATCHES"

    Directory of Open Access Journals (Sweden)

    Amir Mehdizadeh

    2005-04-01

    Full Text Available A simple, sensitive and specific HPLC method and also a simple and fast extraction procedure were developed for quantitative analysis of fentanyl transdermal patches. Chloroform, methanol and ethanol were used as extracting solvents with recovery percent of 92.1, 94.3 and 99.4% respectively. Fentanyl was extracted with ethanol and the eluted fentanyl through the C18 column was monitored by UV detection at 230 nm. The linearity was at the range of 0.5-10 µg/mL with correlation coefficient (r2 of 0.9992. Both intra and inter-day accuracy and precision were within acceptable limits. The detection limit (DL and quantitation limit (QL were 0.15 and 0.5 µg/mL, respectively. Other validation characteristics such as selectivity, robustness and ruggedness were evaluated. Following method validation, a system suitability test (SST including capacity factor (k´, plate number (N, tailing factor (T, and RSD was defined for routine test.

  17. A chiral capillary electrophoresis method for ropivacaine hydrochloride in pharmaceutical formulations : Validation and comparison with chiral liquid chromatography

    NARCIS (Netherlands)

    Sänger-Van De Griend, C. E.; Wahlström, H.; Gröningsson, K.; Widahl-Näsman, Monica E.

    A capillary electrophoresis method for the determination of the enantiomeric purity of the local anaesthetic ropivacaine hydrochloride in injection solutions has been validated. The method showed the required limit of quantitation of 0.1% enantiomeric impurity. Good performances were shown for

  18. Sensitivity and uncertainty analyses applied to criticality safety validation, methods development. Volume 1

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Hopper, C.M.; Childs, R.L.; Parks, C.V.

    1999-01-01

    This report presents the application of sensitivity and uncertainty (S/U) analysis methodologies to the code/data validation tasks of a criticality safety computational study. Sensitivity and uncertainty analysis methods were first developed for application to fast reactor studies in the 1970s. This work has revitalized and updated the available S/U computational capabilities such that they can be used as prototypic modules of the SCALE code system, which contains criticality analysis tools currently used by criticality safety practitioners. After complete development, simplified tools are expected to be released for general use. The S/U methods that are presented in this volume are designed to provide a formal means of establishing the range (or area) of applicability for criticality safety data validation studies. The development of parameters that are analogous to the standard trending parameters forms the key to the technique. These parameters are the D parameters, which represent the differences by group of sensitivity profiles, and the ck parameters, which are the correlation coefficients for the calculational uncertainties between systems; each set of parameters gives information relative to the similarity between pairs of selected systems, e.g., a critical experiment and a specific real-world system (the application)

  19. A Validated RP-HPLC Method for Simultaneous Estimation of Atenolol and Indapamide in Pharmaceutical Formulations

    Directory of Open Access Journals (Sweden)

    G. Tulja Rani

    2011-01-01

    Full Text Available A simple, fast, precise, selective and accurate RP-HPLC method was developed and validated for the simultaneous determination of atenolol and indapamide from bulk and formulations. Chromatographic separation was achieved isocratically on a Waters C18 column (250×4.6 mm, 5 µ particle size using a mobile phase, methanol and water (adjusted to pH 2.7 with 1% orthophosphoric acid in the ratio of 80:20. The flow rate was 1 mL/min and effluent was detected at 230 nm. The retention time of atenolol and indapamide were 1.766 min and 3.407 min. respectively. Linearity was observed in the concentration range of 12.5-150 µg/mL for atenolol and 0.625-7.5 µg/mL for indapamide. Percent recoveries obtained for both the drugs were 99.74-100.06% and 98.65-99.98% respectively. The method was validated according to the ICH guidelines with respect to specificity, linearity, accuracy, precision and robustness. The method developed can be used for the routine analysis of atenolol and indapamide from their combined dosage form.

  20. Validation of Serious Games

    Directory of Open Access Journals (Sweden)

    Katinka van der Kooij

    2015-09-01

    Full Text Available The application of games for behavioral change has seen a surge in popularity but evidence on the efficacy of these games is contradictory. Anecdotal findings seem to confirm their motivational value whereas most quantitative findings from randomized controlled trials (RCT are negative or difficult to interpret. One cause for the contradictory evidence could be that the standard RCT validation methods are not sensitive to serious games’ effects. To be able to adapt validation methods to the properties of serious games we need a framework that can connect properties of serious game design to the factors that influence the quality of quantitative research outcomes. The Persuasive Game Design model [1] is particularly suitable for this aim as it encompasses the full circle from game design to behavioral change effects on the user. We therefore use this model to connect game design features, such as the gamification method and the intended transfer effect, to factors that determine the conclusion validity of an RCT. In this paper we will apply this model to develop guidelines for setting up validation methods for serious games. This way, we offer game designers and researchers handles on how to develop tailor-made validation methods.

  1. The Numerical Welding Simulation - Developments and Validation of Simplified and Bead Lumping Methods

    International Nuclear Information System (INIS)

    Baup, Olivier

    2001-01-01

    The aim of this work was to study the TIG multipass welding process on stainless steel, by means of numerical methods and then to work out simplified and bead lumping methods in order to reduce adjusting and realisation times of these calculations. A simulation was used as reference for the validation of these methods; after the presentation of the test series having led to the option choices of this calculation (2D generalised plane strains, elastoplastic model with an isotropic hardening, hardening restoration due to high temperatures), various simplifications were tried on a plate geometry. These simplifications related various modelling points with a correct plastic flow representation in the plate. The use of a reduced number of thermal fields characterising the bead deposit and a low number of tensile curves allow to obtain interesting results, decreasing significantly the Computing times. In addition various lumping bead methods have been studied and concerning both the shape and the thermic of the macro-deposits. The macro-deposit shapes studied are in 'L', or in layer or they represent two beads one on top of the other. Among these three methods, only those using a few number of lumping beads gave bad results since thermo-mechanical history was deeply modified near and inside the weld. Thereafter, simplified methods have been applied to a tubular geometry. On this new geometry, experimental measurements were made during welding, which allow a validation of the reference calculation. Simplified and reference calculations gave approximately the same stress fields as found on plate geometry. Finally, in the last part of this document a procedure for automatic data setting permitting to reduce significantly the calculation phase preparation is presented. It has been applied to the calculation of thick pipe welding in 90 beads; the results are compared with a simplified simulation realised by Framatome and with experimental measurements. A bead by

  2. Flux form Semi-Lagrangian methods for parabolic problems

    Directory of Open Access Journals (Sweden)

    Bonaventura Luca

    2016-09-01

    Full Text Available A semi-Lagrangian method for parabolic problems is proposed, that extends previous work by the authors to achieve a fully conservative, flux-form discretization of linear and nonlinear diffusion equations. A basic consistency and stability analysis is proposed. Numerical examples validate the proposed method and display its potential for consistent semi-Lagrangian discretization of advection diffusion and nonlinear parabolic problems.

  3. Validity Evaluation of the Assessment Method for Postural Loading on the Upper Body in Printing Industry

    Directory of Open Access Journals (Sweden)

    Mohammad Khandan

    2016-07-01

    Full Text Available Background and Objectives: Musculoskeletal disorders and injuries are known as a global occupational challenge. These injuries are more are concentrated in the upper limb. There are several methods to assess this kind of disorders, each of which have different efficiencies for various jobs based on their strengths and weaknesses. This study aimed to assess the validity of LUBA method in order to evaluate risk factors for musculoskeletal disorders in a printing industry in Qom province, 2014. Methods: In this descriptive cross-sectional study, all operational workers (n=94 were investigated in 2014. Nordic Musculoskeletal Questionnaire (NMQ was used to collect data on musculoskeletal disorders. We also used LUBA method to analyze postures in four different parts of the body (neck, shoulder, elbow, and wrist. The obtained data were analyzed using Mann-Whitney, Kruskal Wallis, and Kappa agreement tests. Results: Lumbar region of back with 35.1% prevalence had the most problems. The results of LUBA method showed that most postures were located at the second corrective action level, and need further studies. Agreement between assessment of shoulder posture and its disorders was significant (p0.05.  Conclusion: According to the results of this study on reliability and predictive validity of the LUBA method in printing industry, it can be concluded that this method is not a reliable method for posture assessment; however, further and more comprehensive studies are recommended.  

  4. Development of a validated HPLC method for the determination of sennoside A and B, two major constituents of Cassia obovata Coll.

    Directory of Open Access Journals (Sweden)

    Ghassemi-Dehkordi Nasrollah

    2014-04-01

    Full Text Available Introduction: Cassia obovata Coll is the only Senna species which grows wild in Iran. In the present study, an optimised reverse High Performance Liquid Chromatography (HPLC validated method was established for quantification of sennosides A and B, the major constituents of C. obovata with a simple and accurate method. Methods: HPLC analysis was done using Waters 515 pump on a Nova-Pak C18 (3.9 × 150 mm. Millennium software was used for the determination of the sennoside A and B in Cassia species and processing the information. The method was validated according to USP 32 requirements. Results: The solvent impact on the selectivity factor and partition coefficient parameters evaluated. Using a conventional RP-18 L1 column, 3.9 × 150 mm, the mobile phase was selected after several trials with different mixtures of water and acetonitrile. Sennosides A and B were determined using the external standard calibration method. Using USP 35-NF 30, the LOD and LOQ were calculated. The reliability of the HPLC-method for analysis of sennoside A + B was validated through its linearity, reproducibility, repeatability, and recovery. Fina1ly ethanol:water (1:1 extracts of Cassia obovata and Cassia angustifolia were standardized by assay of sennoside A and B through above HPLC validated method. Conclusion: Through the above method, determination of sennosides in Cassia species are completely possible. Moreover, through comparing the results, even though sennosides are rich in Cassia angustifolia but, the results shows that C. obovata could be considered as an alternative source for sennosides A and B.

  5. Validation of methods for measurement of insulin secretion in humans in vivo

    DEFF Research Database (Denmark)

    Kjems, L L; Christiansen, E; Vølund, A

    2000-01-01

    To detect and understand the changes in beta-cell function in the pathogenesis of type 2 diabetes, an accurate and precise estimation of prehepatic insulin secretion rate (ISR) is essential. There are two common methods to assess ISR, the deconvolution method (by Eaton and Polonsky)-considered th......To detect and understand the changes in beta-cell function in the pathogenesis of type 2 diabetes, an accurate and precise estimation of prehepatic insulin secretion rate (ISR) is essential. There are two common methods to assess ISR, the deconvolution method (by Eaton and Polonsky...... of these mathematical techniques for quantification of insulin secretion have been tested in dogs, but not in humans. In the present studies, we examined the validity of both methods to recover the known infusion rates of insulin and C-peptide mimicking ISR during an oral glucose tolerance test. ISR from both......, and a close agreement was found for the results of an oral glucose tolerance test. We also studied whether C-peptide kinetics are influenced by somatostatin infusion. The decay curves after bolus injection of exogenous biosynthetic human C-peptide, the kinetic parameters, and the metabolic clearance rate were...

  6. Statistical Analysis Methods for Physics Models Verification and Validation

    CERN Document Server

    De Luca, Silvia

    2017-01-01

    The validation and verification process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal

  7. Implementation of the 3Rs (refinement, reduction, and replacement): validation and regulatory acceptance considerations for alternative toxicological test methods.

    Science.gov (United States)

    Schechtman, Leonard M

    2002-01-01

    Toxicological testing in the current regulatory environment is steeped in a history of using animals to answer questions about the safety of products to which humans are exposed. That history forms the basis for the testing strategies that have evolved to satisfy the needs of the regulatory bodies that render decisions that affect, for the most part, virtually all phases of premarket product development and evaluation and, to a lesser extent, postmarketing surveillance. Only relatively recently have the levels of awareness of, and responsiveness to, animal welfare issues reached current proportions. That paradigm shift, although sluggish, has nevertheless been progressive. New and alternative toxicological methods for hazard evaluation and risk assessment have now been adopted and are being viewed as a means to address those issues in a manner that considers humane treatment of animals yet maintains scientific credibility and preserves the goal of ensuring human safety. To facilitate this transition, regulatory agencies and regulated industry must work together toward improved approaches. They will need assurance that the methods will be reliable and the results comparable with, or better than, those derived from the current classical methods. That confidence will be a function of the scientific validation and resultant acceptance of any given method. In the United States, to fulfill this need, the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) and its operational center, the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM), have been constituted as prescribed in federal law. Under this mandate, ICCVAM has developed a process and established criteria for the scientific validation and regulatory acceptance of new and alternative methods. The role of ICCVAM in the validation and acceptance process and the criteria instituted toward that end are described. Also

  8. Validation of the LWR-EIR methods for the evaluation of compact beds

    International Nuclear Information System (INIS)

    Foskolos, K.; Grimm, P.; Maeder, C.; Paratte, J.M.

    1983-10-01

    The EIR code system for the calculation of light water reactors is presented and the methods used are briefly described. The application of the system on various types of critical experiments and benchmark problems proves its good precision, even for heterogeneous configurations with strong neutron absorbers like Boral. As the accuracy of the multiplication factor ksub(eff) is always better than 0.5% for normal LWR configurations, this code system is validated for the calculation of such configurations with a safety margin of 1.5% on ksub(eff). (Auth.)

  9. Development and validation of a spectrophotometry method for the determination of histamine in fresh tuna (Thunnus tunna)

    International Nuclear Information System (INIS)

    Chacon-Silva, Fainier; Barquero-Quiros, Miriam

    2002-01-01

    Histamine in foods can promote allergic reactions in sensitive persons. A colorimetric microscale method for histamine determination was developed and validated. Cu 2+ histamine chelation occurs at 9,5 ph. Dichloromethane extraction of the complex as the salt with tetrabromo phenolphthalein ethyl ester, allows photometric quantitation at 515 nm. The validation of micro method was accomplished trough its performance parameters, detection limit, quantitation limit, sensitivity, linearity, precision, recuperation. This methodology was applied to twenty raw tuna samples, collected in San Jose metropolitan area. It was found that 45% of analyzed samples had a histamine content in the range between 100-200 mg/kg. This finding indicates bacterial contamination, 15% of samples analyzed were over 500 mg/kg FDA level of sanitary risk. (Author) [es

  10. Validity and reliability of the session-RPE method for quantifying training load in karate athletes.

    Science.gov (United States)

    Tabben, M; Tourny, C; Haddad, M; Chaabane, H; Chamari, K; Coquart, J B

    2015-04-24

    To test the construct validity and reliability of the session rating of perceived exertion (sRPE) method by examining the relationship between RPE and physiological parameters (heart rate: HR and blood lactate concentration: [La --] ) and the correlations between sRPE and two HR--based methods for quantifying internal training load (Banister's method and Edwards's method) during karate training camp. Eighteen elite karate athletes: ten men (age: 24.2 ± 2.3 y, body mass: 71.2 ± 9.0 kg, body fat: 8.2 ± 1.3% and height: 178 ± 7 cm) and eight women (age: 22.6 ± 1.2 y, body mass: 59.8 ± 8.4 kg, body fat: 20.2 ± 4.4%, height: 169 ± 4 cm) were included in the study. During training camp, subjects participated in eight karate--training sessions including three training modes (4 tactical--technical, 2 technical--development, and 2 randori training), during which RPE, HR, and [La -- ] were recorded. Significant correlations were found between RPE and physiological parameters (percentage of maximal HR: r = 0.75, 95% CI = 0.64--0.86; [La --] : r = 0.62, 95% CI = 0.49--0.75; P training load ( r = 0.65--0.95; P reliability of the same intensity across training sessions (Cronbach's α = 0.81, 95% CI = 0.61--0.92). This study demonstrates that the sRPE method is valid for quantifying internal training load and intensity in karate.

  11. Development and validation of a stability indicating HPTLC-densitometric method for lafutidine

    Directory of Open Access Journals (Sweden)

    Dinesh Dhamecha

    2013-01-01

    Full Text Available Background: A simple, selective, precise, and stability indicating high-performance thin layer chromatographic method has been established and validated for analysis of lafutidine in bulk drug and formulations. Materials and Methods: The compounds were analyzed on aluminum backed silica gel 60 F 254 plates with chloroform:ethanol:acetic Acid (8:1:1 as mobile phase. Densitometric analysis of lafutidine was performed at 230 nm. Result : Regression analysis data for the calibration plots were indicative of good linear relationship between response and concentration over the range 100-500 ng per spot. The correlation coefficient (r 2 was 0.998±0.002. Conclusion: Lafutidine was subjected to acid, base, peroxide, and sunlight degradation. In stability tests, the drug was susceptible to acid and basic hydrolysis, oxidation, and photodegradation.

  12. Development and Validation of New Spectrophotometric Methods to Determine Enrofloxacin in Pharmaceuticals

    Science.gov (United States)

    Rajendraprasad, N.; Basavaiah, K.

    2015-07-01

    Four spectrophotometric methods, based on oxidation with cerium(IV), are investigated and developed to determine EFX in pure form and in dosage forms. The frst and second methods (Method A and method B) are direct, in which after the oxidation of EFX with cerium(IV) in acid medium, the absorbance of reduced and unreacted oxidant is measured at 275 and 320 nm, respectively. In the third (C) and fourth (D) methods after the reaction between EFX and oxidant is ensured to be completed the surplus oxidant is treated with either N-phenylanthranilic acid (NPA) or Alizarin Red S (ARS) dye and the absorbance of the oxidized NPA or ARS is measured at 440 or 420 nm. The methods showed good linearity over the concentration ranges of 0.5-5.0, 1.25-12.5, 10.0-100.0, and 6.0-60.0 μg/ml, for method A, B, C and D, respectively, with apparent molar absorptivity values of 4.42 × 10 4 , 8.7 × 10 3 , 9.31 × 10 2 , and 2.28 × 10 3 l/(mol· cm). The limits of detection (LOD), quantification (LOQ), and Sandell's sensitivity values and other validation results have also been reported. The proposed methods are successfully applied to determine EFX in pure form and in dosage forms.

  13. Use of the Method of Triads in the Validation of Sodium and Potassium Intake in the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil).

    Science.gov (United States)

    Pereira, Taísa Sabrina Silva; Cade, Nágela Valadão; Mill, José Geraldo; Sichieri, Rosely; Molina, Maria Del Carmen Bisi

    2016-01-01

    Biomarkers are a good choice to be used in the validation of food frequency questionnaire due to the independence of their random errors. To assess the validity of the potassium and sodium intake estimated using the Food Frequency Questionnaire ELSA-Brasil. A subsample of participants in the ELSA-Brasil cohort was included in this study in 2009. Sodium and potassium intake were estimated using three methods: Semi-quantitative food frequency questionnaire, 12-hour nocturnal urinary excretion and three 24-hour food records. Correlation coefficients were calculated between the methods, and the validity coefficient was calculated using the method of triads. The 95% confidence intervals for the validity coefficient were estimated using bootstrap sampling. Exact and adjacent agreement and disagreement of the estimated sodium and potassium intake quintiles were compared among three methods. The sample consisted of 246 participants, aged 53±8 years, 52% of women. Validity coefficient for sodium were considered weak (рfood frequency questionnaire actual intake = 0.37 and рbiomarker actual intake = 0.21) and moderate (рfood records actual intake 0.56). The validity coefficient were higher for potassium (рfood frequency questionnaire actual intake = 0.60; рbiomarker actual intake = 0.42; рfood records actual intake = 0.79). Conclusions: The Food Frequency Questionnaire ELSA-Brasil showed good validity in estimating potassium intake in epidemiological studies. For sodium validity was weak, likely due to the non-quantification of the added salt to prepared food.

  14. Cleaning Validation of Fermentation Tanks

    DEFF Research Database (Denmark)

    Salo, Satu; Friis, Alan; Wirtanen, Gun

    2008-01-01

    Reliable test methods for checking cleanliness are needed to evaluate and validate the cleaning process of fermentation tanks. Pilot scale tanks were used to test the applicability of various methods for this purpose. The methods found to be suitable for validation of the clenlinees were visula...

  15. Validation of a capillary electrophoresis method for the enantiomeric purity testing of ropivacaine, a new local anaesthetic compound

    NARCIS (Netherlands)

    Sänger-Van De Griend, C. E.; Gröningsson, K.

    A capillary electrophoresis method for the determination of the enantiomeric purity of ropivacaine, a new local anaesthetic compound developed by Astra Pain Control AB, has been validated. The method showed the required limit of quantitation of 0.1%, enantiomeric impurity and proved to be robust.

  16. A Validated HPLC-DAD Method for Simultaneous Determination of Etodolac and Pantoprazole in Rat Plasma

    Directory of Open Access Journals (Sweden)

    Ali S. Abdelhameed

    2014-01-01

    Full Text Available A simple, sensitive, and accurate HPLC-DAD method has been developed and validated for the simultaneous determination of pantoprazole and etodolac in rat plasma as a tool for therapeutic drug monitoring. Optimal chromatographic separation of the analytes was achieved on a Waters Symmetry C18 column using a mobile phase that consisted of phosphate buffer pH~4.0 as eluent A and acetonitrile as eluent B in a ratio of A : B, 55 : 45 v/v for 6 min, pumped isocratically at a flow rate of 0.8 mL min−1. The eluted analytes were monitored using photodiode array detector set to quantify samples at 254 nm. The method was linear with r2=0.9999 for PTZ and r2=0.9995 for ETD at a concentration range of 0.1–15 and 5–50 μgmL−1 for PTZ and ETD, respectively. The limits of detection were found to be 0.033 and 0.918 μgmL−1 for PTZ and ETD, respectively. The method was statistically validated for linearity, accuracy, precision, and selectivity following the International Conference for Harmonization (ICH guidelines. The reproducibility of the method was reliable with the intra- and interday precision (% RSD <7.76% for PTZ and <7.58 % for ETD.

  17. Validation of a UV Spectrometric Method for the Assay of Tolfenamic Acid in Organic Solvents

    Directory of Open Access Journals (Sweden)

    Sofia Ahmed

    2015-01-01

    Full Text Available The present study has been carried out to validate a UV spectrometric method for the assay of tolfenamic acid (TA in organic solvents. TA is insoluble in water; therefore, a total of thirteen commonly used organic solvents have been selected in which the drug is soluble. Fresh stock solutions of TA in each solvent in a concentration of 1 × 10−4 M (2.62 mg% were prepared for the assay. The method has been validated according to the guideline of International Conference on Harmonization and parameters like linearity, range, accuracy, precision, sensitivity, and robustness have been studied. Although the method was found to be efficient for the determination of TA in all solvents on the basis of statistical data 1-octanol, followed by ethanol and methanol, was found to be comparatively better than the other studied solvents. No change in the stock solution stability of TA has been observed in each solvent for 24 hours stored either at room (25±1°C or at refrigerated temperature (2–8°C. A shift in the absorption maxima has been observed for TA in various solvents indicating drug-solvent interactions. The studied method is simple, rapid, economical, accurate, and precise for the assay of TA in different organic solvents.

  18. Validation of a computational method for assessing the impact of intra-fraction motion on helical tomotherapy plans

    Energy Technology Data Exchange (ETDEWEB)

    Ngwa, Wilfred; Meeks, Sanford L; Kupelian, Patrick A; Langen, Katja M [Department of Radiation Oncology, M D Anderson Cancer Center Orlando, 1400 South Orange Avenue, Orlando, FL 32806 (United States); Schnarr, Eric [TomoTherapy, Inc., 1240 Deming Way, Madison, WI 53717 (United States)], E-mail: wilfred.ngwa@orlandohealth.com

    2009-11-07

    In this work, a method for direct incorporation of patient motion into tomotherapy dose calculations is developed and validated. This computational method accounts for all treatment dynamics and can incorporate random as well as cyclical motion data. Hence, interplay effects between treatment dynamics and patient motion are taken into account during dose calculation. This allows for a realistic assessment of intra-fraction motion on the dose distribution. The specific approach entails modifying the position and velocity events in the tomotherapy delivery plan to accommodate any known motion. The computational method is verified through phantom and film measurements. Here, measured prostate motion and simulated respiratory motion tracks were incorporated in the dose calculation. The calculated motion-encoded dose profiles showed excellent agreement with the measurements. Gamma analysis using 3 mm and 3% tolerance criteria showed over 97% and 96% average of points passing for the prostate and breathing motion tracks, respectively. The profile and gamma analysis results validate the accuracy of this method for incorporating intra-fraction motion into the dose calculation engine for assessment of dosimetric effects on helical tomotherapy dose deliveries.

  19. Experimental validation of the intrinsic spatial efficiency method over a wide range of sizes for cylindrical sources

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz-Ramŕez, Pablo, E-mail: rapeitor@ug.uchile.cl; Larroquette, Philippe [Departamento de Física, Facultad de Ciencias, Universidad de Chile (Chile); Camilla, S. [Departamento de Física, Universidad Tecnológica Metropolitana (Chile)

    2016-07-07

    The intrinsic spatial efficiency method is a new absolute method to determine the efficiency of a gamma spectroscopy system for any extended source. In the original work the method was experimentally demonstrated and validated for homogeneous cylindrical sources containing {sup 137}Cs, whose sizes varied over a small range (29.5 mm radius and 15.0 to 25.9 mm height). In this work we present an extension of the validation over a wide range of sizes. The dimensions of the cylindrical sources vary between 10 to 40 mm height and 8 to 30 mm radius. The cylindrical sources were prepared using the reference material IAEA-372, which had a specific activity of 11320 Bq/kg at july 2006. The obtained results were better for the sources with 29 mm radius showing relative bias lesser than 5% and for the sources with 10 mm height showing relative bias lesser than 6%. In comparison with the obtained results in the work where we present the method, the majority of these results show an excellent agreement.

  20. DEVELOPMENT AND VALIDATION OF NUMERICAL METHOD FOR STRENGTH ANALYSIS OF LATTICE COMPOSITE FUSELAGE STRUCTURES

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available Lattice composite fuselage structures are developed as an alternative to conventional composite structures based on laminated skin and stiffeners. Structure layout of lattice structures allows to realize advantages of current composite materials to a maximal extent, at the same time minimizing their main shortcomings, that allows to provide higher weight efficiency for these structures in comparison with conventional analogues.Development and creation of lattice composite structures requires development of novel methods of strength anal- ysis, as conventional methods, as a rule, are aiming to strength analysis of thin-walled elements and do not allow to get confident estimation of local strength of high-loaded unidirectional composite ribs.In the present work the method of operative strength analysis of lattice composite structure is presented, based onspecialized FE-models of unidirectional composite ribs and their intersections. In the frames of the method, every rib is modeled by a caisson structure, consisting of arbitrary number of flanges and webs, modeled by membrane finite elements. Parameters of flanges and webs are calculated automatically from the condition of stiffness characteristics equality of real rib and the model. This method allows to perform local strength analysis of high-loaded ribs of lattice structure without use of here-dimensional finite elements, that allows to shorten time of calculations and sufficiently simplify the procedure of analysis of results of calculations.For validation of the suggested method, the results of experimental investigations of full-scale prototype of shell of lattice composite fuselage section have been used. The prototype of the lattice section was manufactured in CRISM and tested in TsAGI within the frames of a number of Russian and International scientific projects. The results of validation have shown that the suggested method allows to provide high operability of strength analysis, keeping

  1. Validation of a method for radionuclide activity optimize in SPECT

    International Nuclear Information System (INIS)

    Perez Diaz, M.; Diaz Rizo, O.; Lopez Diaz, A.; Estevez Aparicio, E.; Roque Diaz, R.

    2007-01-01

    A discriminant method for optimizing the activity administered in NM studies is validated by comparison with ROC curves. the method is tested in 21 SPECT, performed with a Cardiac phantom. Three different cold lesions (L1, L2 and L3) were placed in the myocardium-wall for each SPECT. Three activities (84 MBq, 37 MBq or 18.5 MBq) of Tc-99m diluted in water were used as background. The linear discriminant analysis was used to select the parameters that characterize image quality (Background-to-Lesion (B/L) and Signal-to-Noise (S/N) ratios). Two clusters with different image quality (p=0.021) were obtained following the selected variables. the first one involved the studies performed with 37 MBq and 84 MBq, and the second one included the studies with 18.5 MBq. the ratios B/L, B/L2 and B/L3 are the parameters capable to construct the function, with 100% of cases correctly classified into the clusters. The value of 37 MBq is the lowest tested activity for which good results for the B/Li variables were obtained,without significant differences from the results with 84 MBq (p>0.05). The result is coincident with the applied ROC-analysis. A correlation between both method of r=890 was obtained. (Author) 26 refs

  2. Validation of the cleaning and sanitization method for radiopharmaceutical production facilities

    International Nuclear Information System (INIS)

    Robles, Anita; Morote, Mario; Moore, Mariel; Castro, Delcy; Paragulla, Wilson; Novoa, Carlos; Otero, Manuel; Miranda, Jesus; Herrera, Jorge; Gonzales, Luis

    2014-01-01

    A protocol for the cleaning and sanitization method for radiopharmaceutical production facilities has been designed and developed for the inner surface of the hot cells for the production of Sodium Pertechnetate Tc-99m and Sm-153 EDTMP, considering an extreme situation for each hot cell. Cleaning is performed with double-distilled water and sanitation with two disinfectant solutions, 70 % isopropyl alcohol and 3 % hydrogen peroxide in alternate weeks. Microbiological analysis of sanitized surfaces were made after 20 minutes and 48 hours for the hot cell of Tc-99m and 72 hours for the hot cell of EDTMP Sm-153 in 3 consecutive tests by the method of direct contact with plates containing culture medium, made for each sampling point (6 in the first and five in the second). The results showed that the microbial load on surfaces sanitized was below acceptable limits and that the lifetime of cleaning and sanitization is 48 hours for the hot cell of Tc-99m and 72 hours for the one of EDTMP-Sm-153. As a conclusion, the method of cleaning and sanitization is effective to reduce or eliminate microbial contamination therefore, the process is validated. (authors).

  3. A validated Fourier transform infrared spectroscopy method for quantification of total lactones in Inula racemosa and Andrographis paniculata.

    Science.gov (United States)

    Shivali, Garg; Praful, Lahorkar; Vijay, Gadgil

    2012-01-01

    Fourier transform infrared (FT-IR) spectroscopy is a technique widely used for detection and quantification of various chemical moieties. This paper describes the use of the FT-IR spectroscopy technique for the quantification of total lactones present in Inula racemosa and Andrographis paniculata. To validate the FT-IR spectroscopy method for quantification of total lactones in I. racemosa and A. paniculata. Dried and powdered I. racemosa roots and A. paniculata plant were extracted with ethanol and dried to remove ethanol completely. The ethanol extract was analysed in a KBr pellet by FT-IR spectroscopy. The FT-IR spectroscopy method was validated and compared with a known spectrophotometric method for quantification of lactones in A. paniculata. By FT-IR spectroscopy, the amount of total lactones was found to be 2.12 ± 0.47% (n = 3) in I. racemosa and 8.65 ± 0.51% (n = 3) in A. paniculata. The method showed comparable results with a known spectrophotometric method used for quantification of such lactones: 8.42 ± 0.36% (n = 3) in A. paniculata. Limits of detection and quantification for isoallantolactone were 1 µg and 10 µg respectively; for andrographolide they were 1.5 µg and 15 µg respectively. Recoveries were over 98%, with good intra- and interday repeatability: RSD ≤ 2%. The FT-IR spectroscopy method proved linear, accurate, precise and specific, with low limits of detection and quantification, for estimation of total lactones, and is less tedious than the UV spectrophotometric method for the compounds tested. This validated FT-IR spectroscopy method is readily applicable for the quality control of I. racemosa and A. paniculata. Copyright © 2011 John Wiley & Sons, Ltd.

  4. HPTLC Method Development and Validation of Zolpidem Tartrate in Bulk and Marketed Formulation

    OpenAIRE

    Abhay R. Shirode; Bharti G. Jadhav; Vilasrao J. Kadam

    2015-01-01

    High performance thin layer chromatography (HPTLC) offers many advantages over HPLC. It reduces the cost of analysis as compare to HPLC. The mobile phase consumption per sample is extremely low in HPTLC, hence reducing the acquisition and disposal cost. Considering the cost and suitability of analysis for estimation of zolpidem tartrate in bulk and its marketed formulation, HPTLC method was developed and validated. The Camag HPTLC system, employed with software winCATS (ver.1.4.1.8) was used ...

  5. Validation of a survey tool for use in cross-cultural studies

    Directory of Open Access Journals (Sweden)

    Costa FA

    2008-09-01

    Full Text Available There is a need for tools to measure the information patients need in order for healthcare professionals in general, and particularly pharmacists, to communicate effectively and play an active part in the way patients manage their medicines. Previous research has developed and validated constructs to measure patients’ desires for information and their perceptions of how useful their medicines are. It is important to develop these tools for use in different settings and countries so that best practice is shared and is based on the best available evidence. Objectives: this project sought to validate of a survey tool measuring the “Extent of Information Desired” (EID, the “Perceived Utility of Medicines” (PUM, and the “Anxiety about Illness” (AI that had been previously translated for use with Portuguese patients. Methods: The scales were validated in a patient sample of 596: construct validity was explored in Factor analysis (PCA and internal consistency analysed using Cronbach’s alpha. Criterion validity was explored correlating scores to the AI scale and patients’ perceived health status. Discriminatory power was assessed using ANOVA. Temporal stability was explored in a sub-sample of patients who responded at two time points, using a T-test to compare their mean scores. Results: Construct validity results indicated the need to remove 1 item from the Perceived Harm of Medicines (PHM and Perceived Benefit of Medicines (PBM for use in a Portuguese sample and the abandon of the tolerance scale. The internal consistency was high for the EID, PBM and AI scales (alpha>0.600 and acceptable for the PHM scale (alpha=0.536. All scales, except the EID, were consistent over time (p>0.05; p<0.01. All the scales tested showed good discriminatory power. The comparison of the AI scale with the SF-36 indicated good criterion validity (p<0.05.Conclusion: The translated tool was valid and reliable in Portuguese patients- excluding the Tolerance

  6. Validation of the activity expansion method with ultrahigh pressure shock equations of state

    Science.gov (United States)

    Rogers, Forrest J.; Young, David A.

    1997-11-01

    Laser shock experiments have recently been used to measure the equation of state (EOS) of matter in the ultrahigh pressure region between condensed matter and a weakly coupled plasma. Some ultrahigh pressure data from nuclear-generated shocks are also available. Matter at these conditions has proven very difficult to treat theoretically. The many-body activity expansion method (ACTEX) has been used for some time to calculate EOS and opacity data in this region, for use in modeling inertial confinement fusion and stellar interior plasmas. In the present work, we carry out a detailed comparison with the available experimental data in order to validate the method. The agreement is good, showing that ACTEX adequately describes strongly shocked matter.

  7. Evolutionary Analysis Predicts Sensitive Positions of MMP20 and Validates Newly- and Previously-Identified MMP20 Mutations Causing Amelogenesis Imperfecta

    Directory of Open Access Journals (Sweden)

    Barbara Gasse

    2017-06-01

    Full Text Available Amelogenesis imperfecta (AI designates a group of genetic diseases characterized by a large range of enamel disorders causing important social and health problems. These defects can result from mutations in enamel matrix proteins or protease encoding genes. A range of mutations in the enamel cleavage enzyme matrix metalloproteinase-20 gene (MMP20 produce enamel defects of varying severity. To address how various alterations produce a range of AI phenotypes, we performed a targeted analysis to find MMP20 mutations in French patients diagnosed with non-syndromic AI. Genomic DNA was isolated from saliva and MMP20 exons and exon-intron boundaries sequenced. We identified several homozygous or heterozygous mutations, putatively involved in the AI phenotypes. To validate missense mutations and predict sensitive positions in the MMP20 sequence, we evolutionarily compared 75 sequences extracted from the public databases using the Datamonkey webserver. These sequences were representative of mammalian lineages, covering more than 150 million years of evolution. This analysis allowed us to find 324 sensitive positions (out of the 483 MMP20 residues, pinpoint functionally important domains, and build an evolutionary chart of important conserved MMP20 regions. This is an efficient tool to identify new- and previously-identified mutations. We thus identified six functional MMP20 mutations in unrelated families, finding two novel mutated sites. The genotypes and phenotypes of these six mutations are described and compared. To date, 13 MMP20 mutations causing AI have been reported, making these genotypes and associated hypomature enamel phenotypes the most frequent in AI.

  8. Evolutionary Analysis Predicts Sensitive Positions of MMP20 and Validates Newly- and Previously-Identified MMP20 Mutations Causing Amelogenesis Imperfecta.

    Science.gov (United States)

    Gasse, Barbara; Prasad, Megana; Delgado, Sidney; Huckert, Mathilde; Kawczynski, Marzena; Garret-Bernardin, Annelyse; Lopez-Cazaux, Serena; Bailleul-Forestier, Isabelle; Manière, Marie-Cécile; Stoetzel, Corinne; Bloch-Zupan, Agnès; Sire, Jean-Yves

    2017-01-01

    Amelogenesis imperfecta (AI) designates a group of genetic diseases characterized by a large range of enamel disorders causing important social and health problems. These defects can result from mutations in enamel matrix proteins or protease encoding genes. A range of mutations in the enamel cleavage enzyme matrix metalloproteinase-20 gene ( MMP20 ) produce enamel defects of varying severity. To address how various alterations produce a range of AI phenotypes, we performed a targeted analysis to find MMP20 mutations in French patients diagnosed with non-syndromic AI. Genomic DNA was isolated from saliva and MMP20 exons and exon-intron boundaries sequenced. We identified several homozygous or heterozygous mutations, putatively involved in the AI phenotypes. To validate missense mutations and predict sensitive positions in the MMP20 sequence, we evolutionarily compared 75 sequences extracted from the public databases using the Datamonkey webserver. These sequences were representative of mammalian lineages, covering more than 150 million years of evolution. This analysis allowed us to find 324 sensitive positions (out of the 483 MMP20 residues), pinpoint functionally important domains, and build an evolutionary chart of important conserved MMP20 regions. This is an efficient tool to identify new- and previously-identified mutations. We thus identified six functional MMP20 mutations in unrelated families, finding two novel mutated sites. The genotypes and phenotypes of these six mutations are described and compared. To date, 13 MMP20 mutations causing AI have been reported, making these genotypes and associated hypomature enamel phenotypes the most frequent in AI.

  9. A Newly Improved Modified Method Development and Validation of Bromofenac Sodium Sesquihydrate in Bulk Drug Manufacturing

    OpenAIRE

    Sunil Kumar Yelamanchi V; Useni Reddy Mallu; I. V Kasi Viswanath; D. Balasubramanyam; G. Narshima Murthy

    2016-01-01

    The main objective of this study was to develop a simple, efficient, specific, precise and accurate newly improved modified Reverse Phase High Performance Liquid Chromatographic Purity (or) Related substance method for bromofenac sodium sesquihydrate active pharmaceuticals ingredient dosage form. Validation of analytical method is the confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled as per ICH, USP...

  10. Development and Validation of LC Method for the Determination of Famciclovir in Pharmaceutical Formulation Using an Experimental Design

    Directory of Open Access Journals (Sweden)

    Srinivas Vishnumulaka

    2008-01-01

    Full Text Available A rapid and sensitive RP-HPLC method with UV detection (242 nm for routine analysis of famciclovir in pharmaceutical formulations was developed. Chromatography was performed with mobile phase containing a mixture of methanol and phosphate buffer (50:50, v/v with flow rate 1.0 mL min−1. Quantitation was accomplished with internal standard method. The procedure was validated for linearity (correlation coefficient =0.9999, accuracy, robustness and intermediate precision. Experimental design was used for validation of robustness and intermediate precision. To test robustness, three factors were considered; percentage v/v of methanol in mobile phase, flow rate and pH; flow rate, the percentage of organic modifier and pH have considerable important effect on the response. For intermediate precision measure the variables considered were: analyst, equipment and number of days. The RSD value (0.86%, n=24 indicated an acceptable precision of the analytical method. The proposed method was simple, sensitive, precise, accurate and quick and useful for routine quality control.

  11. Development and validation of an HPLC/UV/MS method for simultaneous determination of 18 preservatives in grapefruit seed extract.

    Science.gov (United States)

    Ganzera, Markus; Aberham, Anita; Stuppner, Hermann

    2006-05-31

    Grapefruit seed extracts are used in cosmetics, food supplements, and pesticides because of their antimicrobial properties, but suspicions about the true nature of the active compounds arose when synthetic disinfectants such as benzethonium or benzalkonium chloride were found in commercial products. The HPLC method presented herein allows the quality assessment (qualitative and quantitative) of these products for the first time. On the basis of a standard mixture of 18 preservatives most relevant for food and grapefruit products, a method was developed allowing the baseline separation of all compounds within 40 min. Optimum results were obtained with a C-8 stationary phase and a solvent system comprising aqueous trifluoroacetic acid, acetonitrile, and 2-propanol. The assay was fully validated and shown to be sensitive (LOD or = 96.1%), repeatable (sigma(rel) < or = 3.5%), precise (intra-day variation < or = 4.5%, interday variation < or = 4.1%), and rugged. Without any modifications the method could be adopted for LC-MS experiments, where the compounds of interest were directly assignable in positive ESI mode. The quantitative results of several products for ecofarming confirmed previous studies, as seven out of nine specimens were adulterated with preservatives in varying composition. The samples either contained benzethonium chloride (2.5-176.9 mg/mL) or benzalkonium chloride (138.2-236.3 mg/mL), together with smaller amounts of 4-hydroxybenzoic acid esters, benzoic acid, and salicylic acid.

  12. What Does It Cost to Prevent On-Duty Firefighter Cardiac Events? A Content Valid Method for Calculating Costs

    Directory of Open Access Journals (Sweden)

    P. Daniel Patterson

    2013-01-01

    Full Text Available Cardiac arrest is a leading cause of mortality among firefighters. We sought to develop a valid method for determining the costs of a workplace prevention program for firefighters. In 2012, we developed a draft framework using human resource accounting and in-depth interviews with experts in the firefighting and insurance industries. The interviews produced a draft cost model with 6 components and 26 subcomponents. In 2013, we randomly sampled 100 fire chiefs out of >7,400 affiliated with the International Association of Fire Chiefs. We used the Content Validity Index (CVI to identify the content valid components of the draft cost model. This was accomplished by having fire chiefs rate the relevancy of cost components using a 4-point Likert scale (highly relevant to not relevant. We received complete survey data from 65 fire chiefs (65% response rate. We retained 5 components and 21 subcomponents based on CVI scores ≥0.70. The five main components include, (1 investment costs, (2 orientation and training costs, (3 medical and pharmaceutical costs, (4 education and continuing education costs, and (5 maintenance costs. Data from a diverse sample of fire chiefs has produced a content valid method for calculating the cost of a prevention program among firefighters.

  13. Validity studies among hierarchical methods of cluster analysis using cophenetic correlation coefficient

    International Nuclear Information System (INIS)

    Carvalho, Priscilla R.; Munita, Casimiro S.; Lapolli, André L.

    2017-01-01

    The literature presents many methods for partitioning of data base, and is difficult choose which is the most suitable, since the various combinations of methods based on different measures of dissimilarity can lead to different patterns of grouping and false interpretations. Nevertheless, little effort has been expended in evaluating these methods empirically using an archaeological data base. In this way, the objective of this work is make a comparative study of the different cluster analysis methods and identify which is the most appropriate. For this, the study was carried out using a data base of the Archaeometric Studies Group from IPEN-CNEN/SP, in which 45 samples of ceramic fragments from three archaeological sites were analyzed by instrumental neutron activation analysis (INAA) which were determinate the mass fraction of 13 elements (As, Ce, Cr, Eu, Fe, Hf, La, Na, Nd, Sc, Sm, Th, U). The methods used for this study were: single linkage, complete linkage, average linkage, centroid and Ward. The validation was done using the cophenetic correlation coefficient and comparing these values the average linkage method obtained better results. A script of the statistical program R with some functions was created to obtain the cophenetic correlation. By means of these values was possible to choose the most appropriate method to be used in the data base. (author)

  14. Validity studies among hierarchical methods of cluster analysis using cophenetic correlation coefficient

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Priscilla R.; Munita, Casimiro S.; Lapolli, André L., E-mail: prii.ramos@gmail.com, E-mail: camunita@ipen.br, E-mail: alapolli@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    The literature presents many methods for partitioning of data base, and is difficult choose which is the most suitable, since the various combinations of methods based on different measures of dissimilarity can lead to different patterns of grouping and false interpretations. Nevertheless, little effort has been expended in evaluating these methods empirically using an archaeological data base. In this way, the objective of this work is make a comparative study of the different cluster analysis methods and identify which is the most appropriate. For this, the study was carried out using a data base of the Archaeometric Studies Group from IPEN-CNEN/SP, in which 45 samples of ceramic fragments from three archaeological sites were analyzed by instrumental neutron activation analysis (INAA) which were determinate the mass fraction of 13 elements (As, Ce, Cr, Eu, Fe, Hf, La, Na, Nd, Sc, Sm, Th, U). The methods used for this study were: single linkage, complete linkage, average linkage, centroid and Ward. The validation was done using the cophenetic correlation coefficient and comparing these values the average linkage method obtained better results. A script of the statistical program R with some functions was created to obtain the cophenetic correlation. By means of these values was possible to choose the most appropriate method to be used in the data base. (author)

  15. Multi-Trait Multi-Method Matrices for the Validation of Creativity and Critical Thinking Assessments for Secondary School Students in England and Greece

    Directory of Open Access Journals (Sweden)

    Ourania Maria Ventista

    2017-08-01

    Full Text Available The aim of this paper is the validation of measurement tools which assess critical thinking and creativity as general constructs instead of subject-specific skills. Specifically, this research examined whether there is convergent and discriminant (or divergent validity between measurement tools of creativity and critical thinking. For this purpose, the multi-trait and multi-method matrix suggested by Campbell and Fiske (1959 was used. This matrix presented the correlation of scores that students obtain in different assessments in order to reveal whether the assessments measure the same or different constructs. Specifically, the two methods used were written and oral exams, and the two traits measured were critical thinking and creativity. For the validation of the assessments, 30 secondary-school students in Greece and 21 in England completed the assessments. The sample in both countries provided similar results. The critical thinking tools demonstrated convergent validity when compared with each other and discriminant validity with the creativity assessments. Furthermore, creativity assessments which measure the same aspect of creativity demonstrated convergent validity. To conclude, this research provided indicators that critical thinking and creativity as general constructs can be measured in a valid way. However, since the sample was small, further investigation of the validation of the assessment tools with a bigger sample is recommended.

  16. Validation of FNAA method for testing the elements of Mn, Cr and Mg on the Gajahwong river sediment sample

    International Nuclear Information System (INIS)

    Wisjachudin Faisal; Elin Nuraini

    2010-01-01

    Validation of elements of Mn, Cr and Mg by using FNAA method has been performed. NBS SRM 8704 (Bufallo River Sediment), was used as the standard reference material, with the neutrons generator operating condition at the optimum voltage of 110 kV. Energy and channel number of calibration lines obtained with the standard equation as y = 1.034 x + 151.21. From the analysis of SRM, the results show that only Mg can be analyzed, because Cr and Mn are located at the same peak point (interferences), so that they can not be analyzed. From the analysis for Mg element (SRM), the precision and the accuration obtained are 95.53 % and 94.88%, while the average price of expanded uncertainty for the various locations is 0.233 ± 0.012. Mg content analysis result at various locations along the river Gajahwong ranging from 85.41 – 103.55 ppm. When compared with previous studies showing the elements content of Fe, Al and Si is much higher than Mg content. (author)

  17. Validity and extension of the SCS-CN method for computing infiltration and rainfall-excess rates

    Science.gov (United States)

    Mishra, Surendra Kumar; Singh, Vijay P.

    2004-12-01

    A criterion is developed for determining the validity of the Soil Conservation Service curve number (SCS-CN) method. According to this criterion, the existing SCS-CN method is found to be applicable when the potential maximum retention, S, is less than or equal to twice the total rainfall amount. The criterion is tested using published data of two watersheds. Separating the steady infiltration from capillary infiltration, the method is extended for predicting infiltration and rainfall-excess rates. The extended SCS-CN method is tested using 55 sets of laboratory infiltration data on soils varying from Plainfield sand to Yolo light clay, and the computed and observed infiltration and rainfall-excess rates are found to be in good agreement.

  18. Phytochemical analysis of Vernonanthura tweedieana and a validated UPLC-PDA method for the quantification of eriodictyol

    Directory of Open Access Journals (Sweden)

    Layzon Antonio Lemos da Silva

    Full Text Available AbstractVernonanthura tweedieana (Baker H. Rob., Asteraceae, is used in the Brazilian folk medicine for the treatment of respiratory diseases. In this work the phytochemical investigation of its ethanol extracts as well as the development and validation of an UPLC-PDA method for the quantification of the eriodictyol from the leaves were performed. The phytochemical study for this species lead to the identification of ethyl caffeate, naringenin and chrysoeriol in mixture, eriodictyol from leaves, and the mixture of 3-hydroxy-1-(4-hydroxy-3,5-dimethoxyphenyl-propan-1-one and evofolin B, apigenin, the mixture of caffeic and protocatechuic acid and luteolin from stems with roots, being reported for the first time for V. tweedieana, except for eriodictyol. The structural elucidation of all isolated compounds was achieved by 1H and 2D NMR spectroscopy, and in comparison with published data. An UPLC-PDA method for quantification of the eriodictyol in leaves of V. tweedieana was developed and validated for specificity, linearity, precision (repeatability and intermediate precision, limit of detection (LOD and limit of quantification (LOQ, accuracy and robustness. In this study, an excellent linearity was obtained (r2 = 0.9999, good precision (repeatability RSD = 2% and intermediate precision RSD = 8% and accuracy (average recovery from 98.6% to 99.7%. The content of eriodictyol in the extract of leaves of V. tweedieana was 41.40 ± 0.13 mg/g. Thus, this study allowed the optimization of a simple, fast and validated UPLC-PDA method which can be used to support the quality assessment of this herbal material.

  19. Method validation to measure Strontium-90 in urine sample for internal dosimetry assessment

    International Nuclear Information System (INIS)

    Bitar, A.; Maghrabi, M.; Alhamwi, A.

    2010-12-01

    Occupational individuals exposed at some scientific centers in Syrian Arab Republic to potentially significant intake by ingestion or inhalation during process of producing radiopharmaceutical compounds. The received radioactive intake differs in relation to the amount of radionuclides released during the preparation processes, to the work conditions and to the applying ways of the radiation protection procedures. TLD (Thermoluminescence Dosimeter) is usually used for external radiation monitoring for workers in radioisotope centers. During the external monitoring programme, it was noticed that some workers were exposed to high external dose resultant from radiation accident in their laboratory when preparing Y-90 from Sr-90. For internal dose assessment, chemical method to measure the amount of Sr-90 in urine samples was validated and explained in details in this study. Urine bioassays were carried out and the activities of 90 Sr were determined using liquid scintillation counter. Then, the validated method was used for internal occupational monitoring purposes through the design of internal monitoring programme. The programme was established for four workers who are dealing, twice per month, with an amount of about 20 mCi in each time. At the beginning, theoretical study was done to assess maximum risks for workers. Calculated internal doses showed that it is necessary to apply internal routine monitoring programme for those workers. (author)

  20. Development and Validation Dissolution Analytical Method of Nimesulide beta-Cyclodextrin 400 mg Tablet

    Directory of Open Access Journals (Sweden)

    Carlos Eduardo Carvalho Pereira

    2016-10-01

    Full Text Available The nimesulide (N-(4-nitro-2-phenoxyphenylmethanesulfonamide belongs to the class of non-steroidal anti-inflammatory drugs (NSAIDs and category II of the biopharmaceutical classification, The complexation of nimesulide with b-cyclodextrin is a pharmacological strategy to increase the solubility of the drug The objective of this study was to develop and validate an analytical methodology for dissolving the nimesulide beta-cyclodextrin 400 mg tablet and meets the guidelines of ANVISA for drug registration purposes. Once developed, the dissolution methodology was validated according to the RE of parameters no.  899/2003. In the development of the method it was noted that the duration of the dissolution test was 60 minutes, the volume and the most suitable dissolution medium was 900 mL of aqueous solution of sodium lauryl sulfate 1% (w/ v. It was also noted that rotation of 100 rpm and the paddle apparatus was the most appropriate to evaluate the dissolution of the drug. Spectrophotometric methodology was used to quantify the percentage of dissolved drug. The wavelength was 390 nm using the quantification. The validation of the methodology, system suitability parameters, specificity/selectivity, linearity, precision, accuracy and robustness were satisfactory and proved that the developed dissolution methodology was duly executed. DOI: http://dx.doi.org/10.17807/orbital.v8i5.827